For the record: most people (myself included) notice the difference between 25 FPS and 60 FPS in a video game.
Well, my Go 7950 GTX gets 15-20 FPS at those settings (1280x800, 0xAA, 4xAF, all high settings). With a little more optimization in the engine/drivers, it could get into the playable range.
-
1280x800 high settings all effects enabled running at 30+fps on an 8700GT mobile? I really doubt that.
-
I'm here to kill the messenger.
-
He is the cause of my nightmares...
I want his head...
-
BS MAN!!! Why would movies play at 24.57 whatever fps and movie theaters sometimes at 72-75fps
On lets say mythbusters their highspeed cameras run at around 1000fps your telling me you see a bullet at twice the speed from the slomo?! wow man -
Well you people should also think that ironically... Crysis and Farcry (probably farcry2) all use settings: LOW, MEDIUM, HIGH, VERY HIGH
So im not talking max settings when I say high... -
Don't shoot me for asking... But will my G2S-A1 (see sig) run Crysis? If so, at what settings and res and how well?
-
haha yes why not?
-
Probably, I'd right now 1280*800 med settings avg 30fps+...
Later with patches and optimization and maybe new nvidia drivers (and crysis never been tested on DX10) id say 1280*800 med and high mix settings mabe some aa/af 30fps+ -
k dont listen to him, posted above a 7950GTX on high settings at 1280*800 cant get 30fps how can an 8600gt?
-
Well I'm just glad I can run it. I haven't really read into what's going with Crysis, but what I have read clearly said that it will cripple most systems. This is a relief!
Thanks for the responses! -
NVidia? Hi. New DX10 card plz... right now! kthxbai!!
-
I did say DX10 man... and I automatically inclue effects and physics at full
-
Dude you own a M570RU and dont even list you gpu... wow...
And yes NVIDIA BRING ME YOUR NEW GPU'S!!!!
Edit: sorry didnt see it... well still it just says 8700m... lol -
Let's just all forgive and forget.
-
how do u know?! he's saying that for the final release
-
NO! WAR! WAR! WAR! lol jk
-
Bring IT!
(10 chars) -
movies play at 24 fps because the MOTION BLUR of the actors are already recorded onto the film
in games, without motion blur, everything looks crisp and thus you need a higher FPS to deceive the eye/brain into connecting the crisp looking frames as smooth motion
in games with motion blur, it is harder to detect the difference between fps, but 25 fps is still WAY TOO LOW for a first person shooter
25 fps is not suitable for a FPS, you need at least 35+
EDIT:
Beatsiz i predict by his ignorance, immaturity and rashness, is not even in uni, probably just entered highschool
and all you peeps hoping that crysis will miraculously run much better when it comes out are just after fool's gold. Bioshock demo ran FASTER than it did in the final product, how do you know you can't expect the same for crysis? you all just look to the fact that bioshock plays great at low fps...
and seriously, when recommended settings is an 8800gts, there is no way you're going to get high settings on a 8600gt, the 8800gts gets 8-10k in 3dmark06 at native res, 8600gt = at best 4-5k with ddr 3 ram (with the 8800gts shader scores much higher than the shader scores in the 8600gt. the 8600 gt wouldn't even get such high scores without the cpu score skewing the results) -
Generally, 30FPS LOCKED SOLID is fine for FPS. Obviously, the higher the better (DUH)
Older console and PC games have been played at locked 30FPS for some time before kicking it up to 60FPS and beyond.
It's the stuttering that actually kills the illusion of high FPS...
You can have 80FPS, but if it suddenly drops to 50FPS, you'll still notice a huge disrepency and "low frame rate" even though 50FPS is still much higher than 30FPS. However, if the game runs at rock solid 30FPS, you usually won't notice any "low frame rates" even though 30FPS is quite "low FPS" by today's standards.
But, alas, PC games don't usually get the optimizations required for a "locked" FPS since there's so many different types of hardware out in the wild--unlike consoles. -
No, Crysis is currently in Beta, much different then a demo version. I would almost guarantee that the final release runs noticeably faster than any of the betas.
Well, recommended settings is also for a decent resolution, at least 1400x1050, maybe even 1680x1050. High settings at that kind of resolution is entirely different than at 1280x800. High at 1280x800 is probably not going to happen, mind you, but on a DDR3 8600M-GT I wouldn't rule out 1280x800 with high and medium tweaked settings. -
In fairness a mobile 8600GT is slower then a desktop one. Do you honestly think the most visually advanced game ever to be released, that the developers have said scales both years back and forwards, is going to run at high settings 1280x800 on an 8600GT with 'possibly' some AA??? I love my 860GT, but lets be honest here - its not a very good card for the very latest games.
Yes the final release will be better then the Beta but I've never known a game that performance went up about 200% from beta to final build, does anyone?? -
This text you wrote is sooo true, with fear i have frames between 70 and 40 fps and sometimes it feels the game is lagging because of the framedrop to 40 frames.
-
The man speaks the truth. My average framerate nearly doubled when they released Beta Client #3. They've repeatedly said on the Beta Tester Forums that the beta is not optimized (drivers or game engine), and that performance in the final game will be much improved. (not to mention there are bug-collecting/monitoring programs in the background for the Beta that are hurting performance as well.)
-
ArmageddonAsh Mangekyo Sharingan
why would the min system requirements be higher then the the recommended? -
The minimum CPU is a single core, whilst the recommended is dual core.
-
My Commodore 64 runs it at max settings...
-
ArmageddonAsh Mangekyo Sharingan
okay thanx, thank god i have not got my laptop yet
was looking at AA m7950 with
2.0 Ghz processor
1 gig ram (upgrade my self)
signle 7950, 512 MB GC (upgrade later)
i wonder if this would stand a chance of playing on at the very least medium otherwise im going to upgrade NOW -
exactly, its not going to quadruple peformance, and constantly expecting and 8600gt to run new games at highest setting because its DX10 is just stupid. sure we cant know until it comes out, and since the recommended settigns is an 8800gts which no mobile card can approach in peformance, id highly doubt the 8600gt be able to run it on high.
-
well, right now i can run beta at 1280*800 medium,...so i see no reason why i would not be able to put a couple of settings, perhaps even more, on high in the final release. 200%?!? who said 200%?
p.s. i think u guys are making too big of a deal out crysis(i love it, don't get me wrong)....it's only a beta, and a very bad coded(like a lot of people claimed) at that... -
Because anything over 18 fps the brain interprets as motion. The eye can see alot more fps, its just not needed to percieve motion, especially if the image is blurred. The less blur, the more fps required in order to percieve motion.
first off, 500fps is half of 1000, not twice, and thats not how slomo works. The camera captures 1000 frames per second, and then it is slowed down to about 30 fps so it can be slowmo and still interpretted as motion. If you were to slow down an ordinary tv camera that only captures 30 fps, you would get about 2 fps and it would look like a slide show. -
wow man... he said the eye can see 500fps so if a 1000fps camera slowed down to 30fps... means he could see.. oh yeah right... but still if you see at 500fps... your pretty much a reflex god
And crysis has motion blur... so wheres the need for high framerates so the eye can perseive motion blur or whatever you were talking about
?! -
It's digital motion blur still, not analog. The motion blur is there to make it look more "realistic" but not to help run with lower frame rates.
-
The number of frames per second your eye can percieve has absolutely no bearing on how fast your reflexes are, so I'm not sure what you're saying. Just because you can see something that quickly doesn't mean you can react that quickly.
Yes, it does, but motion blur can never be as effective in a variable fps interactive situation as it is in something like a film, where the fps is locked and the data is pre rendered. Even in games with motion blur, up to 60fps is definitely noticeable. -
Haven't read through all the posts, but I can tell you you are going to need some serious hardware unless the optimizations are amazing. Running beta now with 2.66 Core 2 Duo, 2 Gigs Ram, and 8800GTS 640mb KO. On high without any AA I get around 45 indoors and 35 outdoors. This game is a bad mother.
I'm sure it will play on lower end cards like the 9800 pro, but it is going to be rough. -
Sorry about the reflex thing... but what I ment was you would basically see stuff others couldnt such a 30fps movie would (I believe) lag in a persons point of view who can see at 500fps...
up to 60fps what is noticable?! -
Just because your eye could physically perceive 500fps (and btw, I'm not saying that's true, I can't verify the member who posted that), wouldn't mean that 30fps would seem "laggy".
By noticeable I meant that a normal person with normal eyesight can easily see animation quality improve in a normal FPS up to about 60fps. Don't know about over that, because I only really have experience gaming on screens that have a 60hz refresh rate. -
So we do agree that very high fps above the refresh rate of your screen can not even be shown by it, so you cant perseive them anyways.
-
1gb of ram is not cutting it no more lol i no the days that 256 and 512mb of ram was alot
-
The Forerunner Notebook Virtuoso
Yes but that doesn't mean our eyes see in 30 fps that would be like living in a one of those flipbook animations.
-
man, what's the point of u guys arguing about this crap? i'll put my 2 cents in and say that for me, any game that runs over 30 fps is smooth...of course there's a difference between 30fps and 60fps,and a pretty good one, but anything above that - is marginal imo
-
I agree. Fact of the matter is, 30 is pretty decent. I was happy to play the demo with that frame rate. Yes, 60 would be optimal, but that may not always be possible depending on your hardware.
And again, as far as requirements go, you need a powerful system to make Crysis look its best. You may be able to play it on older hardware, but why buy a game that is coming out in 2007 and scale it down so it looks like a game that came out 2 years ago. I know some will say because of how it plays, not how it looks, but I believe the two go hand and hand.
Even if you own a 1710 or 1730, it will bring you laptop to its knees if you plan on playing at the highest settings at a native resolution. I would not even dare try to run it on my laptop as I know from prior experience (Stalker) that it cannot handle a game like that with any eye candy turned on. -
Just sacrafice some resolution for higher settings if you really want the eye candy.
Playing an FPS at 1024x640 isn't all that bad if you can have all the settings at High instead of Medium at 1280 x 800 (This is for the 8600M GT DDR2 I'm talking on behalf of). -
Nice... ill look into that
13 days till crysis demoooo!!!!!!!!!!!!!! -
and you plan to run it on your gma 950?
-
This is not a laptop game....especially not in DX10. Doesn't mattery if you have a 8700gT.... even you get to run it. I won't be a pleasurable.
as a Beta tester (online only), It can hardly run on my 7900GTX card DX9.
All medium, no AA, 1440x900 and i get 15-20FPS... looks like crap too.
Wanna play it? better get yourself a desktop with 8800series card. -
we are all entitled to our own opinions...i think the final release will have a lot of stuff fixed, and u'll be able to enjoy the game on ur 7900gtx
-
How is it supposed to look like crap!? Tell me other games that look better...
Its still unoptimized
how about 1280*800 and 8800m sli
-
Of course I do! Why not >!
lol
No, I will be purchasing a new lappie sooner or later... hopefully before crysis release... or before christmas could be sweet... Or any time Nvidia releases 8800m!!!
Final Crysis Requirements...Don't kill the messenger for the bad message
Discussion in 'Gaming (Software and Graphics Cards)' started by The Forerunner, Oct 9, 2007.