I've read somewhere that you really can't tell the difference between 24 fps and 60 fps. Is it true? I always felt like that I can tell the difference... But my brain might just be playing tricks on me.
-
Depends. On movies, you really can't. For interactive games, most certainly. Also note that when a game gets 24fps, that's usually an average. That means it can go much below 24fps at times as long as it also goes above it, too. If a game as a minimum of 24fps, it will be pretty smooth playing.
-
No, there are actually levels, like similar between 24-28fps, etc. But between 24 to 60 fps... for games that difference is homunculus.
-
well, maybe not as much difference in how it looks, but u'll most likely have some input lag at 24 fps.
that's why people prefer 60+ fps. it will 'feel' smooth, even if it doesn't necessarily look that much different. -
Anything above 30FPS the human eye can't detect? I think that's what they say anyways.
-
-
But most don't, and that's why usually anything above 29 FPS (in my experience) is perfectly playable without any sort of "lag"
-
Depends on the game. I think flight simulators and racing games are fine at ~15 to 20 FPS, whereas first person shooters should go for maybe 30+. Also depends on visual effects like motion blur, etc.
-
i wouldn't dare play any online shooter with less than 35 fps, say goodbye to fast aiming and movement due to lag
-
-
Those who play CS swear that they can tell the difference between 100 and 120 FPS... and I believe them. Here is an interesting explanation:
http://www.100fps.com/how_many_frames_can_humans_see.htm -
punch him in the face if he's using a generic lcd. -
If you really want to be able to tell the difference, move your mouse (pan the view) with First Person Game and notice the fixed object in front of you. With 60 fps, you would notice that the object is silky smooth as you move the mouse. With 30 fps, you would notice juddering of the object. -
-
-
Hmm, i'd say for games, no difference, as for other things, the difference between the frames could be massive,
Frame tearing, mouse lag, lots of different factors,
But generally 24 fps should be able to play things fine, -
and IMO , u really can't tell the difference between 24 and 60 FPS... the only difference u can really tell is when u change details level... that is what u really can see. -
This is not necessarily true depending on what we are talking about.
Some games will be flawless looking at 30fps, others will not look smooth until 60fps...
Everything from clarity of picture to contrast to individual human sensitivity come into play.
Most of the people who claim a universal truth at "X" fps for every situation are full of it.
The ones who claim they "Need" 120fps on their screen that shows them a max of 60Hz are the most entertaining.
In general, human eyes are more sensitive than 24fps or 30fps for today's high resolution and high contrast games/screens. This means most of us can tell the difference between an average frame rate due to the MINIMUM framerate dropping below a comfortable threshold for the game.
Note please that it is minimum framerate that makes most of us see the difference. More than likely everyone here would be more impressed with a rock-solid, never moves 24fps than an average framerate of 60fps that varies from 100fps to 10fps.
Its those moments of 10fps that kill it...
Yeah, you might be able to tell when it is "less smooth" but it doesn't really matter until you can't do something. -
60 fps for me, any lower than 40 is unplayable to me.
-
havent you people ever played games? its extremely easy to tell difference.. 60+ for me is an requiement, atleast in FPS's.
-
All I know is that Console seems to be the future of gaming and the most prevalent. Console gaming is 30FPS, even for online FPS.
60+ definitely is not a requirement for me or anyone I know who plays FPS on the 360 and PS3. -
-
-
24FPS, like in movies, looks smooth because of motion blur. In games, 24FPS is not smooth. Even in games with motion blur, it just doesn't work out like it does in a movie.
-
SomeRandomDude Notebook Evangelist
-
play UT2000 capped at 24fps then talk.
edit: low fps is noticable when you have a lot of change in short amount of time, like twitch shooters.
say you pan your view by 180 degrees in 0.5 seconds. with 24fps you only see 12 frames, so each image has a 15 degree increment. say there was a static enemy at 90 degrees to your original location. then if you have a 90 degree FOV (widescreen)you can see the enemy in only 6 frames, and you will only have 6 frames to extrapolate his position relative to you.
edit2: nvidia's 3d vision glasses uses this, so it's actually capped at 60. you're wrong on two accounts. -
waaaa?
well when you are playing games you can DEFINITELY tell bro. if you can't, get your eyes checked cuz they are refreshing at 24 blinks per second. -
That link that amik777 posted is a good link. That's why Crysis had motion blur. It took some GPU power, but if you have motion blur on high then it's completely playable and looks great at 20fps. If motion blur is off, it looks skippy at 30.
I hope more games start implementing good motion blur effects. -
thinkpad knows best Notebook Deity
-
That doesnt remove that fact that they games are not smooth at 24 fps, dude.
-
thinkpad knows best Notebook Deity
To each his own, i'm personally fine playing most games at 24 FPS. I also don't see why consoles have been critically commented on as having bad FPS, all i know is GTA 4 for the PS3/360 was developed to be constantly at 30FPS in action, or just standing on the beach, i don't see why 30FPS is so bad...
-
GTA 4 PS3/360 barely manage an average of 24fps, let alone 30.
-
In a game like Street Fighter 4 which has startup frames and recovery frames I wouldn't want anything less than 60 FPS.
This topic has been beat to death on so many forums it's really pointless to discuss.....If 24fps is satisfactory for you then by all means continue to keep that as your standard.....I don't like anything less than 60fps for the games I play. -
I'll play on any as long as it's not an online shooter...
-
-
I always felt that 60 f/s in FPS games was a huge difference over 30 f/s. I feel it is much less important in strategy games...
-
How many FPS did the original 1987 Street Fighter arcade game run at? That was good enough for me
-
30fps is totally fine as long as its not a competitive online shooter.
I want every edge I can get without going to ridiculous extents/without spending a ton of cash, and so yeah 60+ FPS at pretty much all times is preferable (drops to 50 occasionally are tolerable but not that great).
Every split second counts and when you can get an edge in addition to your natural reaction time then yeah, thats what you need. I hate it when I get killed online by some lucky bugger because of crappy framerates or lag or whatever, it just irks me. A lot of this also has to do with internet connection but yeah, more frames, why not? It'll only help. -
people often neglect that indoor lighting is pulsed, thus, it mimics the behavior of refresh rates on monitors to how you view things.
strobe lights make things look funny, yeah? -
I can clearly see the difference between 30 and 60fps, but it doesn't mean I can't play at 30 (at least in single-player).
15fps vs 30fps vs 60fps:
http://www.boallen.com/fps-compare.html -
30 FPS (consistant, not average) is the point at which the average human eye (and no, "training" makes no difference) stops being able to pick up in things being a sequence of stills vs smooth motion. Yes, blur and persistance of image on screen can push things a bit lower than that (why movies in theaters running at 24FPS look fine), but if you really want to see the difference, watch some stuff on a NTSC (that's US/Canada) TV and then on a PAL (much of Europe) TV. NTSC runs at 60i (30 sub-frames per second) and PAL runs at 50i (25 sub-frames per second). NTSC looks smooth to the vast majority of people, while PAL looks jittery to many, and causes eye strain to even those that don't conciously notice the jitter.
Also, as pointed out, florecent light bulbs pulse at the frequency of the local power grid, (60Hz in the US, 50Hz in much of Europe, which is also the orginal reason the framerates were set at what they were for NTSC and PAL), so anyone that claims they can differentiate framerates above 50 or 60 Hz must have a hell of a time every time they go to the grocery store or the mall. They would be able to pick up on the flashing of pretty much every light in the building. -
I always try to run games at 60 FPS and above. Anything under 45 looks jittery to me. 24 FPS is a slide show.
I really don't think FPS in games is the same as the 24 FPS we are all familiar with. I think 24 FPS translated to gaming is 60 FPS but that's just me -
i can tell difference between 60 fps and 100 fps also in Counterstrike where speed is everything.
a slight delay in crosshair resizing and u ll notice a difference in kill.
dunno about these visually exciting HD games though..where multiplayer is much easier -
Saying that you can differentiate above 60FPS is funny, because if you are using a LCD screen that is not for 3D, the max that it can refresh is 60FPS no matter how fast your game can run.
-
That's not completely true. My guess is he plays with v-sync off, so he can technically have partial frames at faster than 60FPS, just with taring all over the place.
-
Even if you disable Vsync, 60 refresh rate is the physical limit of normal LCD monitors, even with graphics tearing etc defects the monitor is still only displaying at 60 FPS.
-
I can easily tell the difference between a 60hz lcd and an 85hz crt.
-
You can easily see up to 60 fps, and even more but not that much unless you shutdown brain activity and focus completely on visual feedback. Myself I can only see up to 50fps or so, at such point I can't much difference anymore. -
The most important aspect of frame rates to lookout for would be the minimum frame rate in games. You can have a high average FPS, what that doesn't show is the minimum FPS. Minimum FPS is more important as it can make a 100FPS jittery or a 60FPS game feel constantly smooth. Sure your eyes can't see more then 60FPS on the common 60hz LCD (because it can't display any more
), but it can most definitely tell if gameplay is smooth or not. Even if the game does dip down to 1fps for a split second, that's long enough for the eye to see a jolt.
Hence, I believe that FPS should be as high as possible to allow for headroom (for future games?). Failing that, I prefer to aim for the highest minimum FPS possible.
Wish more reviews show minimum frame rate- average is not very good as it can be skewed with very high max FPS. In COD4 for example- there is a max FPS lock in the game. Disabling that in the console will allow for more frames over the lock but the game will still slow down at the same points. If that is benchmarked, it will show greater FPS but the same experience as the FPS enabled. I believe the FPS lock was 91FPS in COD4.
Failing minimum frame rate, at least keep the frame rate constant- a game is no fun if it's switching back and forth from 60 to 20FPS every few minutes.
What's with the 24FPS argument? Movies use motion blur. This is proven if you pause a movie in a high action scene it will have blur trails over the source of movement. Crysis also used motion blur, hence why it was so playable with 25FPS compared to COD4 which, just seems so... unsmooth. Good thing COD4 runs really well.
-
Some people's eyes are more sensitive than others. A good friend of mine and I can clearly see the difference between 60Hz and 80Hz refresh rates, likewise with framerates, while some others cannot. Just depends on the person, but in this case, I think most could clearly see the difference between 24 and 60 fps.
24 fps vs 60 fps
Discussion in 'Gaming (Software and Graphics Cards)' started by RedNara, Mar 24, 2010.