Lets not forget that games are interractive media, they are responding to your controls, doing something, and then showing you the outcome on screen.
And if they can only give you visual feedback 24 times a second that means there is a perceptible delay 'between frames' while you wait for your control input to be shown to you on screen.
Thats why people who play driving games or FPSs can perceive a huge difference between low and high framerates. Its as much about the control as it is about smoothness of motion.
Its funny you would say that, because personally I can easily detect flickering in flourescent light bulbs. A lot of people can, especially out of the corners of their eyes.
-
thinkpad knows best Notebook Deity
All i can say is 24 FPS is NOT a slideshow... 3-10 is...
-
-
Just to be 100% clear on this, if you have 120FPS without v-sync on a 60Hz monitor, what you will get is roughly 1/2 of each frame displayed on the screen. You won't get full screen 120FPS obviously, but it is also not technically true that you won't get any more screen updates than at 60FPS. As long as you don't mind taring and are one of the superhumans that can actually respond to refreshes that quickly, you will actually see partial frames at a faster rate.
-
For me it depends on the game.
Console games usually run at 1280x720 or lower at 30fps.
Some games like PS3 Wipeout HD runs at 1920x1080 60fps.
But for pc i need at least 60fps for FPS and Racing, Flight Sims seems playable at 25+ fps and WoW i could live with 30+ fps. -
well wiki has some good info
"The human visual system does not see in terms of frames; it works with a continuous flow of light/information.[citation needed] A related question is, “how many frames per second are needed for an observer to not see artifacts?” However, this question too does not have a single straightforward answer. If the image is switching between black and white each frame, then this image will appear to flicker when the pattern is shown at rates slower than 30 frames per second. In other words, the flicker-fusion point, where the eyes see gray instead of flickering tends to be around 60 Hz. However, for fast moving objects, frame rates may need to be even higher to avoid judder (non-smooth motion) artifacts. And the retinal fusion point can vary in different people, as well as depending on lighting conditions."
sounds like 60 FPS
and motion blur also sounds very important
"Without realistic motion blurring, video games and computer animations would not look as fluid as on film even with the same frame rate. When a fast moving object is present on two consecutive frames there is inevitably a gap between the images on the two frames which can contribute to a noticeable separation of the object and its afterimage left in the eye. Motion blurring helps to mitigate this effect since it tends to reduce this image gap when the two frames are strung together (the effect of motion blurring is essentially superimposing multiple images of the fast-moving object on a single frame). The result is that the motion becomes more fluid to the human eye even as the image of the object becomes blurry on each individual frame."
Sounds like 30fps is acceptable with motion blur, and 60fps without good motion blur. (eg. CRYSIS) -
-
Seems to me the frames/sec barrier is usually invented by people getting destroyed so they can go back to their computer and their cheats...
I can completely see "noticing" a difference... but "I can't play" is a complete cop-out for cheaters who NEED their cheats.
Its the same excuse console gamers use when they get splattered on a PC version of the same game... I even hooked up a PS3 controller so he'd feel more at home... still got annihilated...
"dude you have to aim now". -
-
If it's perfectly smooth and consistant, most people will never notice a difference between 24fps and 60fps.
In video games though you can have an average of 24 frames rendered per second but because maybe a few frames took longer to load than others there can still be a noticeable difference. It depends on the game, the settings, the hardware, and the viewer what framerate you can have in a game and have it look smooth. -
25 FPS is my usual target. I can tell the difference up to ~40
-
Its all about INPUT LAG. If you are any competitive at any fps game, you will notice any delayed response of your input and it usually ends up with a bullet in your head.
-
-
-
-
-
When the next generation of consoles has a standard of 60fps, it will suddenly be the most amazing thing ever. Until then, it doesn't matter.
-
-
To all of you using the "movies are perfectly smooth at 24 fps!" argument: next time you watch a movie, pay attention to the motion, particularly the camera pans. You'll notice that they are very slow. If movie camera pans were anywhere close to as fast as your average motions in gaming, you'd realize just how visibly obvious the low frame rate is.
tl;dr - 60 fps is better than 24, and yes, you can tell. -
Thank you shroom!! finally someone who gets it. I swing my XM8 way faster than the Hollywood guys.
-
i'll just quote myself..
-
24FPS looks fine with movies because cameras have exposure time which means that if things are moving, they get blurred depending upon how quickly they move across the frame during the exposure. Another reason they look smooth compared to the same framerate on a CRT is that the image is projected constantly rather than pixel by pixel, so there isn't the flashing effect that CRT develops at lower refresh rates.
-
If the game's not badly programmed it can even increase actual performance.
-
-
-
Vsync decrease performance and use lots of vram .. BUT it help the screen tearing in certain games.
Enable when needed , off when not .. -
Also, how do you say that it decreases performance across the board? I said the game had to be properly programmed, and many aren't (the graphics drawing is too tightly tied to the game logic), but for games that are properly programmed, enabling vsync will increase performance. -
-
May be I choose the wrong word in "lots" ...
all it does is maintain the refresh rate to stop the tearing .. I don't see where you get an increase in performance ... -
-
-
vsync shouldn't really increase performance, but when you have it triple buffered it shouldn't cut the frames too much if the game is a heavy one.
-
I tried Vsync, and hated it. I really don't care about tearing, and everything just feels sluggish with Vsync on.
-
I don't know any games specifically that will take advantage of it.
The theory with vsync is that instead of spending time in the drawing loop, the program can spend more time in physics, game AI, whatever, and can make events more accurate because it runs more physics and AI iterations instead of graphical redraw events that are ultimately discarded.
If you want more info, start looking into 3D rendering and game engines and how they are designed and work. -
event draw draw draw draw event
as
event draw event.
the function call to draw isn't that great of an impact for run time anyways. the queue is truncated to get rid of extra draw calls to go forth with global timers.
however, what happens is sometimes a call to draw something on the screen will take longer than the allotted delay for the global timer to draw the actual image on the screen. vsync makes sure that the images drawn on the screen will ONLY be full images rather than constant updated images. in other words, vsync tries to prevent the program from drawing half of an image on the screen and then drawing a full image on the screen.
the theory of vsync is to match the interval between the last frame on the screen with the next. i don't think it has anything to do with saving time to compute AI and physics. it may help relieve your computer of extra strenuous work of trying to draw frames as fast as it can, but it doesn't matter.
vsync decreases some performance due to the nature of game states - if you're not drawing everything you're seeing when you see it, then you have latency issues. this is why people who play FPS recommend this be turned off because when a person is actually popping out of a corner and when you see them pop out of a corner are already two different things due to network latency. if you add more latency to when you're drawing to the screen then you're making the problem much more complicated. -
-
thinkpad knows best Notebook Deity
So in a sense it's also a somewhat power saving technology since it does reduce unecessary calculations that the GPU has to complete per clock.
-
Allocate resources more effectively.
-
from my assumptions, vsync shouldn't really boost performance for computing AI and physics tasks. capping physical FPS will help moreso with fluidity because it puts a limit on how fast the computer can draw the screen, rather than just putting a delay to ensure full images are drawn to the screen. this will prevent the computer from getting hit by tasks like computing AI and physics often and letting it affect drawing and rids problems (screen lag) that triple buffering tries to eliminate for vsync, except that the possibility of tearing is still there.
works on BUFFERING will help with CPU and GPU performance, but vsync simply helps eliminates problems tearing in games.
it doesn't stop the global timer from executing tasks when it is called in the event queue, rather, it just puts a delay on the buffer to ensure that what is drawn on the screen is a full frame. this creates a latency issue, not necessarily a performance issue. things in game are working as they should, but it isn't appearing that way on the screen. -
@Levenly:
Let's look at it this way, you are drawing things at a very fast rate, this means you, for example are doing all the rendering 240 times per second and you keep changing the buffer that is sent to the GPU, while the display, has a 60Hz refresh rate.
So for example, if you have a very high framerate and scroll very fast when a pillar is in your view, you will see something like this :
|
_|
__|
(The _ is a replacement for a space, while gets omitted thanks to html)
Instead of
|
|
|
That means, a single refresh of the monitor (while drawing) had it's buffer updated twice in the middle of the drawing, causing screen tearing. vsync, syncronizes the framerate to the refresh rate of the monitor.
More info here:
http://en.wikipedia.org/wiki/Vsync
http://en.wikipedia.org/wiki/Page_tearing -
Read the wiki pages on vertical synchronization and triple buffering.
Check these other references:
Triple buffering: http://www.gamecritics.com/triple-buffering-improve-your-pc-gaming-performance-for-free
Enabling vsync fixes Dead Space control issues: http://techreport.com/discussions.x/16061 -
the actual tasks are still being executed at the same exact times as they would be regardless of the vsync, except that when vsync is enabled, something may happen in game that doesn't happen on your screen at the same time, thus, the latency issue.
the draw is delayed to allow a full image to be drawn on the screen, rather than whatever is in the buffer at that given time, which may be half an image.
i feel like i'm repeating myself. -
I know Vsync will reduce the tearing effect when the fps are above the refresh rate of the monitor. If that is an "increase" in performance, then I agree.
Note: that is not an increase in performance in my book. -
With triple-buffering and vsync, nothing is delayed. Best performance, best image. Before triple buffering, you would be correct about most of your assertions. As it stands now though with newer game engines and newer GPU's, it's no longer the case. I play L4D2 with some pretty good players and I get better accuracy with a higher ping than they do, and I have triple-buffering with vsync enabled. If anyone would notice it, it would be me. -
The minimum framerate is what your after.
-
99.9% of pc games are badly coded then .. as I don't see any of the so called increase in performance in the games I owned.
-
D3DOverridder is what you need to use for your V-sync + Triple Bufferring.
-
-
-
No tearing , yes ... Smoother gameplay , not always ... Hence no performance gain.
Simple logic and gaming experienced non theoretical.
24 fps vs 60 fps
Discussion in 'Gaming (Software and Graphics Cards)' started by RedNara, Mar 24, 2010.