Gaming without V-Sync has the best responsiveness and FPS but it causes screen tearing whenever the GPU's output (your FPS) exceeds your monitor's refresh rate.
Turning on V-Sync gets rid of screen tearing and caps FPS at your monitor's refresh rate. It provides the smoothest visual experience but FPS drops by factors of two whenever your FPS falls below your refresh rate. It also introduces a lot of input lag.
G-Sync provides the best of both worlds with a perfectly tear-free, visually-stunning picture without compromising FPS or input lag at all.
Think of it this way: G-Sync turns the concept of V-Sync on its head. Instead of syncing the GPU's output to the monitor's refresh rate, it syncs the monitor's refresh rate to the GPU's output. In other words, the monitor no longer has a fixed refresh rate. The refresh rate is now variable and dynamically controlled by the number of frames pushed out by the GPU.
-
I see, seems cool the variable refresh rates which synchronized with FPS. But how about with lower Nvidia GPUs which cannot push 40-60FPS or if I want play a game near max graphics quality around 30FPS? Or for example I have a 60Hz laptops screen, than cannot I get the same effect if I lock FPS to 60 or exactly to the half to 30FPS?
-
An FPS cap will not prevent tearing. All it's doing is chopping the top off the frame rate. It's not perfectly syncing the frames produced by the GPU with the monitor's input and refresh rate.
-
Thanks for the explanation! It is quiet difficult topic to understand well and I was always confused which FPS / Hz ratio best for eyes. However I'm still uncertain, for example I used to play with locked 30FPS, is not it better with 60Hz than pairing 30FPS with 30Hz? 30Hz would not be too disturbing for eyes?
-
30 Hz on a CRT would kill your eyes with the flicker, but on an LCD there's no flicker at all since they don't operate on the same technical principles.Atom Ant likes this. -
Here is some information on what may happen now that we are out of the CRT monitor dark ages. http://techreport.com/blog/25542/a-few-thoughts-on-nvidia-g-sync
-
..problem is that the framerate production on your average gpu/cpu solution never avoids dropping below either 120Hz, 60Hz, 30Hz and so on. That's basically by design, since you can't control the rendering process completely with some sort of program logic.
So to work around that, you would set v-sync, say at 60hz. And then if the framerate ever drops below that, you would drop a frame or two and reduce the output to the next "half-rate", and switch a whole completed frame. In other words, create the illusion of 60hz by adding dummy-frames, and still switching only completed frames. You would notice that, but you would avoid the screen displaying incomplete frames, which is really what's causing screen-tearing.
And since the industry is run by morons, what we're getting now is a screen that can match the number of frames the actual graphics card is outputting. To create a more seamless and more ..smooth-er... variable framerate.. when outputting from badly programmed game-engines on sub-par last-generation hardware.
Also, this will make your migraines less intense when watching badly optimized 3d movies and games on bad screens for 12 hour stretches.
Yay.
XBone will not support Mantle or OpenGL
Discussion in 'Gaming (Software and Graphics Cards)' started by Zymphad, Oct 20, 2013.