The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    XBone will not support Mantle or OpenGL

    Discussion in 'Gaming (Software and Graphics Cards)' started by Zymphad, Oct 20, 2013.

  1. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Gaming without V-Sync has the best responsiveness and FPS but it causes screen tearing whenever the GPU's output (your FPS) exceeds your monitor's refresh rate.

    Turning on V-Sync gets rid of screen tearing and caps FPS at your monitor's refresh rate. It provides the smoothest visual experience but FPS drops by factors of two whenever your FPS falls below your refresh rate. It also introduces a lot of input lag.

    G-Sync provides the best of both worlds with a perfectly tear-free, visually-stunning picture without compromising FPS or input lag at all.

    Think of it this way: G-Sync turns the concept of V-Sync on its head. Instead of syncing the GPU's output to the monitor's refresh rate, it syncs the monitor's refresh rate to the GPU's output. In other words, the monitor no longer has a fixed refresh rate. The refresh rate is now variable and dynamically controlled by the number of frames pushed out by the GPU.
     
  2. Atom Ant

    Atom Ant Hello, here I go again

    Reputations:
    1,340
    Messages:
    1,497
    Likes Received:
    272
    Trophy Points:
    101
    I see, seems cool the variable refresh rates which synchronized with FPS. But how about with lower Nvidia GPUs which cannot push 40-60FPS or if I want play a game near max graphics quality around 30FPS? Or for example I have a 60Hz laptops screen, than cannot I get the same effect if I lock FPS to 60 or exactly to the half to 30FPS?
     
  3. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    An FPS cap will not prevent tearing. All it's doing is chopping the top off the frame rate. It's not perfectly syncing the frames produced by the GPU with the monitor's input and refresh rate.
     
  4. Atom Ant

    Atom Ant Hello, here I go again

    Reputations:
    1,340
    Messages:
    1,497
    Likes Received:
    272
    Trophy Points:
    101
    Thanks for the explanation! It is quiet difficult topic to understand well and I was always confused which FPS / Hz ratio best for eyes. However I'm still uncertain, for example I used to play with locked 30FPS, is not it better with 60Hz than pairing 30FPS with 30Hz? 30Hz would not be too disturbing for eyes?
     
  5. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    No because of what I said previously. You will still get tearing at a capped 30 FPS on a fixed 60 Hz screen because that disconnect between the GPU's output and the monitor's refresh rate happens at any FPS unless the two are perfectly in sync. You need to throw out whatever preconceptions you have about monitors and fixed refresh rates because G-Sync changes everything. Getting 30 FPS on a G-Sync monitor is a much better and smoother visual experience because your monitor will always be controlled by your video card. The screen will only ever display whole frames, giving you that perfect visual experience. 30 FPS will also look less laggy on a G-Sync monitor.

    30 Hz on a CRT would kill your eyes with the flicker, but on an LCD there's no flicker at all since they don't operate on the same technical principles.
     
    Atom Ant likes this.
  6. ajnindlo

    ajnindlo Notebook Deity

    Reputations:
    265
    Messages:
    1,357
    Likes Received:
    87
    Trophy Points:
    66
  7. nipsen

    nipsen Notebook Ditty

    Reputations:
    694
    Messages:
    1,686
    Likes Received:
    131
    Trophy Points:
    81
    ..problem is that the framerate production on your average gpu/cpu solution never avoids dropping below either 120Hz, 60Hz, 30Hz and so on. That's basically by design, since you can't control the rendering process completely with some sort of program logic.

    So to work around that, you would set v-sync, say at 60hz. And then if the framerate ever drops below that, you would drop a frame or two and reduce the output to the next "half-rate", and switch a whole completed frame. In other words, create the illusion of 60hz by adding dummy-frames, and still switching only completed frames. You would notice that, but you would avoid the screen displaying incomplete frames, which is really what's causing screen-tearing.

    And since the industry is run by morons, what we're getting now is a screen that can match the number of frames the actual graphics card is outputting. To create a more seamless and more ..smooth-er... variable framerate.. when outputting from badly programmed game-engines on sub-par last-generation hardware.

    Also, this will make your migraines less intense when watching badly optimized 3d movies and games on bad screens for 12 hour stretches.

    Yay.
     
← Previous page