The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Vsync Questions

    Discussion in 'Gaming (Software and Graphics Cards)' started by Robbo99999, Jun 30, 2015.

  1. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    TLDR: read my question in bold at the end!

    Now I thought I knew quite a lot about vsync and how it worked, because I've read up about it over the years, but I've never really used it because it's supposed to reduce your frame rate all things considered. However, I've been experimenting with vsync forced through the NVidia Control Panel on various games & my framerate observations don't stack up or align with what I know about vsync.

    My understandings about vsync were:
    1. Enabling normal vysnc (double buffered) will reduce your framerate by half your refresh rate if your GPU can't keep up with the screen refresh rate.
    2. Triple Buffered Vsync will allow for a framerate above half your refresh rate if your GPU can't keep up with the screen refresh rate.
    My observations though were sometimes different to the two points above, here I'll provide some examples of my game observations (Vsync forced through NVidia Control Panel):
    • Games that conformed to my understanding of vsync, that when below screen refresh rate fps will be limited to 1/2 or 1/3 screen refresh rate:
        • Batman Arkham Origins
        • Unigine Heaven Benchmark
    • Games that didn't conform to my understanding of vsync, when below screen refresh rate the fps did not drop to 1/2 or 1/3 screen refresh rate (fps stayed above 1/2 screen refresh rate when fps dipped below screen refresh rate):
        • Bioshock Infinite
        • F1 2012
        • Metro Last Light
        • Tomb Raider
        • Assassins Creed Unity
        • Titanfall
    Enabling Triple Buffering in NVidia Control Panel had no effect on fps on any of these games tested, which I'm not too surprised given it's only supposed to work with OpenGL games & not DX based games.

    My question really is: With vsync forced through NVidia Control Panel, why are some games dropping to 1/2 or 1/3 screen refresh rate, whereas other games don't drop to 1/2 or 1/3 refresh rate when fps drops below screen refresh rate?
     
    Last edited: Jun 30, 2015
  2. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Not in my experience? I always have VSYNC on and have never seen only 60-30-60-30. There's always frames in between. Always. Maybe all games I've ever played use triple buffer by default? But that's a lot of games!
     
    Robbo99999 likes this.
  3. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I think I'm becoming a v-sync convert now! Just played Titanfall to check it out (that's definitely double buffered because it says so in the in game menu), and most of the time it's locked at 78fps (my refresh rate), and when it does dip below 78 it's only dropping to like 60fps. It's a bit easier on the eyes as well it seems - I think I can track what's going on better - I think screen tearing can mess with your eyes!

    Perhaps the dropping to 1/2, 1/3 refresh rate is the exception rather than the rule then. So far only Batman Arkham Origins & Heaven Benchmark have behaved in that manner. I wonder what's causing this difference & disparity between games, anybody any ideas? I think it was in this Tweakguide article where I got this idea of how v-sync works (see following quote from this link http://www.tweakguides.com/Graphics_9.html):

    "There is however a more fundamental problem with enabling VSync, and that is it can significantly reduce your overall framerate, often dropping your FPS to exactly 50% of the refresh rate. This is a difficult concept to explain, but it just has to do with timing. When VSync is enabled, your graphics card becomes a slave to your monitor. If at any time your FPS falls just below your refresh rate, each frame starts taking your graphics card longer to draw than the time it takes for your monitor to refresh itself. So every 2nd refresh, your graphics card just misses completing a new whole frame in time. This means that both its primary and secondary frame buffers are filled, it has nowhere to put any new information, so it has to sit idle and wait for the next refresh to come around before it can unload its recently completed frame, and start work on a new one in the newly cleared secondary buffer. This results in exactly half the framerate of the refresh rate whenever your FPS falls below the refresh rate."
     
  4. trvelbug

    trvelbug Notebook Prophet

    Reputations:
    929
    Messages:
    4,007
    Likes Received:
    40
    Trophy Points:
    116
    iirc, one of the major issues with vsync (aside from fps drop) is input lag. if this doesnt bother you, or is not present in the games you play, then vsync on would be better as long as your gpu can cope with the fps hit.
     
  5. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Yes input lag and fps jumps which can cause stuttering or just overall choppy performance if it gets in situations where the GPU can barely maintain FPS equal to refresh of LCD.

    Here's a good example: http://forum.notebookreview.com/thr...nc-review-by-htwingnut.777906/#gamebenchmarks

    This LCD has 75Hz refresh, and you can see some good examples of

    Bordering 75FPS consistent:
    [​IMG]


    Occasional drops below 75FPS:
    [​IMG]



    Always above 75 FPS:
    [​IMG]

    This is where G-sync is beneficial. You get all the benefits of V-sync with none of the drawbacks.
     
    E.D.U. and Robbo99999 like this.
  6. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,296
    Likes Received:
    3,048
    Trophy Points:
    431
    That's because vsync will clamp to factors of the refresh, this is where gsync shines as it doesn't have that limitation. It just doesn't let the frame rate go above refresh, which is what causes tearing.
     
  7. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    You were getting some pretty big fps drops there in those games, not quite by a factor of the refresh rate though, but still bigger fps drops than I saw when enabling v-sync (apart from Arkham Origins which always dropped to a factor of the refresh rate when not maintaining refresh rate). Curiously it just seems to depend on the game whether or not the dips in fps are large or not. I don't really understand why that is, perhaps I don't need to, it's not like I can influence anything by knowing - but I'm curious! Gsync looks awesome though, what we've all been waiting for!

    EDIT: I just checked out your review at the link you gave - good review by the way. Yes, so you also experienced in the same games that I did that vsync had no real negative impact on minimum frame rates when dropping below screen refresh rate: Bioshock Infinite, Metro Last Light - those were the ones where your results matched with mine. Yes, and on other games you saw big performance drops when dipping below screen refresh rate. Definitely game dependant for some reason, and similar games & similar behaviour when comparing across my Kepler platform to your Maxwell 980M.


    Yep, that's what I used to think, but my testing & HTWingnut's results there show that it's not dropping to a factor of the refresh rate (some of his fps drops are close to a factor of a refresh rate), whereas my fps drops were minimal & nowhere near a factor of the refresh rate in most games.


    Input lag I've not really noticed much in Titanfall when it's at 78fps with v-sync on, and besides I feel that the screen clarity from lack of tearing means that allows you to pick up targets easier, so perhaps that outweighs or balances the small input lag involved. Gsync FTW though on both counts!
     
    Last edited: Jul 1, 2015
  8. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,296
    Likes Received:
    3,048
    Trophy Points:
    431
    Unless you're using Adaptive Vsync, Vsync will DEFINITELY clamp to a factor of the refresh, unless your frame rate counter is not sampling fast enough and showing the average of two factors when it's swinging between two factors.

    For a 75Hz refresh it should be 75/37.5(might round up/dn)/25/etc...
     
  9. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    That's what I used to think, but that's not the case. I have a 78Hz screen, and I was seeing fps drops to 60 fps in Titanfall at times, Metro Last Light ran the exact same fps curve on the benchmark whether I ran vsync or not (fps varying smoothly from anywhere from 30 fps to 60fps), same with Bioschock Infinite and F1 2012 - in fact the only game that dropped to factors of the refresh rate was Batman Arkham Origins & Heaven Benchmark (HTWingnut saw similar behaviour in terms of vsync not causing fps drops to factors of the refresh rate (see his link for other games that didn't drop far in fps), and also Cakefish earlier). I used to believe exactly the same as what you've just written, but not anymore - it's game dependant whether or not it falls all the way down to factors of screen refresh rate - it's weird, I don't fully understand why that's the case, but it is.

    I take your point on the frame rate counter though, I'm using FRAPS, and I know what you mean. It didn't feel like the frame rate was fluctuating that rapidly though, so I'm fairly confident that the FRAPS reporting while being an average over a short time was still close to the range of framerates seen.
     
    Last edited: Jul 1, 2015
  10. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,296
    Likes Received:
    3,048
    Trophy Points:
    431
    60 is a close 3/4 of 78... I wonder if it was allowing more than half but still divisible. Doesn't make any sense at all but I suppose it's possible.
     
  11. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I don't think it works like that. The way I used to understand it, the frame rate had to be a multiple into the screen refresh rate, so for my 78Hz, that would be 78, 39, 26, 19.5 - (which is 1, 1/2, 1/3, 1/4). 60 is not a multiple into 78. However, my testing, and HTWingnut's testing, and Cakefish's show that fps doesn't drop to those extents in all games (it does in some - weird!).
     
  12. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,296
    Likes Received:
    3,048
    Trophy Points:
    431
    Yeah that's exactly what I was saying originally, which is why 60FPS at 78Hz refresh with vsync on makes zero sense. Unless it's adaptive vsync.
     
  13. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    My testing was with just v-sync no adaptive v-sync. But yeah I did find it odd too how it would run 75Hz and down to 70Hz, and 60Hz, and sometimes in between with V-sync enabled. Many times I see it drop from like 60 to 45 to 30 to 20 to 15. But it's still a step factor and not a linear one like with G-sync.
     
  14. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    No steps seen on Metro Last Light with my testing, nor with Bioshock Infinite. 100% definitely not with Metro Last Light, got the exact same 51fps average for the benchmark run, so it's following the same curve whether v-sync is enabled or not. (possible for a start because that benchmark run can never reach the 78fps of my refresh rate, and note this is not adaptive vsync but normal vsync forced through Nvidia control panel - vsync was working because it maxed out at 78fps on the loading screen before benchmark started.). This is also exactly what you saw in the Metro Last Light run you did HTWingnut - it followed the same curve vsync or no vsync when below screen refresh rate.
     
    Last edited: Jul 1, 2015
  15. Mr Najsman

    Mr Najsman Notebook Deity

    Reputations:
    600
    Messages:
    931
    Likes Received:
    697
    Trophy Points:
    106
    Is it possible that Fraps captures the framerate differently in different games?
    As in some games Fraps captures the framerate the game is actually running disregarding what is being sent to the screen and in others it displays frames drawn on screen.
     
  16. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I wondered that for a while, but in the games where it stepped down to 1/2 and 1/3 of the screen refresh rate it really did look like that was the actual framerate - ie a greatly reduced framerate (Batman Arkham Origins / Heaven Benchmark). Whereas, in the other games I tested where fps drops were minimal and a lot less than 1/2, then the fps felt & looked like it was only a minor fps drop - so looked like FRAPS was reporting accurately (hard to be anymore scientific about it though, as this is a subjective and for me non-measurable experience).

    I think I read that if Triple Buffering is activated then FRAPS can incorrectly record the frame rate: instead it records the frames drawn by the GPU, rather than the ones sent to the screen. Triple Buffering enables the GPU to run at full pelt constantly replacing drawing the frames into the 2 back buffers where they constantly get replaced (the 3rd buffer is used to store the frame that is gonna be sent to the screen). So Triple Buffering could show a higher framerate than what is displayed - but I don't think that's the case for normal double buffered. My games were not using Triple Buffering.
     
    HTWingNut likes this.