TLDR: read my question in bold at the end!
Now I thought I knew quite a lot about vsync and how it worked, because I've read up about it over the years, but I've never really used it because it's supposed to reduce your frame rate all things considered. However, I've been experimenting with vsync forced through the NVidia Control Panel on various games & my framerate observations don't stack up or align with what I know about vsync.
My understandings about vsync were:
My observations though were sometimes different to the two points above, here I'll provide some examples of my game observations (Vsync forced through NVidia Control Panel):
- Enabling normal vysnc (double buffered) will reduce your framerate by half your refresh rate if your GPU can't keep up with the screen refresh rate.
- Triple Buffered Vsync will allow for a framerate above half your refresh rate if your GPU can't keep up with the screen refresh rate.
Enabling Triple Buffering in NVidia Control Panel had no effect on fps on any of these games tested, which I'm not too surprised given it's only supposed to work with OpenGL games & not DX based games.
- Games that conformed to my understanding of vsync, that when below screen refresh rate fps will be limited to 1/2 or 1/3 screen refresh rate:
- Batman Arkham Origins
- Unigine Heaven Benchmark
- Games that didn't conform to my understanding of vsync, when below screen refresh rate the fps did not drop to 1/2 or 1/3 screen refresh rate (fps stayed above 1/2 screen refresh rate when fps dipped below screen refresh rate):
- Bioshock Infinite
- F1 2012
- Metro Last Light
- Tomb Raider
- Assassins Creed Unity
- Titanfall
My question really is: With vsync forced through NVidia Control Panel, why are some games dropping to 1/2 or 1/3 screen refresh rate, whereas other games don't drop to 1/2 or 1/3 refresh rate when fps drops below screen refresh rate?
-
Robbo99999 Notebook Prophet
-
Not in my experience? I always have VSYNC on and have never seen only 60-30-60-30. There's always frames in between. Always. Maybe all games I've ever played use triple buffer by default? But that's a lot of games!
Robbo99999 likes this. -
Robbo99999 Notebook Prophet
Perhaps the dropping to 1/2, 1/3 refresh rate is the exception rather than the rule then. So far only Batman Arkham Origins & Heaven Benchmark have behaved in that manner. I wonder what's causing this difference & disparity between games, anybody any ideas? I think it was in this Tweakguide article where I got this idea of how v-sync works (see following quote from this link http://www.tweakguides.com/Graphics_9.html):
"There is however a more fundamental problem with enabling VSync, and that is it can significantly reduce your overall framerate, often dropping your FPS to exactly 50% of the refresh rate. This is a difficult concept to explain, but it just has to do with timing. When VSync is enabled, your graphics card becomes a slave to your monitor. If at any time your FPS falls just below your refresh rate, each frame starts taking your graphics card longer to draw than the time it takes for your monitor to refresh itself. So every 2nd refresh, your graphics card just misses completing a new whole frame in time. This means that both its primary and secondary frame buffers are filled, it has nowhere to put any new information, so it has to sit idle and wait for the next refresh to come around before it can unload its recently completed frame, and start work on a new one in the newly cleared secondary buffer. This results in exactly half the framerate of the refresh rate whenever your FPS falls below the refresh rate." -
iirc, one of the major issues with vsync (aside from fps drop) is input lag. if this doesnt bother you, or is not present in the games you play, then vsync on would be better as long as your gpu can cope with the fps hit.
-
Yes input lag and fps jumps which can cause stuttering or just overall choppy performance if it gets in situations where the GPU can barely maintain FPS equal to refresh of LCD.
Here's a good example: http://forum.notebookreview.com/thr...nc-review-by-htwingnut.777906/#gamebenchmarks
This LCD has 75Hz refresh, and you can see some good examples of
Bordering 75FPS consistent:
Occasional drops below 75FPS:
Always above 75 FPS:
This is where G-sync is beneficial. You get all the benefits of V-sync with none of the drawbacks.E.D.U. and Robbo99999 like this. -
That's because vsync will clamp to factors of the refresh, this is where gsync shines as it doesn't have that limitation. It just doesn't let the frame rate go above refresh, which is what causes tearing.
-
Robbo99999 Notebook Prophet
EDIT: I just checked out your review at the link you gave - good review by the way. Yes, so you also experienced in the same games that I did that vsync had no real negative impact on minimum frame rates when dropping below screen refresh rate: Bioshock Infinite, Metro Last Light - those were the ones where your results matched with mine. Yes, and on other games you saw big performance drops when dipping below screen refresh rate. Definitely game dependant for some reason, and similar games & similar behaviour when comparing across my Kepler platform to your Maxwell 980M.
Input lag I've not really noticed much in Titanfall when it's at 78fps with v-sync on, and besides I feel that the screen clarity from lack of tearing means that allows you to pick up targets easier, so perhaps that outweighs or balances the small input lag involved. Gsync FTW though on both counts!Last edited: Jul 1, 2015 -
Unless you're using Adaptive Vsync, Vsync will DEFINITELY clamp to a factor of the refresh, unless your frame rate counter is not sampling fast enough and showing the average of two factors when it's swinging between two factors.
For a 75Hz refresh it should be 75/37.5(might round up/dn)/25/etc... -
Robbo99999 Notebook Prophet
I take your point on the frame rate counter though, I'm using FRAPS, and I know what you mean. It didn't feel like the frame rate was fluctuating that rapidly though, so I'm fairly confident that the FRAPS reporting while being an average over a short time was still close to the range of framerates seen.Last edited: Jul 1, 2015 -
60 is a close 3/4 of 78... I wonder if it was allowing more than half but still divisible. Doesn't make any sense at all but I suppose it's possible.
-
Robbo99999 Notebook Prophet
-
-
My testing was with just v-sync no adaptive v-sync. But yeah I did find it odd too how it would run 75Hz and down to 70Hz, and 60Hz, and sometimes in between with V-sync enabled. Many times I see it drop from like 60 to 45 to 30 to 20 to 15. But it's still a step factor and not a linear one like with G-sync.
-
Robbo99999 Notebook Prophet
Last edited: Jul 1, 2015 -
Is it possible that Fraps captures the framerate differently in different games?
As in some games Fraps captures the framerate the game is actually running disregarding what is being sent to the screen and in others it displays frames drawn on screen. -
Robbo99999 Notebook Prophet
I think I read that if Triple Buffering is activated then FRAPS can incorrectly record the frame rate: instead it records the frames drawn by the GPU, rather than the ones sent to the screen. Triple Buffering enables the GPU to run at full pelt constantly replacing drawing the frames into the 2 back buffers where they constantly get replaced (the 3rd buffer is used to store the frame that is gonna be sent to the screen). So Triple Buffering could show a higher framerate than what is displayed - but I don't think that's the case for normal double buffered. My games were not using Triple Buffering.HTWingNut likes this.
Vsync Questions
Discussion in 'Gaming (Software and Graphics Cards)' started by Robbo99999, Jun 30, 2015.