I see this as off by default in the nVIDIA control panel, do you guys enable it or keep it off? I read some articles about it but I didn't really understand them, if someone can explain in layman's terms what it does I'd greatly appreciate it.
Regards,
-
Spartan@HIDevolution Company Representative
-
Normally when you enable (double buffered) V-Sync, your FPS will automatically drop to factors of your monitor's refresh rate when you cannot maintain a frame rate that is at least equal to the refresh rate. For example, on a 60Hz monitor, if you cannot maintain 60 FPS, it drops to 30 FPS. If you cannot maintain 30 FPS, it drops to 20 FPS. If you cannot maintain 20 FPS, it drops to 15 FPS. And so on. Triple buffering fixes these drops. It does nothing when V-Sync is disabled.
The triple buffering setting in the Nvidia Control Panel only works in OpenGL applications while the vast majority of games are Direct3D, so it's pretty much useless. Neither Nvidia nor AMD's drivers offer native support for triple buffering in Direct3D, so we need to use external tools such as D3DOverrider and RadeonPro.
I never turn on triple buffering because I never play with V-Sync on in the first place. I can tolerate a little tearing, but I can't stand input lag and judder. Variable refresh rate tech (G-Sync/FreeSync) is the ultimate no compromises solution.Last edited: Aug 12, 2015Spartan@HIDevolution likes this. -
Also, he has SLI, so he's getting quad buffering anyway.
Spartan@HIDevolution likes this. -
in what way does triple buffering fix these drops?
-
Spartan@HIDevolution Company Representative
-
-
Spartan@HIDevolution Company Representative
-
Spartan@HIDevolution likes this.
-
Spartan@HIDevolution Company Representative
-
-
It used to be that when your FPS exceeded G-Sync's upper range, the monitor defaulted to V-Sync on behavior to eliminate tearing at the expense of input lag. As you can imagine this would be disastrous for twitch shooters, so pretty much anybody who played those games disabled G-Sync if they had such a monitor.
However, AMD's Freesync solution did allow you to choose what you wanted the monitor to do if you exceeded Freesync's VRR range. You could either let the monitor default to V-Sync on, or de-sync at the expense of (barely noticeable) tearing but no input lag. Despite many (possibly paid) review sites blasting Freesync left right and center, this was the one thing that everybody agreed Freesync did right, and it obviously bothered nVidia.
So starting with 353.06, nVidia finally added that option in. But nVidia being nVidia, had to make setting it as obfuscated as possible, I mean just read this: (feel free to barf at the bolded part)
It's almost as if nVidia was too embarrassed to publicly admit they overlooked such an important feature, so they had to sneak it in without it being overly obvious. nVidia be nVidia -
-
-
-
Obviously I have no idea, I don't use gsync. Just going by the statements made. -
-
That makes sense. Why anyone would want to use gsync without capping the refresh rate is what I'm wondering in all this. The whole purpose of it is to rid yourself of those types of artifacts. At least that's what I'd be using it for.
Do you turn on Triple Buffering?
Discussion in 'Gaming (Software and Graphics Cards)' started by Spartan@HIDevolution, Aug 11, 2015.