I had a question that kept me curious regarding top notch Laptop GPUs.
I currently have a Nvidia Geforce GTX 780M GPU and I went to see more info about it on Notebookcheck.net
I went to see what games can run on this GPU and I notice that many games will run smoothly....but when it hits the Ultra Settings...It will show Frame Rate drop significantly.
For Example:
Hitman Absolution
low: 79 FPS
med: 75 FPS
high: 64.7 FPS
and then when ultra kicks in
ultra: 33.8/33.8/34.4
Same with Sleeping Dogs
the high settings is: 97 FPS
then when ultra comes in
Ultra: 30.6
quite a major difference on games like these.
SOURCE: NVIDIA GeForce GTX 780M - NotebookCheck.net Tech
My question is why does that happen....does it require more than the GPU to run a game on Ultra Settings smoothly? (at least for some games)
NOTE: I'm not trying to figure out how to run games on ultra settings, I am fine with high settings, but i am curious about this though.
-
-
Sometimes ultra turns on some really resource intensive features. Like in the case of Sleeping Dogs, it turns on SSAA which is a massive performance hit.
-
Ultra is usually maximum anti-aliasing which is massively bandwidth intensive, and not needed at native resolution, post processing effects like tesselation and shader options, etc. Most of these are not necessary and not worth the performance hit IMHO. I usually turn them off anyhow because they detract from the gameplay.
-
Yeah some ridiculously GPU-intensive form of antialiasing is usually to blame for the huge performance dropoffs between High and Ultra settings in many games. Some other examples are Antialiasing Deferred (a form of MSAA that is actually a custom selective supersampling technique) in BF3 and Ubersampling (SSAA) in The Witcher 2.
Another high-profile example is when DX11 was added to Crysis 2 and people discovered that Ultra enabled gratuitous amounts of tessellation in places where it didn't make sense and made no visual difference, such as on low-poly concrete slabs and on the invisible water mesh beneath the entire level. Crytek's decision to do this was obviously to live up to the "legacy" of Crysis and turn Crysis 2 into a system punisher because a lot of criticism at launch was that the game didn't look that great and didn't push PC's to their knees. But then they received even more flak for basically wasting GPU power and crippling frame rates for no visual gain on purpose with this lazy implementation of DX11 tessellation. -
so if a game setting allows me to turn off or decrease settings like anti-aliasing i should be able to run some games on ultra with at least around 50 FPS-40 FPS instead of a 30?
-
-
Don't forget that on Notebookcheck High is tested in 1366x768 resolution, while Ultra is tested on 1920x1080p. The higher resolution is also a big hit on performance by itself.
webcrtor likes this. -
I play Sleeping Dogs with on Ultra + HD texture pack and AA on normal. Never drops below 90FPS. I always lower AA levels in games, because most of the time, different levels of AA don't make a difference in visuals (unless it's TXAA).
-
At native resolution I never use AA. It makes little to no difference. Maybe 2xAA if there's a few visual jaggies, but with a 15" or 17" laptop LCD the pixel density of a 1080p screen is usually tight enough not to notice the "jaggies". And even then if you're really into the game it doesn't matter. I'll take improved FPS over visual improvement any day, as long as it doesn't affect the gameplay.
-
Well that satisfies my curiousity, thanks guys ^_^
Curious about FPS on Games
Discussion in 'Gaming (Software and Graphics Cards)' started by alexpre888, Oct 14, 2013.