As you can see there is no difference between ultra and low, antialiasing makes somewhat of a differance however. aside from AA is there really a point in running ultra over medium settings in most AAA games..Im thinking of upgrading my Laptop, but the games simply don't need the power, I already run almost all games at 60fps on medium, and Maybe im alone on this one and here. Do you think games are differentiating from low medium ultra, by a small factor I mean it's not like going from 720p to 4k with TVs. 4k represents ultra 720p represents low. 4k is worth it if you enjoy clairity. However low vs ultra in most games means only a negligable bump in visuals unlike what happens with TV resolutions, and im not talking about generational changes or crysis. as obviously games will look better and better with time, until it peaks at photo realistic.
Anyone else notice how games are less and less dynamically visual. should we be signing a petition or stiring up the industry asking developers to put our money into developing games that push limits..
and OUT.
example 2
example 3
I could go on for more games and I will
example 4
-
yrekabakery Notebook Virtuoso
Using low quality YouTube footage to compare graphical settings? Come on, mate...
MahmoudDewy, killkenny1 and vacationtime like this. -
It varies from game to game. In most cases there is definitely a visual distinction from low and high/ultra. A lot of times Ultra just adds on all the extra effects like bloom, depth of field, vignetting, grass and/or water detail a lot of stuff not really needed. There is a performance hit though which can easily be seen when playing on lower end hardware. Low settings can give you playable results, but just bumping up to Medium or High can make it wholly unplayable. So I like that it gives people with lower end hardware a decent visual perspective that doesn't require a $5000 PC to run.
I don't know that there needs to be an uproar, I like how it's scalable today, and most recently released games struggle to get manageable frame rates on top end hardware, especially those that are running higher resolutions like 2560x1440 or even 4k. It's really hard to make a game that much different between low settings and high settings considering it's all the same polygons.Last edited: Jan 2, 2018killkenny1 likes this. -
MahmoudDewy Gaming Laptops Master Race!
-
if you read my post you would see were not talking clarity or sharpness.
I also metioned medium vs ultra. the videos are just an example, obviously low will look worse than ultra to a degree with a few exceptions like crysis 3.
Just wondering why games on medium look identical to ultra....i'll test out the most demanding game ghost recon wildlands and post comparison pictures of medium vs ultra in a bit. And don'g get me wrong there are a few games that clearly look superior on ultra. -
EDIT... ok so theres a pretty big difference between veryhigh and medium.. but thats just one game im still pretty sure most games don't bump up visuals very much from medium to veryhigh.. thoughts?
-
thegreatsquare Notebook Deity
Tweaking settings can be very effective. Tweaking was integral to my Asus G73jh lasting most of 5 years.
http://www.overclock.net/g/a/36947/...layed-2013-retirement/sort/upload_time/page/0
The Thief screenshots are mostly low-medium ...including low textures. Wolfenstein: TNO, Crysis 3, Witcher 2 & DA:I screenshots also used a mix from low to high, though I don't remember specifics. The Tomb Raider reboot is also heavily tweaked to squeeze in tressfx ...because I had to try.Last edited: Jan 5, 2018JRE84 likes this. -
thanks for your post thegreatsquare. It just crossed my mind tweaking settings is probably the smartest thing to do vs upgrading hardware. I want my laptop to last another 2 years and I have a pretty good feeling i'll be able to game with low/medium/high mixed settings at 1080p 30fps for 2-3 years....what are your thoughts as you have experience with this. btw I got back into gaming a little bit....sick of benchmarking
-
thegreatsquare Notebook Deity
...Keep native resolution*.
...Try to keep draw distance High if it's noticeable in detail, some AA of any sort and some level of AO.
..."Rob Peter to Pay Paul". Meaning drop settings to the lowest acceptable one from the start to have as much room to play with.
...cut frivolous post processing effects
...Keep in mind towards the end of the console cycle that CPU requirements jump first and it's the CPU that's the issue if you keep lowering graphics and not getting more FPS.
Last resort tweaks:
...Cut FOV 5-10 degrees in FPS
...Cut brightness a little, it hides the sins of lower textures.
*The use of variable resolution in some new games has me thinking that I could get away with 1600x900 if it meant better settings ...or really just running. That's some time off for me.Last edited: Jan 5, 2018 -
i just tweaked the witcher 3...looks like ultra and im getting 47fps.
i have a 970m...going to upgrade to the 1080s successor and keep it for 4 years.
i dont mind 30fps on most games and generally even though my gpu is 3 and a half years old it still usually runs games well enough to not have to worry about tinkering..witha few exceptions of course.
I just downloaded the free to try game SNOW its a new cryengine snowboarding game....actually quite good...fun..
i run very high and net 32fps average but its consistant...and thats a crytek engine 2018 game so i think ill be ok for 1 more year...depending on how game requirements evolve I might hold back on upgrading and get the 3080.. -
thegreatsquare Notebook Deity
I say this because the real next gen [...not the XB1x/PS4 Pro mid-gen refresh for 4k] will be another big jump in CPU/GPU requirements. You don't need to go all out. The closer to a new console gen, the shorter the GPU life [GTX 880m/HD 8970m-R9 m290x buyers can probably attest to this]. Look at the GTX 1070 benchmarks. [The Evil Within 2's numbers are what a CPU bottleneck looks like.]
https://www.notebookcheck.net/Mobile-NVIDIA-GeForce-GTX-1070-Laptop.169549.0.html
The smarter purchase would be to pick up a GTX 2060 laptop or something with a 1070 when they're being cleared out because now is the time you are ~4 years out from new consoles.
...or if you live near a Microcenter, these now:
http://www.microcenter.com/product/486481/1510_156_Gaming_Laptop_Computer_-_Black?ob=1
http://www.microcenter.com/product/486482/1710_173_Gaming_Laptop_Computer_-_Black
[edit] Getting a laptop ~2yrs from now would mean looking for a 3050/Ti to run out the generation, you'll want to keep it under $1k as much as possible.Last edited: Jan 6, 2018JRE84 likes this. -
Yeah, I agree with OP. It feels more like a placebo.
The most significant differences are:
- Texture quality (textures on low blurred)
- Shadow quality (shadows on low disabled)
But other than that the differences are only recognizable if you search for it. It's almost like you need that extra CPU/GPU power only for playing in 4K or Virtual Reality.JRE84 likes this. -
yeah I think ill go for a 1070 or 2070...thanks so much for the detailed explanation, you are one smart man.
Franzerich: I agree I play crysis 3 alot and I just set textures to very high and everything on low, don't see the need to run veryhigh as it all looks the same no matter what I do.
so in short
1) set textures to ultra
2) set shadows to medium
3) set everything else to low and use some AA
4) Just enjoy the game
LoL ill be honest my thread seems like a troll, But im serious when I say I have difficulties noticing the difference between medium and ultra, and have no idea why a microscopic bump in image quality results in framerate dropping in half, developers need to smarten up and make the options worth selecting. -
-
yeah not sure what it will be called but its going to be epic.
-
I have a 1070 laptop, and I cannot tell the difference between high and ultra other than occasional differences in FPS. I can however, almost always see a loss in quality when dropping to medium.
-
oh wow....just crossed my mind as to why there is no difference between medium/high and ultra.
They are console ports.
I just downloaded fallout 4 and checked medium vs ultra...there is no noticeable difference.
wonkyfinger im not sure how you can say you see a difference between medium and ultra unless your avoiding AAA games that are console ports.
game developers are not going to make games have exclusive textures and post processing just for pc gamers very often, so it does happen once in a while, its rare. so to say something that is rare is common is kinda trolling.. -
thegreatsquare Notebook Deity
-
Not trolling, just what I see. I do, however, mostly play MMOs.
-
oh ok yeah...that could explain it....
thegreatsquare, I dont really see much difference in those pictures...I looked at one, the more distant objects that you would not even be looking at while playing the game have more detail but its so slight i don't think it justifies half the framerate.....what do you guys think -
-
ok good...I thought the world was losing it's mind
-
thegreatsquare Notebook Deity
I hate seeing things pop in, so I extend things out to max. The bridge piece floating in midair and building that will pop in on this shot will annoy the hell out me.
https://images.nvidia.com/geforce-c...teractive-comparison-001-ultra-vs-medium.html
I also use ini tweaks like uGridsToLoad=7 [stock "5"] as well as others to extend lighting and shadows beyond stock max preset.
[Edit] That the distant view of settlements don't change with what I build annoys me as well.Last edited: Jan 8, 2018 -
guess It's understandable, some people don't like pop-in/textures loading. I personally don't look at the far distance while playing im more focused on the character and enemies im fighting, so it is virtually un-noticeable. I wonder how many NBRers don't care for ultra in AAA games, wish I made a poll. you never know.
Do games support Multiple graphics settings, or is it placebo
Discussion in 'Gaming (Software and Graphics Cards)' started by JRE84, Jan 2, 2018.