How does the image quality compare on notebooks between the ati vs nvidia at lower resolutions 800x600 - 1280x1024.
-
-
Some stuff looks better with an ATI card, others look better with an nVidia card. Most stuff will look the same.
-
I'd have to agree with Lithus on this one.
-
Most things will look the same. The two brands are very simmilar, although nVIDIA is currently doing better for thier high end cards. But for a lower resolution, In my opinion, I dont think it will matter.
-
there will be subtle differences in different applications in direct3d/openGL or whatever, but I doubt you would notice it at that res.
-
TheGreatGrapeApe Notebook Evangelist
The image quality will be fairly similar.
What you will notice most is a differece in defaults more than anything. But tweak them properly and they will both look the same. ATi's defaults are usually a little richer, but you just need to tweak the nV cards and they are right there alongside. Thank God the oversaturated Digital Vibrance is gone as it was, the new vibrance is a bit better.
Regardless of resolution the IQ differences break down to very VERY minute differences in the quality of AF (favours nV now) and AA (favours ATi) but the differences are very slight, and depend on whether you use the settings or not, and also if these higher quality settings are even playable. Having better AA or AF quality but with crippling framerates as a result doesn't matter much.
In general though both are neck and neck and despite some misconception noise about IQ differences, there's really very few (probably more in HD playback than gaming).
Also don't be fooled by people using the old GF7 series issues to talk about ATi's 'better image quality' nowadays, all those old issues were fixed with the GF8 series, and they even boosted the output quality to 10bit per channel/component to finally match ATi, Matrox and S3.
Main thing is to try and get something that runs your app/game at native resolution more than anything, because if you're on an LCD interpolation is going to be a far greater issue than slightly different alogrythms or alpha/gamma differences. -
-
TheGreatGrapeApe Notebook Evangelist
You can tweak them both to look better than default, but once properly adjusted they both produce near perfectly the same image with only slight differences in algorythms or how they use alpha an gamma in their image processing (you really have to look EXTRA hard for the differences, and it's not like a glaring this one is better difference either, in fact often people prefer the one that is not the same as the refrast versus the one that is).
Think of it like this (just pulling #s out of my sphincter for illustration no one get all worked up);
The ATi/AMD solution at default may be 96% of perfect right out of the box, while the nVidia solution may be 94% of perfect right out of the box, but when calibrated they both are 99.44% of perfect. So while both can be increased once they are both tweaked they are the same.
Like I said the differences are incredibly minor like nV not having as angle dependant AF, and ATi having more AA methods to chose from and different colour corrected AA.
However usually the resulting differences are so minor you need a zoomed in enhanced still frame to even recognize that there is any difference. Under normal gaming conditions you would never see those minute difference especially at 30-60fps where the difference amounts to a single colour/gray-scale grade difference in one or few pixels.
This is nothing like the FX era or even the GF7 shimmering era. Now they're so equal you'd have to pretty much be told where to look and how before you might even really notice a difference.
ati vs nvidia lower resolution image quality
Discussion in 'Gaming (Software and Graphics Cards)' started by noseguard20, Mar 7, 2008.