The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    ati vs nvidia lower resolution image quality

    Discussion in 'Gaming (Software and Graphics Cards)' started by noseguard20, Mar 7, 2008.

  1. noseguard20

    noseguard20 Notebook Enthusiast

    Reputations:
    1
    Messages:
    35
    Likes Received:
    0
    Trophy Points:
    0
    How does the image quality compare on notebooks between the ati vs nvidia at lower resolutions 800x600 - 1280x1024.
     
  2. Lithus

    Lithus NBR Janitor

    Reputations:
    5,504
    Messages:
    9,788
    Likes Received:
    0
    Trophy Points:
    205
    Some stuff looks better with an ATI card, others look better with an nVidia card. Most stuff will look the same.
     
  3. Ithcandos

    Ithcandos Notebook Consultant

    Reputations:
    16
    Messages:
    113
    Likes Received:
    0
    Trophy Points:
    30
    I'd have to agree with Lithus on this one.
     
  4. ahl395

    ahl395 Ahlball

    Reputations:
    3,867
    Messages:
    8,218
    Likes Received:
    72
    Trophy Points:
    216
    Most things will look the same. The two brands are very simmilar, although nVIDIA is currently doing better for thier high end cards. But for a lower resolution, In my opinion, I dont think it will matter.
     
  5. ClockedRodent

    ClockedRodent Notebook Consultant

    Reputations:
    72
    Messages:
    245
    Likes Received:
    0
    Trophy Points:
    30
    there will be subtle differences in different applications in direct3d/openGL or whatever, but I doubt you would notice it at that res.
     
  6. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    The image quality will be fairly similar.

    What you will notice most is a differece in defaults more than anything. But tweak them properly and they will both look the same. ATi's defaults are usually a little richer, but you just need to tweak the nV cards and they are right there alongside. Thank God the oversaturated Digital Vibrance is gone as it was, the new vibrance is a bit better.

    Regardless of resolution the IQ differences break down to very VERY minute differences in the quality of AF (favours nV now) and AA (favours ATi) but the differences are very slight, and depend on whether you use the settings or not, and also if these higher quality settings are even playable. Having better AA or AF quality but with crippling framerates as a result doesn't matter much.

    In general though both are neck and neck and despite some misconception noise about IQ differences, there's really very few (probably more in HD playback than gaming).

    Also don't be fooled by people using the old GF7 series issues to talk about ATi's 'better image quality' nowadays, all those old issues were fixed with the GF8 series, and they even boosted the output quality to 10bit per channel/component to finally match ATi, Matrox and S3.

    Main thing is to try and get something that runs your app/game at native resolution more than anything, because if you're on an LCD interpolation is going to be a far greater issue than slightly different alogrythms or alpha/gamma differences.
     
  7. noseguard20

    noseguard20 Notebook Enthusiast

    Reputations:
    1
    Messages:
    35
    Likes Received:
    0
    Trophy Points:
    0
    you say tweak the nvidia cards to look the same as ati cards, but couldn't you tweak an ati card to look even better than it does at default?
     
  8. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    You can tweak them both to look better than default, but once properly adjusted they both produce near perfectly the same image with only slight differences in algorythms or how they use alpha an gamma in their image processing (you really have to look EXTRA hard for the differences, and it's not like a glaring this one is better difference either, in fact often people prefer the one that is not the same as the refrast versus the one that is).

    Think of it like this (just pulling #s out of my sphincter for illustration no one get all worked up);

    The ATi/AMD solution at default may be 96% of perfect right out of the box, while the nVidia solution may be 94% of perfect right out of the box, but when calibrated they both are 99.44% of perfect. So while both can be increased once they are both tweaked they are the same.

    Like I said the differences are incredibly minor like nV not having as angle dependant AF, and ATi having more AA methods to chose from and different colour corrected AA.

    However usually the resulting differences are so minor you need a zoomed in enhanced still frame to even recognize that there is any difference. Under normal gaming conditions you would never see those minute difference especially at 30-60fps where the difference amounts to a single colour/gray-scale grade difference in one or few pixels.

    This is nothing like the FX era or even the GF7 shimmering era. Now they're so equal you'd have to pretty much be told where to look and how before you might even really notice a difference.