The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Is AA necessary at high resolutions?

    Discussion in 'Gaming (Software and Graphics Cards)' started by midnitdragoon, Jul 19, 2007.

  1. midnitdragoon

    midnitdragoon Notebook Consultant

    Reputations:
    4
    Messages:
    270
    Likes Received:
    0
    Trophy Points:
    30
    wondering if it is or not. Is image quality very different with that on at 1900 res for example?
     
  2. conejeitor

    conejeitor Notebook Evangelist

    Reputations:
    28
    Messages:
    652
    Likes Received:
    6
    Trophy Points:
    31
    If you can, why not. It will look more like a movie.
     
  3. minxshin

    minxshin Notebook Consultant NBR Reviewer

    Reputations:
    41
    Messages:
    198
    Likes Received:
    0
    Trophy Points:
    30
    Unless I have a really powerful gfx card, I turn AA off most of the time, or set it low, like 2x. The FPS hit doesn't seem worth the marginally smoother lines. I barely notice it anyways since I'm always moving around in games. And since you're gaming at 1900 res, I'm going to assume you have a really good gfx card so yeah....
     
  4. usapatriot

    usapatriot Notebook Nobel Laureate

    Reputations:
    3,266
    Messages:
    7,360
    Likes Received:
    14
    Trophy Points:
    206
    Normally, the higher the resolution the less AA you need.
     
  5. Phritz

    Phritz Space Artist

    Reputations:
    68
    Messages:
    1,276
    Likes Received:
    0
    Trophy Points:
    55
    Depends on how close the pixels are together (dpi) e.g. Once newer games (in a few years time lol) start to lag my m9750 (screen dpi:133) I'll turn down/off the AA first thing, since I barely notice the AA anyway on such a dense screen, it doesn't matter how many pixels you have, its how big they are, the bigger the pixel, the more noticable Aliasing is, even on my VAIO (screen dpi: 120~) AA is still pretty much undetectable, unless you have really sharp eyes. Also: AA is most noticable in Space games, where there is a high contrast between a spaceship and the black background, and like Flight Sinulator benefits a lot from AA eye candy wise.


    See THIS thread to compare how obvious AA is, I've been on many different screens and resses and at 120dpi+ AA starts to get really unoticable, below around 90-100dpi AA is almost a must because the jaggies are s obvious
     
  6. J-Bytes

    J-Bytes I am CanadiEEEn NBR Reviewer

    Reputations:
    109
    Messages:
    811
    Likes Received:
    1
    Trophy Points:
    30
    I would recommend it on, even at such a high resolution. It doesn't have to be at high settings, like minxshin said, 2X is often sufficient. If your GPU is very powerful, you can turn it up to 4X or 6X, for an even better visual experience. I am aware that the ATI Catalyst Control Center has a feature that evaluates a program, application, game or movie and decides if AA should be turned on, with consideration for your GPU. If it decides it should be turned, it will also decide at which setting. nVidia may very well also have this feature, however, I am not at the moment aware of it.
     
  7. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    i agree with you that the benefits of AA depend strongly on the game, and i also agree that generally, as you increase resolution, aliasing is less noticeable anyway, and therefore there is less benefit to antialiasing.

    however, i disagree about the correlation between dpi and anti aliasing. dpi doesn't change what is physically being put onto the screen. unless you are at the point where you can no longer see specific pixel detail because the pixels are so small, but that would mean that you could simply use a lower resolution to increase performance without compromising image quality.

    imagine a 3200x2000, 13.3" screen.

    "The DPI is so high that anti aliasing doesn't matter"

    true- but the DPI is so high that you can't see any specific pixel detail at all anyway. you could be running 1600x1000 and not see any difference in the image...

    also, aa is often personal preference. i really like 2x aa in a lot of games, but i feel like more than 4x or so actually detracts from quality. its hard to explain, but it makes the edges too soft in my opinion...
     
  8. Phritz

    Phritz Space Artist

    Reputations:
    68
    Messages:
    1,276
    Likes Received:
    0
    Trophy Points:
    55
    The whole point of AA is to fool the eye into thinking a row of little squares is not a stair case but a smooth curve, at sufficient dpi the eye can't detect the individual squares and is also fooled into seeing a smooth curve.
     
  9. link1313

    link1313 Notebook Virtuoso

    Reputations:
    596
    Messages:
    3,470
    Likes Received:
    0
    Trophy Points:
    105
    Well going to a higher resolution short of works like AA as it makes the picture sharper in areas it otherwise wouldn't be. But high resolution + AA just looks amazing if your GFX card can handle it.
     
  10. midnitdragoon

    midnitdragoon Notebook Consultant

    Reputations:
    4
    Messages:
    270
    Likes Received:
    0
    Trophy Points:
    30
    if the resoulution is a little lower(1200) but you turn on AA will the picture look as great as it would at for example 1900 res?
     
  11. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    agreed, but if the dpi is so high that you can't detect the individual squares, then you wouldn't be able to discern any image quality differences even if you lowered the resolution

    (lets say you had a crt monitor to avoid interpolation concerns)
     
  12. quiong

    quiong Notebook Consultant NBR Reviewer

    Reputations:
    97
    Messages:
    245
    Likes Received:
    0
    Trophy Points:
    30
    On my 17' screen at 1920x1200, I find that AA doesn't matter quite so much. I can still tell the difference between no AA and 2xAA, but it's not a huge difference in the image quality imo, because the "jaggies" aren't very noticable in the first place.

    And yes, I find that a lower resolution (like 1280x800) at 4x AA doesn't look quite as good as 1920x1200 with no AA. When I run a game at the native resolution of my screen, it just looks sharper.
     
  13. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    1280x800 with AA vs 1920x1200 without AA...

    hmm, i think, (performance not being a factor) that 1920x1200 is going to look better.

    however, i personally think that quality games 1280x800 and above looks EXCELLENT. you are already a few pixels beyond 720p HD, which the current crop of consoles play at.

    edit: agreed. the most important thing is that whatever screen size you get- that you run your games in native resolution. that sharpness is a huge plus in my opinion. i find that it is better to give up some physical screen size to play games in 1280x800 (on my 1440x900 screen) instead of interpolating the image to fill up the screen.
     
  14. andrewt1187

    andrewt1187 Notebook Consultant

    Reputations:
    2
    Messages:
    261
    Likes Received:
    0
    Trophy Points:
    30
    With everything graphics related, its up to you if it's necessary. If you don't care, then don't use it.
     
  15. Mippoose

    Mippoose Notebook Deity

    Reputations:
    126
    Messages:
    885
    Likes Received:
    0
    Trophy Points:
    30
    When I played oblivion I found that full AA on a low resolution was not comparable to no AA on a high resolution.

    The game looked legions better with just the higher resolution.

    This is true for a lot of games.

    I know I personally think the source engine looks kinda iffy on FULL AA (just awkward), but very nice on moderate AA.
     
  16. Myriad

    Myriad Notebook Guru

    Reputations:
    0
    Messages:
    56
    Likes Received:
    0
    Trophy Points:
    15
    So what about if you have a 1900x1200 screen but run a game at 1680x1050? Will AA be able to overcome the inherent "jagged" effect that running at a non-native resolution causes?
     
  17. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    non-native resolution doesn't cause a jagged affect. it just makes things look a little blurry.

    the affect may or may not bother you. anti aliasing won't remove the blurriness caused by interpolation.
     
  18. Myriad

    Myriad Notebook Guru

    Reputations:
    0
    Messages:
    56
    Likes Received:
    0
    Trophy Points:
    15
    Ok, that is what I was confused about.