wondering if it is or not. Is image quality very different with that on at 1900 res for example?
-
-
If you can, why not. It will look more like a movie.
-
Unless I have a really powerful gfx card, I turn AA off most of the time, or set it low, like 2x. The FPS hit doesn't seem worth the marginally smoother lines. I barely notice it anyways since I'm always moving around in games. And since you're gaming at 1900 res, I'm going to assume you have a really good gfx card so yeah....
-
usapatriot Notebook Nobel Laureate
Normally, the higher the resolution the less AA you need.
-
Depends on how close the pixels are together (dpi) e.g. Once newer games (in a few years time lol) start to lag my m9750 (screen dpi:133) I'll turn down/off the AA first thing, since I barely notice the AA anyway on such a dense screen, it doesn't matter how many pixels you have, its how big they are, the bigger the pixel, the more noticable Aliasing is, even on my VAIO (screen dpi: 120~) AA is still pretty much undetectable, unless you have really sharp eyes. Also: AA is most noticable in Space games, where there is a high contrast between a spaceship and the black background, and like Flight Sinulator benefits a lot from AA eye candy wise.
See THIS thread to compare how obvious AA is, I've been on many different screens and resses and at 120dpi+ AA starts to get really unoticable, below around 90-100dpi AA is almost a must because the jaggies are s obvious -
I would recommend it on, even at such a high resolution. It doesn't have to be at high settings, like minxshin said, 2X is often sufficient. If your GPU is very powerful, you can turn it up to 4X or 6X, for an even better visual experience. I am aware that the ATI Catalyst Control Center has a feature that evaluates a program, application, game or movie and decides if AA should be turned on, with consideration for your GPU. If it decides it should be turned, it will also decide at which setting. nVidia may very well also have this feature, however, I am not at the moment aware of it.
-
masterchef341 The guy from The Notebook
i agree with you that the benefits of AA depend strongly on the game, and i also agree that generally, as you increase resolution, aliasing is less noticeable anyway, and therefore there is less benefit to antialiasing.
however, i disagree about the correlation between dpi and anti aliasing. dpi doesn't change what is physically being put onto the screen. unless you are at the point where you can no longer see specific pixel detail because the pixels are so small, but that would mean that you could simply use a lower resolution to increase performance without compromising image quality.
imagine a 3200x2000, 13.3" screen.
"The DPI is so high that anti aliasing doesn't matter"
true- but the DPI is so high that you can't see any specific pixel detail at all anyway. you could be running 1600x1000 and not see any difference in the image...
also, aa is often personal preference. i really like 2x aa in a lot of games, but i feel like more than 4x or so actually detracts from quality. its hard to explain, but it makes the edges too soft in my opinion... -
The whole point of AA is to fool the eye into thinking a row of little squares is not a stair case but a smooth curve, at sufficient dpi the eye can't detect the individual squares and is also fooled into seeing a smooth curve.
-
Well going to a higher resolution short of works like AA as it makes the picture sharper in areas it otherwise wouldn't be. But high resolution + AA just looks amazing if your GFX card can handle it.
-
if the resoulution is a little lower(1200) but you turn on AA will the picture look as great as it would at for example 1900 res?
-
masterchef341 The guy from The Notebook
(lets say you had a crt monitor to avoid interpolation concerns) -
On my 17' screen at 1920x1200, I find that AA doesn't matter quite so much. I can still tell the difference between no AA and 2xAA, but it's not a huge difference in the image quality imo, because the "jaggies" aren't very noticable in the first place.
And yes, I find that a lower resolution (like 1280x800) at 4x AA doesn't look quite as good as 1920x1200 with no AA. When I run a game at the native resolution of my screen, it just looks sharper. -
masterchef341 The guy from The Notebook
1280x800 with AA vs 1920x1200 without AA...
hmm, i think, (performance not being a factor) that 1920x1200 is going to look better.
however, i personally think that quality games 1280x800 and above looks EXCELLENT. you are already a few pixels beyond 720p HD, which the current crop of consoles play at.
edit: agreed. the most important thing is that whatever screen size you get- that you run your games in native resolution. that sharpness is a huge plus in my opinion. i find that it is better to give up some physical screen size to play games in 1280x800 (on my 1440x900 screen) instead of interpolating the image to fill up the screen. -
With everything graphics related, its up to you if it's necessary. If you don't care, then don't use it.
-
When I played oblivion I found that full AA on a low resolution was not comparable to no AA on a high resolution.
The game looked legions better with just the higher resolution.
This is true for a lot of games.
I know I personally think the source engine looks kinda iffy on FULL AA (just awkward), but very nice on moderate AA. -
So what about if you have a 1900x1200 screen but run a game at 1680x1050? Will AA be able to overcome the inherent "jagged" effect that running at a non-native resolution causes?
-
masterchef341 The guy from The Notebook
non-native resolution doesn't cause a jagged affect. it just makes things look a little blurry.
the affect may or may not bother you. anti aliasing won't remove the blurriness caused by interpolation. -
Is AA necessary at high resolutions?
Discussion in 'Gaming (Software and Graphics Cards)' started by midnitdragoon, Jul 19, 2007.