This has been troubling me for a while now.
Is it better to run games on high resolutions without anti aliasing or run them on a lower resolutions with 2x/4xAA ? I just cant decide what looks better...
The two setups ive been trying is 1920x1200 without AA and 1440x900 with 2xAA (you can guess my native resolution) on Far cry 2 and Call of duty 4 and basically ive been switching between those two setups for almost a week now since i cant decide.
-
Higher resolution mean more details and things or larger screen you can see.
Which mean more TORNING EFFECT you can see if you do not turn on Anti-Aliasing(AA). This is obvious because the SCREEN IS SO BIG.
If want to remove the TORNING EFFECT, AA(about 4x or above is need).
Smaller resolution mean lesser thing you can see and the stuffs become smaller(if windowed).
A little AA would cover up a lot of TORNING EFFECT. AA(2x to 4x should be enought) -
I prefer a higher resolution over lower res + AA.
-
I find nothing about a "torning effect" online. Anyway, after reading some more about how AA, especially FSAA works, it makes even less sense to say that lower res with AA would look better. The assumption here is that we are talking about playing the game full screen no matter what, so the size on screen stays basically the same; only the level of detail changes.
Based upon how FSAA works (at least with supersampling), it sounds to me that the best scalars would have similar effects to using FSAA. In any case, higher res translates to higher detail, which means the effects of AA become less and less. Basically, AA tricks the eye in to seeing smoother lines by fuzzing the edges of color boundries in an image. Scalars can have the same effect. However, just running at a higher res reduces the need to trick the eye, as the finer image (due to the higher resolution) looks better on it's own, requiring less trickery.
Basically, if you choose AA with a non native resolution, you are essentially passing the image through 2 scalars before hitting your screen, which means maximum fuzziness. Running without AA on a very high res means the highest edge definition, with the possibility of an increase in aliasing artifacts. However, given how high the res is, it becomes much less likely you'll notice the artifacts. So in summation, I'd say go high res without AA over non-native with AA. -
-
Well, I say if you can't decide which looks better, it doesn't matter for you. After all, the point of using a nice gaming laptop (or any gaming rig) is to make games look pretty,and pretty is in the eye of the beholder. If you can't decide which looks better, play at the one that gives you the best performance.
-
native resolution ftw... i can't stand lcd scaling.
-
-
The only time i noticed tearing is during benchmarks. Mostly on the far cry 2 one which at the end of the benchmark is really visible. Never seen tearing during the actual game play though, so im not that worried about it (yet
)
The main reason I mostly prefer 1920x1200 is because it seems like anti aliasing adds extra "issues" to games. For eg, going into a smoke nade in CoD4 with AA on (even on x2) pretty much kills your fps. That one and a few other bugs ive noticed in games when AA was active basically made me lean towards a higher resolution without AA to avoid those issues.
The only thing i was wondering is that if i was missing any eye candy for not using AA, but from what i see from your responses its not that big a deal at that high resolution -
Sorry for my poor English.
Some people misunderstood what is high resolution and full screened.
FULL SCREENED IS NOTHING TO DO WITH HIGH RESOLUTION.
FULL SCREEN MEAN STRETCH YOUR GAME TO FIT THE SCREEN ROGUELY.
RESOLUTION IS ADD-ON DETAILS, COLORS, MORE SPACE OF VIEWING AND MANY MORE.
High Resolution would give sharp images.
AA is to clear the tearing effect.
Theory 1,
The higher the AA the lower the tearing effect.
Theory 2(no AA),
The higher the resolution, the higher the tearing effect(NOT RELATED TO FULL SCREEN). This is based on Mathematic calculation.
For example(no AA):
in 800x600, a tall rectangular building is 100cm and have 50 tearing effect.
in 1024x768, the tall rectangular building is longer it might be 120cm and the tearing effect will be more like 60.
I don't like tearing effects so I choose AA more than high resolution. -
Had the impression that AA was mostly to smooth the edges than fixing the tearing effect
-
Those fantastic movies such as Transformer, Transformer II(going release soon), Terminator Salvation, AvP, and many more did not have any tearing effects or rough edges.
Don't tell me you like to watch Transformer II with a lot of tearing effects and rough edges =.=!?
Basically, gamers prefer high resolution rather than AA.
I am Art Lover and Art Student, that why I love AA more than high resolution because AA make the images look perfect(nothing is perfect LOL). -
Yea the main issue is that its really hard to tell the difference between those setups. The rough edges are not that bad in high resolutions. Ive been taking game screenshots for a few days now trying desperately to see the difference between those two setups and its really hard to spot something, hence the post
-
I was under the impression that the tearing effect was fixed by Vsync
-
Yeap same here, usually when people mention tearing in their games usually the "solution" is to turn vsync on.
-
Would it not be possible to drop the resolution to exactly half the dimensions of your existing screen so that the lcd scaling wouldn't make it look like crap? All it would have to do is duplicate one pixel into three around it. Sure, you would have about 1000x500 instead but it wouldn't look crap
-
Um, tearing has nothing to do with AA. Tearing happens because your frame buffer refresh and screen refresh get out of sync. V-sync forces front buffer and screen refreshes to sync, so it stops tearing. AA is to reduce jaggies and smooth the appearance of edges.
If a scene is rendered to a high res image/frame buffer (as opposed to rendered to a low res image/frame buffer then scaled, with many programs, including most console games that claim to run at 1080p, do), the effects of AA are very much less noticeable. I'm sorry, but you are just wrong here. Of course, if you have a very big screen, AA will still be noticeable, but if you are talking about a 15" or 17" LCD with WSXGA+ or WUXGA res, the pixel density is so high on those screens that the gains from AA drop way off when compared to just upping the rendering res. -
AA is great if you can handle it on top of running at native resolution, but I would highly suggest dropping AA instead of resolution if you have to make a tradeoff. Fortunately, most recent cards are much better at antialiasing than they used to be, so you can often turn it on with little performance drain. And I highly recommend turning on vsync in any game, especially if you might be CPU limited in it. Turning on vsync prevents frames from being rendered and then thrown away, never to be displayed, and leaves the CPU free for more AI/effects/whatever processing (if it's properly programmed) -
True definition of Anti-aliasing,
In computer graphics, antialiasing is a technique for diminishing jaggies - stairstep-like lines that should be smooth. Jaggies occur because the screen display doesn't have a high enough resolution to represent a smooth line. Antialiasing reduces the prominence of jaggies by surrounding the stairsteps with intermediate shades of color. Although this reduces the jagged appearance of the lines, it also makes them fuzzier.
Sorry for the tearing effect I get it wrong =.=!
Tearing effect is for vysnc. XD!
Conclusion, HIGH RESOLUTION & AA works together to perform superb images and animations.
That's why those fantastic movies(transformer, kungfu panda, terminator and so on) don't have jagged lines... because they use AA and VERY VERY HIGH RESOLUTION. If not mistaken, in animation softwares, there is option for AA as well. AA is for total beauty XD! -
-
v-sync is a whole other debate. Personally, I'm a fan of it (I HATE how tearing looks), and if a program does triple buffering, most of the negatives of using v-sync are mitigated.
-
Yes, it's another debate. But enabling vsync could theoretically speed up AA performance, as the card isn't trying to draw as many frames and AA them. FSAA usually involves drawing a much larger fame, and shrinking it appropriately, essentially taking sub-samples around every pixel displayed to get the color offset it should have:
http://en.wikipedia.org/wiki/Antialiasing#Full-scene_anti-aliasing
Since vsync limits the number of frames displayed, it could allow AA to effectively run with less penalty. At least that's the theory... -
AA is a big frame rate killer, so for the most part I prefer to run at native res first. It will also depend on the game a little, if there are a lot of edges, you might want to go with AA. For example, I'm running Far Cry 2 at native with 2X AA and at times it feels a little laggy. But I feel that AA is a must in this game because the game world is filled with blades of grass and tree branches that look horrible without AA.
-
ViciousXUSMC Master Viking NBR Reviewer
Depends on the game, older games seem to be more prone to bad jaggies (esp things like a chain link fence or odd sloped polys) on those older games some AA and a low res would be the better solution.
On newer games though they are so clean already if you can manage native resolution instead you wont need the AA, so thats the better way to go so that you get the sharper image. -
As pitabread says, most (if not all) game AA is done with supersampling, which is essentially rendering a higher res image (2X, 4X, 8X etc the res of the display buffer) and downscaling it. So in essence, if you run 960X600 at 2X FSAA to render to a 1920X1200 display, your GPU is rendering a 1920X1200 image, downscaling it to 960X600, then outputting it to your display which is turning around and upscaling it to 1920X1200 again. So if you run 1920X1200 with 2X FSAA, you are actually rendering a 3840X2400 image then downscaling it on the GPU.
So in theory, I suppose, depending mainly on the quality of the scalar in your monitor vs the scaling method used on the GPU in the downscale process, you *could* get a *slightly* better image by running at a lower, non-native res with AA on. But it would be very slight, most likely, and probably not worth the effort, so you'd most likely just be better off running at native res with AA off. -
What's the difference with multisampling? I know it is higher performance, and a lot of in game options probably use it.
I prefer native res first, then AA if I can handle it. It's nice running optimized games like L4D at vsync'd 60 fps 4x aa, 16x af at native res though, lol -
MSAA seems to refer to methods of FSAA where not all channels (luma,chroma, alpha, etc) or render passes (object, shadow, etc) are subjected to AA. It reduces demand on the system by not requiring as much work, but means not as much AA is actually done. It still usess supersampling though, just less of it.
-
Finally, you all understand the importance of AA.
But resolution is for overall images size.
So, it is mire important of course. -
FSAA is rarely used anymore because it's a huge performance hit. Other methods of anti-aliasing give the same or better results at much smaller performance hits. MSAA, MSAAQ, etc, ATI and NVidia each have different names for it.
-
-
But MSAA is just FSAA, only not all the rendering passes are done to the higher res buffer. The same things that are true for using FSAA are true for MSAA, only to a lesser degree.
Oh, and resolution has nothing to do with picture "size", only image fidelity. -
AA is important if you want a high quality and good images for your game.
I'll show 2 images here, 1 wif AA and 1 without AA.
Image 1 without AA,
Image 2 with AA, -
Also, the higher the res the lower the AA is required to smooth out the edges for reasons posted before. At 1280 x 800, I can set 2x AA and that is good enough to produce a smooth edge.
-
But without any AA the images really look suck. -
Regardless of the method of AA, it uses supersampling or some derivative thereof. If you take a 1280X800 image that was generated with 2X AA, at least part of it will have been rendered to a 2560X1600 frame, then downsampled to 1280X800. If you are pushing it to a 1920X1200 display, you are rendering to a buffer that is 33% larger (in each dimension) than the display, downsampling to a resolution that is 33% smaller (again, in each dimension) than the display, then upscaling to 1920X1200. You are honestly going to tell me that an image going through that many transformations is going to look cleaner than an image that is rendered once to a 1920X1200 buffer?
And before anyone comments, yes, in this instance, the appropriate metric is the ratio of dimension along each axis, not the ratio between total pixel count. For processing calculations, pixel count is more important, but in terms of scaling/sampling artifacting, dimensional ratio is more important.
The whole point of AA is to make up for the jaggy edges brought about by a lower resoultion by fuzzing edges to fool the eye in to thinking the edge is smoother. If you are rendering at a higher resolution, the line already looks smoother by virtue of being at a higher res. Again, yes, if you apply AA to it at that higher res, it will look still better, but that is not what we are talking about here. -
Because the higher the resolution, the lesser the AA needed. -
-
anti aliasing - off
vertical sync - off
antistropic - off
the best -
I am in the minority, but I would rather have a game run at Ultra settings at 13366*768 than High at 1900*1200.
I ended up recently getting a G50vt-x1 vs a 7805u because I don't think the games I want to run will be smooth at 1900*1200, and I hate running LCDs at anything other than native res.
It's a moot point with anything less than a 8800m/9800 (maybe even a 9600m GT or 4650) because the fill rate on mid-low cards is insufficient for monster pixel count panels. -
So, I messed up everything.
After all, I checked dictionary and found XD!
But we don't have such crazy graphic card to produce 5-digit resolution.
Thus, we need AA.
If we do have such crazy graphic card, we might need a cinema screen to display it as well. It is not practical. -
-
-
Sigh, this is still too hard
Had some free time so i decided to take some screenshots to help us out.
The first one is at 1280x800 with 4xAA, the 2nd one at 1440x900 with 2xAA and the 3rd one at 1920x1200 with no AA.
For the first time i managed to see the difference with no AA (notice the electricity poles, thats the main reason i took the screenshots at that place). But to be honest you cant really tell the difference while moving around etc.
And heres another random one i took today. Sadly i captured it using fraps before i noticed that printscreen button actually takes a screenshot and saves it in the far cry folder so its a bit dark ;( But its mostly to see what i mean when i say when you move around you dont notice the difference that much without AA on.
-
That's not 100% representative of what you are comparing though. Print screen just saves off the front buffer, so it is at render resolution, not display resolution. To really get the full effect, you'd have to scale the 2 lower res images up to 1920X1200 in photoshop or something, then compare them.
-
-
I'm noticing what look like moire effects in a few places in those images, and they actually look the same or worse in the lower res with AA shots. Do you know which flavor of AA that game is using? I'm guessing some form of MSAA, as it doesn't look like all the textures are getting the AA treatment.
-
This already mean that AA is important to produce non-jagged lines images.
So, the images produced can be high quality and high "art sense". -
The image on the left is low resolution with AA.
The image on the right is high resolution with non-AA.
IS D@MN OBVIOUS that the left image is better than the right image.
My theory correct after all. T.T -
ill take a lower res, to get AA.
I hate no AA, it looks crappy. -
Soviet Sunrise Notebook Prophet
Strictly arbitrary, my vote goes to high/native resolution with no AA.
High resolution or lower resolution with anti aliasing ?
Discussion in 'Gaming (Software and Graphics Cards)' started by CooLMinE, Jun 4, 2009.