Ditto. I think the high res one looks better. And the moire is generally more pronounced in the low res images. The moire looks like a lighting effect at first, but when you look at it, you can see it is moire. For instance, look at the barrel and brown sack in the lower right of the image. There is banding that shows up on them, ie moire. It also shows up on some other similar surfaces.
There is still jaggies in the other ones too. And if you look at them full res, I really don't see that much diff between the jaggies in high res and low res with AA. And all the textures that aren't getting AA applied (because it is probably MSAA) look much worse in the low res images.
-
mmm, just tested low resolution with 4xAA and it looks really awful while moving around. Basically tried a sunset and a sunrise to check which combination looks better. And at low resolution with 4xAA everything looks a bit too blurry while 1920x1200 looks much much better.
Think at the end it comes down to what you prefer, smoother edges vs image quality.
Since as you can see from those two screenshots even without AA the environment looks stunning, with any other resolution other than the native one (1920x1200 in my case) it just looks too blurry to actually "enjoy" it.
Sorry about the first one again, apparently fraps doesnt like FC -
Yea, i kinda noticed that thats wat AA does, it blurs things into being straight almost lol. A few pages back someone posted left4dead images with it on and off... You really noticed those edges unless up close anyway. Low AA on the situation above actually serves to makes things look "crisper"... sometimes... high resolution = more important than AA
-
I would stick with AA definitely. I don't want to watch TRANSFORMER II with cranky and jagged lines.
This is merely Image Quality(pixels) vs Image Smooth Edges(art sense and natual). -
But the edges aren't really any smoother, they are just smudged.
-
The image of the right looks natural.(sun should shin like that) -
Smudged would make the image attach to the nature.
As told before, art sense and no art sense. -
lol. It has nothing to do with an "art sense". I understand the theory of why AA looks better. I just don't agree that it looks better than a native higher res image.
-
Some people do love VERY BIG IMAGE.
Cheer! -
Problem with those two setups is that with AA looks better as far as edges go. There are two problems with that.
1) Even without AA you cant really notice the edges since to notice them in that resolution you need to pay really attention to if the edges are smooth or not AND the thing you are checking has to have a bright background, the sky for eg in order to see them clearly.
2) With AA at lower resolution the edges are much harder to see, non-existent i could say for the human eye, the only problem is that the rest of the textures, like the ones i noticed, trees, sunshine, grass etc looks too blurry. Not sure if its because there are a lot of things to apply AA on these objects since they are small that at the end of it blurs the object instead of making it smoother.
So even if AA helps with the edges it actually creates another problem, cause lets face it, in a game with nice image quality (taking Crysis/Far Cry 2) as an example you dont run around checking edges, but you enjoy the environment while you go through the story of the game. In my opinion its basically like trading object smoothness for the image quality. Which might be a hard decision for some. Well to be honest i myself is a bit troubled with that (apparently). But after starting to pay more attention to the image difference between AA at low resolution and high resolution without it you can clearly see the "extra" blurriness you get with it on the whole game and not only on edges.
The first one took it with fraps so the image quality went down abit and its too dark cause apparently Fraps and FC are not that compatible, the 2nd one was again without any AA but i used the printscreen button to take a picture
First is sunrise, second in sunset
You would really notice the edges on that environment ?
-
Cold colour and Hot colour.
Cold colour tone = morning
Hot colour tone = evening
I always notice the jazzed lines T.T...
My eyes are like very sensitive to the jagged lines.
Blurriness is normal. Why it is blurriness?
Because in logic and real world, human eyes capture images in such a quality.
That's why got some artists comment about the matte screen and glossy screen.
Glossy screen is too sharp and not natural anymore.
Matte screen is not too sharp and natural. -
Now i can tell the difference between AA and no AA but again its really hard to notice it while playing
Doubt i would be able to tell the difference without actually trying to see them.
Blurriness is normal in real world yea (usually when you focus on something in the front for eg the back part blurs) but in games is a bit annoying since you cant control what you are focusing so at some point the part of the screen you actually try to focus is blurred. I tried to take a few screenshots to show you what i mean but for some odd reason i cant really see the blurriness on the screenshot that much while in game switching between those resolutions immediately you can tell the difference. Probably has something with photos been compressed and lose some of the quality.
But yea, at the end of the day it has to do with user preference, some might prefer the blurriness which smooths the edges and others might prefer the more rough edges but with sharper image/textures. -
Yeah i agree, blurriness is definitely normal in real world conditions.
that's why i love crysis even though it runs about 1/3 the fps of most FPS games. amazing effects they were able to pull off there using per object motion blur and depth of field effects.
btw, a good example i think of the high res vs low res w/ aa would be crysisscales/looks better in high res w/o AA when compared to low res. Textures look a lot sharper and stuff. great game.
<-- crysis addict
This is my first post, im nervous :S
Gateway P-Series FX (Santa Rosa - Crestline)
Intel T9500 Core2Duo 2.60Ghz 6mb L2 (Penryn @ 1.02v)
4GB Dual-Channel Kingston HyperX (4-4-4-12) RAM
Nvidia Geforce(m) 8800 GTS OC 512mb @ 600/825/1350
820GB Seagate Momentus 5400rpm (500GB+320GB)
IC 7 Carat Nano-Diamond Thermal Grease’d Heat Sink -
actually, I think that glossy screens look more life like. Screens can never be as sharp as real life and what you are talking about our eyes bluring things... it's not like we look at the screen with something other than our eyes, why not just let our eyes do it for us?
-
higher res, no AA.
-
-
I think another thing that supports this is native resolution of the panel itself, this can impact the quality also.
-
-
What? That doesn't even make sense. How does a matte screen make things out of focus? A matte screen is no more or less sharp than a glossy screen at the same resolution.
-
*PREFER* native resolution with no AA. But I'm fine with lower resolution (1280x800) with AA. It seems much better these days.
Also, regarding matte screen, I greatly prefer the matte screen. Glossy is just annoying. Matte screen doesn't look any less sharp. Most of the complaints are in the color saturation. But unless you're into photo, video, or art development, who cares? -
Because I am in art development. XD!
But I am using Glossy screen T.T -
RainMotorsports Formerly ClutchX2
I used to have a 17 inch monitor that was crap at non native res's unlike my last 2 laptops. I one day realized that I could run fear at 1280x1024 if i turned anti aliasing off (this was before i really started to understand graphics. 1280x1024 with no AA looked much better then 800x600 with 8xAA. On a monitor that doesnt scale non native res's well higher res for sure.
On my current 1366x768 i will play some games at 1280x720 in order to save a little power for higher settings or a little AA. On a proper screen 720P should look pretty well, especially on a TV that can do actual upscaling, its what you encounter with a game console anyhow. -
Definitely prefer Native res over lower res w/AA!
I've owned several laptops and many LCD's, and every time, it looks much better at native resolution.
I've found this to be true weather matte or glossy finish. -
Native resolution and no AA all the way. Though this is kind of a moot point for me and my 1280x800 resolution monitor
The real question is: Native resolution and medium/lower settings, or lower resolution and high settings? -
The only issue is AA since anti aliasing at 1920x1200 pretty much kills the fps completely -
I've never found AA to be an improvement.
I use powerful graphics cards and a high res screen and play at native resolution.
IMO, AA is way over rated and it's performance diminishing properties far outweigh any benefits.
Looking at the poll, it seems most people prefer high res without AA over lower res with AA. -
It is only user preferences. You might not notice the jagged lines I guess(some people eyes are not sensitive to it).
I still prefer AA XD! -
It's not that we don't notice the jaggies, but rather that we prefer slight jaggies to the fuzzy, out of focus look that AA/lower res and non-native res gives.
-
Assassin's Creed I had to play with AA, there was no option, the game looks terrible without it.
Other games, you can get away with no AA. Not every engine is as smooth as others. -
I do not prefer low resolution with AA. I do not prefer high resolution without AA.
I prefer high resolution with AA!
However, my computer sucks. Can't support high resolution with AA. I can just choose either low reso+AA or high reso+no AA. -
Yea most racing games are ok with AA since fps still stays pretty high.
But games like Crysis/Far cry 2/Cod4 (due to smoke nades) its pretty much impossible to aim effectively at max resolution with AA since fps drops pretty low during combat.
The best thing would be 1920x1200 with 2xAA id say but not many notebooks cant handle that with new games (yet). So for now i guess its max resolution without AA till the 4850's are ready for order
Another thing, im not sure if its an ATI "problem" but ive found a few bugs that AA causes. For eg, smokes in Cod4 goes from 125 fps (caped) down to 35, in WoW theres a place near Naxx (for those that actually play the game) which with 4xAA the fps drop from 100 to 10. Had a few issues in Far cry as well which i cant recall just yetAll of those problems disappeared when i dropped the AA.
-
Notebook GPUs are consistently worst at handling particle effects than desktop GPUs, so AA + smoke isn't as big a hit on a comparable desktop GPU than on a notebook model.
-
That explains why no one had the same issue when running into smokes
No one had a notebook
-
Desktop GPU,
9600GT and above such as 9800GTX, GTX 280 SLI and blah blah can support 4x-16x of AA for those games(cod4/5, wow, left4dead, grid and so on).
My friend play cod4 in high resolution with 4x AA in his desktop. The gameplay was awesome and the overall graphic was nice. No visible jagged lines. Is the best for high detail/pixel resolution and smooth edges by AA. -
I'd prefer a higher rez than lower + AA/AF.
Although I have been able to turn on 2x AA&AF in TF2 with my overclock. -
i played call of duty 4 in low res not knowing much about games i never went under settings until i finished the whole game until the end i decided to go under settings and saw that the res was only set at 800x600 wow the game look crappy so i decided to increase it to 1280x800 WOW what a difference everyone doesnt look fat (lol) they were all sandwhiched in before wow was i missing out even though my framerate did go don didnt bother me much though.....
now that i knoe more about computers and games i love playing around with the settings....
-
Because 800x600 is actually using NTSC/PAL 4:3 aspect ratio.
Your laptop might be 16:9 or 16:10 aspect ratio known as widescreen. In order to fit 4:3 to 16:9/16:10, the images will be stretch and caused "fatness" and blurriness.
If you try 800x600 windowed mode, you can see the different.
I used to play COD4 using 800x600 windowed mode with 4x AA. -
I noticed most of you are talking about edge aliasing.
I most definately prefer AA, but not so much for the edge aliasing, but rather for the texure aliasing.
so I would not think to go down in resolution, except perhaps only one notch (not half resolution or something like mentioned here)
Also, as far as I know, to remove texure aliasing you cannot use MSAA because it is only 'edge aliasing' and avoids alpha's and other texures mentioned earlier on the whole scene.... MSAA's adavantage was it only hits parts of the scene so not so much decrease in performance whereas if AA does hit the whole scene it's a MAJOR resource hog. I am not even sure if any current AA method implemented do this anymore (I stopped following up on all the AA methods when it just got too dang confusing!).
but if you have it, and you can turn it on and run it, then in my opinion it causes a massive improvement in the texure aliasing. You also dont need 4x or 8x, but rather only 2x for this effect to work!
Oh by the way in case you are wondering you only notice the texure aliasing when you are moving around and the texures 'shimmer' so a screne capture does not show it, you need to have a movie or something. I feel like my eyes are drawn to a flame when that happens and it really distracts me!
Also someone mentioned Moire patterns... I was under the impression this is caused by various methods of aniso and not AA methods, which are two completely different things. Is that incorrect thinking?
Thanks -
Anisotropic is related to textures.
Anisotropic texture filtering affects the crispness of textures.
Anisotropic filtering is a technique used to improve the quality of textures applied to the surfaces of 3D objects when drawn at a sharp angle. Enabling this option improves image quality at the expense of some performance. You can choose either to let the application determine the anisotropic filtering settings, turn anisotropic filtering completely off, or select from a number available settings. Higher values yield better image quality while reducing performance.
Anti-aliasing. In computer graphics, antialiasing is a technique for diminishing jaggies - stairstep-like lines that should be smooth. Jaggies occur because the screen display doesn't have a high enough resolution to represent a smooth line. Antialiasing reduces the prominence of jaggies by surrounding the stairsteps with intermediate shades of color. Although this reduces the jagged appearance of the lines, it also makes them fuzzier. Here's a zoomed in look at the difference. The image on the left is set to performance setting, which does not use antialiasing. The image on the right is using the quality setting which is using the antialiasing technique.
Setting for performance is best used when the 3D image is animated and the fluid motion of the scene is most important.
Setting for quality is best used when having a highly detailed and realistic 3D objects are the primary concern. -
Thanks Darksilver,
Yes Aniso is the crispness or sharpness of the texture as it gets further away from your viewing angle (view frustrum). so aniso is needed to see angled texures clearer the further away they go (think of the texures in a road that you are walking down).
But this can be seen in a screen capture.
This is not what I am referring to by texure aliasing. Alpha textures are possibly the best example of texure aliasing. these are usually large texures that are prevalent in a scene like glass windows, fences, trees bushes, grass, etc.
As you move through the environment these texures will 'shimmer'. Some techniques for for anti aliasing actual resolve this issue, but it uses 'too many' resources because they handle edges and texures (ALL texures even if not needed).
Basically, I would be willing to forgo the edge aliasing (save those resources) and use them for texure aliasing for certain types of texures (e.g. alpha and shadow texures) to not be so resource heavy, then use aniso for the regular texures that are not as 'affected' by texure aliasing for crispness based on viewing angle.
I have never seen an AA method that does not focus on edges.... when there is clearly a debate on the effectiveness of edge aliasing.... but there is no real debate on the texure shimmering issue (in my opinion) so I am just bringing it up as a point of discussion and am curious what others think. -
You can adjust it to 16x or 8x without slowing down the FPS of the games.
Anti-Aliasing is totally not same. Resources hog indeed.
So, Anisotropic is very light resources and got nothing to do with heavy resources Anti-Aliasing. -
AF helps smooth transitions between mipmap levels, so like said, it smooths the appearance of textures on surfaces oblique to the camera. However, it is not the only source of moire. Moire happens when there is a pattern that has a higher frequency than the pixel frequency in the image. You can get moire with a digital camera taking real world pictures.
As to seeing fuzzy textures with lower res+MSAA, you can see that in the images posted in this thread. Look at the text on the signs. It looks much worse at lower res. -
Im still confused as to why when taking screenshots i cant see the amount of blurriness i see in the game. Same with desktop screenshots. Resizing the desktop to lets say 1680x1050 blurs the image a bit, including the letters. If it take a screenshot it doesnt look that blurred (even if i resize the image to 1920x1200 in photoshop. No idea why
Would be nice if i could "capture" the amount of blurriness at lower resolution to show what i mean. -
-
A camera to capture the screen blurriness ?
that would be a bit pointless
-
I nominate DarkSilver as official ambassador of AA!
-
Thought I'd semi hijack this thread rather than start a new one. I'm getting the laptop in my sig. I plan to hook it up to my 32" 720p lcd tv but was wonder what you guys think about gaming resolutions. Basically, would I be better playing games using my laptop screen (native : 1600x900), or connecting via HDMI and playing games on my tv (1366x768). I understand that the higher the resolution the better, but in terms of more demanding games where I may need to turn down some visual settings, would I be better off running the game on my TV, or will I lose a lot of detail playing on the lower res tv?
-
Relax, your laptop can handle almost all games at 1600x900 with high settings more than just fine, except for a few extremely demanding ones. I can't think of a reason u should game on your TV, since its resolution is quite a bit lower than your laptops. Well, unless u got a TV with resolution higher than 1600x900, you should always stick with your laptop screen, it will look just better than games running at 1366x768, it is just as simple as that.
-
-
Well, you can surely try to game on your TV if u prefer bigger screens. But i personally feel kinda awkward sitting a few feet away from TV while playing games via wireless mouse/keyboard, games that can utilize controller can make me sitting on a couch while playing more comfortable i guess.
I have gamed for a while using my laptop screen(1366x768) until one day i bought a HDMI cable and hooked my computer to my 1080P Sony Bravia TV at home, i tried a few games on it at full resolution(1920x1080), god, it made my previous gaming experience at a limited 1366x768 screen look like utter trash. The only problem is, well as i mentioned above, using wireless mouse/keyboard while sitting on a couch doesn't feel quite right, so i gave it up and bought a 1080p external monitor, since then i almost never game with my laptop screen again.
High resolution or lower resolution with anti aliasing ?
Discussion in 'Gaming (Software and Graphics Cards)' started by CooLMinE, Jun 4, 2009.