I've got one of those XPS 1645 i7 720 laptops with 4GB of RAM and an ATI 4670 HD Mobility Radeon and am running with the powerplay option off so that I can play games.
My GPU stays at a consistent 400MHz, but it enables me to run things like Arkham Asylum, Empire Total War, Resident Evil 5 etc at just about max settings and resolution 1900x1024 (I think), albeit a tad slow, nothing over the top, right around 30fps.
Now if I drop the resolution to 1280x720 (I think that's what it is) I get about double the fps, and can add 2x anti-aliasing.
Does anti-aliasing really do all that much, or should I stick with no anti-aliasing and a higher resolution? I can't seem to tell the difference either way, but I'm not really looking at it under a microscope.
-
well your screen will always give a more vivid image at its native resolution, which is your max resolution.
thats the number one advantage of gaming (or doing anything) in native res.
if it doesnt bother you, gaming at a lower res, then maybe your screen downscales quite well or your not sensitive enough to see the difference.
if you cant tell the diff then stick to the lower res so your game will run faster, smoother. -
cool thanks. I think I'm going to see how I feel with the higher resolution and maybe a bit lower settings and see where it compares since the screen will give the more vivid image.
-
masterchef341 The guy from The Notebook
just pick which one you prefer. image quality is complicated and there aren't any right answers, usually.
Resolution Dilemma
Discussion in 'Gaming (Software and Graphics Cards)' started by joehempel, Mar 13, 2010.