Hey guys, I'm playing the new COD game in native res - 1920x1200. However, due to some performance issues I decided to downgrade to a lower resolution. To my surprise, the picture still looked pretty crisp. Reminded me back of the CRT days when any resolution would work just fine.
I'm not a big gamer... has technology caught up to the lcd monitors?
-
-
As long as you have 'use ATI/nVidia scaling' instead of 'display scaling' playing at non-native resolutions should look fine
(IMO)
-
Yeah and some games are good at scaling down. LCDs are still technically supposed to look better at their native resolution, but it depends on the monitor, the game and other factors as well. Some contexts will not yield a significant difference while other will yield a very noticeable one.
Oh well, I'm fine with my old CRT screen lolOnce it dies I'll get myself an LCD >.>
-
LCD monitors are the problem...they have a fixed (native) pixel structure vs CRTs which had a screen coated with phosphor and no real 'pixel'...in fact, we usually called it 'lines of resolution' because the gun would aim at the top (bottom?) of the screen and fire electrons at the phosphor screen and then aim down (up?) slightly and fire the next line like a typewriter....
If you ran anything at non-native resolution on the LCD, then the graphics adapter had to 'scale' the image...if you're screen is 1920x1200, and you're running the game at 1280x800, then it has to take the 1280 pixels and 'magically' turn them into 1920 pixels...it does this by looking at the color and hue of adjacent pixels, and 'guessing' what another pixel in between the two would look like and displays that...
Two things have happened...
'Scaling' algorithms (that look at the adjacent pixels and determine what the extra, new pixel will be) are much improved from laptops in the late 90's...
Also, you have a lot more physical pixels per inch on your screen (1920x1200), so imperfections in the scaling aren't as easily seen...they're too tiny...dead/stuck pixels are still out there but they're a lot less annoying on a 17" 1920x1200 screen than a 17" 1680x1050 screen....
So goes the theory... -
I wonder why they don't make a resolution evenly divided by the native resolution. For example, a 1920x1200 screen should have another resolution of 960x600. Going from 1920x1200 to 960x600 is exactly going down by a factor of 2, so all the video card needs to do is to duplicate each pixel into 2 pixels and therefore making a perfect match. No pixels sharing or guessing work is necessary.
-
Thank your lucky stars that you're not a videophile and don't mind a slight bit of blurriness. You've got a P-7811x, right? The high dpi (1920x1200 on a 17" screen) might also contribute to it not looking so bad at a non-native res.
-
ViciousXUSMC Master Viking NBR Reviewer
GPU (Software) scaling has looked great for years. Its much better to scale down a game for faster frame rates and higher game details than it is to play full native and have to sacrifice some of the game settings or play in a bad frame rate.
Monitor (Hardware) scaling is give & take. I find it to be great on my LVM-37W3 but its designed to be a monitor/HDTV and thus it knows part of its job is to scale things like game consoles to fit the screen.
Last time I tried hardware scaling on a pure computer monitor like my Naga IV it was not so pretty (but it had 1:1 atleast)
Scaling quality will be effected greatly depending on the resolution you choose, you want one that relates well to the native resolution of your screen. Especially atleast the right aspect ratio (1920x1200 is 16:10)
Im not hypercritical of things like this, cutting the resolution by 1 setting in the same aspect ratio looks almost the same, but performance can really jump up. Once you go 2 or 3 settings down, then it becomes pretty obvious how "soft" the image gets if you just swapped from the higher res. However after 5 minutes of playing you wont notice it anymore. If you had stuck with the max res and your computer cant handle it, you would still notice and probably hate the choppy frame rate.
Plus since your performance went up, if you did not already have the game details & effects up, turning them up is more than worth it. Plus if you had AA or AF off, turning those on now will really make the game look better. If you balance it right your picture quality will be better, and game run faster than trying to play game full native res on lower settings.
This is of course if your computer is right on the limit of making it or breaking it. You may need to really cut things down, or just barly be short of what you need to max it out at native, in that case just cutting one or two of the detail settings may work better for you (shadows is a good place to start, and if you have AA higher than 2x turn it down)
Screen Resolution Question
Discussion in 'Gaming (Software and Graphics Cards)' started by hax0rJimDuggan, Jan 6, 2009.