The fact is that the higher the resolution of the screen, the more pixels the video card needs to feed the screen. In other words, a native 1650x1050 screen consists of over 1.7M pixels where as the native 1280x800 screen is only about 1M pixels. The program 3dMark06 (or any other program/game) may be set to 1280x800, but if it's running in "full screen" mode, isn't it still pushing around the same number of pixels as the native resolution?
Therefore it would only stand to reason that the given the same video card, the higher the native resolution of the screen, the lower the video performance. Is this a true statement?
-
Also, I think the higher the bus width of a graphics card, the better it can handle games in higher resolutions. -
If that's the case, then would running in "windowed" mode result in better performance since the graphics card would then only be pushing out the resolution set in the application/game?
I'm in the process of installing Crysis, I'll be trying it out both ways to see if there's a difference. It would be nice if I could run everything in "high" settings even if it means I would have to run it in a smaller window. -
openGL can handle windows... -
Actually, the "stretching" of an image at the non-native resolution is handled by the monitor and has nothing to do with the video adapter. This is why some older LCDs didn't even stretch the picture but instead displayed the image centered with black bars around the side.
1280x800 on a native 1680x1050 screen is no more strenuous on the video adapter than on a native 1280x800 screen. -
Vostro Screen Resolution Performance Question.
Discussion in 'Dell' started by steveeb, Dec 8, 2007.