Hi, all,
As title says, I'm wondering the performance difference of these two configurations.
For gaming on a 17" laptop, my experience is that 900p and 1080p don't look that difference while extra effect such as AA/AF/post processing are more noticeable. So I'd really appreciate a 900p screen which sort of extend the lifespan of the GPU. unfortunately it seems almost all high-end gaming laptop nowadays are 1080p even for 15.6" ones. So far I only found toshiba Qosmio has 17" 900p configuration with gtx560m, which I'm actively considering. Compare with sager np8170 wich has gtx580m option and are 1080p, will the gaming performance (fps mostly) roughly be the same?
If they are, then I'd rather prefer choosing the toshiba and use the extra a few hundred bucks for an external 1080p monitor when I need to do programming or web surfing or watching videos, or on an SSD or some other stuff.
-
You could just run games at 900p...
-
not everybody enjoy playing games in non-native resolution on lcd screen
-
Res is very dependent upon memory bandwidth. Due to generally laptop GPUs having limited bandwidth, people opted for lower resolutions.
This is no longer the case with GTX580m and HD6990m (even GTX570m) because they have huge over 100GB/s bandwidth and thus the performance hit from 900 res to 1080 res is not much if at all.
Just as for 560m and HD5870m have little performance hit from 800 to 900 res, but 1080 res takes a higher performance hit, due to 60s GB/s.
GTX570/580m will be better at any resolution and any game, every time, even at slightly higher res. They just have more graphical power coupled with much more memory bandwidth. -
It's going to look the same whether it's native or non-native. It doesn't make that much of a difference for games.
-
Specially if you use external monitos with upscaling, then even lower res dont look as bad. I use 1600x900 for deus ex and runs incredibly fluid while looking like 1080 on my tv due to upscaling.
-
you mean the internal upscaling of the TV. yes I know that and that's how all the current console games do, processing in 720p but upscaling to 1080p to play on HDTV.
But on a laptop monitor it's not as good. I've tried that with the latest games such as Alice Madness Returns & Crysis2 on my old laptop which has a gtx 260m and a 1200p screen. @1200p both games are choppy due to the old GPU, 1680x1050 is better and only @ 1440x900 can I say it's smooth. In these later cases, I'd choose to play in a window than let it scale up to occupy the whole screen but with blurred image (even with AA). I just cant ignore the difference.
Back to the topic, I see what you mean on the bandwidth limit stuff. But reducing resolution also affect the pixel count in each polygon calculated right? So the computational load shall also reduce thus improve performance (speed) right? -
Yes, performance does increase when reducing resolution due to a bit less computational load, but overall the main attributo for resolution alone, along with textures etc is memory bandwidth. When you have enough power in your GPU, bandwidth will be the limiting factor when increasing resolution. As for native, yeah I also use 1600x900 with black borders (not stretched) to play some games, or full native. The difference is very noticeable.
however the GTX580 simply has much more computational power than the 560m, so regardless of resolution, it will perform better all the time. Even as time goes by, you will only need to reduce effects etc, but the huge memory bandwidth will allow you to continue 1920x1080 res. -
thanks for the detailed information.Guess a gtx580m is the better way in the long run. plus changing a lcd screen is always cheaper than upgrading a laptop GPU.
gtx560m@900p vs gtx580m@1080p
Discussion in 'Gaming (Software and Graphics Cards)' started by mangos47, Sep 5, 2011.