The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Resolution vs. Graphics power

    Discussion in 'Gaming (Software and Graphics Cards)' started by redrazor11, Nov 5, 2008.

  1. redrazor11

    redrazor11 Formerly waterwizard11

    Reputations:
    771
    Messages:
    1,309
    Likes Received:
    0
    Trophy Points:
    55
    At what point does resolution become a bottleneck for graphics card performance?

    I've always heard people say "Don't even bother with sli or crossfire unless your running atleast 1900x1200 res or higher"

    If someone is using a monitor native 1280x800, 1200x1024, or 1400x700, what would be best card without spending more than you need? Obviously, a 4870 is going to be wasted power at someting like 1024x768. Thoughts?
     
  2. brian.hanna

    brian.hanna Notebook Evangelist

    Reputations:
    44
    Messages:
    410
    Likes Received:
    0
    Trophy Points:
    30
    i don't think resolution can be counted as a botteneck, it's not that any of the cards work is being "wasted", it's just not being used to it's full potential.

    imo, anything below a 4870 is fine for 1400x700 and below, i mean, sure the resolution isn't super intensive, but you would a least like it to run at higher graphic settings.

    Better cards do more than just try to negate the loss of performance as resolutions go up.

    so saying at what resolutions (going down) a card isn't "worth it" is really hard to say.

    The main reason that they will say crossfire isn't needed below 1920x1200 is because crossfire/sli setups will usually hae over 1gb of memory, which is really overkill for most games. Thats not to say you wont see an improvement, i mean, look at crysis, at 1680x1050 even a 4870 cant run everything maxed out, but a crossfire setup would probably hit like 39avg frames per second.
     
  3. Brickmen

    Brickmen Notebook Consultant

    Reputations:
    0
    Messages:
    110
    Likes Received:
    0
    Trophy Points:
    30
    it does depend absolutely on the game and settings, but if a card (s) gives over 60FPS on top settings on a perticular game at the desired resolution it does seem pointless to get an upgraded or multicard setup. However there maybe games in the future that need it (eg crysis).

    Also for higher resolution gameplay sli and crossfire are preferred as you can get the extra memory in the cards for the more pixels, assuming the cards can use it.
     
  4. Rorschach

    Rorschach Notebook Virtuoso NBR Reviewer

    Reputations:
    1,131
    Messages:
    3,552
    Likes Received:
    17
    Trophy Points:
    106
    In my small opinion at those resolutions a graphics card can still be the bottleneck. Reason I say that is while your not gaming at a higher resolution you can crank up things like AA and AF and still slow your card down. Also you could say playing at those resolutions and picking a higher performance graphics card means it will last you quite a bit longer. Good example would be the original 8800 gtx. The card what almost 3 years old now and its still playing games at good resolutions and high fps. The card could easily last another year or two if someone was willing to drop the resolution down.
    If it was me, and I was looking for a investment I would sit down and think when is the next time I want to upgrade and then make my decision on how long I can wait. The cards that really show almost no benefits at lower resolutions are the x2 or crossfire cards. For nvidia, anything below 1440x900 shows diminishing returns for sli. I think the gtx 260 would be the best choice and should last you at least a year or two at the resolutions you want to play at.
    I'm actually going to return my 4870 1gb in a day or two and get a evga gtx 260. While the performance are pretty much identical, the lack of drivers for radeon and the lower average fps when compared to nvidia has won me over. Not only that, but the 55nm gtx cards are expected in December meaning I could use the step up program offered by evga. That and I could save 50$ on top of a restock fee to change to a gtx 260.
     
  5. brian.hanna

    brian.hanna Notebook Evangelist

    Reputations:
    44
    Messages:
    410
    Likes Received:
    0
    Trophy Points:
    30
    one thing i love though about crossfire, is how freaking well they scale at higher resolutions, a 4870x2 won't even blink at 1920x1200 resolutions and up.

    of course, at those gfx levels, it's the cpu that's the bottleneck usually.
     
  6. Rorschach

    Rorschach Notebook Virtuoso NBR Reviewer

    Reputations:
    1,131
    Messages:
    3,552
    Likes Received:
    17
    Trophy Points:
    106
    Fixed. Game has to support it to get that performance and usually radeon gets to it eventually....their driver support is just dismal when compared to nvidia.
     
  7. spradhan01

    spradhan01 Notebook Virtuoso

    Reputations:
    1,392
    Messages:
    3,599
    Likes Received:
    5
    Trophy Points:
    106
    I think if you high the resolution then lower the graphics or vice versa unless you have a poweful system
     
  8. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    I agree with you bwhxeon the original 8800GTX still kicks butt in games and performance. I have one and have had it for over 2 years now. Now I only have a max res of 1440x900 on my Samsung Syncmaster TFT for my desktop. But no trouble maxing out any game, even Crysis runs good at Very High DX10 at 1440x900.

    The 8800GTX will last me another year. That is one hell of a GPU, the best I have owned so far. I think I will wait until the next Nvidia iterations above the GTX 280 until I upgrade next time.

    The ATI 4870 seems very nice however the driver support that is what holds me away. Nvidia on the other hand outputs drivers like crazy and doesn´t need any hotfixes for Far Cry 2 nor other games like ATI usuall does.

    This is what holds me away from ATI. I have owned an ATI board once the X800 XT, bad driver support and it made me jump over to Nvidia.

    And now with the Big Bang II drivers Nvidia released my performance is top notch and especially for my SLI notebook with 8800m GTX SLI.
     
  9. Dustin Sklavos

    Dustin Sklavos Notebook Deity NBR Reviewer

    Reputations:
    1,892
    Messages:
    1,595
    Likes Received:
    3
    Trophy Points:
    56
    Respectfully, are you high on life?

    First of all, I haven't found a game outside of Crysis that my 4870 512MB doesn't run perfectly.

    Second, ATI releases WHQL drivers monthly. NVIDIA releases drivers nearly at random.

    Third, ATI has been releasing hotfixes within twenty-four hours for the last couple of major releases.

    Fourth, the drivers for my Radeon have been consistently more stable than the drivers for my old 8800 GTS 640MB ever were. I'm running a 4870 and 3450 with three screens and I have yet to see a driver crash. Compare this to when I was running the 8800 with a 7100GS for the third screen and OpenGL games just plain wouldn't run. No, really, I had to disable the 7100 just to get Doom 3 to load.

    This sort of thing really frustrates me. NVIDIA's driver support in Vista has been awful since G80 debuted, and G80 owners (and owners of derivatives thereof) had to at one point wait six months for a WHQL driver update. You guys may remember NVIDIA drivers being the gold standard back in the XP days, but ATI's drivers have been rock solid for the past year. Their driver team actually hangs out on forums and listens to users, and fan control appeared by popular demand in 8.10.

    No, I don't miss the kind of driver juggling I've had to do with NVIDIA hardware, and I had to turn to a special custom driver for my 8400M GS to get it to run happily. NVIDIA's modern driver support is dismal. Big Bang II doesn't even include anything new outside of PhysX that's worth mentioning, except for dual screen operation in SLI that is still inferior to ATI's year old implementation.

    Oh, and by the way, it seems like the vast majority of benchmark sites don't know how to benchmark the Radeon HD 4800 series. At 4xAA it's going to look a little weak next to the GTXes. The 4800s were designed to run at 8xAA, where the GTX line takes a precipitous performance drop while the 4800s scale quite gracefully. I don't even bother running below 8xAA in games where I use AA.

    And just to get back on topic from hijacking the thread, resolution is generally the biggest burden on GPU hardware, with AA coming in second. At lower resolutions (maxing out at 1440x900), a Radeon HD 4670 or GeForce 9600GSO would probably be the most balanced choice, while a Radeon HD 4850 would be more futureproof and max out anything you threw at it at that low resolution.
     
  10. 660hpv12

    660hpv12 Notebook Deity

    Reputations:
    63
    Messages:
    1,031
    Likes Received:
    0
    Trophy Points:
    55
    I think resolutions has mos to do with how much frame buffer the card it has, or one I know crysis played on 2560x1600 eats more than 1GB of frame buffer (make sure the card can uses those memories, there are plenty 8500gt having 1gb of video ram while using a 64-128bit bus). generally speaking a card cannot use more than twice the bandwith as in a 128bit wont uses much more than 256mb, but with faster ram like GDDR3 or GDDR5 that number goes up a bit. for one 256bit bus can almost use 1GB of GDDR5 memory. I'm assuming you are talking about desktop cards then something like a 9800gtx or 4850 can handle upto 1650x1080 no problem and you really dont "need" 3 SLi gtx 280 or quad fire 4870x2 as in thoses cases the cpu becomes to bottleneck anyways.
     
  11. Rorschach

    Rorschach Notebook Virtuoso NBR Reviewer

    Reputations:
    1,131
    Messages:
    3,552
    Likes Received:
    17
    Trophy Points:
    106
    Pulp you sure went on the defensive for something so insignificant. I've tried radeon and I'm not happy with it. I will give you credit where its do and thats in vista radeon has done better. I won't argue with you about that, but I don't use vista and never will. Since windows 7 will be in beta in just a few more months and this time microsoft is pretty much requiring everyone to provide signed drivers. As for 8x aa being more efficient on the 4870 I don't really by that one especially on the 512mb version. Every single benchmark I've read doesn't show either one of the cards handle 8x aa very well. The 4870 might handle it better, but not enough that its always playable. As of this morning I'm going back to Nvidia and here is why...
    http://www.newegg.com/Product/Product.aspx?Item=N82E16814127361
    I'm going to get almost 70$ back and getting a factory overclocked gtx 260 for under 200$. I'll decide for myself which card is better.