The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    COD4 let-down

    Discussion in 'Gaming (Software and Graphics Cards)' started by Simpler=Better, Oct 25, 2009.

  1. Simpler=Better

    Simpler=Better Notebook Consultant

    Reputations:
    74
    Messages:
    165
    Likes Received:
    0
    Trophy Points:
    30
    I bought my notebook for work and occasional light gaming. Just for fun I thought I would try to max out COD4 displaying on my friend's 42" 1080p TV.
    Dell studio 1737, 2.1ghh/2mb ATI 3650 512MB
    Fresh XP SP3 install, newest ATI drivers, used mobility modder.
    HDMI output only, notebook screen disabled
    COD4 single player settings: Max everything, resolution at 1900x1080 60hz

    The game stuttered and had too much lag to play, I'm estimating somewhere in the 10-20fps range.

    I turned everything down to medium(And left the resolution at 1900x1080) and the game was playable(And still looked great) but I was a little let down. I thought a 512MB videocard could handle a 1+yr old game on high.
     
  2. scadsfkasfddsk

    scadsfkasfddsk Notebook Evangelist

    Reputations:
    103
    Messages:
    653
    Likes Received:
    0
    Trophy Points:
    30
    It is not the 512mb of memory on the graphics card that is the problem. The problem is the graphics card isn't strong enough. You need a 256bit card if you want to have any hope you playing a recent game on such a high resolution. You would probably need an SLI/crossfire cards, or a 3850,4850 or better from ATI, or a 8800, 260 from NVIDIA to have much hope of that.

    Although the amount of memory the card has does have the potential to be important, it is only one of a range of factors influence the capabilities of the card.
     
  3. Apollo13

    Apollo13 100% 16:10 Screens

    Reputations:
    1,432
    Messages:
    2,578
    Likes Received:
    210
    Trophy Points:
    81
    Yeah, the 3650 just isn't powerful enough to expect maximum settings at such a high resolution for COD4. I've got an 8600M GT, which is similarly powerful to the 3650, and usually use medium settings at 1280x800. I'd never dream of trying 1920x1200, even with medium. And it wouldn't matter if I had 2 GB of video memory - that's not the limiting factor, even with the 256 MB I have now.

    At 1280x800, I was able to get over 60 FPS on Low, and 40-60 FPS on medium (without shadows), with a card similar to yours. But you were just overestimating the power of your graphics card - memory alone doesn't mean much.
     
  4. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    yeah, well, you just took a midrange *laptop* gpu, set everything to max in CoD4, at 1080p resolution.

    I'm not sure what you were expecting...

    the consoles play CoD4 in 1024x600. my 9600m GT plays it in 1280x720.

    1920x1080 is a factor of two increase in resolution from 720p, so you would *basically* need a factor of two increase in GPU power to play at the same frame rate. something along the lines of a 256 bit gpu with 64 shaders or more (i dont know the conversion factor from nvidia shaders to ati shaders).
     
  5. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    Graphics performance is pretty dependent on resolution, and unless you have a card with a 256 bit memory bus (not specifically because of this, but as far as capability-class determining factors), chances are you card is not capable enough of doing very high resolutions with newer games very well. The card in my laptop, a Geforce 9800GS has a hard enough time doing CoD4 at 1080p with stable framerate at all high settings, and it's a 256 bit memory bus card. Typically 1080p for me is for older games. Generally I run newer games at 720p like Crysis to keep the game stable, and I'll lower graphics settings just to get nice high FPS in multiplayer, where it's crucial.

    And Masterchief, a basic conversion for ATi to Nvidia shaders *remember I said BASIC, it's not very accurate but it's a decent way to group ATi vs Nvidia* is to take the Nvidia number of shaders, and multiply by 5 or to take the ATi number of stream processors and divide by 5. You get a rough way of comparing numbers of shaders between the two. Its also interesting to note that ATi cards have a design as to have the stream processors in groups of 5 to create a single 5-dimensional system (I'm not sure what the proper nomenclature is), with one stream processor as a master, and 4 others operating under it as slaves. The cards are then built with these groups in 4s to create the cards you see with 40/80/120/etc numbers of stream processors. I can get very complicated about this but I choose not to, since few would actually understand.

    Sooooooo not taking into account the extremely varying clock speeds in our various graphics cards (though we can directly compare core speeds and get rudimentary TMU and ROP fillrate comparisons).........a Radeon 3850 with it's 320 stream processors / 5 = 64 shaders = Geforce 9600GT Desktop in terms of class and so one and so forth, but like I said, core, shader, and memory clock speed is always an important piece of consideration as well as bus width and amount of memory available.

    But lets make some more examples using laptop GPU examples:

    Radeon 2400/3200/3450...............40 Stream Processors / 5 = 8 Shaders ------> Geforce 8400M/9200
    Radeon 4330/4530/4570...............80 Stream Processors / 5 = 16 Shaders ------> Geforce 8600M GS/9400M G I must make a quick note of the Radeon 43xx/45xx family here, there is no finer example of the how much memory and clock speeds effect performance, even across similar dedicated graphics GPUs in terms of shader numbers. The 4570 is pretty much in a whole other league compared to it's less powerful sibling, the 4330, thanks to much higher core and memory speeds made possible with DDR3 or GDDR3 VRAM. For "low end" it's a beast.
    Radeon 2600/3650/3670.............120 Stream Processors / 5 = 24 Shaders ------> Basically the family is a "midway" between the 8600M GS and 9600M GT/8600M GT in terms of performance, though the 3670 is roughly equal to the 9600 thanks to such high core speed.
    Radeon 3850/3870/4650/4670......320 Stream Processors / 5 = 64 Shaders ------> Geforce 9800M GS/GT 150M,160M I must note things here get very blurry and convoluted, since the Radeon 4650 and 4670 have 128 bit memory interfaces, but the 4670 compares pretty decently to the 9800M GS in game performance, though the 3DMark06 says otherwise. The 3850 and 3870 are very comparable in this case though, since they have 256 bit memory interfaces.
    Radeon 4830/4860.....................640 Stream Processors / 5 = 128 Shaders -----> Geforce GTX 280MIn this case, the Radeon 4870 with 800 Stream Processors (800 / 5 = 160) is more the equivalent performance to the GTX 280M than the 4830/4860 which are BTW very rare GPUs.

    So there you go, some rough equalizations and comparisons, when comparing performance classes, but like I said, it's more convoluted than that. This is an arguably simple comparison method more inclined to low and medium end mobile graphics.
     
  6. naticus

    naticus Notebook Deity

    Reputations:
    630
    Messages:
    1,767
    Likes Received:
    0
    Trophy Points:
    55
    Very nice post
     
  7. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    yeah. order-of-magnitude-ish calculations are fine for me.

    multiply by 5.

    you want like 64nvidia/320ati processors, and a 256 bit bus, in order to run Call of Duty 4 in 1080p. thats a rough estimation, but that is the *type* of card you will need. in more tangible terms, something like an 8800m+ or ati 3800+, like mobius so eloquently said.
     
  8. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    And even still my 9800M GS struggles to keep high framerate at 1920 x 1080 at all highest settings w/o AA. CoD4 is weird, it seems to be pretty unoptimized. Even at 1280 x 720, the framerate likes to dip around. I'm beginning to think that CoD4 doesn't do the typical culling of off-screen polygons like most game engines do, because honestly, CoD4 looks like crap compared to other games these days, and low resolutions do not help. Though I must note that the 8800GTS 320 MB I had in my previous desktop rendered the game quite easily at a stable 60 FPS @ 1440 x 900, max settings, and 4x AA on top of that. I'd really like to try out CoD4 on my current desktop with a Radeon 4670 but an incompatibility issue with the sound hardware (and I would need a dedicated sound board to circumvent this), prevents me from doing so. But at least I can exclaim how much better the Radeon 4670 handles higher resolutions, as well as added AA, than the 9800GS thanks to the sheer increase in clock speed (530 MHz 9800M GS stock vs. 750 MHz 4670 Stock), and ATi's excellent ROP design. I know this thanks to playing Crysis/Crysis Wars/Crysis Warhead in DX9 mode, but I have XP on it so I don't know about DX10 mode, though I'd assume a bit of improvement. If I had Vista, I'd like to see what kind of framerate improvement I could get with STALKER: Clear Sky as well in DX10.1.
     
  9. Simpler=Better

    Simpler=Better Notebook Consultant

    Reputations:
    74
    Messages:
    165
    Likes Received:
    0
    Trophy Points:
    30
    mobius1-That makes everything beyond clear, thanks for taking the time to run the numbers

    Haha I was definitely overestimating. I wouldn't call it "midrange", it is a stock Dell after all. Maybe "slightly-better-than-onboard" :p


    Regardless, with everything at medium/low running at 1080p it was still quite good looking :D
     
  10. narsnail

    narsnail Notebook Prophet

    Reputations:
    2,045
    Messages:
    4,461
    Likes Received:
    1
    Trophy Points:
    106
    My desktop hovers between 40-80, maxed, which isn't all that high exactly, I wouldn't expect any mid-range card to be even close to that.

    As the others have said it simply does not have the memory bandwidth to push high resolutions in semi-new games.
     
  11. MrFong

    MrFong Notebook Evangelist

    Reputations:
    57
    Messages:
    654
    Likes Received:
    0
    Trophy Points:
    30
    COD 4, not well-optimised? I don't know, it seems like the coders did a pretty good job, at least for the time. I mean, it's a far cry from the mess that Crysis was.
     
  12. Mechanized Menace

    Mechanized Menace Lost in the MYST

    Reputations:
    1,370
    Messages:
    3,110
    Likes Received:
    63
    Trophy Points:
    116
    i have no problems running cod4 maxed
     
  13. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    Well Crysis is way above CoD4 in terms of visuals. IW could only go so far with the visuals to guarantee 60 fps on the console versions, and the PC version reflects the console version's visuals directly except display resolution, texture filtering, and ability to do over 60 fps. It's just never all too stable, maybe because of such a high number of players on the servers I tend to play on? Then that would be a polygon culling issue like I suspect. If it's not in view or in frame, it really needs to be culled out.
     
  14. Amnesiac

    Amnesiac 404

    Reputations:
    1,312
    Messages:
    3,433
    Likes Received:
    20
    Trophy Points:
    106
    I wouldn't call it "low end". I have one, and it really isn't all that bad. I'd say it's a midranger.

    Unusual though, because I was playing COD 4 with my Dell E248WFP (1920 x 1200), on medium - high (No AA), and it really performed all right. 30FPS> in most places. Only in heavy action did it go down to something like 23FPS.

    Have you tried different driver versions?
     
  15. fgjhgjhgk

    fgjhgjhgk Newbie

    Reputations:
    0
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    5
    I turned everything down to medium(And left the resolution at 1900x1080) and the game was playable(And still looked great) but I was a little let down. I thought a 512MB videocard could handle a 1+yr old game on high.
     
  16. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    Especially with these multiplatform games, 512 MB of VRAM is what you should have in general, but the VRAM itself is not the determining factor for how well you'll run a game. Look at VRAM (video-memory) as a workbench space to work on. You could have the best electrical saw or whatever in the world, but a small bench is just going to hamper your work since you'll have to move things around to continue your work. Having a giant workbench allows you to have all the room you need to work (like video memory), but you could have a relatively weak saw and cutting blade (graphics processor) that'll make your work slow and arduous. In fact CoD4 runs pretty well on cards with less than 512 MB of VRAM, like the 8800GTS 320 MB. Even Crysis works very well with it as long as you don't pump the resolution too high but of course the relatively low amount of VRAM just becomes a hampering factor at some point, hence the 512 MB of VRAM recommendation.

    It's funny though, many weak graphics cards come with butt loads of VRAM, I remember working at Fry's Electronics almost a couple years ago, and seeing 8400GSs, 8500GTs, and whatnot equiped with 1 GB of VRAM. Didn't help it was usually low speed DDR2 VRAM. Despite the high load of video memory, for gaming they are not very good cards. The large amount of video memory can either be viewed as a marketing gimmick to make people think the card is better, but the large memory pool is extremely useful for HD video editing and media which is a target market for low end, high VRAM cards, since the emphasis there is a large work space, and high bandwidth isn't as crucial as it is with PC gaming hence the small memory bus sizes (64-bit) and low speed video memory (DDR2, GDDR2). The low end cards like the Radeon HD 34xx, 43xx, 45xx and Geforce 84xx, 93xx, 205M, 210M, etc are meant to serve the media and video driving needs of notebooks, something highly valued now, with gaming as a secondary mission. It just happens that GPU design for gaming does really well at video decoding and playback. And for all I know some people who play games might be aiming specifically at low end solutions because that's all they need. BF2 for instance can pretty much be maxed out at a stable >60 fps on the Radeon HD 4570 @ 1366 x 768. Don't always need to go super high end just to get satisfactory results with specific titles. Some people just want to meet a certain games needs, and still have the rudimentary capability to play future games, even without the eye candy.
     
  17. Amnesiac

    Amnesiac 404

    Reputations:
    1,312
    Messages:
    3,433
    Likes Received:
    20
    Trophy Points:
    106
    You are really making me hate my 8600 GTS 256MB...
     
  18. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    For alot of games, it's still a perfectly fine amount of VRAM, especially for that cards level of performance.
     
  19. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    To be fair the DDR2 versions of the 3650 and 8600M GT both have 512mb of ram and neither can use all of it in a game as they are not powerful enough. The DDR3 3650 with half the memory would be 20% faster for example