The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    UPDATED - The Mobile Graphics Card Info Page - Most GPU Qs answered

    Discussion in 'Gaming (Software and Graphics Cards)' started by Charles P. Jefferies, Feb 4, 2006.

  1. hage

    hage Notebook Enthusiast

    Reputations:
    7
    Messages:
    49
    Likes Received:
    0
    Trophy Points:
    15
    Actually, hasn't it been confirmed that the 9500m in the XPS 13 is just a 9200m IGP and 9400M IGP working in geforce boost mode?
     
  2. sgogeta4

    sgogeta4 Notebook Nobel Laureate

    Reputations:
    2,389
    Messages:
    10,552
    Likes Received:
    7
    Trophy Points:
    456
    Notebookcheck is not reliable. Try searching the forums. It is pretty much impossible for the laptop version of 9600M GT (FX 770M) to beat the 9800 GTX (desktop) since the 9600M GT is based off the 9500 GT (desktop) core, even if the test is based off OpenGL as the 9800 GTX (desktop) has much more shaders and higher memory bandwidth.
     
  3. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    Just to clarify, the DDR2 9600 is very much equal to the DDR3 3670. In the XPS 16 review, I can confirm the HDX18's score of 4127 was at 1280 x 1024. The XPS scored 4855, but assumedly at 1280 x 800 (since it was said every score was at 1280 x 800).
     
  4. ytrewqxl

    ytrewqxl Notebook Guru

    Reputations:
    0
    Messages:
    71
    Likes Received:
    0
    Trophy Points:
    15
    Notebookcheck is just a collector of information from other websites, including this one, so the reliability depends on the sources. In this particular case their info is correct. You just don't understand. It doesn't matter how much shaders, bandwidth or power the 9800GTX has. Nvidia kills the OpenGL and 3D App performance of the Geforce card on driver level. The Geforce card will also make errors in rendering.
    I don't want to start a quadro vs geforce discussion because there are already millions around the internet, just search and you will find out. Let's assume I'm wrong, then ask yourself the question why people would pay $600
    for a midrange quadro if they can get a ultra high-end gaming card for the same price.
    I just wanted to give this guy some advice. If 3D apps is all he wants to do, better spend the money on an older midrange Quadro or FireGL because those are already better then ANY Geforce for that purpose.
     
  5. Red_Dragon

    Red_Dragon Notebook Nobel Laureate

    Reputations:
    2,017
    Messages:
    7,251
    Likes Received:
    0
    Trophy Points:
    205
    yes so NBR review says i wonder of other reviewers are having this problem?
     
  6. Waveblade

    Waveblade Notebook Deity

    Reputations:
    72
    Messages:
    1,037
    Likes Received:
    0
    Trophy Points:
    55
    Hm, that would be rather interesting on what the specs are.
     
  7. Cheeseman

    Cheeseman Eats alot of Cheese

    Reputations:
    365
    Messages:
    1,296
    Likes Received:
    1
    Trophy Points:
    56
    No, the Geforce 9500M GS is a slightly improved Geforce 8600M GT DDR2. The Geforce 9200M IGP falls somewhere under the Geforce 8400M G which was rebranded as the Geforce 9300M GS; unless its a Dell trick to say a Geforce 9200M + 9400M = Geforce 9500M in performance, which would be false.
     
  8. streather

    streather Notebook Evangelist

    Reputations:
    51
    Messages:
    524
    Likes Received:
    0
    Trophy Points:
    30
    Quick question, which one of these performs the best

    ATI HD 3450 or a Nvidia 9400m


    im eyeing up a new system and currently thinking of either a Dell Studio 15 or 17 or a apple macbook wondering which GPU is better out of the two of them
     
  9. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,082
    Trophy Points:
    931
    The HD 3200 integrated card is more or less equal in performance to the HD 3450. The 9400M isn't much different performance-wise. As long as the laptop has one of those cards, then I'd decide on other features.
     
  10. Tyo

    Tyo Notebook Deity

    Reputations:
    89
    Messages:
    1,183
    Likes Received:
    1
    Trophy Points:
    56
    Any news on new nvidia cards or nuttin released yet ?
     
  11. Templesa

    Templesa Notebook Deity

    Reputations:
    172
    Messages:
    808
    Likes Received:
    247
    Trophy Points:
    56
    Well, I got my netbook in, (not really a netbook, but its small enough, and way smaller and lighter than my XPS...) and it has a X4500MHD in it. I always saw this card getting trounced by people around the forums for being bad, but I figured hey, I am only going to install WoW, my word processor for school, Firefox and Winamp, and I have 80 something gigs left over... what the hell, why not try out Steam, right? Well, WoW wouldn't install, (ended up being something I was doing, not the GPU) and I saw someone here say that Half-Life 2 was barely playable at low. I download it from Steam and chuckle to myself cause I figure I'll end up running back to my XPS in tears, but when I hit the Video tab, the game preset everything to high. Very odd, I thought, but I went ahead and played it at the settings it recommended. To my surprise, it played. And well! It was fluent, and I didn't download FRAPs, but it appeared to be at least 30fps because it wasnt feeling choppy or laggy! I even contemplated a video for the non-believers, but I don't feel I have to prove it to anyone, so I just canned the idea. Then, I tried Left 4 Dead, a game I was SURE wouldn't run. But again, to my surprise, it ran, and pretty well. All graphics set to as low as possible, and running in windowed mode, it actually stayed playable. Enough so that I completed the entire Dead Air campaign, through the hordes and everything! When there was a horde and people were throwing Molotovs was the worst, and it did have some stuttering, but it was still able to be played through!

    I still wouldn't recommend this card to anyone serious about gaming- its absolutely no fun worrying about being able to play the next big title at low settings, but I also was surprised at how well the card managed to do!
     
  12. Big Mike

    Big Mike Notebook Deity

    Reputations:
    57
    Messages:
    956
    Likes Received:
    1
    Trophy Points:
    31
    It's definately not the worst mobile graphics out there. It stomps the old x3100 and Nvidia's 7150go both with ease.
     
  13. Euquility

    Euquility Notebook Deity

    Reputations:
    198
    Messages:
    1,592
    Likes Received:
    0
    Trophy Points:
    55
    Yes it definitely is not terrible but are you sure Left 4 Dead is playable Im running it on my desktop with a Q6600 processor and I cant even run the game consistently

    Half Life and CS are possible though under lower resolutions
     
  14. JellyGeo

    JellyGeo Notebook Evangelist

    Reputations:
    57
    Messages:
    603
    Likes Received:
    0
    Trophy Points:
    30
    Templesa - Thanks for the info on the x4500MHD - I just ordered a Toshiba U405 with (I think) the same graphics in it. Would you mind telling what system you have? Thanks...
     
  15. Templesa

    Templesa Notebook Deity

    Reputations:
    172
    Messages:
    808
    Likes Received:
    247
    Trophy Points:
    56
    P8400 2.26 Ghz C2D w/ 3MB L2 Cache 1066Mhz FSB, 4GB of DDRII-800Mhz G-Skill RAM, X4500MHD, 120GB 5400 RPM HDD. This is all in a MSI 1223 (whitebook). Screen is 12" 1280x800 Resolution. Anything else you'd like to know, I'd like to share!


    And yes, I am sure it was playing fine because I am usually pretty picky with the way things run. Also remember though, that this was windowed at 640x320 res and lowest graphical setting. It looks nothing like it would on your quad-core, since there were many unsmoothed jaggies, and some detail missing, but it did run well. Putting it into fullscreen mode made it hiccup like crazy. I am even going to try turning up a few of the options and play around with it to see if I can maybe squeeze a tiny tiny bit more out of it.
     
  16. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    His quad core could be playing it on integrated graphics with a Dell desktop, so that doesn't mean much.
     
  17. Templesa

    Templesa Notebook Deity

    Reputations:
    172
    Messages:
    808
    Likes Received:
    247
    Trophy Points:
    56
    Hehe, in a way I have no idea why this was moved. It wasnt a question at all.
     
  18. johnny89

    johnny89 Notebook Evangelist

    Reputations:
    0
    Messages:
    383
    Likes Received:
    0
    Trophy Points:
    30
    Anyone know have one and knows how well it performs? Is it a dedicated card? Could it play counter strike source? Im looking to buy one thats in a ASUS N10J-A1. thanks
     
  19. 660hpv12

    660hpv12 Notebook Deity

    Reputations:
    63
    Messages:
    1,031
    Likes Received:
    0
    Trophy Points:
    55
    yep it can play source, but dont expect it to do well with new released games
     
  20. Will_Clay

    Will_Clay Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    Hey..
    I didn't see the ATI Mobilitiy RADEON HD 3670 on the chart, where does that compare?
     
  21. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,082
    Trophy Points:
    931
    It is slightly better than the HD 3650. The HD 3670 is a decent card, and it can play all modern games at medium resolution and mostly high settings.
     
  22. Will_Clay

    Will_Clay Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    Ok, thanks!
     
  23. jonuslee

    jonuslee Newbie

    Reputations:
    0
    Messages:
    9
    Likes Received:
    0
    Trophy Points:
    5
    9600m GT vs HD 3670 which one is better in gameplay??
     
  24. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    DDR2 9600 and it's a tie, DDR3 9600 is better. Wait for the 4670
     
  25. Rich.Carpenter

    Rich.Carpenter Cranky Bastage

    Reputations:
    91
    Messages:
    903
    Likes Received:
    0
    Trophy Points:
    30
    Just how big an improvement would a Quadro FX 2700M be over a Mobility Radeon 3650? I'm not talking about trying to jack up the framerate in Crysis. I'm talking about maximizing the graphics quality settings in games like Fallout 3 and Diablo III at WSXGA+.
     
  26. Micaiah

    Micaiah Notebook Deity

    Reputations:
    1,333
    Messages:
    1,915
    Likes Received:
    41
    Trophy Points:
    66
    The Quadro FX 2700M is based off the Geforce 9700M GTS, so it's a lot better than the HD 3650. I'd expect it to be at least 80% faster.
     
  27. Rich.Carpenter

    Rich.Carpenter Cranky Bastage

    Reputations:
    91
    Messages:
    903
    Likes Received:
    0
    Trophy Points:
    30
    Seriously? 80%?
     
  28. Micaiah

    Micaiah Notebook Deity

    Reputations:
    1,333
    Messages:
    1,915
    Likes Received:
    41
    Trophy Points:
    66
    9700M GTS and HD3650.

    Some deviation in the results due to different CPUs being used, but it should be close enough.
     
  29. WILLY S

    WILLY S I was saying boo-urns

    Reputations:
    478
    Messages:
    1,784
    Likes Received:
    0
    Trophy Points:
    55
    No-no-no not 80% :facepalm:

    Up to 50% at a higher resolution but less at a lower one (talking about playable framerates here)

    You won't be maxing fallout3 out @ WSXGA+, Diablo 3 will depend on processor alot too so i doubt that'll max out either.
     
  30. HK6

    HK6 Notebook Enthusiast

    Reputations:
    0
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    I have been looking at lots of laptops with each of these and was just wondering which card is better. I have seen that the 2600HD has 512mb dedicated while I haven't seen if the 3200HD does or not.
     
  31. R4000

    R4000 Notebook Virtuoso

    Reputations:
    736
    Messages:
    2,762
    Likes Received:
    0
    Trophy Points:
    55
    The HD3200 is an IGP, while the HD2600 is a dedicated gpu. The HD2600 scores almost double what the 3200 does in 3DMark06, making it a much better alternative if casual gaming......
     
  32. HK6

    HK6 Notebook Enthusiast

    Reputations:
    0
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    Ok thank you very much.
     
  33. Micaiah

    Micaiah Notebook Deity

    Reputations:
    1,333
    Messages:
    1,915
    Likes Received:
    41
    Trophy Points:
    66
    Crysis - CPU Benchmark: - Low 1024x768 = 43.33 (DDR2) vs 74.34 = 71.6%

    Crysis - CPU Benchmark: - High 1024x768 = 10.84 (DDR2) vs 17.82 = 64.4%

    Doom 3: - Ultra 1024x768 = 93.1 (DDR3) vs 122 = 31%

    Depends on what VRAM the HD 3650 has.
     
  34. baldash

    baldash Notebook Enthusiast

    Reputations:
    132
    Messages:
    33
    Likes Received:
    0
    Trophy Points:
    15
    I have a hd2600 256 mb. Although i will be buying new notobook later this year. I would say if you cant afford something with newer GPU the 2600 is good value for money.
    I play mostly shooters.below are titles/settings.
    COD4/5-1024/800 MED-HIGH
    BIOSHOCK-1024/800 HIGH
    GEARS OF WAR-1024/800 MED
    CRYSIS/WARHEAD-1024/800 MED
    DEADSPACE-1024/800 HIGH
    FEAR2 DEMO-1024/768?800? HIGH
     
  35. Rich.Carpenter

    Rich.Carpenter Cranky Bastage

    Reputations:
    91
    Messages:
    903
    Likes Received:
    0
    Trophy Points:
    30
    I've seen a lot of people saying that the data on Notebookcheck can't be trusted. Is that because it's based on anecdotal user-submitted data or something? (I *am* curious what the CPU Benchmark has to do with the GPU comparison, though.)
     
  36. Big Mike

    Big Mike Notebook Deity

    Reputations:
    57
    Messages:
    956
    Likes Received:
    1
    Trophy Points:
    31
    I haven't found any gross inaccuracies in the GPU tables (though they can be influenced on the CPU used as I'm sure all those tests weren't used with a single CPU) but the CPU table is definately based on benchmarks that either aren't done well or don't properly utilize L2 cache for one thing, you find lots of lower end (T3XXX over T5XXX and T7XXXs with the same MHZ etc) cpus beating a higher end cpu with the same clock. On a test that doesn't use much cache that's entirely possible, but real applications that make use of the cache will obviously run faster on the newer cpu with larger L2 cache. The gap in day to day usage, especially with less demanding apps will be virtually nil from a 2ghz Pentium Dual Core to a Core 2 Duo at the same clock, but any kind of demanding app that uses L2 will run much faster on the C2D.
     
  37. MadHater

    MadHater Notebook Deity

    Reputations:
    229
    Messages:
    1,743
    Likes Received:
    34
    Trophy Points:
    66
    HD2600 is a way better choice.

    Radeon HD3200 is in range of HD2400, which puts in in low range GPU's class.
     
  38. Melody

    Melody How's It Made Addict

    Reputations:
    3,635
    Messages:
    4,174
    Likes Received:
    419
    Trophy Points:
    151
    The data is somewhat decent, just that it should be taken with a grain of salt. As you said, sometimes they base their information off "on sheet" specs and in their tests are also influenced by the CPU sometimes(ex: 3dmark scores or some games are influenced by the CPU). They also do not make any difference between a DDR2 and GDDR3 version of the same GPU, taking the average of both of these in their charts.

    You can use Notebookcheck, but you have to know how to. In order to do it properly one has to:
    - check the actual laptops notebookcheck has used in their "benchmarking"(they usually state them under the games or somewhere)
    - check the versions of the GPUs/CPUs used
    - check what tests are being done and take that into account

    The table in general isn't that badly made. It's a good rough outline(as in, the general list from good to bad is correct), but I wouldn't go into detail and say "oh this one is ranked 4 spots higher in notebookcheck so it must be loads better". Notebookcheck is mostly used to check the specs of the GPU, which are in general accurate(ex: core clocks and shaders). THOSE specs are used to compare GPUs between each other.
     
  39. aceZsta

    aceZsta Notebook Geek

    Reputations:
    0
    Messages:
    96
    Likes Received:
    0
    Trophy Points:
    15
    Got an m1530 with an 8600m GT DDR3. While playing Crysis Warhead GPU temp reached 84C and CPU temp gets to 80C. Every 2 minutes the friggin game downclocks to like 3-4 fps for like 40 sec before going back to a smooth 25-30 fps...

    Is it supposed to be doing this? Shouldn't a laptop GPU be able to push 95C without it being too dangerous? I'm currently using the 179.28 official beta drivers from nvidia.

    Part of it is that the desk in my dorm room is pretty crowded...the desk at my house is not and i've noticed lower idle/gaming temps there. Would something like a USB cooling pad help a lot with the random downclocks?
     
  40. runee1000

    runee1000 Notebook Consultant

    Reputations:
    0
    Messages:
    176
    Likes Received:
    1
    Trophy Points:
    31
    Question 1: How Large a difference is there between the gaming/overall performance of each card. Is one significantly better then the other.

    Question 2: Which Card will play games such as Crysis, GRID, and Dead Space better, and how much better will it play them.

    Question 3: Which Card do you Overall recommend to use. Does one have advantages over another?

    Question 4: How big a difference would performance be if the Nvidia laptop was running P9500(2.53ghz) instead of P8700(2.53ghz). Would the tables change for the above questions?

    Thanks

    *btw, I know that there are similar posts to this one but I want to be a 100% sure.
    *Both Card have 256mb Vram
    *The Nvidia will run on a laptop with 3GB DDR3 ram where as the Ati will run on a laptop with 4GB DDR2 ram.
    *The Nvidia will run on a P8700 2.53 Ghz laptop where as the Ati will run on a T9600 2.80 Ghz laptop.
    *both laptops are 7200rpm
     
  41. WILLY S

    WILLY S I was saying boo-urns

    Reputations:
    478
    Messages:
    1,784
    Likes Received:
    0
    Trophy Points:
    55
    Elevate the back of the laptop and make sure the fan has plenty of breathing space. A laptop cooler would probably help a bit, so would undervolting the cpu.
    Cooling central
     
  42. Sokonomi

    Sokonomi Notebook Guru

    Reputations:
    0
    Messages:
    60
    Likes Received:
    0
    Trophy Points:
    15
    *Bump* :eek:

    Im considering overclocking my 9200m GS a bit to get a tinybit more power out of it,
    but wheres the limit temperaturewise? :p
     
  43. wahad

    wahad Notebook Consultant

    Reputations:
    28
    Messages:
    217
    Likes Received:
    0
    Trophy Points:
    30
    i've tried dead space on 9300gs with dc3200 and 2gigs of ram...
    1024x600 med/high... it can run on 1280x800 witch is native resolution for that acer 15.4' model... bud its little laggish..

    i tried left4dead... 1024x600 med, around 30fps... NFS UC low/med 1024x600 runs good. Test Drive Unlimited... 1024x600 with desireable framerate

    say godbye for crysis and maybe grid coz this graphic cards have only 64bit...
    FIFA09 runs like crap on high :)

    q 2: i would recommend u to go with Nvidia in this case, as far as i know 3470 is so so so so low end card....

    http://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html

    question 1: overall difference for everyday work and stuff like that? none i think :)

    q. 4: well... p9500 have 6mb l2 cache and p8700 only 3 :) so figure it out yourself :).
     
  44. sgogeta4

    sgogeta4 Notebook Nobel Laureate

    Reputations:
    2,389
    Messages:
    10,552
    Likes Received:
    7
    Trophy Points:
    456
    I believe the 3470 is actually better than the 9300M GS, but not significantly so. Notebookcheck can be a little unreliable. Also, processor makes little difference in most games.

    Video card memory makes little difference as both are 64-bit. Even 128MB will suffice. The difference is if it's DDR2 GPU memory or GDDR3. Laptop memory (DDR2/3 SO-DIMM) makes little impact on performance, 3 and 4GB are both good. From your updated specs, ATI should win by a decent margin.
     
  45. boypogi

    boypogi Man Beast

    Reputations:
    239
    Messages:
    2,037
    Likes Received:
    0
    Trophy Points:
    55
    below 90c is good. the absolute limit is 100c
     
  46. wahad

    wahad Notebook Consultant

    Reputations:
    28
    Messages:
    217
    Likes Received:
    0
    Trophy Points:
    30
    i doubt it but ok...

    processor would make huge impact on performance for RTS games...

    but again, nvidia have better driver support what is crucial for me for example...
     
  47. Sokonomi

    Sokonomi Notebook Guru

    Reputations:
    0
    Messages:
    60
    Likes Received:
    0
    Trophy Points:
    15
    hmm, considdering its barely tickling 70'c at most it seems I can give her some gas then. ;) Howfar you reccon a 9200m GS can safely go though?
     
  48. 660hpv12

    660hpv12 Notebook Deity

    Reputations:
    63
    Messages:
    1,031
    Likes Received:
    0
    Trophy Points:
    55
    well the engineering max temp a video card can take is 110-120 c, but as long as its below 90, its good
     
  49. MasterAlex

    MasterAlex Newbie

    Reputations:
    0
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    5
    Does anyone know how good this card is at CAD applications like solidworks?
     
  50. Manic Penguins

    Manic Penguins [+[ ]=]

    Reputations:
    777
    Messages:
    1,493
    Likes Received:
    0
    Trophy Points:
    55
    The 9200M GS is a good overclocker.
    Are you serious when you say the vent singes your hair???

    Check the link in my sig for a good (not max) OC and the benefits, but remember 2 9200m gs can have different limits and 'the danger of overclocking' (oooh scary xD)
     
← Previous pageNext page →