The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Desktop 8600 GT Vs. Laptop 9600m GT

    Discussion in 'Gaming (Software and Graphics Cards)' started by cbee, Aug 25, 2008.

  1. cbee

    cbee Newbie

    Reputations:
    0
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    5
    8600GT has DDR2, 9600M GT has DDR3. Both has 512mb Ram. Go!

    Asking 'cus I'm about to buy the dv7 from HP with 9600m GT in, and I really need to know the difference between the 8600 GT (my current desktop GPU) and the 9600m GT.
     
  2. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    not much difference. stock, the desktop 8600GT has a higher core clock and a lower memory clock than the 9600m GT. the 9600m GT might be a little bit faster overall, probably not enough to notice.
     
  3. WileyCoyote

    WileyCoyote Notebook Evangelist

    Reputations:
    193
    Messages:
    655
    Likes Received:
    0
    Trophy Points:
    30
    the 9600m gt has higher clocked memory speed compared to your 8600gt. This means it will be able to draw more complex shaders, vertices, polygons @faster speeds. The difference is huge actually, the desktop 8600gt has 475mhz memory speeds while the 9600m gt has 800.
     
  4. cbee

    cbee Newbie

    Reputations:
    0
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    5
    Higher clocks? Are you guys saying the 9600m GT can actually beat the 8600 GT?
     
  5. link747

    link747 Notebook Consultant

    Reputations:
    9
    Messages:
    136
    Likes Received:
    0
    Trophy Points:
    30
    I thought the 9600m GT's in the HP's were Gddr2, not Gddr3.
     
  6. cbee

    cbee Newbie

    Reputations:
    0
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    5
    Maybe they are, I assumed they were DDR3 since it's really more standard nowadays and the 9600m GT is so new. I actually haven't looked into it.
     
  7. link747

    link747 Notebook Consultant

    Reputations:
    9
    Messages:
    136
    Likes Received:
    0
    Trophy Points:
    30
    I am pretty sure hp used the Gddr2 version.
     
  8. robvya

    robvya Notebook Evangelist

    Reputations:
    0
    Messages:
    335
    Likes Received:
    0
    Trophy Points:
    30
    but if you are gaming the 9600GT DDR3 will do the job for nowadays games at med-high. Maybe in 2 or 3 year you could play games at low-med.
     
  9. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    any way you spin it, the performance is going to be basically the same for the end user.

    if the memory clocks were almost double, that would be a big deal, but apparently they arent
     
  10. cbee

    cbee Newbie

    Reputations:
    0
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    5
    If they did, I am gonna get so pissed when I buy it tomorrow.
     
  11. Cheeseman

    Cheeseman Eats alot of Cheese

    Reputations:
    365
    Messages:
    1,296
    Likes Received:
    1
    Trophy Points:
    56
    That is true the HP Dv7t comes with a Geforce 9600M GT DDR2. I'm not sure about performance difference, but the DDR3 version scores around 700 points higher in 3dmark06. Nothing a quick overclock can't fix, but then again the DDR3 can be OC'ed even higher.
     
  12. cbee

    cbee Newbie

    Reputations:
    0
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    5
    Awh that's just great! Thanks alot Cheeseman!

    Welp I'm gonna stick to Masterchef's words and hope performance really is the same. Basically as long as it isn't worse I'm pretty much happy, and yeah it is for gaming. Mainly Crysis, tweaking it correctly gave me Very High-like settings on the 8600 GT with 25 fps.
     
  13. ryane0840

    ryane0840 Notebook Consultant

    Reputations:
    16
    Messages:
    259
    Likes Received:
    0
    Trophy Points:
    0
    cbee.. gddr2 vs gddr3 for that version of the card...dont worry makes little diff..only diff will be when overclocking..otherwise its only like 1 fps faster in games..
     
  14. ryane0840

    ryane0840 Notebook Consultant

    Reputations:
    16
    Messages:
    259
    Likes Received:
    0
    Trophy Points:
    0
    ok to the ignorant lol...

    GDDR2 memories continue to work at 2.5 V. Since they run at higher clock rates compared to DDR memories, they generate more heat. This is the reason why only a few video cards used GDDR2 memories – only GeForce FX 5700 Ultra and GeForce FX 5800 Ultra used this kind of memory. Shortly after GeForce FX 5700 Ultra was released many video card manufacturers released a GeForce FX 5700 Ultra using GDDR3 memories, maybe to lower the heat and power consumption effects.

    GDDR3 memories can work at 2.0 V (Samsung chips) or at 1.8 V (chips from other manufacturers), solving the heat problem. This is the reason why this kind of memory is used by high-end video cards.

    The main advantage DDR2 has over good ole DDR memory is that it runs on a lower voltage, which lowers the power requirements, and allows it to scale higher with a small latency penalty.

    GDDR3 (Graphics Double Data Rate3) takes this one step further, requiring less voltage than DDR2, and scaling even futher (though with some latency penalty). While the motherboard industry is making the transition from DDR to DDR2 memory, right now GDDR3 is only used on graphics cards. There are no current plans to migrate GDDR3 to the motherboard level - but who knows what the situation will be a year or two down the road. GDDR3 and DDR2 do share one thing in common, they are only packaged in BGA modules - leaving the old TSOP-II to be finally retired.

    Micron were first out of the gate with GDDR3 DRAM modules, but the memory found on the Albatron GeForceFX 5700 Ultra is actually using Samsung K4J55323QF-GC20 GDDR3 DRAM. Compared to the Samsung K4N26323AE-GC22 DDR2 DRAM modules we find on earlier cards, the Samsung GDDR3 memory requires just 2.0V compared to 2.5V for DDR2. This which should help ease up on power consumption. However, as GDDR3 RAM runs with an even higher latency than DDR2, nVIDIA have had to clock the memory 50 MHz higher as well. This should offset any performance penalty that GDDR3 has over DDR2.

    http://www.pcstats.com/articleview.cfm?articleid=1591&page=2



    lol so basically im right about the stuff i talk about...the only reason the gddr3 is clocked higher is because it has more latency lol not cuz it can handle more..and also it uses less voltage thats what gddr3 is for...it DOES NOT mean it's a better gpu..in fact u cant tell the diff visually from ddr2 and gddr3...it was made to run at LOWER voltages...

    so i dont know where you guys get all your weird esoteric info from..and being a gddr3 owner with the 120 stream processors vs 32...I know mine is roughly the same as ddr2 it can just be overclocked about 50mhz higher ..thats it..if u want 50 more articles educating u guys on this...or if i should head to the library .

    My point is not discouraging the dude cuz its not gddr3....but in fact telling him the truth instead of just assuming the higher clocks mean more performance
     
  15. ryane0840

    ryane0840 Notebook Consultant

    Reputations:
    16
    Messages:
    259
    Likes Received:
    0
    Trophy Points:
    0
    lol and to quote the above key point....for those of u that like to argue without proof or knowledge

    ''nVIDIA have had to clock the memory 50 MHz higher as well. This should offset any performance penalty that GDDR3 has over DDR2.''

    So this is why im not a ati or nvidia fan....

    MISInFORMATION

    the quality of simply not knowing and leaving it at that
    the quality of not analysing information and forming an uneducated guess(the opposite of a hypothesis) and preaching it as fact
    the quality of bias, and partiality of ownership.

    thats all u get for quality when u ask nvidia or ati fanboys for info
     
  16. Field Marshal Rommel

    Field Marshal Rommel Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    Out of curiosity, how big of a difference will there be between my desktop 8800GT and my 9600M GT?