The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Stock 8600GT GDDR2 vs Stock 8600GT GDDR3

    Discussion in 'Gaming (Software and Graphics Cards)' started by Soviet779, Jul 27, 2007.

  1. Soviet779

    Soviet779 Notebook Consultant

    Reputations:
    43
    Messages:
    263
    Likes Received:
    0
    Trophy Points:
    30
    Sorry if you clicked this thinking you would get a comparison lol. I would like to see some kind of comparison between them, some people have received their inspirons and vostros and theres bound to be other people with a GDDR3 8600GT in some other laptop, there should be a stock comparison done, no overclocking or screwing with it, just a stock comparison. With clocks listed as they will differ. Its obvious the 8600GT GDDR3 will win but the thing is, by how much?
     
  2. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    in what benchmark? (say 3dmark 06 and i will slap you)

    i recommend either the half life 2 lost coast video stress test or the counter strike source video stress test.

    i am quite partial to both of those. the engine is very refined for use on the pc. newer games will have big sweeping performance differences with new drivers and patches.

    you understand.
     
  3. Doodles

    Doodles Starving Student

    Reputations:
    178
    Messages:
    880
    Likes Received:
    0
    Trophy Points:
    30
    hahah... yea look for real game testing with fps if you are looking. IMO, even tho im not a EXpert, the difference won't be MAJOR, unless the game really needs ALL of that dedicated RAM anyway...
     
  4. azntfl

    azntfl Notebook Evangelist

    Reputations:
    27
    Messages:
    313
    Likes Received:
    0
    Trophy Points:
    30
    300 MHz
     
  5. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    actually this doesn't have anything to do with the amount of ram.

    this is a ram speed test, which is pretty important. these cards have fast cores and are bandwidth hungry. i can actually do the test myself assuming i can get a driver i can set the clocks on...

    its just 400mhz memory vs 700mhz, and thats the only difference. the core and shaders are clocked up to nvidia spec right?
     
  6. Soviet779

    Soviet779 Notebook Consultant

    Reputations:
    43
    Messages:
    263
    Likes Received:
    0
    Trophy Points:
    30
    Any 2004 or later FPS/RTS. All the ones ive seen so far have been with overclocking a bit usually so i don't know much much a stock GDDR2 will lag behind a stock GDDR3 card in games.
     
  7. Crimsonman

    Crimsonman Ex NBR member :cry:

    Reputations:
    1,769
    Messages:
    2,650
    Likes Received:
    0
    Trophy Points:
    55
    Sometimes if the processor is different, the ingame performance is different, so make sure everything is exact except the GDDR2 GDDR3 thing
     
  8. ShadowoftheSun

    ShadowoftheSun Notebook Consultant

    Reputations:
    27
    Messages:
    181
    Likes Received:
    0
    Trophy Points:
    30
    Masterchef- I believe you can use hacked drivers and a registry edit to bring up the classic control panel, which contains overclocking built in. This should also allow you to see what the default clocks are the for your chip- I hear the MBP's are underclocked, as are the Sagers. I'm not sure if they are underclocked to the same level, however.

    Default clocks (off nVidia's Website) are 475/700. http://www.nvidia.com/object/geforce_8600M.html

    Rumored clocks for GDDR2 version are 475/400, although these are unconfirmed rumors/hotly contested OEM specifications.
     
  9. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    PLEASE explain how to do the registry editing to unlock the ability to set custom clocks.

    i am using windows xp.

    just link me or anything!

    thanks.

    currently i have tried ntune, atitool, and riva tuner, and none of them will allow me to set custom clocks. i click on the "set" button in each app and it yet the clocks stubbornly sit at 470 / 635...
     
  10. ShadowoftheSun

    ShadowoftheSun Notebook Consultant

    Reputations:
    27
    Messages:
    181
    Likes Received:
    0
    Trophy Points:
    30
    This thread outlines some people who are having the same problem as you. Unfortunately, it seems like using the 158 drivers are the only way to overclock. As for the classic control panel, it seems like they disabled the overclocking functionality in the newer drivers anyway. Sorry about the misinformation.
     
  11. XmDXtReMeK

    XmDXtReMeK Notebook Consultant

    Reputations:
    114
    Messages:
    215
    Likes Received:
    0
    Trophy Points:
    30
    I was really mad when I saw my both my experience Index was lower and same for my 3dMark Score compared to the Asus G1S( GDDR3 8600gt).
    GDDR2(Inspiron 1520):
    Default:
    Core Clock: 475 MHz
    Shader Clock: 900 MHz
    Memory clock: 400 MHz
    Benchmark 3dmark: 2839 ( I don't remember it might have been lower but not higher it was in the 2k range did the benchmark long time ago)
    Experience Index Windows: 4.6
    Today I downloaded these drivers:
    http://www.laptopvideo2go.com/forum/index.php?showtopic=15825
    I overclocked it with RivaTuner my stable results were:
    Core Clock: 611 MHz
    Shader clock:1425 MHz
    Memory Clock: 495 MHz
    Benchmark 3dmark: 4097
    Experience Index Windows: 5.9
    Temp: Idle: 46 C Load: 56 C ( the load temperature is from 12 hours worth of gaming)
    The GDDR2 has amazing overclocking capabilities but the Memory clock is max of what it can go or it wont be stable. I can bare having 205 MHz lower than my friends memory clock. His is already overheating with a core clock of 475 mhz. Anyways you wont beat it in 3dmark benchmarking cuz of the memory clock GDDR3 has but you can get close to it. Ill tell you the testing of the Asus G1S as soon as my friend comes back from uni.

    All I know is from my experience using the Asus, it gets so hot its not even playable because of heat starts to irritate your hands. On my Dell I dont even feel the temp difference.

    Anyways sorry for my horrible english its not my first language. But Ide thought I share my overclocking experience because im sure alot of 1520 users were disappointed in their benchmarks, this will change lotsa stuff now I hope :D
     
  12. XmDXtReMeK

    XmDXtReMeK Notebook Consultant

    Reputations:
    114
    Messages:
    215
    Likes Received:
    0
    Trophy Points:
    30
    NOTE: I just wanted to let you know these are numbers VERY STABLE you can overclock it even more if you like to but you will start to get blue screens and random crashes, I wouldnt suggest higher values than these for the GDDR2 version. I have played constant 12 hours stress test without any crashes or blue screens.
     
  13. ahl395

    ahl395 Ahlball

    Reputations:
    3,867
    Messages:
    8,218
    Likes Received:
    72
    Trophy Points:
    216
    A Big Very Noticable Difference--worth Extra Money!!!
     
  14. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    I thought default clocks for 8600m DDR2 were 475 Core / 400 Memory? not 400/400.
     
  15. XmDXtReMeK

    XmDXtReMeK Notebook Consultant

    Reputations:
    114
    Messages:
    215
    Likes Received:
    0
    Trophy Points:
    30
    oh I apologize that was a typo fixed it.