The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    9600M GT Performance

    Discussion in 'Gaming (Software and Graphics Cards)' started by Rekxxor, Oct 6, 2008.

  1. sxusteven

    sxusteven Notebook Evangelist

    Reputations:
    33
    Messages:
    550
    Likes Received:
    0
    Trophy Points:
    30
    [​IMG]
    Ram, please run your 3dmark at 1280x1024 if possible
     
  2. I♥RAM

    I♥RAM Notebook Deity

    Reputations:
    233
    Messages:
    1,596
    Likes Received:
    0
    Trophy Points:
    55
    Sorry I uninstalled my 3dmark06 demo so I can't take a screenshot to show proof of the 1280 resolution, not that I knew how to change it anyway, so take my word for it that it was at a default of 1280.

    [​IMG]
     
  3. Micaiah

    Micaiah Notebook Deity

    Reputations:
    1,333
    Messages:
    1,915
    Likes Received:
    41
    Trophy Points:
    66
    Judging from your score, it's highly likely the FX770M is based on the 9700M GT. nVidia's website also specified it had a 35W TDP, which matches the 9700M GT. The 9600M/9650M is 23W TDP.
     
  4. I♥RAM

    I♥RAM Notebook Deity

    Reputations:
    233
    Messages:
    1,596
    Likes Received:
    0
    Trophy Points:
    55
    Really? Almost everywhere (this forum, notebook check) I read says its based off the 9600M core, but what you said seems to make more sense. I read around and it seems the 9700M does indeed get 5000-6000 points in 1280. I guess thats a better comparison to the 770M Quadro.
     
  5. Micaiah

    Micaiah Notebook Deity

    Reputations:
    1,333
    Messages:
    1,915
    Likes Received:
    41
    Trophy Points:
    66
    They're correct, because the 9600M/9650M/9700M GTs uses the same G96M core. The difference between them are the core and shader clocks, assuming they are all GDDR3 models.
     
  6. Micaiah

    Micaiah Notebook Deity

    Reputations:
    1,333
    Messages:
    1,915
    Likes Received:
    41
    Trophy Points:
    66
    Run this and compare your screen to mine.
     
  7. I♥RAM

    I♥RAM Notebook Deity

    Reputations:
    233
    Messages:
    1,596
    Likes Received:
    0
    Trophy Points:
    55
    Sure thing:

    [​IMG]

    To fill in the missing info:

    Manufacturer NVIDIA
    Series Quadro FX
    Pipelines 32 - unified
    Memory Bus Width 128 Bit
    Memory Type GDDR2 / GDDR3
    Memory Bandwidth 25.6 GB/sec
    Shared Memory no
    DirectX DirectX 10, Shader 4.0
    Current Consumption 35 Watt
    Transistors 314 Million
    technology 65 nm
     
  8. Micaiah

    Micaiah Notebook Deity

    Reputations:
    1,333
    Messages:
    1,915
    Likes Received:
    41
    Trophy Points:
    66
    GPU-z doesn't detect it properly, but yeah, it at least belongs in the same family.
     
  9. sxusteven

    sxusteven Notebook Evangelist

    Reputations:
    33
    Messages:
    550
    Likes Received:
    0
    Trophy Points:
    30
    I figured it out :)
    It's definitely based off the 9600m GT or similar performance because I<3ram ran it at 1280x800 (which is still 1280 as he said). WXGA+ can't use the resolution we ran it at (1280x1024) because it is 1440x900. So it's more or less the same thing.
    The GPU-Z gives us enough information to show it's not based off 9700m GT, which has higher clocks
    I got a 58xx/59xx (forgot exactly what score, and too lazy to spend time to run it again) score at 1280x800
    What OS are you using to run these tests btw (vista aero on, vista aero off, or XP)
    If you are using aero on Vista, then the 770m is probably a better card (100-200 3dmarks) in a field I can't identify. All specs tell me theyre the same :eek:, but that amount of 3dmark difference is usually not caused by only fluctuation
    Quadro cards always has higher TDP than normal GeForce cards (not sure why..)
     
← Previous page