I thought it would be appropriate to dedicate a thread to the 9800M GTX for Clevo models.
I'm curious the performance, temps, and stability others are getting with the 9800M GTX in the Clevo D900 and NP5796 series, and any others that come along from Clevo that support it. If you have the most powerful mobile GPU in the world, let us know!
For those of you unfamiliar with the 9800M GTX, it is the latest and most powerful mobile GPU on the market, offered by nVidia Corp. The 9800M GTX is similar to the 8800M GTX and the 9800M GT in speed, but offers a level of performance unmatched by any other mobile card. The 9800M GTX performs better than the original 8800 GTS 640MB desktop card, which is quite a feat.
Some technical details about the card:
GPU Code Name: G92M-750
Transistors: 754 million
Fab Process Technology: 65 nm
Memory Size: 1 GB
GPU Core Clock: 500 MHz
GPU Shader Clock: 1375 MHz
Unified Shaders: 112
Pixel Fillrate 3200 MPixel/s
Texel Fillrate 11200 MTexel/s
Overclocking
The 9800M GTX overclock is limited to shaders only, out of the box. nVidia System Tools will allows the shader to overclock, but not the Core or the Memory. At the time of this writing, there is no other method of overclocking (outside setFSB). mvktech.net is currently working on including the 9800M GTX BIOS into the next revision of NiBiToR for overvolting the 9800M GTX.
PhysX
The stock 176.09 drivers DO NOT allow PhysX running on the card. Use drivers 177.89 or newer to get optimal PhysX performance. Now the original PhysX cards utilized 128MB of DDR3 along with 12 shader units. If I am correct here, the same specifications will be pulled out of the card while running physX, leaving the card with 100-shaders for GPU processing and 786MB for GPU memory.
Stock Synthetic Benchmark Scores:
Test System: Intel P9500 2.53 GHz, 4GB 1066MHz DDR3, Windows Vista x64 SP1, driver rev. 177.98
3DMark06 - Overall 9555 .:|:. SM2 4405 .:|:. SM3 4263 .:|:. CPU 2289 @ 69C
![]()
3DMark Vantage (P) - Overall 4998 .:|:. GPU 3980 .:|:. CPU(GPGPU) 21532 @ 68C
![]()
Lets open this thread up for general discussion on the 9800M GTX!
-
emike, could we see the Crysis built in benchmark at 1680x1050 all high in dx9?
-
Here is mine
Driver:176.09 (original)
Driver: 177.92
(just noticed this 3Dmark Vantage... thought it was the same as 3DMark06)
Then I noticed you had 177.98 driver >.< so I followed (lol)
Driver 177.98
(uninstalled 3Dmark06 yesterday... so just played with 3Dmark Vantage)
So I noticed the 3Dmark Vantage test from the xxx.92 to the xxx.98 are almost identical. The only thing that raised the score up was the CPU score. The graphic score stayed the same (good thing? or bad thing?) I'm guessing... the CPU performed better bc... the temps were cooler? I really didn't even do anything differently... well.. I was laying in bed with the laptop when I did the xxx.92 score... so it was up right... thought it would be better cooling than to have it on a desk... which is what I did for the 2nd lol. Or maybe it's bc of the new driver too =)
Well, I'm content... I'll keep the .98 Time to play with it!
**Specs in Sig**
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Just noticed... my GPU score is lower than yours emike... should I be worried? It's not that much. You also had a lower CPU Score... which... should make sense bc yours is 2.5ghz core 2 duo right? Hmmm... did you overclock also? -
Very cool. I would have thought the extra 800mhz from your cpu would have reflected more, but not bad at all!
I don't have Crysis installed anymore. I left the disk at a friends house and had to do an OSRI, but maybe later! -
exactly!! thats what I was thinking too! >.<
-
Brilliant thread emike! I'm waiting for my Multicom Kunshan M570+ (Clevo M570TU) with the following specifications:
17" Widescreen WSXGA+ 1680x1050
Intel Core 2 Duo P9500 2.53GHz FSB1066 6MB cache 25W
500GB SATA 5400RPM 2.5"
4GB DDR3 SO-DIMM 1066MHz (2x2GB)
NVIDIA® GeForce® 9800M GTX 512MB DDR3
Intel® Turbo Memory 2GB
MICROSOFT Windows Vista Ultimate 64-bit
I thought 9800M GTX only came in a 1024MB memory configuration, but apparently not. Oh well. Also I couldn't configure this computer with the 4GB Turbo Memory, but that's not such a big issue for me. Once I have received it I will run the appropriate tests and post my results here. In the mean time, keep posting info to keep me excited until it arrives!
EDIT: The shop mentioned above don't know when they will get the 9800M GTX in, but they have the 9800M GT in stock already so I capitulated and changed my order to a 9800M GT instead, using the excess value on upgrading the processor to a T9600 2.8GHz. Well, have fun with the most awesome notebook GPU in the world! -
Thanks! nVidia doesn't make a 512MB model of the GTX. I'm thinking they might have the wrong information on the site. I searched high and low and didn't find a real 512mb model
-
hmmm dont know if i'm the only one experiencing this... but 177.98 clocks down the vid card to 275/300 while playing my cs... not my normal 500/799 >.< going back to .92
-
Still playing CS eh? What fps you get with that? Isn't that the 2D clock speeds? I have no idea why it would drop down to that.
-
lol been playing cs since 1.5 lol... just can't get away from it
Yeah... it's so weird... I'm playing at this server where its headshots only... so accuracy and 1 bad *** gamin pc/laptop is what is needed.
Lucky for me... I HAVE BOTH! It was easy and a breeze yesterday... then after this update... it's not doing so hot.
I'm not really sure if it is the 2D clock speeds... my in game settings is 1600x1200 OpenGL, highest 32bit and normal display mode. Didn't touch anything with the nvidia controls except for Vsync... that way I can get my 100FPS... but even with it off (constant 59-60 fps) It still was downclocking.
Is there a way to keep GPU Z on top of all windows? Well.. doesn't really matter.. w/ it refreshing... it leaves that graph trail right? And when I hover over it... those were the numbers... pretty much less than half of what it should be >.< Great numbers on the 3dmarks... but why downclocking during gaming? -
UGH well... I went back to 177.92 and... same thing!!! 275/301!! it starts off at 500/799 then goes down >.< anyway I can manually keep it on max? is that w/ powermizer? then again... shouldn't it do it by itself?
POOR 9800M GTX! Is he tired already?! -
the clocks are throttled automatically usually.
you can view the clocks go up if you use RivaTuner and monitor temps (and other things like clocks) while in-game.
its a low-level feature that helps keep the videocard cooler and saves power usage. -
ahh yes! rivatuner! u have mentioned that to me before... I think I will get it this time... I'm going to go play some COD4 and see if the vid card gets clocked down too..
-
Perhaps the drivers do not think it is a 3D game. I don't know why else it would DC to 2D mode. Normally DC drops to idle speeds. Have you tried the stock drivers?
-
*sigh* I can....
is this bad for the vid card? This will be the 3rd time today I've uninstalled and installed a vid driver lol -
Gophn... I just installed RivaTuner...
GPUZ shows the clocks at 200/100 @ idle... but rivatuner shows it at 432/601? Should I reset values to match GPUZ? -
I think GPU-Z is not accurate... since it was the videocard is still new.
You can also check Nvidia's nTune for clock speed as well. -
where can I see this nTune? Something I'll need to dl'ed? or something in the nvidia controller?
In COD4 atm... shows 1350 core clock and 799 memory clock...
i'll compare it with nTune and see if I can disregard GPUZ -
nTune is phased out. Download nVidia System Tools. It has all the monitoring tools you need in it.
-
well... I figure that I'll jsut work with these numbers... if GPUZ says 500/799 and rivatuner says 1350/799... then 500=1350 kinda deal. I just want to see if it drops down during CS game play... didn't drop at all during COD4
-
>.<
World of Warcraft
383/301 : GPUZ
828/300 : RivaTuner
CS
275/301 : GPUZ
594/300 : RivaTuner
Okay... I'd like manually set this then. Any way I can crank it up to full whenever I want to? safely... can't I just tune it in a way it hits 500/799?!?!?! I want to unleash the power!
hmmm... should I go back to the original driver too? -
going back to the original drivers (176.09) might be worth it since its stable and was made for the videocard specifically.
and you should install Nvidia nTune with it as well for more monitoring, OC, etc... options that will be added to the Nvidia Control Panel. -
is there a difference between the nTune and emike's suggestion of the nVidia System Tools?
also, going to the original driver... what will I lose from 177.92 to that 176.09?
Also, nTune.. you said I have an OC capability?
Finally... can't I just somehow engage it to full when I want? using the current driver? -
Ntune is a part of the System Tools:
http://www.nvidia.com/object/nvidia_system_tools_6.02.html -
It looks like WOW (bleh) is running under the 3D clocks, while CS (bleh) is running the 2D clocks. I'm wondering if it isn't using the extra clock mode due to the fact that the games hardly utilize the card?
-
actually.. good news.. wow seems to be going all out as well 1350/799. So thats good. I'll do some research regarding this CS being downclocked.
So... I read somewhere about this... "powermizer" feature? Don't seem to have that... have you heard about it?
There has to be like an on/off switch... disabling the "auto" clocking... or just something is up w/ CS + Nvidia -
I don't know much about powermizer. I've tweaked it quite a bit but am not quite sure what each setting does.
Search the system registry, you are looking for for a string called "PerfLevelSrc"
Change the value data from 3333 to 3322. Click ok.
Find "PowerMizerEnable" and change the value data from 1 to 0.
Find PowermizerLevel and change the value to 1, Find PowermizerLevelAC and change value to 1
This should disable powermizer -
hmm... search the system registry? how do i do that? >.< sorry
is that in nvidia control? -
haha no. Hit start, type regedit, say ok to the UAC security prompt, then hit the f3 key once regedit is up. It will search the hundreds of thousands of records and find the right one. Its a mess in there, but just stick to what I wrote and you'll be fine.
-
trying it out now! thanks emike!
-
NP. Let me know if it makes a difference for you.
-
hmm ok... now what does it do? hehe
idle still at 200/100 (gpuz) -
Did you restart the computer? Forgot to mention to do that. Any changes in the registry require a restart. Idle should remain the same. Check your game performance after the restart.
-
restarting now! >.< wish me luck!
-
*sigh* no... it did nothing... still the 275/301 (gpuz) deal... >.<
i'll do some research online to see if its something on CS's end?
I'll report back...
poor 9800M GTX... being held back... -
Well there is nothing you could do to get CS to take advantage of the power the card has to offer
When Johan over at Mvktech.net releases the NiBiToR version for our cards, you can simply increase the 2D clocks to the 3D speeds and the 3D clocks to the Extra speeds, and the Extra clocks to Overclocked! Can't wait for that release.
I hate to ask incredibly stupid questions, but I assume you are plugged in and that Vista's power profile is set to High Performance? -
well then.. i guess i'll just wait
I guess I should have paid attention before... maybe it was always like that. It does spike up to the 1350/799 (rivatune) in the begining... thats when I feel like i'm hacking! then it gets clocked down.
And yes... I am plugged in (wow did I see a lost of performance when unplugged) and yes I am on high performance -
The lost performance is due to the card clocking down to 200/100/200 speeds (idle). This for sure tells you that you aren't running at idle clocks! Wish I could offer more advice. I'm waiting for the card also.
If you want to speed up the process for the GTX hard overclock, would you please email your vBIOS to Johan? Use the green button in GPU-Z next to the BIOS information to save it as a file.
His email is: [email protected]
The sooner you send it, the sooner it will be included. Let me know if you need anything more -
thanks Mike... I'll keep you updated if anything changes. Thanks for all the help
-
well, I called nvidia this morning... just to see if there was something I can do. Of course the person I had... wasnt knowledgeable about computers (go figure). Only thing she did mention is the nTune. After the conversation though, I came up with a good question that I hope some1 has the answer to.
Overclocking cards; it will affect both the 3d and 2d clock speeds correct?
Since I am okay w/ WoW and COD4 (since they are both running 3d clock speeds) I just need to up the 2d clock speeds... which is what CS is seeing. So is there a way I can just OC 2d clock speeds? Wait.. ugh... nvm you already answered it 4 posts above.
I guess i'll just have to wait. I'll email Johan tonight with my vBIOS. *sigh* I'm going to start a new thread and see if I'm the only one with this issue. -
Ya send him that vBIOS. Can't wait for NiBiToR support and uber overvolting at 12k+ 3dM06 scores
-
Hmm...is there any performance difference between the 8800M GTX and the 9800M GTX?
-
There is. The 8800M GTX is an older G92 core with 96 shaders @ 1250mhz and 512MB of memory. The 9800M GTX is a brand new G92-750 core with 112 shaders @ 1375mhz and 1024MB of memory.
The increased shader count and frequency allows for faster rendering of shader data, and the 1GB of vRAM allows for faster and higher resolutions. Overall, the card sees anywhere from 1-15% more performance, depending on the application and its settings. -
but not worth it imo if you already have the 8800gtx so far man i wish i would have bought the sager 9262 instead of my dell
-
Oh I agree that the 9262 is a far better system than the 1730. Also agree if you have a 8800M GTX or the 9800M GT it is not worth the extra $$$ for the card, but if you got the $$$ or the option, its great!
-
Just a question I've been thinking about, What makes the 9800M GTX different from the 8800M GT 1GB? Some things come to mind:
Faster Core/Shader/Memory frequencies
Higher Voltage Core
Better Cooling
Outside those three things, is there anything different from the 8800M GT 1GB? Is there any reason (outside heat, mind you) that the 9800M GTX could not perform as a stock 8800 GT 1GB? Just speculation here... -
But I would`ve gotten the 9262 myself...
However , the XPS is plenty powerful, and the 8800MGTX SLI in the xps is a tad faster than the one in the 9262, proven on many accounts -
The 8800M GTX on the XPS is also overvoltable to 1.15v and perhaps even higher, giving it a far better advantage.
-
Although I believe the 9262 runs much cooler.
I hit 80C when maxing out COD4, with every possible option...whereas most 9262 users report mid 70s. -
you should sell it then.
9800M GTX Thread
Discussion in 'Sager and Clevo' started by emike09, Sep 14, 2008.