So I finally decided to delve into this overclocking business and see if the gains were worth the risk. I've read many threads on this topic around the forum but I still can't wrap my head around how some people are able to pull it off so well.
First off, I'm using an 8600m GT GDDR3 card, so stock clocks are at 475/702. I've successfully been able to overclock upto 665/900, at which point scary stuff begins to happen. My 3dmark06 score at this clock speed is ~4900 (at 1440x900, which I'm told is almost the same as benching at the standard 1280x1024).
I've been using 3dmark to give me a general guideline of how I'm doing and I'm performance testing in UT3 (on Shangri-La). Now, all the guides I've read talk about artifacting and temperature rises as symptoms that you're reaching the limit of your card. I don't see any of this type of behavior. My temperature increases about 2C from what it would've been at stock. Also, my laptop bluntly shuts off and reboots in the middle of a game if I've gone too high on my core clock (note: since I've undervolted my cpu, I thought this tip would fix the problem, but it didn't).
Now this isn't any of the typical behavior I've seen from other OCers. What gives? My gpu temperature refuses to go higher than 73C (mind you, this is a good thing, I just want to use all that headroom for more overclocking), and my laptop shuts off if I go just a tiny bit higher than 670 on the core. The shutting off part is what I'm really puzzled about, since I've never heard about that kind of thing happening before.
I've seen folk with 5800 point scores on standard res with this card. I know that using a really high frequency processor inflates your score a little, but some people are managing to push their card to 750/950 and retain safe temperatures. I'm not looking to go that high, I just want a couple more frames in my games so my lows no longer dip into the mid-high 20s (which is really annoying).
So, does anyone know what the secret is? Are these people just the lucky few, who happen to have gpus that God himself personally crafted?
EDIT: Just thought I might add that I'm using the official nvidia 179.28 drivers. I tried Dox 181.20 but that just aggravated the reboot problem. Perhaps there's another set of drivers that would be more suited to overclocking?
-
King of Interns Simply a laptop enthusiast
Most likely you need to change the Vcore for the graphics card to go higher. Not enough voltage it will crash, without graphics card your computer has to restart. Also do your test in 1280x1024 you should be able to turn your reso down then you can properly compare scores - you have over 5000. The guy who got 5800 on here has to same laptop as me with processor clocked at 2.93 ghtz and he definitely upped the voltage from 1.2 to 1.3 I think.
-
Thanks for the prompt reply.
Hmm so you're saying my voltage is too low? I haven't looked much in to nibitor but do you think it's possible to do a voltage mod on this card?
I'm working on getting the 1280x1024 res (replacing my WXGA+ panel with a WSXGA+), I'll update the numbers when that happens. -
King of Interns Simply a laptop enthusiast
I have never tried and each card and computer is different. It should be possible to change it at least a small amount. I believe the change from 1.2 to 1.3 on the C90 is the most one can do, I don't know on you dell. I gave up early on in the Oc ing card game as my DDR2 will crash at 500/500 maybe when I can afford a DDR3 replacement I can play around some more. Someone can hopefully chip in and let you know how to change the voltage properly. One more thing try not pushing the core higher and then push the shaders alone as high as you can that apparently gives a decent boost. GOod luck
-
I made some tests with a XPS M1530 (T9300, 4gb Ram, 160gb hdd 7200rpm, Bios A12, Vista Ultimate SP1 Official + Forceware 179.28 for NB) about 2 days ago. The results were absolutely impressive.
I tested the frequencies with some long session of UT 3 (Onyx Coast Vehicles Warfare 20-30 mins each), Race Driver Grid (Cycling the tracks) and CoD 4 ("countdown" map if i remember well, the one with the missile silos and huge amount of smoke btw).
The dead line arrived on 685 core / 1650 shaders / 900 memory
I tested also 685/1700/900 but while in UT3 they went well, during the first minutes of Grid the card switched on lower freqs probably for the shaders (on 1750 it showed artifacts).
Talking about the temperatures i can say that on UT3 (that is one of the "hotter" games i tried) my NB reached 91°C after 20 mins, while lifting its rear with 2 bottle stoppers (about 1 cm of height) it never went beyond 81°C (that is acceptable for me considering that stock temps stays on 74°C-76°C and ambient around 25°C.
Raising the memory freq over 900 gave me artifacts so i'm sure that is my physical limit, while on the core i can easily reach 700mhz with some random reboots (so i assume it's a power issue like the shaders one till 1750)
Considering the stock frequencies, i got a boost of 44% on core, 74% on shaders (that is AMAZING imho) and 29% on memory (I heard of people going over 1000mhz but it's quite rare and not fully attendible imho).
I'll probably upload some videos with different games on Youtube to show the proofs -
i would push mine a bit more if it weren't for the impotent cooling system. -
Wow, 1650 on the shader is quite the overclock! I never thought of increasing my shader clock separately since it's linked to the core by default, so I assumed it too would be unstable. What kind of fps gains do you get from increasing the shader? Actually, a more pertinent question would be: which games extensively use the shader? I know crysis off the top of my head, but how about UT3?
I'm not gonna try increasing my memory until I see artifacts, there's no temp sensor and I wouldn't want to risk foolishly frying my chip. 900 is a solid speed for it IMO.
On the subject of nibitor, there's a very limited number of threads about bios flashing the GDDR3 variant of this card. Apparently, even if you change the voltages and flash, nothing changes because it isn't controlled by software. I downloaded nibitor for the hell of it but it wouldn't even read my BIOS(said it had an unknown device ID and then error'ed out)! I then attempted dumping the BIOS with GPU-Z but that spat out a 'BIOS reading on supported on this device' error. I'm somewhat disappointed since I don't mind going up a few degrees in heat if I can squeeze some more juice out of this card.
If anyone has any more information on this I'd appreciate your input.
EDIT: Wow this is a stubborn card. I don't see how it could have an integrated bios - I just ran nvflash using a usb bootdisk, and attempted to save the BIOS. It threw me an 'EEPROM not supported' error -
Yeah I did some heavy OC'ing .. but I currently don't even have Rivatuner installed
Don't game as much on PC as I used to - so I run everything stock speed.. I do use the latest DOX drivers though, for optimal quality & performance.
-
Ha, you have the exact same core clock as I do. What happens when you try to go any further, artifacts or shutdowns? My 3dmark score comes in at 6229 (at 1280x800, which I'm guessing the score in your signature is benched at), I think our difference in points comes mainly from the fact that you have a T8300 and I have a T7500.
Modding the voltage of this card seems impossible at this point. But maybe we can work at this from another angle, is shutting off a clean cut case of low voltage? Are there any other software level tweaks that might be able to alleviate it a little?
Maybe I'm just asking my card to do the impossible, but not giving up just yet. -
BTW that's not a big problem imho, just because with the tons of drivers who have the powermizer "bugged", taking the card always in the 3d high performance profile (with OC freqs) could drastically lower the life of the card.
-
Yeah, I see that now. People were flashing the bios of their 8600m DDR2 cards left and right, so I assumed the GDDR3 would have a dedicated bios too, but I guess it differs between different laptop models. Btw, I know that people are flashing their card to have permanent OC profiles which in conjunction with a bugged powermizer would kill the card pretty quickly, but I'm looking to do no such thing. I just want to increase my voltage a little
Anyway, it turns out my 660/900 clock couldn't hold up at 1440x900 resolution (hard reboots after a few minutes in TF2), so now I'm down 655/870 (stable) which is 5886 3dmark points at 1280x800 res.. *sigh*
I haven't tried increasing my shader clock separately yet, but I have a feeling I know what will happen. -
King of Interns Simply a laptop enthusiast
Dude that is an excellent score.
-
Not sure if it's already been said, but did you disable powermizer? That's what allowed me to hit 600/850/1400 on my card
When I installed the Windows 7 Beta I forgot to disable it again, and had quite a run around trying to figure out why my card would no longer OC. Powermizer was the problem
There's instructions on disabling it in my guide (click sig). -
But I'm not that ambitious.
-
Sorry 'bout that. It's *almost* the same for Win7. You have to look in a slightly different registry file. Look around, you'll probably find it. I'll post directions later, but I've got to get some rest for tomorrow, I'm going to the NAMM show
-
Hmm, I searched for 'PowerMizer' earlier in my registry and none of the entries seemed pertinent. I don't need a step-by-step, just point your finger
-
-
Good luck 2 ya -
Iceman0124 More news from nowhere
Those with the sky high clocks....any of you run artifact tester with those clocks? I can get an insane OC on my GTX 260, and it will bench and play games just fine, but it instantly fails AT, IMO a solid stable OC needs to pass AT for a minimum of 12 hours.
-
Hardcore Overclockers probably won't admit these frequencies as "rock solid" but as i can play without issues i consider them enough for my purposes -
-
I haven't extensively tested how far my shader can go. I know my core won't go past 600, and my memory starts getting unstable after 850, but my shader's at 1400 right now, and I may be able to go a bit further. 1650 seems dern high though.
-
Well, I've repeatedly been having to slightly decrease my clocks to get stability in new games (went from UT3, to TF2, to COD4 demo). I'm convinced there's tons more overclock potential on this card (lack of heat, thus lack of artifacts), it's just that pesky voltage.
That got me thinking though.. has anyone attempted physical voltage modding on a laptop gpu? I'm thinking in terms of adding a variable resistor, or doing a pencil mod. We'd most likely need the data sheets to find FB and ground, hmm.
I think I'm just asking for trouble at this point, but I'm interested. -
Iceman0124 More news from nowhere
AT works fine on vista, I use it all the time. You can use the artifact tester without using any other part of the program.
-
I suggest you to try also This One I found it quite useful to push the card to its limit -
How the heck do you guys manage to clock the shader to 1650? The meter is maxed out for me at 1425, which my DDR2 8600M GT easily does.
(RivaTuner, of course).Attached Files:
-
-
Like I said, I'm using nTune - the shader speed maximum is raised to 4x the stock core instead of 3x in RivaTuner.
-
Iceman0124 More news from nowhere
Inatll ATI tools, you'll get the error, run the program, and scan for artifacts
-
-
Here's how to unlock the max clocks in RivaTuner
-
I managed to overclock my 9500M GS to 630/1500/480 from the stock 475/900/400. The core shows artifacts at >640, the shaders at >1550 and i dont wanna go with more than 20% Oc on the memory since it doesnt have any cooling. In L4D at 1280x800 everything at Highest possible settings except 2xAA and 4xAF and disabled multicore and vsync I get from 15/50/30.063 to 25/83/50.423 (min/max/avg) with dox's 180.84
Keep in mind i only overclock the card when it's on my notepal since this thing gets pretty hot anyway -
i cant even overclock my sager 2090 with the 86000m GT with DDR2 memory past 535 core and 411 memory from 475 and 400 stock. after i hit 540 it just shuts off. i dont know how you all past 550 on the core, maybe i got a bad chip, but i dont care i upgraded to a brand new gateway 6831 for $650 on ebay and i am loving it. time to sell my sager and i might even brake even on the deal.
-
King of Interns Simply a laptop enthusiast
DDR2 8600M GT's suck as they have additional memory on the bottom of card which isn't actively cooled - instability. Well thats the case with the C90S. Would rip them off if I had money to burn
-
8600m gt and 8400m have the same arch. which means it suks and it cannot be OCed well. I reccomend you keep it at stock or get ready to see sparks fly.
About those insane 8600m GT overclocks..
Discussion in 'Gaming (Software and Graphics Cards)' started by paperkut, Jan 16, 2009.