I have just Overclocked my GPUs to 600/980/1600. settings (STOCK settings were 500/799/1250.... and still the max temperature they reach are 80c. I have seen drastic improvements in most games with them now being extremely smooth. I tried doing a bios flash update for the graphics card the other day buy i couldn't get nvflash to run on my vista ultimate 64bit. So i decided to try and use nvidias new 6.02 ntune tool. So far works great.
I've had my XPS for almost 6 months now, hopefully it will still be working fine for years to come.
Has anyone else Overclocked their GPUs and what settings do you people use?
How are your temperatures and fan noise?
Stability?
-
-
what Gpu do you have?
-
My GPU i have is the Geforce 8800M GTX Sli.
-
can u post your full spec and 3dmark 06 scores?
-
My full specs are:
Dell XPS M1730
Windows Vista Ultimate 64bit
2.5ghz T9300 800mhz FSB
4GB DDR2 667Mhz
nVidia GeForce 8800mGTX SLI 1GB DDR3
17" 1900x1200 LG Flat Panel -
PhysX Card
Sorry I currently don't have 3d mark O6 installed, but will be getting a copy later. -
What notebook do you use? Specs?
Do you OC? -
Hmm with 1600 on the shaders you definitely have artifacts unless you have overvolted the GPU´s. Check ATI tool for artifacts.
-
I found 600/900/1500 to be the best OC for me, I could push a little higher but no much.
If I OC'd the CPU as well, fans all high speed, the GPU temp would get no higher than running at stock speeds (Upto 78c Max)
Bench's in my sig.... -
How did you know that, that was your OC limit? Cause its weird for me... I keep on increasing a little more each time and then i run a hope heap of benchmarks and games and stuff and they all seem to run smoothly look the same ....maybe faster or smoother however the temps don't change much
Maybe i should lower my OC clocks seeing as you guys could only go so high with yours. Just to be safe. -
I've now lowered my Shader clocks to 1500 after reading your posts.
Thanks.
I'm going to dl Ati tool now and try it to look out for any problems.
However i remember trying it last time and i couldn't get it to run. -
Also i was wonder what do the shader clocks actually do? Also how come they are set so high compared to the core and mem clocks? Is there normally a rule of tumb as to how high the shader clocks should be set ....for example given the core is set to a certain value then the shader clock should be set to something. ??????????
-
I have now lowered my clock values to 600/900/1500.
Are those values safe now or better? Is the mem value still set too high?
My max temps have lowered from 80c to 78c now.
Now gone up to 80c agains. No external coolers used. -
You can push the mem to 950 without any artifacts also the core can go up to 625 without any voltage increase in Nibitor and the shaders can be pushed to 1550. Most games these days uses heavy amount of shaders, this is what the shader clock is for.
Crysis I heard uses 8000 shaders, so the faster the shader clock is the faster the game will run. Though only a increase in Shader clock won´t yield any magical results. You still need the bandwidth i.e increase the memory and the Core clock alone yields good results too. Not all games are shader intensive. Some games benefit more from a increase in the Core clock than the Shader Clock. -
I've changed my clock settings again to
620/950/1500. -
Do you think it is safe if i OC the mem settings to 980??
I quite enjoyed that setting when playing crysis lol.
Whats the highest settings you have ever tried and used? -
You can set the memory clocks to anything that doesn't raise your GPU temperatures to high or cause artefacts. Everyones card will be different. Just because some peoples cards can't do 980 on memory doesn't mean that yours can't. Just test your settings to make sure you aren't getting any visual anomalies.
Same goes for shader and core clocks too. Just make sure you test them individually and together! -
Yeah every GPU is different, if 980 works go with that however do stress test with ATI tool and other progs just to be on the safe side. If you do have those deltas ATI tool shows and you run your cards at those clocks with deltas all the time you are bound to damage your GPU. I know I did with my XPS M170. I ignored ATI tool and I had deltas though in games I didn´t see any artifacts but in the end my GPU got destroyed.
Also remember the temps you see is only the Core temp. You don´t see what temps your memory on the GPU is at. The first thing that usually craps out on overclocking is the memory on the GPU´s if you clock those too high. -
Quote:
If you do have those deltas ATI tool shows and you run your cards at those clocks with deltas all the time you are bound to damage your GPU. I know I did with my XPS M170. I ignored ATI tool and I had deltas though in games I didn´t see any artifacts but in the end my GPU got destroyed.
Hey theres,
I was wondering how long after did it take for your graphics cards in the M170 to die. What exactly happend and how did you fix it? Did it just die while playing a game or something? smoke coming out of the notebook? get new replacement cards?
DELL XPS M1730 Overclocking! What settings do you use?
Discussion in 'Dell XPS and Studio XPS' started by reborn2003, Aug 30, 2008.