Some of these recent threads got me interested in how many people overclock their XPS 16's GPU. Any interesting overclocking stories?
-
Well I kind of just started, but I have a feeling it's going to be frequently.
Went from 675/800 to 738/898
Saw someone who used like 775/950ish so I figured my numbers were a bit more conservative. -
Well here we go.
I wanted to see how high I could go, but still maintain good temps, no throttling, and stability.
Method:
I tested my overclock by running FurMark 1.8.2 on "Xtreme Burning Mode" at 1000x850 resolution windowed mode. To simulate a high-demand game, I also ran Prime95 concurrently with FurMark. I ran 3 instances of Prime95 on 'blend' mode on 2 different cores. So all together I had 4x 100% load threads (1x FurMark, and 3x Prime95) running on 3x cores. This should provide a greater load than any actual game out there.
I was able to clock to 725/925 (stock is 650/800) without any problems. GPU temps were around 78C max CPU temps were around 84C. This is a pretty mild overclock, but I ran it for over 30 min without any issues or throttling.
The GPU is capable of going much higher. I had it running 750/1000 without any GPU problems. However, the combined GPU/CPU heat was a bit too much for the cooling system, and it caused the CPU to throttle. If you were running a game or 3DMark06 or whatever, this wouldn't be a problem. I just wanted a overclock that would be 100% stable under all conditions.
I had all my regular programs running while developing the overclock w/ Prime95 and FurMark, but I closed all unnecessary programs (anti-virus, bluetooth monitor, smartcard reader processes, etc) when running the 3DMark06 benchmark.
I also tried dropping the voltage from 1.1V to 1.0V, but it did not make any difference at all (speed or temps), so I'm not sure if the voltmod option on the AMD GPU Clock Tool actually works on a mobile GPU.
Benchmarks before overclock:
3DMark06
Total: 8052
SM2: 2907
SM3: 3606
CPU: 3032
Max GPU temp running 3DMark06: 63C
FurMark 1.8 @ 1024x768: 2054 points
After 725/925 overclock:
3DMark06 total score: 8908
SM2: 3261
SM3: 4080
CPI: 3056
Max GPU temp: 66C
Furmark score: 2308
I'll have to try overclocking without Prime95 heating up the CPU and causing throttling, and see how high I can get the GPU to go, sacrificing my 100% stability and no throttling goals.Attached Files:
-
-
The max I could get it to complete 3DMark06 at is 825/1100. Please keep in mind that this (like my previous overclock) was without any external cooling.
So 100% stable = 725/925
3DMark06 stable - 825/1100
Max 3DMark06 scores:
Total: 10014
SM2.0: 3770
HDR/SM3.0: 4728
CPU: 3024Attached Files:
-
-
-
there is only frequently and no all the time to choose from. I leave my GPU overclocked in Rivatuner from start up... no troubles.
-
In 3DMark Vantage my GPU score is around 4188. I don't get a total points score, it just says NA. I'm not sure if that's because I'm just using the trial version. Anyway, the 10014 Mark06 and 4188 GPU vantage puts me just below the GTX 260M / 5850 level, which isn't bad for a 5730 overclock.
Not like I'd run it at 825/1100 all the time though, but 725/925 should be pretty good for a constant overclock. -
-
-
hmm interesting... I didn't think the 5670 was that much more powerful. I can only muster a 7200 or so overclocked. Maybe the i7 is helping the score too
-
-
so how do i overlock...procedure please
?
-
Download, install, run, set clocks to whatever you want.
EDIT: I don't know if it works with the 4670. But it's what I use on my 5730. -
825/1100 is definitely not stable for prolonged gaming. The graphics 'stutter' every few minutes (not throttling!) and it locks up randomly! It seems okay at first but the random stutter gives it away. And then of course, the locking up...
Well, on my card at least
I didn't throttle. Well, I should have had throttlestop up. But my GPU temp was 68C so I'm sure I didn't throttle. Ah well. -
I have ran 725/925 under torture conditions without any problems at all. I'll try testing out 800/1000 next, that might actually be long-term stable as long as the game isn't too CPU intensive, but anything much over 725 or 750 will cause throttling under a heavy combined CPU/GPU load, at least with a quad-core. But that test had like 4x 100% threads running.
-
On paper the 5730 is 20% faster than the 4670, but I don't think the difference is quite so big in real life, maybe 10% or so. More likely, the XPS16 cooling system just isn't adequate for 4670 overclocking, and you can't get all potential performance out of it without overheating. What were your overclock temps at? The 5730 is around 9W cooler, and the XPS heatsink/fan is able to keep up. Also, overclocking is always a crapshoot, maybe I just got a really good part, or you not so much. -
I never have heat issues and my temps while under full load for a while and overclock get to 69 degrees C more or less. I had my motherboard replaced at one point and the previous 4670 I had was unable to be overclocked to the level I am currently at which is 843/961
-
Have you tried bumping up your RAM speed a bit? I don't get artifacts till above 1150, and we use the same RAM, you might be able to get yours up a bit higher. -
Edit: Does anyone use a tool that can just set the ATI PowerPlay clock values (i.e. it still clocks down when not needed, such as when idle or on battery)? -
Well I run into issues when I bring the memory clock near 1000 so I brought it down below it. Every card is different so who knows...
-
Anyone scan arctifacts with ATITool or Rivatuner??
-
-
But if I scan with ATITool, it give me error, because there are arcifacts. -
-
Now I'm testing 785-875, over 900mhz gives me artifacts -
I just ran ATI Tool 0.26 for 15 min at 825/1100 with 0 artifacts. In fact, the artifacts began to appear in ATI Tool at the same time as in FurMark, at around 1150, and they appeared instantly after raising the mem clocks. Only in ATI Tool the artifacts were yellow instead of blue, lol. Regardless, the tests are nearly the same, ATI tool looks at a stationary piece of carpet, where FurMark looks at a rotating doughnut of carpet
-
My memory goes up to 1150 without artifacts. It only starts giving me lip at 1200 or so.
-
. Not that I'd ever overclock the 720m on this machine in real-world usage, it's just cutting it too close with temps and throttling, but for a 10min benchmark it should work.
-
I don't think you can overclock it. I don't know of any way, at least. Not much point to it, like you said, though I do get the feeling temperatures on this machine could be greatly improved with TIM reapplication.
I'm buying MX-3 in a week or so, I think. -
I idle in the high-40s CPU and mid-40s GPU. I've never had CPU temps exceed 75C or GPU exceed 66C (even in 3DMark), unless I'm running Prime95 + Furmark, which will get my CPU up into the mid-80s (throttling starts at 85) and my GPU to 79C (throttle free). This is without any external cooling and BIOS A11, and room temps around 75F. -
Stable Run, this is the overclock I apply everytime I play any Game:
Core: 777, Memory: 950
8094 3d Mark
SM2.0 Score: 3123
HDR/SM3.0 Score: 3984
CPU Score: 2154
Max overclock, I get quite a few Artifacts depending on the area shown:
Core: 790 Memory: 970
3D mark score: 8215
SM2.0 Score: 3192
HDR/SM3.0 Score: 4049
CPU Score: 2157
My conclusion is, that the 4670 is very close to the 5630.
Its the newer generation of CPU´s that do really make the difference in the 3D mark score.
I doubt you would see these benifits in most games though.
So I guess I will need to keep waiting for another version of the Studio XPS before I upgrade - especially since the new Studio XPS notebooks cost over 1300€ here... -
Can you try something for me, please?
Change your power plan to power saver, limit cpu to 0% in advanced setting to lock the cpu at the lowest frequency and do some light tasks like web surfing, xvid watching (no HD for this test), playing music...etc.
All of this having your laptop on your lap and then on a desk.
Does the fan kick in often in your case?
Thx -
@Dany|R
What are the ati and cpu widgets you are running in that screenshot? -
-
My CPU temps ranged from 49 to 54, and my GPU temps were rock solid at 46C (they hoped up to 50C, but it was really only like a split-second, then right back down for the rest of the test).
The entire time my fan was running steady, but at it's lowest speed, where you can barely feel the hot air exiting, and can't hear the fan at all over ambient noise. It never raised fan speed, not even once, the entire time. It also never stopped completely (though I don't know if I would have noticed it if it did).
I ran the tests with the laptop on a lapdesk (the logitech one, without any fans) on my lap.
The fan does kick on occasionally running the same type of tests on the 'balanced' power mode, without the CPU limited to 0%. -
-
Thanks Danny for the gadget links. I was interested in trying to overclock when I saw this thread today, so I grabbed an overclocking program and 3dmark 2006.
Ran a stock benchmark and got scores of...
3DMarks: 7298
SM 2.0: 2661
SM 3.0: 3138
CPU: 3037
These seemed a little lower then what I should be at stock, so I checked the GPU status and saw that it has a current clock of 99.99/149.97. Am I missing something here? These seem pretty low to me. I am also running A09 BIOS if that helps -
To see this in effect, start up GPU-Z and go to the sensor tab. It will show 100/150 until you start running something, like 3dmark when it boosts up to speed, then as soon as it's over it will drop back down to idle. -
-
Any ideas seeker why we have such different stock 3dmark scores? I have almost the exact same system as you except I have a 1080p screen. I tried bumping up to your 725/925 clock and had a score of 8077. Which is pretty similar to your stock clock and about 900 points less then your 725/925 clock. I am running a different BIOS, but could that really make such a difference?
-
Edit: Remove old temp files, clean the registry, run a defrag, update DirectX, adjust control panel settings, set power setting to high performance... -
No I have not. I guess maybe I will try flashing the BIOS and then doing optimizations. Do you know any good threads for windows ops for the XPS? I removed most of the obvious dell bloat ware but maybe I missed some of the background stuff.
-
Some of the others may have shorter answers for you, but if you're serious the TGTC is worth reading. Plus, you can skip the first part if you already have a basic understanding of computers. -
775/1100 is stable for me. Max GPU temp 76 (average 68 [started sampling data after OC]). Played the game for about three hours. It's solid.
My core refuses 800 :/. It chokes and throws off the OC randomly. -
-
I'd have no problems running 775/1100, as the memory doesn't produce much heat, so there's nothing much stopping one from higher memory clocks as long it's below the artifact threshold. However, that throws the ratio off a bit. I'm not really sure if that means anything though. Do you get a significant return from boosting the mem clocks so much, but the shader clocks so (relatively) little, or do you hit a point of diminishing returns on the memory speed?
-
If you wanted to get all high speed, you could either learn how to mod a BIOS, or even add a switch to the fan and run it to the hole for the useless kensington lock, where you could manually cut power to the fan when not needed
. It should be safe enough as long as you were careful and monitored the temps.
-
Wow I can imagine then someone coming to my laptop, doing something intensive, forgetting to switch on the fan, after a while just saying something like: "wow you a have a really quiet laptop..." right before it fries :-D
-
EDIT: 10% (or so..) increase in FPS from 968 to 1100 mem clock. Hardly noticeable, but it is substantial.
I've also brought out my tower fan and managed to lock my temperature at 58C under 100% load. That's really good.
I guess I'll play Battlefield: Bad Company 2. It's the most CPU-intensive current-gen game I have (haven't hopped on the SC2 bandwagon yet...), and it should force throttling, if throttling is possible with 810/1100. If I recall, it keeps all cores turbo'd (x13 multi). So yeah. I'll play an hour or two and see what happens
XPS 16 GPU overclocking
Discussion in 'Dell XPS and Studio XPS' started by gpig, Sep 17, 2010.