Some of these recent threads got me interested in how many people overclock their XPS 16's GPU. Any interesting overclocking stories?
- 
 Well I kind of just started, but I have a feeling it's going to be frequently. 
 
 Went from 675/800 to 738/898
 
 Saw someone who used like 775/950ish so I figured my numbers were a bit more conservative.
- 
 Well here we go. 
 
 I wanted to see how high I could go, but still maintain good temps, no throttling, and stability.
 
 Method:
 
 I tested my overclock by running FurMark 1.8.2 on "Xtreme Burning Mode" at 1000x850 resolution windowed mode. To simulate a high-demand game, I also ran Prime95 concurrently with FurMark. I ran 3 instances of Prime95 on 'blend' mode on 2 different cores. So all together I had 4x 100% load threads (1x FurMark, and 3x Prime95) running on 3x cores. This should provide a greater load than any actual game out there.
 
 I was able to clock to 725/925 (stock is 650/800) without any problems. GPU temps were around 78C max CPU temps were around 84C. This is a pretty mild overclock, but I ran it for over 30 min without any issues or throttling.
 
 The GPU is capable of going much higher. I had it running 750/1000 without any GPU problems. However, the combined GPU/CPU heat was a bit too much for the cooling system, and it caused the CPU to throttle. If you were running a game or 3DMark06 or whatever, this wouldn't be a problem. I just wanted a overclock that would be 100% stable under all conditions.
 
 I had all my regular programs running while developing the overclock w/ Prime95 and FurMark, but I closed all unnecessary programs (anti-virus, bluetooth monitor, smartcard reader processes, etc) when running the 3DMark06 benchmark.
 
 I also tried dropping the voltage from 1.1V to 1.0V, but it did not make any difference at all (speed or temps), so I'm not sure if the voltmod option on the AMD GPU Clock Tool actually works on a mobile GPU.
 
 Benchmarks before overclock:
 
 3DMark06
 
 Total: 8052
 SM2: 2907
 SM3: 3606
 CPU: 3032
 
 Max GPU temp running 3DMark06: 63C
 
 FurMark 1.8 @ 1024x768: 2054 points
 
 After 725/925 overclock:
 
 3DMark06 total score: 8908
 SM2: 3261
 SM3: 4080
 CPI: 3056
 
 Max GPU temp: 66C
 
 Furmark score: 2308
 
 I'll have to try overclocking without Prime95 heating up the CPU and causing throttling, and see how high I can get the GPU to go, sacrificing my 100% stability and no throttling goals.Attached Files:
- 
      
- 
 The max I could get it to complete 3DMark06 at is 825/1100. Please keep in mind that this (like my previous overclock) was without any external cooling. 
 
 So 100% stable = 725/925
 3DMark06 stable - 825/1100
 
 Max 3DMark06 scores:
 
 Total: 10014
 SM2.0: 3770
 HDR/SM3.0: 4728
 CPU: 3024Attached Files:
- 
      
- 
 Damn, nice.
- 
 
 there is only frequently and no all the time to choose from. I leave my GPU overclocked in Rivatuner from start up... no troubles. 
- 
 In 3DMark Vantage my GPU score is around 4188. I don't get a total points score, it just says NA. I'm not sure if that's because I'm just using the trial version. Anyway, the 10014 Mark06 and 4188 GPU vantage puts me just below the GTX 260M / 5850 level, which isn't bad for a 5730 overclock. 
 
 Not like I'd run it at 825/1100 all the time though, but 725/925 should be pretty good for a constant overclock.
- 
 
 Just out of curiosity what resolution was this test run at?
- 
 Whatever the default is... hold on... 1280x768
- 
 
 hmm interesting... I didn't think the 5670 was that much more powerful. I can only muster a 7200 or so overclocked. Maybe the i7 is helping the score too 
- 
 
 It's mostly the i7 probably. P:
- 
 
 so how do i overlock...procedure please  ? ?
- 
 
 AMD GPU Clock Tool v0.9.26.0 For HD 5870 download from Guru3D.com
 
 Download, install, run, set clocks to whatever you want.
 
 EDIT: I don't know if it works with the 4670. But it's what I use on my 5730.
- 
 
 825/1100 is definitely not stable for prolonged gaming. The graphics 'stutter' every few minutes (not throttling!) and it locks up randomly! It seems okay at first but the random stutter gives it away. And then of course, the locking up... 
 
 Well, on my card at least  
 
 I didn't throttle. Well, I should have had throttlestop up. But my GPU temp was 68C so I'm sure I didn't throttle. Ah well.
- 
 Yeah, I got 3DMark06, FurMark, and 3DMark Vantage to run at 825/1100, without lockup even after several runs, but I haven't tried any prolonged gaming with those settings yet. I couldn't get 850 to even get past the first test, so 825 is really on the far end of stable. Edit addition: Also, any memory settings above 1150 resulted in artifacts.
 
 I have ran 725/925 under torture conditions without any problems at all. I'll try testing out 800/1000 next, that might actually be long-term stable as long as the game isn't too CPU intensive, but anything much over 725 or 750 will cause throttling under a heavy combined CPU/GPU load, at least with a quad-core. But that test had like 4x 100% threads running.
 
 That's the program I've been using as well.
- 
 Well my CPU score was around 3025-3050 in all the tests, so even though my score jumped from 8000 to 10000 with GPU overclocking, the CPU score didn't change. What were your SM2 / SM3 scores? That would be a better comparison.
 
 On paper the 5730 is 20% faster than the 4670, but I don't think the difference is quite so big in real life, maybe 10% or so. More likely, the XPS16 cooling system just isn't adequate for 4670 overclocking, and you can't get all potential performance out of it without overheating. What were your overclock temps at? The 5730 is around 9W cooler, and the XPS heatsink/fan is able to keep up. Also, overclocking is always a crapshoot, maybe I just got a really good part, or you not so much.
- 
 
 I never have heat issues and my temps while under full load for a while and overclock get to 69 degrees C more or less. I had my motherboard replaced at one point and the previous 4670 I had was unable to be overclocked to the level I am currently at which is 843/961 
- 
 I was looking at the specs of the two cards, and the stock speeds and even our overclock speeds are very similar. The biggest difference is that the 4670 has 320 shaders, whereas the 5730 has 400, so that may be where the difference comes from. As we scale up our clock speeds, the performance difference of those 80 cores continues to increase. Also, the 4670 is 55nm and the 5730 is 40nm, but if your temps are low that shouldn't make much of a difference.
 
 Have you tried bumping up your RAM speed a bit? I don't get artifacts till above 1150, and we use the same RAM, you might be able to get yours up a bit higher.
- 
 I used the traditional way (Rivatuner). Good directions here. Overclocking ATI Mobility Radeon HD4570, HD4650 and HD4670 at Twisted-Reviews.com
 
 Edit: Does anyone use a tool that can just set the ATI PowerPlay clock values (i.e. it still clocks down when not needed, such as when idle or on battery)?
- 
 
 Well I run into issues when I bring the memory clock near 1000 so I brought it down below it. Every card is different so who knows... 
- 
 
 Anyone scan arctifacts with ATITool or Rivatuner?? 
- 
 I just use FurMark. If you turn up the memory too high, artifacts will start to appear immediately (small, bright blue splotches everywhere for me).
- 
 
 If I launch Furmark for a few hours, I don't see arctifacts.
 But if I scan with ATITool, it give me error, because there are arcifacts.  
- 
 Hmmm... I'll have to try that out. Are you using the one from TechPowerUp?
- 
 
 Yes
 
 Now I'm testing 785-875, over 900mhz gives me artifacts  
- 
 I just ran ATI Tool 0.26 for 15 min at 825/1100 with 0 artifacts. In fact, the artifacts began to appear in ATI Tool at the same time as in FurMark, at around 1150, and they appeared instantly after raising the mem clocks. Only in ATI Tool the artifacts were yellow instead of blue, lol. Regardless, the tests are nearly the same, ATI tool looks at a stationary piece of carpet, where FurMark looks at a rotating doughnut of carpet   
- 
 
 My memory goes up to 1150 without artifacts. It only starts giving me lip at 1200 or so. 
 
 Well it wouldn't matter much if the GPU was the bottleneck to begin with. The T9550 gets around 2350 in comparison. That's a pretty substantial difference!
- 
 True. Do you know what software tool I can use to overclock the CPU as well? This is my first time trying overclocking on a laptop. I've overclocked on a Desktop a few times, but most desktop overclocking tools seem not to work on a laptop. I wonder how high I can get on 3DMark06 if I overclock both . Not that I'd ever overclock the 720m on this machine in real-world usage, it's just cutting it too close with temps and throttling, but for a 10min benchmark it should work. . Not that I'd ever overclock the 720m on this machine in real-world usage, it's just cutting it too close with temps and throttling, but for a 10min benchmark it should work.
- 
 
 I don't think you can overclock it. I don't know of any way, at least. Not much point to it, like you said, though I do get the feeling temperatures on this machine could be greatly improved with TIM reapplication. 
 
 I'm buying MX-3 in a week or so, I think.
- 
 Yeah, I already upgraded my paste to the Shin-Etsu G751. I looked at a lot of reviews and comparisons, and G751 was usually #1 and always in the top 5. I don't think its that much better than good ol' AS5, though it's significantly better than that poorly applied stock crap.
 
 I idle in the high-40s CPU and mid-40s GPU. I've never had CPU temps exceed 75C or GPU exceed 66C (even in 3DMark), unless I'm running Prime95 + Furmark, which will get my CPU up into the mid-80s (throttling starts at 85) and my GPU to 79C (throttle free). This is without any external cooling and BIOS A11, and room temps around 75F.
- 
 
 Stable Run, this is the overclock I apply everytime I play any Game: 
 
 Core: 777, Memory: 950
 
 8094 3d Mark
 
 SM2.0 Score: 3123
 HDR/SM3.0 Score: 3984
 CPU Score: 2154
 
 
 Max overclock, I get quite a few Artifacts depending on the area shown:
 
 Core: 790 Memory: 970
 
 3D mark score: 8215
 
 SM2.0 Score: 3192
 HDR/SM3.0 Score: 4049
 CPU Score: 2157
 
 
 
 My conclusion is, that the 4670 is very close to the 5630.
 Its the newer generation of CPU´s that do really make the difference in the 3D mark score.
 I doubt you would see these benifits in most games though.
 
 
 So I guess I will need to keep waiting for another version of the Studio XPS before I upgrade - especially since the new Studio XPS notebooks cost over 1300€ here...
- 
 Sorry for OT, but:
 Can you try something for me, please?
 
 Change your power plan to power saver, limit cpu to 0% in advanced setting to lock the cpu at the lowest frequency and do some light tasks like web surfing, xvid watching (no HD for this test), playing music...etc.
 
 All of this having your laptop on your lap and then on a desk.
 
 Does the fan kick in often in your case?
 
 Thx
- 
 
 @Dany|R 
 What are the ati and cpu widgets you are running in that screenshot?
- 
 
 You can find here CPU and GPU widget  
- 
 I set everything up like you asked. I watched a 480p movie using VLC. At the same time the movie was running, I had Firefox open and running the "Peacekeeper" benchmark (from Futuremark, so it's like 3DMark for browsers) twice in a row to simulate web browsing, though the load was likely higher than the average surfing. The test took about 10 min total. As I'm sitting here typing this now, the movie is still playing, so it's like the testing continues, lol...
 
 My CPU temps ranged from 49 to 54, and my GPU temps were rock solid at 46C (they hoped up to 50C, but it was really only like a split-second, then right back down for the rest of the test).
 
 The entire time my fan was running steady, but at it's lowest speed, where you can barely feel the hot air exiting, and can't hear the fan at all over ambient noise. It never raised fan speed, not even once, the entire time. It also never stopped completely (though I don't know if I would have noticed it if it did).
 
 I ran the tests with the laptop on a lapdesk (the logitech one, without any fans) on my lap.
 
 The fan does kick on occasionally running the same type of tests on the 'balanced' power mode, without the CPU limited to 0%.
- 
 
 That would be pretty awesome if there was one.
- 
 
 Thanks Danny for the gadget links. I was interested in trying to overclock when I saw this thread today, so I grabbed an overclocking program and 3dmark 2006. 
 Ran a stock benchmark and got scores of...
 
 3DMarks: 7298
 SM 2.0: 2661
 SM 3.0: 3138
 CPU: 3037
 
 These seemed a little lower then what I should be at stock, so I checked the GPU status and saw that it has a current clock of 99.99/149.97. Am I missing something here? These seem pretty low to me. I am also running A09 BIOS if that helps
- 
 Those are normal idle speeds. They'll only kick up to full speed if the GPU is under load. It does that automatically, so nothing to worry about. That's the "PowerPlay" that the other posters are talking about.
 
 To see this in effect, start up GPU-Z and go to the sensor tab. It will show 100/150 until you start running something, like 3dmark when it boosts up to speed, then as soon as it's over it will drop back down to idle.
- 
 
 Great thanks.
- 
 
 Any ideas seeker why we have such different stock 3dmark scores? I have almost the exact same system as you except I have a 1080p screen. I tried bumping up to your 725/925 clock and had a score of 8077. Which is pretty similar to your stock clock and about 900 points less then your 725/925 clock. I am running a different BIOS, but could that really make such a difference? 
- 
 It might, as the BIOS updates may upgrade the vBIOS as well. Also, have you done any windows optimization? Install all the newest drivers, adjust your CCC settings, close or disable all unnecessary programs and services, remove start-up programs that eat up memory but perform no function, etc.
 
 Edit: Remove old temp files, clean the registry, run a defrag, update DirectX, adjust control panel settings, set power setting to high performance...
- 
 
 No I have not. I guess maybe I will try flashing the BIOS and then doing optimizations. Do you know any good threads for windows ops for the XPS? I removed most of the obvious dell bloat ware but maybe I missed some of the background stuff. 
- 
 This is super long, but a good place to start: TweakGuides.com - The TweakGuides Tweaking Companion
 
 Some of the others may have shorter answers for you, but if you're serious the TGTC is worth reading. Plus, you can skip the first part if you already have a basic understanding of computers.
- 
 
 775/1100 is stable for me. Max GPU temp 76 (average 68 [started sampling data after OC]). Played the game for about three hours. It's solid. 
 
 My core refuses 800 :/. It chokes and throws off the OC randomly.
- 
 You might not believe me but I'm bothered even by the lowest fan RPM (well only at the night, since like you said, during day it's not audible due to ambient noise). That's why I wondered whether you can achieve a silent run while doing not so demanding tasks. With temperatures you mentioned earlier I thought it might be possible (especially GPU temps)...
- 
 I was wondering about the shader to memory speed ratio. The 650/800 stock ratio is about 4:5. When I overclock I try to keep a similar ratio, trying to keep things in balance, assuming that the designers knew what they were doing when assigning the speeds.
 
 I'd have no problems running 775/1100, as the memory doesn't produce much heat, so there's nothing much stopping one from higher memory clocks as long it's below the artifact threshold. However, that throws the ratio off a bit. I'm not really sure if that means anything though . Do you get a significant return from boosting the mem clocks so much, but the shader clocks so (relatively) little, or do you hit a point of diminishing returns on the memory speed? . Do you get a significant return from boosting the mem clocks so much, but the shader clocks so (relatively) little, or do you hit a point of diminishing returns on the memory speed?
- 
 With a BIOS mod it should be possible. The engineers may have designed the BIOS to run the fan at all times, even if it's not necessary.
 
 If you wanted to get all high speed, you could either learn how to mod a BIOS , or even add a switch to the fan and run it to the hole for the useless kensington lock, where you could manually cut power to the fan when not needed , or even add a switch to the fan and run it to the hole for the useless kensington lock, where you could manually cut power to the fan when not needed . It should be safe enough as long as you were careful and monitored the temps. . It should be safe enough as long as you were careful and monitored the temps.
- 
 Wow I can imagine then someone coming to my laptop, doing something intensive, forgetting to switch on the fan, after a while just saying something like: "wow you a have a really quiet laptop..." right before it fries :-D 
- 
 
 No returns. Or very little. I didn't notice any, at least.
 
 EDIT: 10% (or so..) increase in FPS from 968 to 1100 mem clock. Hardly noticeable, but it is substantial.
 
 I'm testing, trying to see what the exact max for my card is. It's kind of addicting, heh. So far, I've thoroughly tested the memory and sure enough, 1100 is the stable max for me. As for the core, I've gone up to 840 and slowly gone down to what's stable with Furmark and 810 seems to be the sweet spot. I would totally check with increments of 1MHz instead of 5MHz but man. That's a lot of work. Furmark in Xtreme Burn is... crippling. 100% GPU Load leaves no room for internet surfing on the side and that makes me a sad panda and somewhat bored.
 
 I've also brought out my tower fan and managed to lock my temperature at 58C under 100% load. That's really good.
 
 I guess I'll play Battlefield: Bad Company 2. It's the most CPU-intensive current-gen game I have (haven't hopped on the SC2 bandwagon yet...), and it should force throttling, if throttling is possible with 810/1100. If I recall, it keeps all cores turbo'd (x13 multi). So yeah. I'll play an hour or two and see what happens  
XPS 16 GPU overclocking
Discussion in 'Dell XPS and Studio XPS' started by gpig, Sep 17, 2010.
 Problems? See this thread at archive.org.
 Problems? See this thread at archive.org.

![[IMG]](images/storyImages/th_20100918174259_830-930mhz.jpg)