In one of the other overclocking threads we saw that the trial version of 3dmark uses a different (non-configurable) resolution for the 900p screen compared to the 1920*1080 (at least for 1 user), which makes a very significant difference in the number of marks. If I recall, both screens had the same horizontal resolution as the default in 3dmark but the 900p screen had a lower default vertical resolution. Something like 1280*1024 vs 1280*800.
-
Well I proved myself wrong. I ran a test to see if memory speed is indeed limited in relation to core speed.
I ran 4 runs of 3dmark.
My first run was with a mild 700 core speed, and a ratio correct 861 mem speed, which netted me 8517 points. Then I upped the memory speed to my stable max of 1100, which boosted my score to 9437. Then I ran a faster core speed of 800 with the more limited 861 mem speed from my first run, and only got 8837 points. This isn't a perfect comparison, as the difference in core speed was only 100, where the difference in mem speed was 240.
So long story short, boosting mem speed actually gained me just as much points than boosting clock speed, and the ratio of the two is irrelevant. Lesson learned by me
-
i run 780/910 on my 4670 using rivatuner. any higher on the ram and it starts to studder.
-
-
Sorry for my ignorance in this kind of stuff (I've never overclocked a GPU, just my desktop's proc + mem), but is there any particular gain in watching videos from OC-ing the GPU?
For example, I've started watching BD-R movies, and when I jump to a particular position in the movie, I get a couple seconds of graphic/video artifacting before the video renders properly again (sound is fine and in sync). Would doing a small bit of overclocking speed up, or altogether eliminate, the graphical/video artifact recovery time?
Just wondering. It's not a problem for me, but after all this discussion about overclocking, it's got me curious about this! Thanks! -
I see a lot of talk here about speed, but what about temps? I don't know a whole lot about overclocking on GPUs, but on CPUs, if you keep your temps in the high 80s all the time; under load you are shaving years off the life of your processor. I would assume the same holds true for GPUs. From what I have read, if you are keeping your GPU above 85 C all the time under load, you are asking for trouble.
-
And the CPU temp capped at 70C... GPU at 60.
I think I'm going to catch pneumonia though. It's damn cold in here ;_; (I need a proper laptop cooler)
Anyway, seeker, you should use 1280x1024 (somehow) to benchmark. I can't change it on mine since I ain't payin for it.
My CPU is better than yours by 100 points bro
-
SXPS 1640 with P8700 and HD4670 (835/920) @1280x768
-
-
-
I think you're thinking of 3DMark Vantage. It gives me an error, and then forces me to run at a scaled 1280x1024, maybe that's the problem you're thinking of. Because of it, it doesn't give me an actual Vantage score, just the GPU score. I'm not going to actually pay for Vantage (I wouldn't pay for 06 either, but a copy came for free with my desktop motherboard), so I can't really play around with it.
In other news, I've been able to upgrade my max OC to 830/1115 (before I was using 25MHz steps and developed 825/1100 in only an hour, now I was doing 5 and then 1MHz steps and testing for several hours) and its 99% stable. I'll give more details tomorrow, it's time for me to call it quits today. -
Just out of curiosity... the 5730 in this laptop uses GDDR3 right? And the 4670 uses DDR3.
-
-
-
What differences in temps when running your benchmarks are you guys seeing between the normal clocks and the OC's?
-
-
As an experiment, I decided to turn off ATI PowerPlay and set the clocks with RivaTuner to the normal ATI PowerPlay values (675/800 for the 4670). Performance (as well as temperatures) were way down (like 30% performance reduction, 18C temperature reduction). Anyone know what else PowerPlay does besides changing the clocks? Does it literally use more power, and what else does it do that changes performance?
-
Well, after some work I was able to test a 830/1105 (down a bit from the 1115 I claimed last night...) overclock as 100% stable. To determine max clock speed, I started with my previously max of 825/1100, and ticked up the 825 by 1MHz and ran 3dmark06. I got up to 835, where it was consistently stable, and 836, where it would crash every time. I then took 5MHz off for a safety margin, and ended up with 830.
Then for max mem speed, I ran Furmark at 1050x850 res on extreme burning mode, and the ATI tools artifact scanner at the same time. I know I mentioned earlier that the ATI tools made no difference, but I must have been doing it wrong. Apologies to DanyR, the ATI tool will show artifacts long before FurMark will.
I started at 1135 and slowly moved my way down by 1MHz increments. If you're more than 10MHz over your stable max, the artifacts will appear within 30sec. As you get closer, it takes longer to see them. You're within 2-3 by the time it takes over 2 min for them to show. I ended up at 1120 with 15 minutes no artifacts, and once again, dropped 5MHz off for safety margin (which is where I came up with the 1115 of yesterday).
I then ran 3dmark06 looped 3 times, to include 'force full precision' 1600x900 res, 2xAA, bilinear filtering, and all the feature tests that aren't included in the free version. It ran without any problems.
Today to verify, I ran Furmark and ATI tool as stated above, plus Prime95 on top of it. I ended up having to drop the mem clocks down to 1110 for max stable, then 1105 for wiggle room. Unlike the previous test, where I ran 3x Prime95 threads, this time I only ran 2x to avoid a CPU throttling bottleneck, then eventually dropping to 1x.
I verified 830/1105 to be 100% stable and artifact free. I'll probably stick to 750/1000 (a 10% drop from max) for everyday non-benchmarking purposes though.
Throughout the tests, my GPU never topped 80C. Also, when running Furmark without the Prime95, they never topped 76C. They never topped 67C on just 3DMark06. I think the overheating CPU overwhelmed the cooling solution, and heated up the GPU higher than it would normally go by itself.
On a side note, I'm now convinced that the CPU throttling on my 1645 is power, not heat, related. Even with the 130W adapter. When running the benchmarks, even 3x 100% threads (1x Furmark, 2x Prime95), it would throttle after about 30min and hitting CPU temps of 85 with a 13x multiplier. When dropping down to 2x threads (1x Furmark, 1x Prime95), my multiplier jumped up to 16x, and my temps shot up immediately. For the cores running active 100% threads, the temps rose up almost instantly to 86C (beyond what I thought was the throttling threshold), and eventually climbed to 90C, however, there was no throttling, and the multiplier never dropped from 16x on the active threads. This leads me to believe that throttling is caused on the 1645 by inadequate power supply, when running 3 or more of the quad cores plus the GPU at 100%, and not temps. However, it could also be based on average temps of all the cores together, and not necessarily max temp on only 2x of the 4x cores, but that will require more testing. -
-
My new 3dmark score is 10088, which really isn't that much of a boost from my previous score considering all the work I put into it, but it's something. Plus, I killed a few hours having what a geek (or laptop enthusiast...) can call fun.
-
I decided to mess around on "my friend's" XPS 1645 since I don't want to void "my" warranty. "His" XPS 1645 happens to have the same specs as mine (4670). Here are my observations.
PERFORMANCE
There is a direct relationship between fps and clock (core and memory) speeds in everything I tested. Raise both speeds 5%, there will be a 5% increase in fps, raise it 10%, you will get 10% more fps. This is probably because the GPU is the bottleneck in everything I tested. Just raising one value seems to still help out most of the time. The maximum values I could hit were around 850/960. I could probably go more on the core clock but Rivatuner had a (!) symbol right at 850 so I decided not to.
POWER CONSUMPTION
I tried measuring the power consumption of the overclock while running Furmark. I need to re-do this sometime, one attempt showed a 1 watt difference, and another attempt showed a 7 watt difference. I'm not sure which is right, if either.
TEMPERATURES
Starcraft II, 675/800, max GPU 89C
Starcraft II, 800/950, max GPU 89C
I'll test more some other time. -
Can anyone help me? I have a Mobility Radeon 4670. Latest drivers and ATI CCC installed. Intel P8700 C2D.
I tried using the AMD GPU Clock Tool and messed around with a couple settings, and when they didn't work, I hit "restore default clocks" (300/795) and saved it. But when I try to play games, the clock speed wouldn't go higher than 300 MHz.
I'm pretty sure the default settings were 675/795, so I set it to that. Still the clock won't go higher than 675 while playing games (I think I've hit around 1150-1350 under full load). Is there any way to return the clock speeds back to normal? -
Will this totally brick the system if the overclocking failed...?
-
To make sure that it's working like it should, run GPU-Z. Go to the sensor tab, and look at what freq it tells you. Then start a game (leaving gpuz open), play a few min, and exit. Go back to GPU-Z. You should see that while the game was running, the CPU clocks should have jumped up to normal. -
If you start seeing colored spots appearing randomly across the screen, your mem clocks are too high. Go back and turn them down until the spots disappear. Once again no permanent damage.
Monitor your temps, make sure they don't go over 90, and you'll have no problems. -
what bios is the best for overclocking the gpu so it doesn't throttle? a09?
-
I never relax, i use rivatuner but you have to mod the config file so it recognizes the 4670 gpu.
-
I set the clocks back to default PP settings, however when I play games (I always play with the 90W adapter in, plus I have all the performance settings turned on when on the adapter), the GPU clock will jump from 675 MHz to 300 and back, even at low temps (50-60 C). I'm pretty sure this is "throttling" of sorts and I've never had this happen before. I could normally play ME 2 max settings at a stable frame rate for hours, but now I can barely play for a few minutes.
Another thing I noticed is in GPU-Z, the default clock is set to 400 MHz. I'm not sure if this is normal or not.
I've also tried OC'ing with RivaTuner, but it never gave me any problems.
I monitor the clock speeds with CPU-Z.
Is there any fix for this? -
You could try uninstalling the clock tool and the drivers, then re-installing the drivers. I'm pretty sure that the clock tool / rivatuner just changes the driver parameters, and doesn't do anything to the hardware. Resetting the drivers should fix this if it is indeed the problem. If not, I'd have to suggest getting dell to upgrade you to a 130W (which you really should have if you're trying to overclock anyway). -
I uninstalled the AMD Clock Tool. Don't think that helped, but am I supposed to uninstall the display driver via Device Manager? And if I do, can I just re-install right after that?
-
-
* Update *
This going to sound as weird as hell, but....
I rolled back from the latest driver (8.762) to the previous one (8.753). This didn't help, still had the same problems as before.
I then "searched for automatically updated software" through Device Manager (it always re-installs the oldest version, 8.631).
I go to CPU-Z, and now the GPU clock sits at 879 MHz. Max is 1350 when playing games. Idling temps are around 50 C. Max I've seen is round 72 C (I have a Zalman NC2K cooler + 9 cell battery). Also, for some reason, ATI CCC got downgraded to 10.7.
Funny thing is, GPU-Z readings are totally different. Default clock is still 400, GPU clock is 675, and Mem clock is 800 MHz. Sensors list Core clock as 220 MHz and Mem clock at 300 MHz.
In short, ? Everything works now, but should I take a chance and update back to the latest driver software (the one where I was trying to OC and stuff)? -
. If you're happy though there's no reason you can't stay where you are. There wasn't much difference between 10.7 and 10.8, plus 10.10 is going to be out in about a month, you could just wait for that.
-
Wow !! Its workeds !!!!
-
what program are you guys using to run the gpu at max? because in starcraft2 i am only getting 67C-68C max with a 4670 at 800/950.
-
-
Most of the time, for me, running a game will cause a slightly higher GPU temp then Furmark alone, since a game stresses the CPU and GPU, and they share a cooling system.
-
Or if you really want to have fun, run Furmark and Prime95 at the same time, if that doesn't stress your system, nothing will!
-
Started new thread related to OCing, see http://forum.notebookreview.com/del...os-modding-gpu-oc-fun-profit.html#post6754282
-
OK my 2 cents. The Studio XPS 1640 or any xps 16 is not designed for gaming, and the ATI graphics cards have also been admitted to not be for gaming as the fps is all over the place, The cooling vents are also totally insufficient for the job. I am just not impressed with it. The palmrest also gets so hot from the HDD. If your looking at getting a new laptop and what good gaming performance go for alienware. Its designed specifically for gaming. Get a refurbished one if you cant afford them new and they are just as good as the new ones just half the price. The ATI 4670 I can play on high graphics but if u reduce graphics u have no increase in FPS so weird. Gaming performance is brilliant but not for hours of gaming.
Another mod for cooling is to have a notebook cooler like coolermaster or zalman NC2000 and then take the bottom cover off the xps 1640 which will significantly increase cooling. Do this at your own risk. will test this when my mom has got her laptop back so I get my xps back.
With the zalman nc200 at idle I have temp decrease of 7C across the board. hard drive 2C. People think this is not alot but when you running at 85C dropping to 78 is huge difference. also your laptop will last longer under the cooler temps. -
We all know the SXPS 16 was never a 'gaming laptop.' But it is quite capable- I can run Left 4 Dead 2 at 1920*1080 on maximum settings across the board (besides AA) with an average of 45 fps without OC, so who cares if it's a 'gaming laptop'? And I have the older 4670 card.
I recently re-pasted with AS5, and now I'm able to run Prime95+Furmark without overheating, so no game can make it overheat either. -
Yeah I am just stating that for those that want one for gaming. I would highly recommend against it and rather get an alienware. Look it is serious impressive, I ran call of duty 6 at Full HD with High settings getting 40-60fps thats amazing but the fps spikes are so irritating. like a yo-yo. I have the results from some cooling experiment and some thermal paste. This was all with the zalman nc2000 laptop cooler. without the cooler the notebook fan would come on if the laptop was on a table when it reach 56C then dropped to 50C, with cooler the fan does not come on.
1 2 3 4 5
TZ00 75 77 74 51 49
TZ01 50 55 52 42 41
TZ02 69 75 73 46 49
CPU 70 72 69 40 43
GPU 78 83 81 46 50
HDD 37 41 37 36 36
1 gamin with back cover on
2 gamin with back cover & gold thermal paste
3 gamin without back cover & gold thermal paste
4 idle without cover
5 idle with cover
Used league of legends with is pretty gpu and cpu intensive.
1920x1080 all high graphics.
Interesting to note the HDD which is open is obviously runs cooler as there is no ventilation with the back cover on. My thermal paste didnt make too much of a difference but must be noted I have read it takes a few days for it to reach peak. The dell thermal paste was a gray hard compound which must have been terrible. I noticed the northbridge had a blue pad on it connecting it to copper heat pipe. can one use a copper kit and improve cooling on it and what would the performance difference be. -
I don't get it. Why revive a thread that's been dead for over a month just to say that the XPS16 isn't a gaming laptop, and that an Alienware is. We all already know that. Besides, you can't even buy the XPS16 new anymore. Not to mention that the 5730 that they replaced the 4670 with doesn't have the same heat problems.
-
and from his signature, it's a dual core SXPS with RGB too.
maybe this thread needs to be closed. -
Closing the thread completely might be a bit harsh. There's nothing wrong in bumping an old thread, there's lots of good info in here. Also, there's still potential for discussion in case anybody has something new. It's just that this dude makes his first post on the entire website in this thread, and said nothing new that hadn't already been discussed, throughout the thread he probably didn't even bother to read.
-
Was seriously considering to overclock my 1645 for some time, you guys talked me into it
-
hey, I overclocked mine to 800/1000 ! will keep it till that, no further freq increasing. will do benchmarking and post results soon !
thanks guys ! -
I tried to OC my GPU (4670) but the screen got black already at 725/925, possibly lower too. Is it the ATI driver overriding the clock or something?
Info: 1640 (as I've understood I can't run custom BIOS with 1640?), P8600 CPU, 4GB ram, A14 BIOS, 90W AC and I also undervolt the CPU to get a cooler laptop (performance on demand). -
Try bumping the Core clock and Memory clock up by 15 or so at a time from the default, separately. The 925 in your 725/925 is actually quite high- I was only getting around there (945ish) before I re-pasted. The people getting up to 1000 for the memory clock have 5730 cards which seem to overclock a bit better.
XPS 16 GPU overclocking
Discussion in 'Dell XPS and Studio XPS' started by gpig, Sep 17, 2010.