If I am reading correctly... the newer one has lower values... how would that be better in any way when the first one we fear may not be enough?
-
-
UPDATE: Still chokes w/the newer adapter... -
-
-
-
Should my fan be kicking in 100% when my cpu goes over 70 C? Is there anyway I can stop this, it's really annoying when watching an hd movie and the volumes not cranked...
-
-
The speakers are listed as 7 watt soundsystem though...
-
-
What about my XPS 1645 which still has power and throttling issues? Running Prime+Furmark shows that I get a multiplier difference of between 1 and 2 based on whether or not my screen is on maximum brightness or minimum brightness. Somewhere along the way my XPS 1645 is having problems with power.
Edit: Here's one log where I switched between maximum brightness and minimum brightness every minute (based on time in FurMark, may be a couple seconds off) for about 7 minutes. I noted in the log where the switches take place.
I get significantly better (between 1 and 2x multiplier) CPU performance every time I run these tests on minimum brightness (as compared to maximum brightness), and can generate more logs if desired. Also to note, the CPU performance gets reduced the longer the test is run, but the affect of the screen brightness is very clear.Attached Files:
-
-
Your results show a direct link between throttling and power consumption.
The next question might be, is this a sign that power consumption varies significantly from one XPS 1645 to the next or is this a sign that not all power adapters are created equal and some are better than others at converting AC to DC and not wasting as much of this energy in useless heat.
You can measure DC power consumption with the Windows Performance Monitor utility that is included in Windows 7 and Vista. Just click on the green + sign and tell it to monitor the Battery Status - Discharge Rate. Set the scale to 0.001 so it converts the milliwatts data to watts DC.
You need to be on battery power for this to work. You could run Prime95 or Furmark. I don't agree with running both of them at the same time while on battery power. That will warm up any battery so if run both of these at the same time, don't test for very long. -
-
It depends what you're trying to prove.
If you first test Prime95 and Furmark individually then maybe you'd be able to find out if the CPUs and GPUs are fairly consistent for power consumption. If power consumption is consistent from one laptop to the next then that would point the finger at the power adapter.
At the moment, you don't need to prove anything. We already know that your laptop throttles due to power consumption while some other XPS 1645 laptops do not.
Can your XPS 1645 run Prime95 + Furmark at the same time while on battery power? In the early days when trying to do this test, some laptops shut down because the battery could not deliver the necessary power. That's why I didn't recommend this test while on battery power but if you're feeling lucky then give it a try. It might not throttle at all on battery power. -
I tried running both at the same time on battery power (maximum CPU and GPU settings including ATI powerplay), and it throttles more even on low brightness than compared to when plugged in on maximum brightness.
One thing I noticed is that I never seen a framerate drop in FurMark due to throttling in any combination setting, ever. -
-
Has anyone ever recorded an actual frame rate decrease during a game due to throttling?
If not, then maybe we need to approach the entire throttling issue from a whole new angle... -
Lol... actually, the very first indicator of CPU throttling in a game is unplayable framerates and/or crackling audio.
GPU throttling (mostly thermal), on the other hand only causes a framerate drop and occurs intermittently (ie, it goes away when temps drop).
This is a good way to figure out in-game what sort of throttling you're experiencing
-
I get the glitching sound coming out of my speakers bad so that just confirms my suspicion that its my CPU throttling and not gpu or temperature related
-
Ok guys, just got my xps 1645 yesterday and got a chance to do a little throttle testing tonight. Not sure if I am doing this correctly, so if some one sees a setting I need to change or add then let me know. I was running prime95 and furmark with all the dell bloat ware still installed. As far as I can tell my log shows no throttling but like I said that may be because I was missing something. Here is my log and a screen shot. You may notice some erratic temps for a bit. This may be caused by the fact that I moved my computer off my lap and onto my desk.
ThrottleStopLog.txt -
Judging by your framerate in Furmark (which seems rather low - what resolution were you running it in?), while there is no CPU throttling, there might be GPU throttling. Can you run Rivatuner's hardware monitoring module alongside to see if your GPU clocks are throttling back. But your temps seem to be really good! What are your ambient temperatures man?!
From your TS log, there doesn't seem to be any clock modulation, which is always a good sign. But we really can't be absolutely sure till you run it with the CPU multiplier unchecked also. Could you run the same test with the CPU multiplier unchecked also? -
Ok I will uncheck that and run the tests again tomorrow. Sorry about furmark settings, I should have dragged them into the screen. I was running at 1080 res for furmark with xtreme burn mode on. I just tried without burn mode for a few minutes and got an average of about 18 fps with no hiccuping. Ambient temps are just about 70f, but with that being said I have a ceiling fan going that gives the room some decent circulation.
On a side note, I just finished playing a 35 minute match of supreme commander 2 with all settings maxed except no AA, and did not experience any slow downs to speak of. Averaged around 50 fps.
Also my exact system specs are..
Dell Studio XPS 16
Core i7-720QM | 6GB DDR3 1333MHz RAM | 7200 500GB Hard Drive | ATI Mobility Radeon HD 5730 1GB | 15.6 WLED 1080 Screen
I will toss these in a sig tomorrow. When I get a chance I will also take some pictures of my AC adapter and post them up, although it might be a few days. -
Well if you're running Furmark at a high resolution then I think your framerates are just fine. I would presume you have 130W adapter so pictures of it really wont be necessary (considering all of us have the same adapter ;-))
Congratulations, you seem to have a system that doesn't throttle. By the way, what BIOS are you running?
-
Going to try pull off a 20 minute test without burn and the cpu unchecked before bed right now. 15 mins in and so far no throttling. Ya using a 130w with 9 cell battery and a razer diamond back usb mouse plugged in. Where should I go to check bios?
EDIT:
Nevermind I found the BIOS. A9 5/11/2010 release -
Ok did a full 30 minute test and still no throttling. Here is a screen of my settings and my stop log. One thing to note is that at the beginning and end of the log you will see throttling but that is before and after I was running the tests. Also on my temps, I forgot to reset HWmonitor after playing Supreme Commander 2, so most of the highs listed there were from playing that game. I will try and get in a good hour long test tomorrow. If anyone has any questions or would like me to run any different tests/settings let me know.
http://img404.imageshack.us/img404/1057/burntesttwo.png
ThrottleStopLogTwo.txt -
I think I never see FurMark throttle because it hardly uses the CPU. I'm sure it would eventually throttle due to heat but I haven't tried running it past 100C on the GPU. -
Just did an hour long prime95 with furmark and unless I am doing something incorrectly, still no throttling.
http://img138.imageshack.us/img138/3538/burntest.png
ThrottleStopLog.txt -
Edit: I see your specs now. I need to get my hands on one of those; my 4670 gpu hits 98C running Furmark for 15 minutes. -
@Prime
Yes, your system is doing just fine. Don't think you need to torture it any more. Enjoy your new SXPS 1645!
I'm getting a new one with the exact same config as you - I hope it performs just as admirably -
<<<<<<<<<<<< Is jealous -
I can see Silverstreak almost has the exact same system. Why does he has throttling issues then? Both systems do require the same wattage. So what's the deal?
-
I get an average of 16 FPS and a max temperature of 86C when I run furmark, on my current system. But when I unplug I average 11.
But unlike Prime, my multipliers are jumping around from approximately 13 to 16.
My CO% never goes above 25% and my DTS never rises above 45.
Can anyone explain what this means? -
I did just receive my system yesterday and ordered it on the 20th. That mixed with the fact that I had A9 and not A10 BIOS preloaded maybe means that dell has just recently found some work around to the throttling issue. Some recent buyers had A10 on their XPS when they received it. Why else would they roll back to an older BIOS on a newer system? Coupled with the fact that after digging through these forums for some time, I have yet to see any recent XPSs that did not throttle with the exception of liquid's XPS. It will be interesting to see how some of the XPSs delivered in the next few weeks fare.
-
Or does anyone else have a cheaper model that does the same basic thing with less features?
Anyway, I'll try to see how much DC power my laptop uses on battery using the Windows tool thing.
The ambient temperature was somewhere in the 74-78F range when I tested Prime+Furmark, but I've experienced CPU throttling down to a 7x multiplier in Left 4 Dead even when the room temperature was 70F. I keep the laptop on a flat "Lap desk" (not a cooler). -
-
By itself, Furmark does not fully load your CPU. Not even close. When the CPU is not fully loaded, then it can use more turbo boost which can increase the multiplier up to 21 when a single core is active and the other 3 cores are in the C3/C6 sleep state. As cores wake up to process background info, the maximum multiplier for the cores that are awake is reduced. When lightly loaded, the maximum amount of turbo boost available is constantly changing a hundred times a second. ThrottleStop reports the average multiplier during each sample period. That's why you will see the average multiplier jumping up and down as the CPU continuously adjusts the amount of turbo boost. That's normal.
On Core i CPUs, the C0% is very similar to a load meter and tells you how much of your CPU is being utilized. When running 8 threads of Prime 95 Small FFTs, it should be steady at 100.0%. If you run 4 threads then it should show 50.0% for Prime95 plus another couple of percent depending on how much background junk you have running on your computer.
If you have 8 threads of Prime Small FFTs going and the C0% starts dropping below 100%, that's a very good sign that your CPU is being slowed down internally with clock modulation throttling.
If you are running just Furmark, that's barely loading your CPU so that's why the maximum C0% isn't that high. Same thing happens while gaming. Even with modern games, it's rare to see a log file with the C0% much above 25%. All those stories about how well threaded new games are going to be someday is still in the future. Games rarely need the full processing power that a hyper threaded Core i7 can deliver.
The DTS column is a direct reading of the on chip digital thermal sensor. As the CPU gets hotter, this counts down towards zero so the higher this number is, the better. If you would prefer that this column shows you the approximate core temperature of your CPU then go into the ThrottleStop.ini configuration file and just add this, TJMax=100
Intel's temperature sensors are not 100% accurate from idle to TJMax but the above setting should make your reported temperatures reasonably accurate.
With ThrottleStop running and opened up, push the F1 key on the keyboard for some more info about C0%, etc. -
Hey Unclewebb (thanks for the info),
Please can you tell me what my log means (TS is disabled and I'm running A09 [no thermal limit] on 130w). I start playing BF:BC2 at around 03:51:00. My game settings, if they mean anything are 1280x720, everything on high, shadows on medium, AA off. My framerates average mid-high 40s and are great. Max GPU temp is about 84C. As you see in the log, my multiplier fluctuates between 13 and 12, but stays mostly at 13. Does this mean my processor is turbo boosting on all 4 cores most of the time (13x)? I have always assumed that this was good and that at least my CPU was not throttling on the most demanding game that I have, but I wanted to get the expert's analysis first on what my log looks like. Thank you.
TS LOG HERE -
I did some tests on battery power.
System: i7 720QM with 4670 and WLED screen at full brightness (every power option set to maximum power/maximum performance including ATI POWERPLAY, this makes a major difference with Furmark!)
Here are the results:
Average idle power, 27.4 watts
Prime95 small FFTs 8 threads, 63 watts (12 multiplier the whole time)
Furmark by itself, 75watts average, steady 18fps (no throttling) spikes up to 86watts (probably due to some unrelated service using the CPU)
FurMark+Prime: up to 95 watts (multipliers at 7 pretty much instantly), 92 average, system crash after 2 minutes
Of course, this data by itself is pretty much meaningless (I think) with out other data to compare it with...anyone interested? Use Windows' built in tool for this as described by unclewebb here: http://forum.notebookreview.com/del...dates-xps-16-heating-throttling-issue-17.html -
Okay, I did a quick test with Furmark and (8) 95 small prime FFT's running,
minimum processor state at 100% and powerplay is on best performance on and off battery,
looks pretty bad..
TSLOG HERE
I have a few questions:
Why is my multiplier staying around 7?
Half way through I unplugged and CO% stood at 100%.. but when I play games and unplug I see a drastic loss in fps.
How is this possible?
Hopefully my replacement system is throttle free with it's new specs. -
E.D.U.: BF:BC2 looks like the perfect game for testing purposes. It's utilizing your CPU a lot more than most games do.
Overall your log file looks excellent. When 3 or 4 cores are loaded, turbo boost allows the multiplier to increase by a maximum of +1 as long as power consumption and the CPU core temperature are within spec. The default multiplier is 12 so with turbo boost, that's why you are seeing a lot of 13.00 multipliers in the log.
As the load increases, the amount of turbo boost can disappear and drop to zero so you are left with the default 12.00 multiplier. When it goes below the default of 12.00, that's a sign of throttling. There were a few times when this happened where it throttled back to the 11 multiplier. This throttling was usually very brief and it quickly recovered back to 12.00 so it's not likely you would ever notice this while playing. This typically happened when the C0% got up to around 60% or higher.
Your temperatures are fine. With this game you are likely on the edge of throttling due to power consumption. This game and most any game look like they would be very playable without any significant throttling issues. I think a Furmark + Prime95 test would put you over that edge and you would see more throttling when running an extreme synthetic test like that. -
overvu: Your CPU is suffering through some severe throttling. Dell uses two different methods to accomplish this. The first method is usually multiplier throttling. On the Core i7-720QM, this can go as low as 7.0. When it decides that it needs to throttle your CPU further, then it will start to use clock modulation throttling which is designed to slow your CPU down internally.
Clock modulation throttling reduces the load internally so this can sometimes result in the multiplier increasing. The two throttling methods might alternate back and forth a little when this starts to happen.
Are you using the 130 watt adapter and what bios version are you using? Most of the newer models don't throttle this badly.
Without knowing the exact moment you pulled the plug, it's hard for me to comment on exactly what happened when. I also don't know the exact throttling scheme that Dell is using. It varies from one laptop to the next and from one bios version to the next.
All I know based on your log file is that your laptop is throttling like crazy. Hopefully the XPS 1645 you have in production will work better than this one does. -
Yea, well I don't know how old this laptop is, as it is refurbished.
I use a 130w adapter and I'm on A10.
Catalyst 10.6
I just played a round of Starcraft 2 and I can only handle medium graphics to achieve ~60 fps.
When I run in high I get ~20 fps
I'll test my new system on Tuesday when it comes in and update here!
*And thanks so much for your input*
Oh and I'll do two different logs right now, one with battery and one with the 130w adapter. -
Here are the quickies I promised
On AC Adapter
On Battery
And to confirm I'm doing it right here's a picture:
http://i48.tinypic.com/tasbc7.jpg -
. So as you said, my system is on the edge in the power consumption front, but I'm thinking that future games are going to utilize more of the CPU (with more efficient multi-core usage)? So that C0% will likely just keep increasing with newer more demanding games, unless the GPU will become a bottleneck by that point? I guess I will be pretty much playing the lottery as to whether these future games would push my power consumption over the edge into CPU throttling realms or not
... (especially if Dell does nothing more)
I never use the synthetic tests because I like real-world applications/games like BF:BC2, but it looks like I'm already maxing out the system's power consumption as per Dell's caps. I don't even see them removing the thermal limit anymore, so there is little chance that they will release the 130w full potential (as I believe it was said that Dell capped that as well). Well...I guess we'll have to see what happens though, but for the present I'm happy enough that I can even game on this system at essentially the expected levels of performance (hope it lasts). Thanks a lot UW. -
-
I really don't see what the difference is personally between A09 and A10. Either one I can play Bad Company 2 for less than 10mins before I go from 40+ FPS to under 8 FPS. Can someone explain to me step by step on how to create a throttlestop log and what programs to use so I can post my results here for UncleWebb and others to critique? Thanks.
-
@wetcardboard, Here you go: CLICK. This is one of the earliest threads that really got the whole "throttling" issue out there. The first post is most definitely worth a read through. The directions might be a little outdated (and throttlestop's UI has changed a bit from the screenshots on there) but it tells you pretty much what to do. Also everyone of Unclewebb's posts has taught me so much about this stuff, so def. read them when you can. Get to testing, and put 'em up
.
-
-
If anyone got the balls to...
For those who have 130 watt chargers... how bout you grab a volt meter and tap into the electrical line of an outlet to see how much power the XPS is using -
So I played Bad Company 2 for about 5mins and throttling started occuring. My max cpu temp was 85 C and max gpu temp 86.5 C (6-cell battery, vent blocked by screen). It was CPU throttling because of the glitching sounds coming from the speakers. I wanted to see if I would have gpu throttling at 84 C as everyone seems to think there is a thermal limit with A10 bios, but it doesn't seem so unless there's something I missed...
Any updates on the XPS 16 heating/ throttling issue?
Discussion in 'Dell XPS and Studio XPS' started by tgreen408, Jun 18, 2010.