If I am using the processor to do basic things (internet, ms office, streaming video, no games) what would be a rough estimate in the difference of the yearly energy cost of these processors everything else being equal?
-
I'm guessing you're using the 920xm or 940xm. The cpu won't use more energy than it has to when doing basic things. It's not like it's on full throttle all the time. If you do only that, you wont see much of a difference.
-
-
This is a pretty basic way of looking at it:
Lets make some assumptions here, let’s say that a 45TDP CPU uses 15W when just surfing the internet and the 73TDP CPU uses 25W. If you surf for one hour you have used 10W/hr of power more with the 73TDP CPU. It would take 100 hour increments to use 1 kw/hr which is typically used by the utility companies to charge electrical use. Check your utility bill and see what they charge per kw/hr of power use and you can figure out the difference. Of course as you do more demanding tasks the difference grows as you get closer to maxing the processors power use.
hope this helps -
Except those assumptions aren't correct. You can easily buy a $30-40 Kill-A-Watt and measure the power consumption directly.
-
-
I also thought he's comparing a dual core to a quad core.
I'd say there might be a 5 watt difference between the two in lower clocked modes.
If you really want to know exactly, Notebookjournal.de publishes the real power consumption of many notebooks. -
I thought he was comparing the TDP of the extreme cpus with throttlestop to unlock the TDP to a regular 45W quad.
-
At 12 cents per kilowatt-hour (around what they charge here last I checked), a watt in constant use is about $1.05 per year.
24 [hours/day] * 365.25 [days/year] * 1/1000 [kw/watt] * 0.12 [dollars/kw hour] = 1.05 [dollars/watt-year]
If your price is a little higher... at 16 cents per kw-hour, that becomes $1.40 per watt-year.
So if you leave the thing on 24/7, and it uses 7 more watts, that'll cost around $7 to $10 extra per year. If you only have it on 3 hours a day on average, then you're talking about something like $1 per year or so.
Of course there may also be other benefits to the lower TDP cpu (less fan noise, less environmental impact, etc.) -
-
When I made my post above I figured the 45TDP CPU was a non extreme quad notebook model, and the 73TDP was a desktop of some sort. To come up with the 15W I used HWInfo32, it has a feature that can display the power consumption of the CPU. I don’t know how accurate it is but the results I got seemed very reasonable using my engineering judgment. This is the only tool I know that is available to approximate power use as it is virtually impossible to measure it otherwise. On my laptop using a 920XM rated at 55TDP I measure while surfing the net between 20-25W. So in reality 15W is probably a bit low, conversely I probably should have rated the desktop CPU more like 30W. For the sake of argument 10W difference allowed an easy build up of figuring out power use. As it was getting late for me I did not go through the complete steps to calculate yearly use (someone else did and from the quick scan looks right). So I will stand by my assumptions, knowing that it is probably a higher than my estimates and accurate enough for these discussions. People are giving way to much credit in efficiency of processor in a somewhat idle stay. Also a watt meter only gives the system level power use not the CPU as the OP was wondering about.
Edit: As swarmers calculation shows it’s not much per year in actual dollar savings, however if you sold 10 million processors worldwide during a year and each CPU saved 10W that add up to a significant impact on the amount of electricity that needs to be generated. Something to think about for sure. -
-
-
My determination of power use is based on the reading from HWinfo32, if you read my last post. I also stated I have no idea what the accuracy is, but using my judgment it seems about right. (Just so you know my judgment is not irrelevant, I have been a practicing professional electronics engineering for 25+ years, with a lot of micro experience the past 15 years. I am not spewing crap here) My CPU which is an i7 920XM with a TDP rating of 55W, shows power use in the range of 20-25Watts while surfing the net. Since I was figuring a 45W TDP CPU the OP was talking about probably would be an i7 720/740/820/840 quad. I was giving the benefit of the doubt that it would be lower by maybe 5W, when performing in the same manner. You cannot compare your laptop to a machine with one of these i7 quads, these things use a lot more power.
-
The method I used in HWiNFO32 for power consumption monitoring of Nehalem/Westmere CPUs is based on official Intel algorithm, so I assume it should be accurate.
-
Mumak, you must be the author of HWinfo32. If so good job, that is some nice software.
Based on Mumak’s comments about how HWinfo32 comes up with the CPU power figures, I would say conversation over, my numbers stand. -
Also, sgogeta4 is correct that the difference would not be so great between two systems with "everything else being equal." Assuming the systems are identical and the processors are of the same family, stepping, and power saving features, we should be looking at possibly a 1-2 watt difference at idle if there is any at all. Under lower load and with all power saving features enabled, both systems would use similar power as well. At high loads the processor with the higher TDP will use more power, but it will also be doing a lot more work, though not as efficiently on a work unit per watt basis (assuming the higher TDP is there because it is a much higher clocking model and not complete marketing BS). -
As I said in my previous few posts, until the OP posts the CPUs that are being compared, none of us can accurately compare power consumption. I am just basing my facts off of what the OP has stated (whether or not they could be wrong is another story). Again, TDP isn't directly related to a CPUs actual power consumption. I am not calling your judgement irrelevant but I'm trying to say that we don't have enough information to use our comparisons on the OPs question. I might not have the experience you have in the field, but I also graduated and worked as an electrical engineer specializing in control systems.
-
Okay couple of things
I looked through the spec sheet and found the numbers for idle power for the i7 920 and the 720/820. The 920 is speced at 20W max @50 degrees C and the others are 18. This is max, with no typical or mins. The state of the CPU is halted so is would be a little less than idle (how much is anybody’s guess). So it makes sense the values I read from hwinfo32 are right in the range of 20-25 when idle. One thing that I came away with is that Intel has a serious handle on the power used by these CPU’s in order to handle the turbo boast features. So the algorithms that are able to determine power use are probably pretty accurate. As far as accuracy is concerned of HWinfo I have no reason to doubt it’s pretty darn close, and it is obvious from seeing what the software does that its takes a pretty high level of competence to come up with that in the first place. I would suspect the author performed their due diligence in making sure his algorithms followed Intel. Unless Intel has some kind of internal software that could be used to verify this accuracy, I just don’t see any way to prove one way or another just how accurate it is.
So in the end we would need to know the CPU the OP had in mind for the 73TDP to get a better idea of its power draw at idle, so we can compare with the 45TDP model. My post still is probably close enough to use to see that total year cost it just not that much to worry about from an individual’s point of view
Cost of using of processor with a 73watt TDP vs. 45 watt TDP
Discussion in 'Hardware Components and Aftermarket Upgrades' started by JWBlue, Aug 17, 2010.