does a better processor and a better video card == less battery life?
like does a 1.8GHz processor use less than a 2.0GHz one? (both core 2 duo)
and will an NVIDIA 8400 use less than a 8600?
-
-
Yes. A higher clocked component will generally draw more power. Although the difference between a 1.8Ghz CPU and a 2.0Ghz is very minimal and you probably won't see any real world difference. I'm not too sure about video cards though.
-
o yeah, and can you like shut off one core or something to save battery life? (i think i read that somewhere, or is that on the new intel pro chip?)
-
Yes, the new Core 2 Duo's are very good about only using the power that they actually need for the task at hand. Because of that, the 1.8GHz and the 2.0GHz processors will provide the same battery life for everyday tasks.
-
No, better performance does not necessarily imply less battery life.
It is often the case, but there are plenty of exceptions. (Pentium 4 ate plenty of power, but was just ridiculously slow compared to the competition. Geforce 7 used less power than Geforce 6, in general))
However, if the hardware is identical, then yes, higher performance will usually require more power, and thus decrease battery life. (So in your example, yes, a 2.0GHz CPU will *usually* eat more power than an 1.8GHz, if they're physically identical.) -
alright, so now i know about the processor, how about the video cards? will a higher performance video card eat more power than a lesser performance one?
-
-
John Ratsey Moderately inquisitive Super Moderator
The ability to reduce power consumption will depend on chip design and software. Nvidia make big claims about the power saving features of their new series of GPUs. I read a review of a notebook with an 8600GT which ran for less than 1.5 hours under full load but went about 5.5 hours under light load, which would tend to confirm that the GPU power saving features are working. However, I can't recommend ATI: I've got an X1700 which guzzles power when doing nothing.
John -
Just like cpus, higher speeds, more pipes, and a more complex gpu on the video cards require more power. Thus a 8600 will consume more than a 8400. Now this difference might not be much at all (~2-3%
, but it's undoubtedly there.
Also, this is running at full speed. If software downclocks your gpu to save power, then I would think they can both be at the same speeds and thus draw roughly identical amounts of power. -
thanks for all the help guys, i figured that the one with more power would consume more power when under a full load, but i wasnt really sure about when they arent (like word processing or something)
so, the conclusion is that it depends on the workload then (full load = yes consumes more power, but not much more, not full load = no they consume an equal amount)
i was just trying to decide between the NVIDIA 8400 and the NVIDIA 8600 cards and was curious about the processors. is this powermizer software that NVIDIA has developed only usable under windows? or can you use it under ubuntu, because i need to have ubuntu on my system too.
does more power == less battery life?
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Fittersman, Jun 19, 2007.