Why is the power input on laptops DC 19V and the battery 12V? Why not just stick with 12V?
MM
-
moral hazard Notebook Nobel Laureate
I guess to charge faster?
Also to be able to charge while still using the notebook? -
The higher the voltage, the lower the amperage.
If you had a laptop which had a 96 watt power consumption, one would need to pass 8 amps of current through the laptop at 12V.
If the manufacturer used 24V, only 4 amps would be needed to go through the laptop.
The lower the amperage going through, the better with electronics. Too much amperage can overload many computer components. They cannot tolerate such high amperage, as others
K-TRON -
TL;DR: I think batteries would have a higher voltage if it were efficient.
I'm going to go out on a limb here and guess it's because it's slightly easier to step voltages downward rather than upward. Why it's specifically 19V, someone else can answer, but it's quite easy to get rid of voltage in a circuit.
I can however guess a little better about the batteries:
Lithium ion (Li-on) batteries are the most common batteries in laptops today, and have a nominal voltage of 3.7V. This can fluctuate between 2.7V and 4.2V on a full discharge and charge respectfully.
You'll notice most laptop batteries therefore are in some increment of 3.7V (3 * 3.7 = 11V and 4 * 3.7 = 14.4V), which is why we have 6, 9, and 12 cell batteries; or 4, 8, and 12 cell batteries.
I am going to further guess and say it'd be more convenient to have more voltage in the batteries since we have 19-20V adapters usually. But now think of the charged and discharged voltages. For example, an 11.1V battery can fluctuate between 8.1V and 12.6V. That's a decent range to be compensating for.
Now imagine if we had a 5-cell battery that put out a nominal 18.5V. That would mean charged and discharged voltages between 13.5V and 21V, which is a 35% fluctuation to compensate for. Any voltage compensation ends up losing some overall energy, so it's most efficient to keep the voltages low while still workable.
Sorry I got carried away :/ -
-
-
Please enlighten me as well, I consider K-TRON to be one of the people with the most hardware know-how around here. -
moral hazard Notebook Nobel Laureate
-
-
One of the reasons is that the voltage level from the AC adapter has to be higher than the battery's voltage level, even when the computer is consuming the maximum amount of power possible.
Also the adapter is running at *about* 19V. But it is not a perfect flat-line voltage that is always 19V. There is going to be a ripple voltage, and as small as these adapters are I would not be surprised to see a significant amount of ripple. Engineers have to make sure that even the lowest voltage possible (highest ripple and high power demands from the laptop) is higher than the maximum voltage level of the battery. If not, then the battery would be periodically discharging while the adapter is plugged in. Obviously not a good thing.
Plus, the battery itself is not always 12V output. The battery is manufactured to output 12V even when the battery is almost entirely drained, otherwise the internal power circuitry on the laptop wouldn't function (voltage regulators need to run in specified voltage ranges). I've seen my fair share of laptop batteries that output 14-15V when at full capacity.
Put the two together, and you start to get the idea why the adapter has to be at that high of a voltage.
And yes, it sometimes does have to do with current capacities. Some components will burn out when current is high enough, or perhaps the "wires" on the circuit boards will heat up too much and/or blow. It is also more efficient to transmit power at a higher voltage and lower current, that means less power is lost during transmission. I do not think that is the main reason that engineers choose specific voltage levels, but it *could* be one of them. -
iGrim,
I love your comments
If you were right, how come appliances which use a ton of power like washing machines and dryers, are 220V, 480V, 600V?
If you were right, diesel locomotives would have 1200horsepower engines powering a 12V at 42,500 amp generator, instead of there normal 600V at 850amp generators
Try finding wire which can handle 42,500 amps. It would be a meter thick.
I am not 100% right in my opinions, but its common sense that you increase voltage to decrease amperage needed. This is true for appliances, car starters, laptops, etc.
K-TRON -
iGrim is no longer with us, so I think we can tone it down a notch now
. He isn't coming back either.
Now back to the topic at hand...
(Oh, and if anyone is curious...yes, I do have a degree in Electrical Engineering)
Oops, now let's get back to the topic...
If anyone is curious, start looking up the current carrying capacity of various gauges of wire... -
I belive that they do choose high voltages and low currents because this does increase efficiency in transmission... learnt this in school when they taught about how power is transmitted in the electricity grid and i assume its the same here.. also like greg said high currents can damage electrical and electronic components so typically adapaters and batteries try to use higher voltage. My own 90W adapter gives an output of 19.4 V and 4.77 A.
-
Thanks K-TRON and others (minus iGrim) for clearing that up. Much appreciated.
MM -
It just boggles my mind that someone posts up something they don't even have a clue about and everyone sides with them even though there was opposition to the completely incorrect conclusion. -
-
Power plug / battery voltage question
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Moosemilk, Oct 3, 2009.