Since I couldn't fit this info into my own guide.Hope it'll be useful for you.
It means we can assume that the TDP is equal to the maximum GPU power consumption.Ofcourse GPU does not reach the maximum power consumption unless at heavy gaming but it can be considered as a measure of GPU's power consumption.
The GeForce Go 7000 series :
Model /Process/TDP
7950GTX/90nm /45W
7900GTX/90nm /45W
7900GS /90nm /20W
7800GTX/110nm /66W
7800 /110nm /35W
7600 /90nm /16W
7400 /90nm /10W
7300 /90nm /7W
Mobility Radeon X1000 series :
Model / Process / TDP
X1800 / 90nm /35W?
X1600 / 90nm /17W
IGP(Integrated Graphics Processor) Solutions :
Model / Process / TDP
Nvidia GeForce Go 6150/90nm(IGP)-130nm(chipset)/5.6W(GPU only),incld. chipset: 7.5W
ATi Radeon X1150 /110nm /<10W (GPU + chipset)
Every kind of help will be appreciated.![]()
Thanks to miner for the IGP GPU's.
Sources :
http://www.laptoplogic.com/news/detail.php?id=541
http://www.laptoplogic.com/news/detail.php?id=693
http://laptoplogic.com/news/detail.php?id=775
http://hardocp.com/article.html?art=MTE5NCwyLCwxNjA=
http://www.laptoplogic.com/news/detail.php?id=586
http://www.laptoplogic.com/resources/detail.php?id=45&page=1
EDIT : The TDP part of the guide has been integrated by Meaker.I hope it helps people compare the GPU's even better.(the fab. process part has not been integrated)
For Meaker's guide for here : http://forum.notebookreview.com/showthread.php?t=70254
-
-
17W for a X1600 sounds believable...if only it were less while gaming
.
-
Useful thread, but the TDP has 2 definitions: the one that you posted and the "Intel one". Intel, in fact, defines TDP as average consumption, so their values are virtually useless.
Anyway, a lot of figures seem reliable, but the 20W TDP of the Go 7900 seems optimistic to me.
In this ( http://www.pcper.com/article.php?aid=293&type=expert&pid=11) test, the same system with a 7900 GS has a total system consumption only 12W lower than the one equipped with the 7900 GTX (which should have a 45W TDP, so in theory 25W higher than the GS). -
No, it's the common one.The common says the maximum power that the thermal solution should dissipate.(go to wikipedia)
Intel definition :
For GeForce Go 7900GS go to the first page :
http://www.pcper.com/article.php?aid=293&type=expert&pid=2 -
Well, for AMD is the maximum amount of power drained.
I know that the official figures say 20W for the 7900GS, but it's really unlikely that this consumption is calculated in the same way of the X1600 or 7600 one, given the fact that the maximum power consumption (in full load) of a comparable system with a 7900GS is at least 15W higher than my pc, for example (80W vs 65W).
Maybe 20W is the consumption when you're watching a movie...
-
I don't think so,
3DMark06 is more CPU dependent than 3DMark05, so I think the 7900GTX might not be fully utilized, also the ACPI settings and V-RAM frequency and some other stuff might be at hand.Anyway open your fav. battery manager.And monitor the power consumption, it goes up and down.So never ever try to compare the power consumption of two internal parts by judging them by the overall power consumption.
Another sample of unfair and incomplete comparison is the benchmarking posted on AMD's site.It compares a Turion X2 2.0GHz with a Core Duo 2.0GHz but the AMD one is equipped with a GeForce Go 7600 and their power is not equal.AMD finaly concludes that the AMD machine is slightly better while it is not because of the CPU alone but the GPU too, so , all and all we cannot compare using overall power of machines this easily. -
You're right. But under certain conditions it could give you some important infos...
Check those two tests: exactly the same notebook (same cpu, same dvd, same screen, only the HDD differs, but it's 4200rpm on the 7900GS one, so it should have a lighter consumption) but one with a 7600, and the other with a 7900 GS. Look at the consumption data (at the end of the page, under "Battery runtime")... More than 20W of difference in full load.
http://www.notebookcheck.net/Review-Toshiba-Satellite-P100-191.1303.0.html
http://www.notebookcheck.net/Review-Toshiba-Satellite-P100-102.1291.0.html -
Dunno, but it's better to stay "official" or we'll be making enemies for ourselves for no known reason...
-
heh Very helpful. Should be a stick
-
Couple of problems with using TDP for this.
1: As said, not all companies measure TDP the same way.
2: TDP does not accurately depict heat dissipation (For example, AMD uses the same TDP for almost all their CPU's, despite them having very different heat dissipation characteristics. For convenience, they just stick the same TDP on all of them, so heatsink manufacturers only have one number to worry about. Even if most of the chips can only output, say, 70% of that figure as heat. As you said yourself, TDP is the maximum amount of power the heatsink *should* be able to dissipate. That isn't the same as the maximum amount of power the chip *can* dissipate. (Which is usually lower, and in a few (Intel) cases, higher)
3: heat dissipated != power consumption
So no, I'd say we can definitely *not* "assume that the TDP is equal to the maximum GPU power consumption"
True, but if you're careful, it's still a better method than going by TDP.
Of course, to properly test, all other components should be as fast as at all possible, so that these bottlenecks are (mostly) eliminated.
Then we can be reasonably sure that we're not holding the card back by using a too slow CPU.
It might not be a 100% reliable method, but it's better than TDP, which scores a big zero in accuracy, simply because it measures the wrong thing. It only measures "what the guys compiling data for heatsink manufacturers found convenient to come up with". And if that wasn't bad enough, the data these people are given to come up with numbers from is *also* the wrong thing. (heat dissipation vs power consumption) -
Um... yes it is.
What else happens to power being drawn by any Integrated circuit but to end up being dissipated as heat? It's not emitting light, sound or storing chemical potential or performing mechanical work. -
Also, any chip spends power only if it's switching between 0 and 1, so Thermodynamically the power consumption almost == heat dissipated.
-
It's also sending electrical signals on to other components, which aren't emitted as heat (at least by the GPU. Sooner or later it obviously does turn into heat, but the GPU isn't responsible for that)
To be honest, I've no clue how big a ratio is emitted as heat, but I agree it's probably most of it. But still not all.
(That said, my reasons 1 and 2 are the main problems with using TDP) -
The TDP is more critical for laptops, I think we can safely assume that both numbers are realistic.In fact for laptops you can go and read my sources.One of them says "ATI always targets 35Watts for the high-end mobility GPU's"
2 almost equal to 3...
(Note the GPU does not send wattage out, the power it uses is only when switching between 0-1, in fact the data is transmitted in pulses, the making of the pulses (the changes) consume power not anything else) -
Updated the data a bit.
-
Well, I don't think so.
And I dont see why the TDP is any more critical for laptops. Of course on laptops, actual power consumption is a much more important factor than on desktops, but that still doesn't imply that TDP should be any closer related to it.
Which means what, exactly? That they target a TDP of 35W? According to whose definition of TDP? And what does that have to do with actual power consumption?
-
When the you are gaming [the up to date ones] the GPU is under stress which means it's at the peak power consumption.This is important for gamers which are the main people who read this guide.
If you consider the GPU as a thermodynamical system.The input is the power that the chip uses to calculate and change the pins from high to low and back again [and if you insist, the electricital energy which defines the input data] the output is the electrical energy that defines the output.Since the data is full duplex the input and output data are transferred at the same time [almost, you nit picking is really annoying...].So in the delta T we can by average and statistically say that the electrical data defining the output and input data cancel each other out.Also we know at some point the heat generation and heat dissipation (TDP) are equal.And since we proved that the heat generation is equal to power consumption.When GPU is under stress TDP = Power Consumption. -
What about power consumption when Powerplay/powermizer are used...
-
Well, Powerplay reduces the clock to 1/3 of the maximum so it should make a difference.But the power consumption is much more dependent on the GPU utilization (like the CPU).But I haven't seen any numbers on that subject.Any will be welcome
-
Some more info on IGP's:
Nvidia Geforce Go 6150
TDP: 5.6W (GPU only), incld. chipset: 7.5W
Manufacturing Process: 90nm(IGP), 130nm(chipset)
ATi Radeon X1150
TDP: <10W (GPU + chipset)
Manufacturing Process: 110nm
Source: http://www.laptoplogic.com/resources/detail.php?id=45&page=1 -
Meaker@Sager Company Representative
Think of a chip as a variable resistor, since it technically does no useful work since its not emiting light or the like which is being used all power = heat.
-
:cry: :cry: 66W! ouch...
-
Meaker@Sager Company Representative
Thats peak power, it rarely hits this. -
Nope, we do not know that TDP and heat generation are ever equal. There is no reason for them to be. We know that TDP is a number produced by the manufacturer, and that if the heatsink can dissipate at least as much heat as the TDP indicates, the card will not overheat. In other words, we know that TDP is an upper bound on heat output from the GPU. But the keyword here is "upper bound".
The TDP does not correspond to the max power consumption or heat dissipation. A TDP of 200W is perfectly valid for a GPU that generates 3W of heat at peak. -
But when the cost of production comes in the companies will do whatever they can to reduce costs and a high TDP certainly means more money spend on the cooling system.Also if ATI and nVidia post numbers that are high, they will experience a hit in the sails.If the number is low, the manufacturers and the users will be unhappy.
-
what a good discussion and source here! especially for me who are in the middle of related research..
Does this TDP can be assumed a operating power consumption? Does 7W really can achieve 90-100 C? Im using laptop with this IGP and the temp is reaching 80-90 C even in idle..
GPU Power consumption guide
Discussion in 'Gaming (Software and Graphics Cards)' started by mujtaba, Feb 17, 2007.