^^ what?? 670 has 150W TDP right? it is 6W more than 7870 and about 100 less than 7970
-
-
Another thing is that the power consumption is HIGHER than TDP, like discussed and concluded earlier in this thread.
Third, the TDP looks pretty much like 150W or atleast pretty close.
Here is a GPU-Z of GTX 680, which is a 195W
1536 shaders @ 1006-1059 MHz
Here is a GPU-Z of GTX 670,
1344 shaders @ 915-994 MHz
-
^^So you trust their perfomance numbers, but you do not trust their power consumtion numbers?
Thats amuzing.
If you read their test method. They take peak power as load. Not average power usage during all load tests, which should be lower ofcourse. I have sayd earlier, that Nvidia calculates their TDP differently. Theyr cards TDP is generally lower than AMD counterparts, but nvidia cards actually peak higher than AMD cards at power usage.
Peak power usage doesnt matter much for desktop PC, since PSUs can peak quite a bit higher from their Wattage output. That is probably not the case on notebooks. I dont think that GPU peaking much over 100W would be healthy for notebook.
-
power consumption is not the same thing as TDP, and will never believe the chart you are citing, so you seriously believe that 680 consumes more than 7970..
-
So its average power consumption. Not peak load consumption shown on this chart of their.
And If you would have read my last post, you wouldnt ask a silly question as this. -
According to Anandtech, 680 consume 362W while 7970 consume 391W. HMMM wonder who we should trust
This is partly why I don`t trust that power chart at all from the 670 review. So I`m sorry, I will await more reviews before coming to a conclusion.
After all they started the review discussion how everyone hates Nvidia. Real unbiased review right there
Like shown, the specs comparing 680 and 670, should represent about 150W TDP for 670 imo. And discussed earlier in this thread, I saw nearly all Nvidia GPUs having higher power consumption than the TDP. So no, power consumption does NOT equal TDP. But I`m not discussing that over again. -
Ok. After looking TPU GTX680 power consumpion numbers, it looks like this power consumption was indeed a very bad example to take numbers from.
NVIDIA GeForce GTX 680 Kepler 2 GB Review | techPowerUp
Those are likely the most accurate numbers of them all.
Im not going to argue about how TDP = power consumption or not. As i have said many times. AMD and NV just put their numbers differently. So there is no fact for this.
In theory TDP = power consumtion since heat generation = power consumtion and TDP = heat dissipation needed. Simple as that. -
Would like to see some GT660M full load power consumption numbers at last. Could come handy to predict GTX680M numbers.
-
From the Clevo forums... an MSI 680M apparently?
Probs fake. -
the easiest way to explain TDP-
Heat is measured in watts. TDP is the heat output. a device with a tdp of 10, means it outputs 10 watts of heat that need to be removed (typically, by a cooling system) TDP is NOT how much power a device consumes.
TDP and power consumption are not 'directly' linked.
TDP can be thought of as a statistic of heat expressed in watts. Its how much Power (read- heat power) a given device (gpu) is "expected" to give off.
How "expected" is calculated varies from manufacturer to manufacturer.
For example, AMD gives TDP based off of maximum possible LOAD (NOT power usage) of "device". If its a gpu, they include everything, under 100% load, including any "self powered" fans. so an amd gpu with a TDP of 10 means it will never give fof more than 10 watts of heat under 100% load with STOCK settings.
Intel does NOT calculate it the same way. Intel calculates TDP based off of "normal load". (note- they do NOT define this).
and intel 'device" with a tdp of 10 is NOT NECESSARILY 10 watts of heat output under load.
Snapdragon processors for example (mobile chips) have a TDP of .5 to 1.5 watts. This does NOT mean their power consumption is .5 of a watt.
because of the differences in "measurement techniques" you CANNOT compare TDP of brand/chip/line X to Y or Z.
People confuse TDP and power consumption because of heat being a byproduct of inefficient power consumption. Inefficient power consumption produces heat. Higher Inefficient power consumption produces more heat, ergo people think higher power consumption = heat = tdp.
Its an easy mistake to make, however its a mistake none the less.
Obviously i simplifiied things, but in the end, the information above is correct.
TDP is NOT power consumption.
- TDP is heat output measured in watts.
- Heat output is the amount of heat dissipation needed to prevent overheating.
- TDP is measured differentlly by each manufacterer
- The definition of what TDP is does NOT change.
-
deleted OT.
-
And even if it were, different companies will likely put different numbers for TDP, or maybe modify or add a little bit "to be safe".
TDP is TDP. Power consumption is power consumption. They are different things and should not be compared as if they are equal.
edit: well it seems like you deleted too late -
Im just not in the mood of continuing this.
I have just one quetion i would like to get answered.
Where does this consumed power go if it does not go in to heat? -
We are not talking about a toaster.
Try reading here and you may start to understand a bit better-
x Watts to run = x Watts of heat?
Only a portion (you can call it unused, leaked, inefficiently cycled etc) of the energy used is given off as heat. that heat is easier to understand, manage, and deal with by measuring it as watts. Watts is not a measurement of electricity, It is a measurement of Energy.
The ration of leaked vs used energy is referred to as its efficiency.
electric motors can be 50% efficient, or 90% efficient.
a 500 watt EV with 50% efficiency can give off 250 watts of heat, or a TDP of 250 (again, i am simplifying just to get the concept across)
If it gives off 500 watts of heat, it will produce no movement, because none of the energy is being used 'efficiently"
the average human body gives off ~76 watts during sleep, ~582 watts during intense physical work.
Tldr;
TDP = watts released via heat (watts released via heat = watts not used to do things, lost power, leaked power, etc) -
Meaker@Sager Company Representative
You are forgetting that electronic chips do no useful work in a mechanical sense, therefore Heat = watts consumed at the wall.
Yes a tiny TINY fraction of that power will produce some noise and light from the display, the fan moves a bit of air but as an engineer I can tell you that those are virtually negligible.
Also those uses are pretty constant so if you compare an idle state to a load state you are directly looking at the extra heat generated by that load.
BUT:
THE POWER CONSUMED BY A SILICON CHIP IS EQUAL TO THE HEAT IT WILL PRODUCE.
THIS IS FACT. IF YOU ARGUE WITH IT YOU ARE WRONG AND NEED TO GO READ A BOOK. -
That being said, again oversimplifying it-
energy in > 0's and 1' that become the information of what to display> signal up to the monitor > displayed as light energy> light energy gets out into the world and absorbed eyes etc> eyes convert light energy into signals, signals goto brain> brain uses energy to show you a mental picture etc etc
keep in mind, absolutely NOTHIING is 100% efficient, energy is "lost" in every "conversion". and by lost, i mean it turns to heat, or cold, or laser beams, or chocolate filled hamburger hot dogs that appear in the 10th dimension etc
Basically- energy doesn't die, it just changes forms. I always tell students think of a lawn mower, the energy from the dinosaurs became gas. Gas is burned in a lawn mower, it loses some energy as heat, and uses some to spin a blade and cut the grass. (i know i know...... its extremely simplified but hey,, its getting the concept across!)
not trying to be a , but you would greatly benefit from reading up on how electricity works, it will explain things to you alot better than what we would be able to.
How Does Electricity Work? | EasyApplianceParts
Most importantly though, tdp will always be tdp, however the way of measuring TDP throughout the chip industry is faulty, full of marketing BS at best, and outright lies at worst.
a good way to guess for yourself at a tdp is-
(current temp - ambient temp)*0.52752793= heat watts being "wasted"
for example, my cpu wastes ~25 watts with light load, close to its TDP of 35.
Rough estimates though. -
+rep for you for getting my point.
We are not talking about mechanical instruments here like some may have thought.
-
LOL this discussion was funny. Forget about GPUs acting like GPUs, no sir, they are electric heaters. Everything is output as heat, nothing is actually consumed. 28 nm GPUs are as hot as 55nm GPUs. Heatwise a Nvidia is identical to a AMD. GCN, Kepler, VLIW, Fermi, they are all the same. Official specs from Nvidia and AMD are all a lie. The TDP must be, since the power consumption never match it. They say they tested the GPUs at stock, but they must have secretly overclocked it since the power consumption is many times higher than the TDP. All the reviews you have seen are all a lie
Everything is a lie. -
And as I said, TDP is a number put out by the manufacturer and doesn't necessarily represent the exact power consumption and heat output. Unless AMD and Nvidia are just really good at designing chips that always have TDPs at a multiple of 25 or 10. -
Im willing to bet that if you compare a GPU from 1998 to a 2012 there have been some improvements with the designs. Say you fed the chip with 1w back in 98 you got a factor by 0.95 as heat output, that factor is lower today...
-
:yes:
Instead of making constructive argument, you constantly choose to make posts like this instead. -
OK people enough arguing as we are going a bit off topic
I work at Micron Italy, memory design division.
It works like this:
Voltage is absorbed from the GPU and distributed on its circuitry and then onto other components inside the notebook.
What manufacturers use to call TDP (at least on an engineering level) it's the power in Watts neccesary to design the most efficient cooler for that GPU.
Heat is produced only from leakage currents flowing on the chip (EVERY chip produced to date has them).
No leakage currents = no internal heat dissipation (means 100% energy transfer, impossible in this world) the only temp the chip would have would be the ambient temp.
The reason why newer architectures produce less heat is because of optimisations in silicon production (less leakage currents).
Under the same number of transistors, 2 chips with different architectures would produce different internal heat.
Having a certain TDP limit (meaning heat is dissipated efficiently only to a specific power usage, past that power the chip goes on overheat mode), manufacturers try to squeeze as much processing power as possible.
That is why Intel stopped it's GHz race and decided to focus on efficiency and IPS/W beacuse they were reaching a point where power consumption and heat generated would grow out of control with time.
Regarding power consumption, to get the max power consumption of the chip, unless you work at the design team at Nvidia or AMD, calculating it is difficult because there are lots of factors to include -
Thank you for clearing it up felix. Like I said earlier for those who understood it without resulting in personal attacks, no name mentioned, Fermi, Kepler and different architectures have improved and therefor improved TDP. Thank you for that we actually have improved during the years, lol who would have thought.
And Im pretty shure we don't have 100 percent leakage lol. -
felix3650, If heat is produced only from leakage, then where does the rest of consumed energy go? What will be the form of it?
---
Yes, Cloudfire, you were very constructive indeed. Well done bud. -
In form of better GPU performance maybe...?
I wonder though if its the lack of effective architectures that cause the leak? How effective are the GPUs now? 95%? 90%? -
Meaker@Sager Company Representative
-
Meaker@Sager Company Representative
I have yet to come across a unit of processing energy lol.
You need to understand that a microchip does NO mechanically useful work.
The TDP is the design point for the heatsink to ensure functions continue in spec. -
Now that I`m home with my laptop again I had a look at the Techpowerup review of GTX 680.
According to TweakTown review of GTX 670, the GTX 680 consumes 450W while 7970 consumes 389W. Thats is 61W more for GTX 680.
Looking at a review that have exact power consumption read, from Techpowerup, the GTX 680 consumes maximum 228W while 7970 consumes maximum 270W. So according to this, which I trust 100x more, 7970 consumes 42W more, not 680 consumes 61W more.
Now I`m curious to see what the real power consumption of 670 is since the 680 is 228W max. Could it be 170W max for GTX 670? If so, then we might see a mobile GPU out of this -
I am really enjoying this thread, especially as there is no single 680M to be actually compared to, regardless of the thread's title. Good fanboy flame at both ends, though
53 'default view' pages, well done :thumbsup:
-
@felix3650
Yes, there are a lot of factors to include, but you can make an educated guess. If there's one thing you learn in any engineering course in college, it's that you can always ignore 2nd order, 3rd order effects, etc. and approximate to a 1st order and still be within 20% of what you would get if you included everything. That kind of error might seem like a lot, but it really isn't. -
What a marvelous GPU world we could be living in soon. Free energy, nothing is ever consumed -
According to recent leaks, Thanks GeoCake (+rep btw), the GTX 680M is identical to GK104 die, which means we have 3 different alternatives what could be our 680M:
GTX 660 Ti
GTX 670
GTX 680
Which one will it be
-
I can rule one out easily for you Cloud, 680
Still crossing fingers for a 670 based chip, please hear us nvidia and pull off a miracle -
Yeah I think they won`t cut that in half or something when they can easily go through the other two and get better results
Fingers crossed here as well -
Please please please PLEASE no more bickering about all these theories. It's gone from interesting speculation to a physics debate where everyone and no one is an expert.
-
Meaker@Sager Company Representative
2 Core phases and 1 memory phase.... oh dear, that's a bit... ummm.... not good.
-
We dont even know, if its a GTX680M on MSI picture. It could be the first version of it, which is now perhaps remade to overcome 7970M?
Judging by desktop cards, i see no reason why GTX680 should not beat 7970M, but having seen that GTX660M (rated 75W TDP) is way weaker than 7870M (rated 36W TDP) i have obvious doubts.
-------
There should be only 2 energy forms which apply to GPUs. Electrical Energy and thermal energy. Thats it. -
Meaker@Sager Company Representative
-
Enough already...
-
From the chinese forum that leaked the GTX 680M picture
-
OH PLEASE!!! with my 7970m dead (most likely) I need a new GPU! (just cannot pay twice for the same card
)
or an m18x(please no!)
-
Either way wouldit be fair to assume that there is no downside to picking up a 7970M now?
Even though the 680M will most likely outperform the 7970M (releasing an inferior performance chip for an inflated price would be retarded... hi 675M) given recent AMD vs NVIDIA flagship GPU trends it would most likely come at a reasonable price hike (+200$ or more). -
-
I think 7970m is a great decision especially it is great value + top notch performance, and even if 680m will beat it (who knows) it will still stay the top of the line amd gpu for sometime, I mean 6990m was not a bad choice last summer, why should 7970m be
-
-
How well can a single 7970m card run Battlefield 3 or Crysis 1 on max settings with max resolution?
-
-
-
SINGLE CARD!!! -
My main argument was 7970M now (end of May) vs 680M later (possibly many months) at an increased price (following previous Nvidia vs AMD flagship trends).
AMD 7970m vs GTX 680m
Discussion in 'Gaming (Software and Graphics Cards)' started by x32993x, Apr 20, 2012.