if the graphic card is stronger, does it mean more heat will be produced and shorter battery life?? is there a chart somewhere that shows the comparison for all the cards?
-
-
yes, stronger graphics cards require more energy, thus generate more heat and draw more power from the battery.
-
of course or else most notebooks out there would have very high end graphics
-
When the stronger GPU as well as the weaker one is idle, will the difference in battery life remain the same or will it decrease?
-
Just depends on whether or not it's being used. On idle, i believe the same amount of power will be used by both GPUs (high-end and low-end).
When just browsing the web, chatting, or word-processing, there won't be a big strain on either power GPU, so power consumption will be the same, for the most part.
I may be wrong on this. If you're looking for battery life, then get an integrated GPU, because that beats dedicated in battery saving whether the PC is idle or not. -
This holds true only for video cards within the same generation, and the same fabrication process. A 8600m gt is more powerfull than a go6800 ultra, but produces far less heat.
When the card is idle at desktop it depends on how the chip turns off, reduce speed of some parts. -
Patrick Y. Go Newbs! NBR Reviewer
I hope more high-end notebook will copy the hybrid graphic system used in sony sz.
sigh... -
i do believe that newer cards with same power or even stronger will be better at cooling than the older ones and might use less power. but im not sure, thats why im asking. u know, better techonology
-
Let's take two scenarios.
Say you're the lead GPU guy at NVidia, now
A: You take your old Geforce 5xxx from 3 years ago, and overclock the hell out of it so it becomes fast enough to compete with todays chips.
That's going to burn a hole in your laptop pretty quickly. The GF5 was hot when it launched, and we'd need to speed it up by a factor of 8 or so.
B: You design a new chip from scratch, with the goal of reacinh "8 times the performance of GF5", while only producing reasonable amounts of heat.
If you follow strategy A, you're going to see more and more heat for every new card you release. There's just nothing to do about that. That's also why a 8800 Ultra requires more power than an 8800 GTS. It's the same chip, just overclocked.
Option B is much more interesting. Of course transistors get smaller every year, which gives us more freedom in designing the chip because we can afford to put more transistors on it. Perhaps running a Geforce 5 at 3ghz isn't the most efficient option. Perhaps it'd be better to develop a chip that has 8 times as many shader units. Or perhaps we can afford to spend some transistors now on logic to disable unused parts of the chip. Or...
So more graphics power doesn't *have* to mean more heat. And it doesn't always. Geforce 7 used less power than Geforce 6. 8 uses a bit more than 7, but is twice as fast, so it's still more performance/watt.
And honestly, GPU's just can't afford to get much more power-hungry. NVidia has already started getting their power consumption under control (since GF6 and forwards). ATI is starting to get the idea now, I believe.
And of course, everyone's darling, the Core 2 CPU uses far less power than a Pentium 4, despite being *a lot* faster.
So basically, if nothing else changes, then yes, higher speed means more heat.
If the entire chip is changed, all bets are off. It might mean more heat, it might not. -
Sorry for the notebook-noob question, but is there any way to scale back the GPU or switch it over to the onboard graphics chip* (manually or automatically, either by a hardware switch or a control panel or utility) if you're not doing anything graphically intensive? If the answer is specific to certain laptop models, graphics card manufacturers, or OS's, which provide this kind of functionality?
*Assuming there's still an onboard graphics chip in machines with discrete graphics cards. Not sure that's even true.
does more graphic power means more heat?
Discussion in 'Gaming (Software and Graphics Cards)' started by tudyfrutie, Nov 27, 2007.