A friend told me that you shouldn't play too much games on laptops that has discrete graphics because it will reduce the processors life span. It generates more heat and strain on the processor which causes it to burn it self eventually.
Is this true?
I researched this with no solid answer anywhere. I can find allot of info on integrated graphics but hardly any for discrete graphics.
-
-
moral hazard Notebook Nobel Laureate
You can play games as much as you want, the notebook will turn off before any real damage is caused by the heat.
Sure if you have and older notebook with the faulty G86 or G84 nvidia GPUs, then gaming would cause an early death. But for any non-faulty notebook it doesn't matter how long you game for. -
well, kind of, because all processors deteriorate over time, the more they work the more they deteriorate, they need more power and so on.
But that means it could run for 1000 years on idle, and if you burn it 100% of time, it'll only last 100 to work under full load under manufactured specified conditions.
For example, 5 years ago I had e6600 cpu that could run at 3.8 GHz 24/7 and that was it's stable maximum, but after 3 years it couldn't stay stable at prolonged load periods without even further voltage increase so I had to keep it at 3.5GHz to be stable using some nominal voltages. And that's still almost 50% boost over standard frequency. On manufacturer specified settings it still was able to work as day one, but this was an illustration of cpu degradation.
If I hadn't overclocked it so much, it would work the same as when it left the factory even today no matter under how much torture I would put it, and well inside Intels specs and would do so many more years, but i willingly gave up on that so i could have 60% faster cpu.
In other words, because you can't overclock your laptop cpu (mostly) it will outlive even it's usefulness let alone laptop it's embedded into no matter what you do to it. -
Short and basic answer - No.
Your "friend" is full of something other than knowledge. -
H.A.L. 9000 Occam's Chainsaw
BUT...
It takes a LONG time for that to happen. A LONG, LONG time. -
The answer for OP's question is simple, no, playing on integrated/discrete graphics isn't bad for your computer, AT ALL. Just play to your heart's content, your friend has no idea how a computer works. -
well, drastic overclock simply speeds up the process a lot, and if my cpu managed to stay at reasonable condition after that much stress for such a long time, standard cpu or any other chip by itself is going to outlive all other components inside a notebook.
and I don't believe that was over complicating as OP asked if it would hurt it eventually, and that is imho a very long pariod of time -
H.A.L. 9000 Occam's Chainsaw
-
-
TheBluePill Notebook Nobel Laureate
It never ceases to amaze me, the myths people dredge up to make using technology difficult.
We need a Forum Tack with Mis-information Info Like: Leaving your battery in is OK, or Using your CPU/GPU won't harm it (at stock speeds) or any of the other nutty things people come up with. -
Meaker@Sager Company Representative
If you dont game on it the transistors will never be used and wasted. The likely hood is the CPU transistors will fail before the GPU ones.
-
TheBluePill Notebook Nobel Laureate
Some more food for thought;
AnandTech - Intel's 45nm Dual-Core E8500: The Best Just Got Better
Even at 100% utilization, you can expect to get many Years of life out of a typical CPU or GPU.
At average use, you can get Decades of use out of it.
Only when you start Voltage Modding do things start to degrade quickly. -
I'm surprised that nobody here has mentioned that the cooling system that critical to keeping any component from overheating and shutting down the computer.
Laptop manufacturers are looking to cut costs in any way they can, and one way is to use aluminum (cheaper) heat pipes and sinks instead of pure copper (more expensive), using less materiel, cheaper and fewer fans, etc.
So if you have a gaming laptop with high-performance parts (which is advertised) and a low-quality cooling system (which is never advertised), then the answer to your question is yes; the components will face a much shorter lifespan under constant high-stress usage, and they will often fail prematurely.
On the other hand, manufacturers who focus specifically on making gaming laptops generally do well in this area by including robust cooling systems: Sager, Alienware, and MSI are examples of such brands.
EDIT: I read your post too quickly and misunderstood your question. No, a more powerful GPU will not place greater stress on the CPU unless (again) the cooling system is pushed beyond its limits when the system is stressed. -
Short answer yes and no.
Long answer:
If the GPU/CPU repeatedly come within 5- 10* C ( sometimes not even that close ) of the TJmax for extended periods of time, although the part may never reach critical temperature, this will over time cause the part to fail prematurely. The failure will not often stem from the die itself so much as a weakening of the solder which with the new ROHS lead free solder is a problem. Over the time, the solder becomes brittle, much the way that a piece of baking clay wil if you heat it and let it cool repeatedly, over time, the clay will become brittle and prone to cracking/breaking. There is also the threat of permenant failure of capacitors and resistors that come too close for too long to their maximum operting temperature.
On the other side. If the parts are kept cool like the repplication of TIM's ( Thermal Interface Materials ) with quality TIM's then you reduce the likelhood of premature failure due to heat. Also the use of shims in many instances helps eliminate low clamp pressure due to poor heatsink design by some manufacturers. Think of clamp pressure like this; If you have the worlds best heatsink held onto a CPU with a rubber band, then the heatsink is not being forced close enough to the CPU to be effective reguardless of the TIM you use. If however you keep put a large copper shim between the heatsink and the CPU, the rubber band gets tighter and decreases the ammount of space between the CPU and heatsink causing the TIM to be spread out more evenly and thus becoming more efficient.
The best way to see if you need a shim or not due to proper/improper clamp pressure is to apply fresh TIM between the heatsink and chip to be cooled, tighten the heatsink all of the way down and then remove the heatsink, if the TIM is all globbed ( technical term, I know ) on the chip and the heatsink and you cannot see anywhere where the TIM has been spred so thin that you can nearly see the heatsink or chip under the TIM then you probably need a shim. If however, you remove the heatsink and you can see towards the center of the TIM that it has been squished ( another tech term ) to the point that you can nearly see the heatsink or chip thru it then you have good clamp pressure and just need to remove the old tim, reapply, reassemble and be done.
The laymans answer always comes down to this; whether it be a laptop, desktop, phone or any elecronic device: As long as the parts stays cool, it's safe. If however it gets too hot, too often for too long then it will most certainly fail prematurely. -
----
to the OPs question: chips usually last really long time, unless there's defect of some sort and/or the chip is not used as it's supposed to. And even then some just keep working
play your games and be happy -
As far as I know, what I said is not incorrect at all. If you are referring to metal pipes being superior to rods then you'd be correct, but I wasn't talking about that. -
Yes aluminum fins are often used to reduce cost, however, if they are sized properly, they will get the job done just fine. Both my Asus have copper (or some copper alloy) heatpipes with aluminum fins and they are running quite cool.
Anyways, we're going off topic here. The OP got his answer, as long as the cooling is adequate, there is absolutely no harm in playing games on the Intel integrated graphics. -
Meaker@Sager Company Representative
Aluminium is lighter and gives out heat better than copper (though absorbs it slower) so I can see them being a good combination with heat pipes.
-
TheBluePill Notebook Nobel Laureate
-
Thermal conductivity of aluminum at near ambient temps ~240W/(m*K)
Convection coefficients are independent of the metal used.
I don't see where you got the info that aluminum gives out heat better than copper, copper is a much better heat conductor than aluminum and that's what matters. The choice of aluminum is purely cost and/or weight driven, it gets the job done though. -
Hmmm not really i think. In my opinion it is the other way around. Reason is if you are using shared graphics and playing games on it (well first thing first, you wont be able to play much games on shared graphics though), anyways, discrete graphic cards have their own cooling fan and there is no restriction of how long you want to play games on it. Obviously your laptop will start to heat up if you play for a very very long time but still its always better than shared graphics.
-
-
The HDX18 has 1 fan but 2 pipes (one covers the CPU + Northbridge, the other runs over the GPU and its own RAM), although what I find amusing is that the heatsink for the GPU is smaller than the one for the CPU. The fins are aluminum though, but it does work well enough. 1 pipe for both (more so for higher end cards) seems quite silly.
-
a good cooling solution would be two heat sinks (for the CPU and the GPU each) at the left and right back corners of the laptop, sucking air from the front of the laptop through specifically placed openings so that the flow could cool down other laptop parts as well. The CPU and the GPU each should be placed as close to the heat sinks as possible, connecting with at least two heat pipes (if not one wide one). There are laptops with this design, most if not all high end gaming ones though. -
Yeah the dual heatsink with dual heatpipes is mostly limited to high end laptops. It does increase the weight and cost, but then if you use high performance components, it will already be expensive and is only needed for components with higher TDPs anyways.
I'd be curious to know how the engineers design their cooling solution, because sometimes they can be terrible. Personally, i'd first make a simulation of the heatsink (and fan) based on the TDP of the components it's supposed to cool, oversize it a little to have a safety factor and then test it in an actual prototype.
Is it true that playing on integrated/discrete graphics is bad for the computer/processor?
Discussion in 'Hardware Components and Aftermarket Upgrades' started by richie1989, Mar 13, 2012.