The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Is it true that playing on integrated/discrete graphics is bad for the computer/processor?

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by richie1989, Mar 13, 2012.

  1. richie1989

    richie1989 Notebook Enthusiast

    Reputations:
    0
    Messages:
    27
    Likes Received:
    0
    Trophy Points:
    5
    A friend told me that you shouldn't play too much games on laptops that has discrete graphics because it will reduce the processors life span. It generates more heat and strain on the processor which causes it to burn it self eventually.

    Is this true?

    I researched this with no solid answer anywhere. I can find allot of info on integrated graphics but hardly any for discrete graphics.
     
  2. moral hazard

    moral hazard Notebook Nobel Laureate

    Reputations:
    2,779
    Messages:
    7,957
    Likes Received:
    87
    Trophy Points:
    216
    You can play games as much as you want, the notebook will turn off before any real damage is caused by the heat.

    Sure if you have and older notebook with the faulty G86 or G84 nvidia GPUs, then gaming would cause an early death. But for any non-faulty notebook it doesn't matter how long you game for.
     
  3. ivan_cro

    ivan_cro Notebook Consultant

    Reputations:
    23
    Messages:
    121
    Likes Received:
    0
    Trophy Points:
    30
    well, kind of, because all processors deteriorate over time, the more they work the more they deteriorate, they need more power and so on.

    But that means it could run for 1000 years on idle, and if you burn it 100% of time, it'll only last 100 to work under full load under manufactured specified conditions.

    For example, 5 years ago I had e6600 cpu that could run at 3.8 GHz 24/7 and that was it's stable maximum, but after 3 years it couldn't stay stable at prolonged load periods without even further voltage increase so I had to keep it at 3.5GHz to be stable using some nominal voltages. And that's still almost 50% boost over standard frequency. On manufacturer specified settings it still was able to work as day one, but this was an illustration of cpu degradation.

    If I hadn't overclocked it so much, it would work the same as when it left the factory even today no matter under how much torture I would put it, and well inside Intels specs and would do so many more years, but i willingly gave up on that so i could have 60% faster cpu.

    In other words, because you can't overclock your laptop cpu (mostly) it will outlive even it's usefulness let alone laptop it's embedded into no matter what you do to it.
     
  4. Rishwin

    Rishwin Notebook Deity

    Reputations:
    215
    Messages:
    886
    Likes Received:
    0
    Trophy Points:
    30
    Short and basic answer - No.

    Your "friend" is full of something other than knowledge.
     
  5. H.A.L. 9000

    H.A.L. 9000 Occam's Chainsaw

    Reputations:
    6,415
    Messages:
    5,296
    Likes Received:
    552
    Trophy Points:
    281
    Yes, electromigration causes a breakdown of the transistors over time.

    BUT...

    It takes a LONG time for that to happen. A LONG, LONG time.
     
  6. lidowxx

    lidowxx Notebook Deity

    Reputations:
    169
    Messages:
    801
    Likes Received:
    0
    Trophy Points:
    30
    We are talking about possible damage whether gaming could be done for CPU, not deterioration here, anything deteriorates over time in this universe. You are over-complicating this simple matter here. OP didn't ask how overclock/overvolt can affect the lifespan of a laptop.

    The answer for OP's question is simple, no, playing on integrated/discrete graphics isn't bad for your computer, AT ALL. Just play to your heart's content, your friend has no idea how a computer works.
     
  7. ivan_cro

    ivan_cro Notebook Consultant

    Reputations:
    23
    Messages:
    121
    Likes Received:
    0
    Trophy Points:
    30
    well, drastic overclock simply speeds up the process a lot, and if my cpu managed to stay at reasonable condition after that much stress for such a long time, standard cpu or any other chip by itself is going to outlive all other components inside a notebook.

    and I don't believe that was over complicating as OP asked if it would hurt it eventually, and that is imho a very long pariod of time :)
     
  8. H.A.L. 9000

    H.A.L. 9000 Occam's Chainsaw

    Reputations:
    6,415
    Messages:
    5,296
    Likes Received:
    552
    Trophy Points:
    281
    Yep, as long as it's just an overclock. If you're also overvolting, be prepared.
     
  9. Thaenatos

    Thaenatos Zero Cool

    Reputations:
    1,581
    Messages:
    5,346
    Likes Received:
    126
    Trophy Points:
    231
    Even with those faulty chips my notebook from that era is still running strong (minus GPU). The t8100 still runs the same way it did back in early 2008.
     
  10. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    It never ceases to amaze me, the myths people dredge up to make using technology difficult.

    We need a Forum Tack with Mis-information Info Like: Leaving your battery in is OK, or Using your CPU/GPU won't harm it (at stock speeds) or any of the other nutty things people come up with.
     
  11. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    If you dont game on it the transistors will never be used and wasted. The likely hood is the CPU transistors will fail before the GPU ones.
     
  12. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
  13. Bog

    Bog Losing it...

    Reputations:
    4,018
    Messages:
    6,046
    Likes Received:
    7
    Trophy Points:
    206
    Yes and no.

    I'm surprised that nobody here has mentioned that the cooling system that critical to keeping any component from overheating and shutting down the computer.

    Laptop manufacturers are looking to cut costs in any way they can, and one way is to use aluminum (cheaper) heat pipes and sinks instead of pure copper (more expensive), using less materiel, cheaper and fewer fans, etc.

    So if you have a gaming laptop with high-performance parts (which is advertised) and a low-quality cooling system (which is never advertised), then the answer to your question is yes; the components will face a much shorter lifespan under constant high-stress usage, and they will often fail prematurely.

    On the other hand, manufacturers who focus specifically on making gaming laptops generally do well in this area by including robust cooling systems: Sager, Alienware, and MSI are examples of such brands.

    EDIT: I read your post too quickly and misunderstood your question. No, a more powerful GPU will not place greater stress on the CPU unless (again) the cooling system is pushed beyond its limits when the system is stressed.
     
  14. WARDOZER9

    WARDOZER9 Notebook Consultant

    Reputations:
    35
    Messages:
    282
    Likes Received:
    8
    Trophy Points:
    31
    Short answer yes and no.

    Long answer:

    If the GPU/CPU repeatedly come within 5- 10* C ( sometimes not even that close ) of the TJmax for extended periods of time, although the part may never reach critical temperature, this will over time cause the part to fail prematurely. The failure will not often stem from the die itself so much as a weakening of the solder which with the new ROHS lead free solder is a problem. Over the time, the solder becomes brittle, much the way that a piece of baking clay wil if you heat it and let it cool repeatedly, over time, the clay will become brittle and prone to cracking/breaking. There is also the threat of permenant failure of capacitors and resistors that come too close for too long to their maximum operting temperature.

    On the other side. If the parts are kept cool like the repplication of TIM's ( Thermal Interface Materials ) with quality TIM's then you reduce the likelhood of premature failure due to heat. Also the use of shims in many instances helps eliminate low clamp pressure due to poor heatsink design by some manufacturers. Think of clamp pressure like this; If you have the worlds best heatsink held onto a CPU with a rubber band, then the heatsink is not being forced close enough to the CPU to be effective reguardless of the TIM you use. If however you keep put a large copper shim between the heatsink and the CPU, the rubber band gets tighter and decreases the ammount of space between the CPU and heatsink causing the TIM to be spread out more evenly and thus becoming more efficient.

    The best way to see if you need a shim or not due to proper/improper clamp pressure is to apply fresh TIM between the heatsink and chip to be cooled, tighten the heatsink all of the way down and then remove the heatsink, if the TIM is all globbed ( technical term, I know ) on the chip and the heatsink and you cannot see anywhere where the TIM has been spred so thin that you can nearly see the heatsink or chip under the TIM then you probably need a shim. If however, you remove the heatsink and you can see towards the center of the TIM that it has been squished ( another tech term ) to the point that you can nearly see the heatsink or chip thru it then you have good clamp pressure and just need to remove the old tim, reapply, reassemble and be done.


    The laymans answer always comes down to this; whether it be a laptop, desktop, phone or any elecronic device: As long as the parts stays cool, it's safe. If however it gets too hot, too often for too long then it will most certainly fail prematurely.
     
  15. miro_gt

    miro_gt Notebook Deity

    Reputations:
    433
    Messages:
    1,748
    Likes Received:
    4
    Trophy Points:
    56
    my G86: OCed that thing to over 50% and for 3.5 years and counting beat hard on it with gaming ... no problems :D


    that is not true, heat pipes transfer heat much faster than copper does and are harder to manufacture, thus the overall cost of a heat pipe could easily surpass the one for the copper rod.

    ----

    to the OPs question: chips usually last really long time, unless there's defect of some sort and/or the chip is not used as it's supposed to. And even then some just keep working :)

    play your games and be happy
     
  16. Bog

    Bog Losing it...

    Reputations:
    4,018
    Messages:
    6,046
    Likes Received:
    7
    Trophy Points:
    206
    I merely stated that heat assemblies using aluminium are commonly used to cut costs instead of using copper, even though the latter metal offers superior heat transfer.

    As far as I know, what I said is not incorrect at all. If you are referring to metal pipes being superior to rods then you'd be correct, but I wasn't talking about that.
     
  17. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    Yes aluminum fins are often used to reduce cost, however, if they are sized properly, they will get the job done just fine. Both my Asus have copper (or some copper alloy) heatpipes with aluminum fins and they are running quite cool.

    Anyways, we're going off topic here. The OP got his answer, as long as the cooling is adequate, there is absolutely no harm in playing games on the Intel integrated graphics.
     
  18. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    Aluminium is lighter and gives out heat better than copper (though absorbs it slower) so I can see them being a good combination with heat pipes.
     
  19. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    Magnesium has about 2/3rds the conductivity of Aluminum. I wonder how it handles heat dissipation and absorption in comparison?
     
  20. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    Thermal conductivity of copper at near ambient temps ~380W/(m*K)
    Thermal conductivity of aluminum at near ambient temps ~240W/(m*K)

    Convection coefficients are independent of the metal used.

    I don't see where you got the info that aluminum gives out heat better than copper, copper is a much better heat conductor than aluminum and that's what matters. The choice of aluminum is purely cost and/or weight driven, it gets the job done though.
     
  21. techgadget52

    techgadget52 Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    0
    Trophy Points:
    5
    Hmmm not really i think. In my opinion it is the other way around. Reason is if you are using shared graphics and playing games on it (well first thing first, you wont be able to play much games on shared graphics though), anyways, discrete graphic cards have their own cooling fan and there is no restriction of how long you want to play games on it. Obviously your laptop will start to heat up if you play for a very very long time but still its always better than shared graphics.
     
  22. WARDOZER9

    WARDOZER9 Notebook Consultant

    Reputations:
    35
    Messages:
    282
    Likes Received:
    8
    Trophy Points:
    31
    Only higher end gaming laptops tend to have dedicated fans/heatpipes for discrete GPU's. Many still share the heatpipe/s and fans of the CPU cooler. And believe it or not, integrated GFX tend to last far longer and run significally cooler than discrete GFX so when you can actually play something on your integrated IGP you're better off in terms of reliability and heat.
     
  23. Kuu

    Kuu That Quiet Person

    Reputations:
    765
    Messages:
    968
    Likes Received:
    18
    Trophy Points:
    31
    The HDX18 has 1 fan but 2 pipes (one covers the CPU + Northbridge, the other runs over the GPU and its own RAM), although what I find amusing is that the heatsink for the GPU is smaller than the one for the CPU. The fins are aluminum though, but it does work well enough. 1 pipe for both (more so for higher end cards) seems quite silly.
     
  24. miro_gt

    miro_gt Notebook Deity

    Reputations:
    433
    Messages:
    1,748
    Likes Received:
    4
    Trophy Points:
    56
    the part that sits on top of the chip itself is not the heat sink nowadays, it's just the connection plate from which the heat pipes start. Heat sink is the part that gives out the heat, which would be the part with the fan.

    a good cooling solution would be two heat sinks (for the CPU and the GPU each) at the left and right back corners of the laptop, sucking air from the front of the laptop through specifically placed openings so that the flow could cool down other laptop parts as well. The CPU and the GPU each should be placed as close to the heat sinks as possible, connecting with at least two heat pipes (if not one wide one). There are laptops with this design, most if not all high end gaming ones though.
     
  25. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    Yeah the dual heatsink with dual heatpipes is mostly limited to high end laptops. It does increase the weight and cost, but then if you use high performance components, it will already be expensive and is only needed for components with higher TDPs anyways.

    I'd be curious to know how the engineers design their cooling solution, because sometimes they can be terrible. Personally, i'd first make a simulation of the heatsink (and fan) based on the TDP of the components it's supposed to cool, oversize it a little to have a safety factor and then test it in an actual prototype.