The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Are there any notebooks w/discrete gpu that DON'T run hot?

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by cognus, Sep 21, 2012.

  1. cognus

    cognus Notebook Deity

    Reputations:
    186
    Messages:
    877
    Likes Received:
    18
    Trophy Points:
    31
    :confused:
    I"m just wondering if any of you know of a notebook of modern vintage, with discrete/dedicated GPU [for real, not the 'tweener] that do not run "too hot"... that is, the kind of hot that will eventually damage the system unless mods are made. I don't know of any, so far...
     
  2. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    Asus G series run cool enough to run on your lap 24/7

    As for components, all MSI ASUS, DELL and CLEVO run cool to run 24/7 without mods
     
  3. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    Every 15" or bigger current gaming and mobile workstations i know of have no heat issues as long as you keep the thing free of dust.
     
  4. Tsunade_Hime

    Tsunade_Hime such bacon. wow

    Reputations:
    5,413
    Messages:
    10,711
    Likes Received:
    1,204
    Trophy Points:
    581
    That is kind of a general question, all notebooks should "sufficiently" cool discreet graphics, only exceptions would probably those high end consumer laptop with entry/mid range level, like Inspiron 17R SE, XPS 17, Envy. Actual gaming laptops all have sufficient cooling, even my MSI barebones had excellent cooling, 2 massive heatpipes for a 5730M and an i5.
     
  5. nipsen

    nipsen Notebook Ditty

    Reputations:
    694
    Messages:
    1,686
    Likes Received:
    131
    Trophy Points:
    81
    ..this sounds a bit one-sided, I guess. But outside the ROG laptops, I've tested 1 other "high end" laptop (see review in sig) that wouldn't eventually fail to lead more heat out of the chassis than the components would produce.

    Not that other systems wouldn't cool the system enough to work, and work for a long time. And maybe work well for years with some maintenance, like ^said. But (at least with my luck) I wouldn't be comfortable using a system for gaming or heavy virtualisation or servers.. desktop or laptop.. that I knew would gather heat elsewhere than the heatsinks. Or where the heatsinks can and do reach a point where they stop transferring heat until you reach 90-95 degrees. That's just asking for something to fail, or for the cooling goop to dry out, and so on.. Open fans that blow the air down on the mainboard as well, dust gathering in every possible corner -- was a bit surprised when I started looking for a new laptop now, that so many laptop designs (expensive ones as well) are built as desktop systems, just with the components much closer, and the gpu's heatsink without it's own exhaust. Not completely sure if that would get any design awards when the graphics cards will draw around 60-100w..

    Not that I build laptops, so I don't really know if it actually is enough. But for what little it's worth, it's the kind of thing I would have avoided from the beginning if I did..
     
  6. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    I prefer dual heatsinks like in some gaming notebooks and the dell precision line: see the pics at the end of this review: http://forum.notebookreview.com/del...79326-dell-precision-m6700-owners-review.html. Dual fans, easy to get to and clean as well. Alienwares and Clevos also have that advantage.

    That said, a single fan with heatsinks made from the right materials (copper, yes please!) is perfectly fine too, it might generate more noise, but as long as the airflow is sufficient, it'll be alright.

    Something i noticed though is that manufacturers sometimes seem to be stingy with the heatpipes and seem to try to size their cooling with a minimalistic approach. I don't know if they are or not, but good engineering practices actually dictate that oversizing critical parts (slightly, no need to go nuts with oversizing) is the way to go. If you have a CPU with a TDP of 45W and a GPU with a TDP of 50W, make sure your cooling can handle more than that... (yeah, it sometimes seem like manufacturers forget that fact)
     
  7. PaKii94

    PaKii94 Notebook Virtuoso

    Reputations:
    211
    Messages:
    2,376
    Likes Received:
    4
    Trophy Points:
    56
    the newer envys run cool also. with max overclock the highest gpu got was 75
     
  8. Krane

    Krane Notebook Prophet

    Reputations:
    706
    Messages:
    4,653
    Likes Received:
    108
    Trophy Points:
    131
    As many here already know it's a compromise within the corporate structure: The engineer might want maximum cooling, but some guy in the accounting departments wants to save as much money as possible and want's to cut corners if he can. The deciding vote gets to make the best compromise between the two.

    When you design and build a workstation like Dell or Clevo, you need to push it to the limit because that's how the folks that buy these products will use them. That's the difference between companies like John Deere who have a reputation of endurance and those like DeLorean that appear for a brief time, and disappear as quickly as they began. A great exterior, but no engine to match.

    Dell, once king of the PC, looked back for a brief moment and got passed twice. Hopefully, they learned a good lesson and will not slow down but keep the momentum going this time around? Only time will tell.
     
  9. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    Good point Krane, i'm pretty sure there is a decent amount of headroom taken into account with the gaming notebooks (most of them), workstations and "premium laptops" while the mainstream are mostly design with low cost in mind and the cooling is designed to barely make it.

    However, when you design a notebook, the cooling should always be able to handle the maximum heat output of the system including a fouling factor accounting for dust. If you need to resort to throttling or underclocking parts, then someone somewhere did something wrong.

    I may be wrong, but i got a feeling that during prototyping, there is a good chance that the TIM application is better than the factory application for mass production. If the cooling is barely designed to make it and the test units have decent TIM jobs, then it could explain why production units might not handle the max heat output of the laptop with a sub par TIM application. It would also explain why repasting solves those kind of issues for many people.
     
  10. cognus

    cognus Notebook Deity

    Reputations:
    186
    Messages:
    877
    Likes Received:
    18
    Trophy Points:
    31
    so, it sounds like your [the collective foregoing] experience is that bigger actually is better - just more room/space/air to work with? why is it that when I go snoop into the brand/model-distinct forums/folders I see so many people driling holes etc in systems that SHOULD be designed to not run too hot??? its a rhetorical question. we run temp monitors, don't like what we see, go seek answers that ends up with refloat/repaste/drill/add coolers, etc. Some Dell's I have worked on are/were notorious for cooked parts - I'm thinking of some of the Nvidia discrete units. and many of the athlon-based models from various makers, ones with discrete gpu parts, also notorious for early death due to cookery.

    at any rate, you folks generally display thumbs-up for Precision, Alienware, Clevo....
     
  11. homank76

    homank76 Alienware/Dell Enthusiast

    Reputations:
    601
    Messages:
    1,137
    Likes Received:
    62
    Trophy Points:
    66
    Both my Alienware M11x R3 and M18x run just fine while gaming and even during benchmark tests without any issue at all. I think your referring to the older M11x's; especially the R1's where they did have an issue.
     
  12. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    Current platform, 'real' discrete gpu and 'not hot' still don't compute (ime).

    If the keyboard, palmrest, chassis is even noticeably 'warm' - that is too hot for me (because it means the internals are roasting).

    Sure, all the 'gamer' setups are very, very good at keeping all that generated heat from being an immediate issue - but that doesn't mean they're not running 'too hot' imo.

    To satisfy my requirements on a system with a discrete gpu (strickly for 'compute' capabilities - not gaming) I would envision something like a chassis inside a chassis (and as thermally seperated as possible), with the heat generating components inside one chassis and the chassis we interact with being the other. To ensure that the heat doesn't simply 'soak' and transfer between them; it would require the most complete and optimum removal of any heat generated (which would mean that the components are running much cooler than they're allowed to now).

    I am 100% sure that if we can feel ANY heat the system generates (excepting from the exhaust vents, of course...) then the components generating that kind of heat are simply being abused.

    Case in point: Apple products.

    While some say they're 'amazing' as they keep working in spite of their ridiculously high temps, all I know is that what's 'amazing' is that Intel's failsafes and safeguards are simply keeping (most of) those systems alive.

    At the expense of performance (via throttling), of course.

    Until gpu's reach or surpass the process nodes that Intel cpu's are made at - this will continue to be an issue.

    I think though that by the time discrete gpu's become 'civil' (by today's standards), Intel will have already made them effectively obsolete by incorporating them into their cpu dies.

    How many generations are we away from that?

    For me: we are already there (I hate platforms based on discreet gpu's because they're hot, loud and power hungry with little in return performance/productivity-wise for the high 'costs' they currently demand).

    How many generations are we away for a gamer?

    I would say that the majority of gamers will be happy with an 'igpu' from Intel within three years (introduced sometime in 2015) with the Sky Lake/Skymont platform.

    For the hardcore gamers (and those that need the most 'compute') the discrete gpu's will still be around to annoy us with their then relatively high noise, heat and power issues.
     
  13. homank76

    homank76 Alienware/Dell Enthusiast

    Reputations:
    601
    Messages:
    1,137
    Likes Received:
    62
    Trophy Points:
    66
    My Dell 1537 gets hot just from the current hard drive installed on it.
     
  14. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    Don't tell me you're still running Vista on it? ;)

    What HDD btw? The original 250GB 5400RPM version?

    Together with Vista that combo = HOT (but not in a good way...). :)
     
  15. homank76

    homank76 Alienware/Dell Enthusiast

    Reputations:
    601
    Messages:
    1,137
    Likes Received:
    62
    Trophy Points:
    66
    I have Windows 7 ultimate installed the the HDD is a Seagate Momentus XT 500GB hyrbrid.
     
  16. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    Good to hear!

    What kind of things do you use the system for? And, are you running any notebook cooler on your 1537?

    (I highly recommend the Zalman 2000NC).
     
  17. homank76

    homank76 Alienware/Dell Enthusiast

    Reputations:
    601
    Messages:
    1,137
    Likes Received:
    62
    Trophy Points:
    66
    I just use the 1537 at work as a side computer to watch MLB and NFL while I do my job using the computer that I'm supplied with. Sometimes I use it for photo editing, but that is when it's work related. Everything else I use my Alienwares for at home or on the road.
     
  18. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    Seems like you kind of depend on it?

    Notebook cooler highly recommended.
     
  19. nipsen

    nipsen Notebook Ditty

    Reputations:
    694
    Messages:
    1,686
    Likes Received:
    131
    Trophy Points:
    81
    ..that's how the rog laptops and the n46/56/76 is set up. Channels isolating the heat from the chassis, allowing it to be vented faster than the material soaks it up. Relatively low heat-capacity, will be warm on the inside, cold on the outside. Behold: physics.
    AMD already did that. Nvidia did it a few years before them. IBM did it before them again (with the Cell processor).
    That's not the question. The question is: when will fanboys fawn over the existing technology, because the manufacturers suddenly decide that said technology is worth money. Or to be more specific: that using that technology will make them /more/ money than the currently sold ones.

    For example: why do we still have hdds? There's no technical reason for it. The reason is price. Price, and that the current technology is set up and still earning money. Meaning that the new technology should not, from a marketing standpoint, replace the old one yet.

    Or for example: why do we still have mobile phones that are priced at premium, that break down and stop working after a year?

    Or why are still mobile phones manufactured where the memory access and gpu bridge access are competing for bandwidth? This was a "design flaw" in Qualcomm's soc designs as far back as the first S60 phones. And yet - the new chips they manufacture yet again have the same issue.

    There are very good economical explanations for why this happens. But there's no reason technically that we should have these problems right now.

    In fact, there's no reason why the products that break with these expectations should not outcompete the lesser ones. Not in the least because they often even cost less money to buy.

    And yet.. It's considered the truth by a lot of people that for example the latest iPad has the best technology in existence. Their chips are, to quote Engadget: "an in-house vision of what a mobile chip should be. It's the culmination of four years of hard work and perhaps half a billion dollars of investment."

    Just take that one in - "perhaps" it is the culmination of all kinds of things. We don't know, but we are /convinced/ of it. But Engadget has nothing to say on the licensed arm chipset designs shared among virtually all serious handset manufacturers. Because the technology and how it actually works - isn't the point. It's not what their advertisers care about, it's not what the manufacturers care about.

    And it is also not what their readers, and quite a lot of handset customers care about either..
     
  20. Krane

    Krane Notebook Prophet

    Reputations:
    706
    Messages:
    4,653
    Likes Received:
    108
    Trophy Points:
    131
    Nevertheless, because they don't know what you know
    Rate the present products not the past ones.

    Point being?
     
  21. eKretz

    eKretz Notebook Enthusiast

    Reputations:
    37
    Messages:
    42
    Likes Received:
    0
    Trophy Points:
    15
    You see people drilling holes mostly when they are overclocking their processors and/or GPU's. Or people who are modding to get every last ounce of heat out of their system that they can. Just because someone does something doesn't always mean that they need to do it. I have an Alienware M15x that can keep up quite easily with the stock 920XM processor and 5850 GPU when running at stock clocks on both. When overclocked to the point of craziness on both, then yeah, it gets hot...but it was never designed to cool an 80+watt load from the CPU, etc. This model has separate sinks and heat pipes for the CPU and GPU, and they work very well, IMO. Could they be better? Sure, but in the form that the factory sent out the machine, they are more than adequate, as evidenced by the fact that I have my CPU overclocked, my GPU overclocked, and it is still able to keep the machine from melting down. The underside of the chassis gets pretty damned hot, but the topside remains relatively cool...and by no means does it get uncomfortable.
     
  22. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    nipsen,

    the 'rog laptops' may be set up like that - but that is not the same thing as saying that is how they perform - just sitting at idle at any big box store they're running too hot imo - I am not a cold blooded animal; I generate my own heat thank you. I can only imagine them when they're at full 'load' - a very expensive space heater (and a noisy one at that too).

    eKretz,

    you prove my point that the current systems are 'too hot'. I have an extremely high pain threshold and this makes me stronger (as a person). An inanimate object like a notebook doesn't become 'stronger' or 'better' by abusing them - they simply die. Or throttle. Or make people much more sensitive than me cry for their mommies. ;)


    nipsen,

    as for your rant about hdd's... there is a technological reason we're still using HDD's. The world doesn't have the nand producing capacity (yet) to completely replace our platter based solutions.

    and to touch on the discrete gpu's... no, nobody has made anything yet to replace them (we are talking about 'real' gpu's - not 'tweeners') - not AMD, not nVidia and not even Intel yet.

    But, I'm betting on Intel to get there first (why am I discounting AMD? Because the cpu side of their APU's don't even compare to Intel's from 3-4 years ago...).


    Simply; the systems the OP is asking about do not exist today.

    What we want from our systems is what modern/current cars enjoy today: more HP (with increased gas mileage), more Driveability, more Reliability and all that with; increased efficiency, increased longetivity and at the same or lower prices of yesteryear's models.

    What we have today with our portable computers are muscle cars of the 60's/70's that can generate HP but without any real ability to reliably handle any extra heat we can make them produce: but we live with it because the power makes us smile. (In spite of the heat, the noise and the constant tweaking/tuning to keep the system going at 'full power').

    A couple of generations more (2015 - Skymont, as I've said before) and we'll have the civilized mobile computing platforms that will bring us on par to the automobile industry analogy: a platform that fully configures itself for the task at hand without ever breaking a sweat (nor our wallets, vs. the performance we get).
     
  23. nipsen

    nipsen Notebook Ditty

    Reputations:
    694
    Messages:
    1,686
    Likes Received:
    131
    Trophy Points:
    81
    :) ..hehe. No, it actually works. The g55, for example can be pushed very far before the chassis becomes hotter than your hand. It's really a very simple design (as in.. one layer of plastic coating), but it's another layer in the construction design. And most laptop manufacturers don't bother with it. And why should they, when "people" accept that it's not really a priority.

    Well, there's no physical restrictions involved, that's my point. No one technically stopping anyone from replacing their production lines. It would have made sense economically too - if there was competition towards actually making the best solution. Best and most convenient for us as customers, I mean.
    Well, some context about that. If we take a look at IBM's Cell processors, these are essentially programmable vector-computers in a very small chip. We would.. you know.. be able to replace a server-farm in Germany with one chip. Do video-encode in real-time on a very low clock-frequency. Have extremely high synthetic performance for prepared and well-scheduled tasks. We could be writing graphics code that goes way beyond the meager scene and object interference we have on the current cpu/gpu setups. And if that was paired with further development of asynchronously reading and writing ram, for an integrated bus -- this would very quickly be able to transform into a cheap consumer product. Capable of real-time ray-tracing, actual shadow construction based on actual occlusion in the object shapes, never mind laughably more complex curve-approximations. Hell, we could scrap curve-approximation altogether.

    The reason why that does not happen is not a technical one. But instead has to do with the existing technology being seen as "good enough", or the upcoming tech being a challenge to develop on by Zynga developers. Microsoft platform software is impossible to port to PPC and scheduled execution. More and more developers deliberately choose intel designs because it cuts down on the skill they need to hire to get the code to work.

    The ION chipset as well was supposed to be an integrated cpu/gpu solution. Or, it essentially was a gpu with general purpose execution capabilities.

    This won't happen because Intel sued Nvidia over infringing on their "CPU" patent. Clever heads insisted that linear execution capabilities would still be below intel designs anyway, and that this wasn't much of a threat. But in the entire business at the moment, embedded systems and even soc designs have a cpu and gpu construction that physically will divide the gpu and cpu parts.

    The Tegra 3 chipset is finally something that actually works - but note how this ends up in "embedded devices", even though the linear, unoptimized, non-parallelized execution speed is on par with an intel core2duo. There's a very good reason for that - that you have nettops with an intel atom processor, the full mainboard, the standard ram construction with the ports, the connectors, etc. Instead of a 10x10 module with one 4 by 4.5 cm processor chip with an io module and an external connector chipped in. And it's not a technical one.

    So when AMD actually pulls it off with an integrated gpu/cpu solution, we have minimal out of order execution, along with low-level optimization so developers don't have to explicitly program for it -- and we start to look more practically at what sort of advantages parallel computing might mean to "real" tasks, then that is a huge deal. This isn't a programmable instruction set over an explicitly parallel assembly language, obviously. And the "cores" are still of different design between gpu and cpu parts, even if they can communicate between each other.

    But it is a huge deal when that actually works as well as it does even in the first iteration. Because we sort of need to understand that clock speed and linear processing is not where we have made any progress at all over the last 8 years or so. We're hitting a point where it costs way too much to increase the actual transfer speeds. And even intel relies on parallelism in their chips internally now to offset the relatively lower clock speeds.

    There's also something to be said for power-use and heat envelope. No one needs to have a mobile phone running at 1.5Ghz all the time. It's just stupidity incarnate. So no one of the larger handset makers stick to that (even if Apple did so for at least two iPhone editions, and a lot of the "yay quadcore" people still do that.

    And in the same way, a laptop that has to perform a normal office-task that consists of writing a file, updating the screen, and reading information from a server -- tasks that all /could/ be run in parallel -- have no reason to sport a 4Ghz computer that can brute-force the execution times from 3 seconds on a computer made of bamboo, to the glorious target of 1.5s that a computer that costs 10 times more can achieve. For example. That's just not worth it. Not just in money, but in terms of battery life, psu-draw, heat, durability, stability, etc.

    So again here there's no reason for having a high linear processing output. While there should be huge incentive to increase execution times on the existing low-powered designs.

    And this is where we see any sort of progress at all. With OpenCL, Android, iPhone, etc (Winmo and windows not included in any way). Where you start to care a bit about the clock-cycles, and finding out where the IO can happen without slowing down input response. And where the graphics unit might overlap with the normal tasks - not just in games, but in graphics acceleration in general.

    But far as 3d graphics goes as well - a 3d card still uses ram and components that are significantly slower and much less costly than a cpu design. Remember that they don't choose those designs for performance first. They choose them for the best balance between cost and performance. And if a new product that's actually cheaper to produce in terms of just the production cost (not including setting up new production lines) would outperform these cards -- how would that make economic sense to embrace, right?

    Because you should understand that there's no practical or technical reason for not gearing graphics into being run on cpu-cores with different instruction sets. There is, however a very good economical reason for many companies to avoid it. "Hardware cycles" will be slower, designs would need to be compatible with each other, etc.

    Apple, for example, dropped PPC because of that. And by most accounts, people in general can't see the difference between a PPC Mac and an intel Mac anyway.

    ..mm. Well, I'm pretty sure it will still be a sum of individually manufactured devices communicating on a pci bus of some sort.

    It's more than just a few years away where that modularity would happen in software, rather than hardware..
     
  24. eKretz

    eKretz Notebook Enthusiast

    Reputations:
    37
    Messages:
    42
    Likes Received:
    0
    Trophy Points:
    15
    I really don't see how anything in my post proves your point.
     
  25. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631

    eKretz previously said
     
  26. eKretz

    eKretz Notebook Enthusiast

    Reputations:
    37
    Messages:
    42
    Likes Received:
    0
    Trophy Points:
    15
    Dude, you're as bad as one of those muck-raking reporters that's always getting in trouble for tweaking their stories to suit an agenda. If you take my reply out of context then I guess you can make it mean whatever you want.

    Try reading it again:

    eKretz previously said:

    I.E. using NON-stock settings which drive the heat levels through the roof. Again, the cooling system is more than adequate for stock clocks. In other words, when I am not running overclocked and pushing 100% loads, the chassis barely gets warm. I use it all the time setting on my legs directly with no discomfort at all.
     
  27. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    And now you are contradicting yourself...

    'Damned hot' and 'use it all the time setting on my legs directly with no discomfort at all' do not compute.


    As I mentioned already even the 'rog' setups are too hot imo - just sitting at idle (all day) at the big box stores on display; they are too warm for my tastes (even if I have a high pain threshold - computer components don't share that 'strength' that I have and you seem to too).

    Having the ability to 'keep a machine from melting down' is not high praise for a cooling system - even if it's overclocked and not in stock form.

    The thread here is for any notebooks with discrete gpu's that DON'T run hot.

    There are not any, yet.
     
  28. eKretz

    eKretz Notebook Enthusiast

    Reputations:
    37
    Messages:
    42
    Likes Received:
    0
    Trophy Points:
    15
    Your reading comprehension skills need work. As I (thought I) stated clearly in my last post... I use it all the time setting directly on my legs when it's not being pushed to 100% loads while overclocked to the max. Which would mean that it stays at a perfectly comfy temp, core temps usually don't go much over 55-60C. But I digress, and am tired of banging my head against this particular brick wall. I will just agree to disagree.
     
  29. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Most gaming laptops don't run "too hot". Rarely is it a design issue, but typically a poor application of thermal paste from the factory, crappy thermal paste, or a mis-aligned heatsink or heatpipe or combination of these.

    Regarding drilling holes, etc, usually users do this to overclock which generates more heat, or just to get the coolest running system possible. The OEM's test to make sure they run within a certain temp limit and don't always optimize for absolute best cooling. Although it boggles my mind how there's frequently restricted airflow to the cooling fan due to really small slots, and just opening this up improves cooling by a decent amount.

    The only heat I feel on my Clevo is from the hot air going out the back. Everything else is cool, less than 100F at peak load. So I dunno why that would be considered too hot.