I already know that heat decreases how well a processors gates/transistors function, just like gasoline engines and heat. Is there anything recently publicized that documents just how much performance degradation happens when heat builds?
-
thinkpad knows best Notebook Deity
-
I think the problem is that for different architectures, the performance drop will be different. Some architectures, I'm sure, are more resistant to a performance drop due to heat, while other architectures (and compositions) will have less resistance to such a performance drop, which means that you'd need to find the specifications for a specific architecture/chip design, which will probably be considered extremely confidential.
-
The reason why performance decrease with heat is because built in fail safe mechanisms forces the component to switch lower power states to prevent further increase in heat built up so component do not get permanently damaged and users burnt.
-
thinkpad knows best Notebook Deity
RMclock proves this, it is also undervolted, i'll probably splurge and get a T9300 which i heard brings a miraculous decrease in temps, like 50's for max temps! -
i think 60-70s would be more like it but the T9300 should be cooler anf faster...
-
-
1)Hardware manufacture based. Implemented in BIOS low level system
2)High Level Windows ACPI.
Do you see this in the event log?
-
moral hazard Notebook Nobel Laureate
Advanced CPU settings>throttling>Force thermal throttling.
Now you will see that the CPU did not downclock, but the performance will go down dramatically.
Throttling is different to downclocking.
Your CPU is throttling because it's too hot.
There are two solutions:
1. Change the thermal paste and turn up the fan speed (maybe use a cooler).
2. Mod the dsdt table to disable thermal throttling (I can help with this). -
I think critical temps (downclocking) are 105C for nVidia GPUs and ~100C for intel cpus off the top of my head.
-
thinkpad knows best Notebook Deity
Hmmmm... you seem to be forgetting that i just wanted to know if there is a general percentage of performance decrease for certain temps, like for 50C 60C 70C etc.
-
thermal management is the single hardest thing that Intel/AMD (and yes, Nv) have to do.
Pulling 60+ watts through a piece of silicon 10mm square is kind of stressful. -
-
H.A.L. 9000 Occam's Chainsaw
I know that in high power instances, like pulling 25-35 possibly even 45 watts for a notebook through a 2-3 centimeter silicon wafer, electromigration becomes an issue. Albeit not severe until the heat really gets up there. I know this was an issue for the Prescott core Pentium 4's because their long, shallow, and inefficient pipelines. Over time it failed because of signal degradation across the pipelines. Now this is not so much the case with the Core microarchitecture, seeing as how they have deeper, shorter, and more efficient pipelines... I have read that over time this can cause a pretty noticeable toll on performance as the pipelines degrade.
-
Yeah that would make sense, i dont see any performance drops of the cpu when operating at like 70-80C running random burn tests, but i do know that the northbridge can mess operations up when it gets hot.
Performance decrease with heat
Discussion in 'Hardware Components and Aftermarket Upgrades' started by thinkpad knows best, Jan 10, 2010.