Hey everybody, I'm looking at upgrading from a Intel P8600 to a Intel T9900. The P8600 is a Max TDP of 25W while the T9900 is 35W.
Has anybody upgraded their laptop processors before and went to a higher TDP like this? any suggestions on this?
thank you!
-
-
Is that even worth it? Seems like a very minute jump.
-
tilleroftheearth Wisdom listens quietly...
A 40% increase in performance? Yeah, worth it - depending on what you can get the T9900 for and what you can sell the P8600 for.
As to the OP's 'is this possible' question: it depends on what specific system he's considering upgrading.
If this was me (and assuming the cooling system of the notebook in question was adequate for the increased heat that will be generated at maximum output...) the price would have to be less than $100 to consider this 'jump'.
If it is more, I would rather spend in the $500-$600 range (or higher, of course) and get twice or more the performance increase from what I'm at now by considering an i5 2410 with 8GB RAM and Win7x64.
Hope some of this helps.
Good luck. -
It's worth it. I went from a P7350 to a T9600... a 25W versus 35W. The temps dropped in my G51VX... because it ran more efficiently.
http://forum.notebookreview.com/asus-gaming-notebook-forum/550790-g51vx-new-cpu-again-post7268817.html#post7268817 -
Thanks guys for the responses!
The notebook in question is this Toshiba
http://forum.notebookreview.com/toshiba/542594-upgrading-qosmio-g50-cpu.html
My friend uses his laptop as a 'portable workstation' and take it from his firm back home so he can work after hours.
He has already put in 8GB as per my recomendation and he liked the increased performance.
My worries are stability, going from 25W to 35W I don't want him to endanger his system.
It's a huge laptop so I'd think the cooling should be good on it but it's hard to say. -
Wrong thread, sorry
-
TDP doesn't translate linearly to power consumption. As long as you put sufficient thermal paste and have a clean heatsink then you shouldn't worry about the different TDP.
-
-
Meaker@Sager Company Representative
2.4ghz 3MB cache vs 3.06ghz 6mb cache
3.06/2.4 = 27.5% in CPU clock limited apps.
Maybe up to 30% if it is somewhat cache sensitive too. -
clock speed does not directly relate to percent performance increase.
-
tilleroftheearth Wisdom listens quietly...
I was going by the (admittedly limited) PassMark scores for each processor. -
-
TDP != power / temperature.
I did an upgrade from T9400 to T9900 and the new processor runs noticeably cooler than the old one. Both have 35W TDP.
-- -
-
Changed from P7450 to T9900 & had no issues here - possibly runs a little warmer but i changed mobo at the same to up my GPU from 4530 to 4650 so hard to say which or both is adding to the little bit warmer temps.
No problems runs fine.
If your looking to increase performance maybe an ssd might be an easier way to go -
As people stated above, TDP=/=power consumption. I upgraded from a P8400 with 25W TDP to a X9100 with 44W TDP with no problem, the temp does increase a bit but that's because my x9100 is overclocked.
Don't take the TDP for everything, just give your new CPU a good repaste and you should be fine. And if you are so worried about temp, undervolt it. -
This was on the laptop in your signature? wow
I've thought about putting the X9100 into my friends Toshiba Qosimo G50, but going from 25W to 44W that does worry me personally, especially even if the BIOS will bootup the X9100 or not.
I was thinking of going with the T9900 and then install the WD Scorpio Black 750GB 7200RPM drive, since 750GB in SSD is way too much these days. -
-
And if your laptop has a second HDD bay, it's a good idea to buy a small SSD and keep your old HDD in its place, install your OS and favorite programs on your SSD, this way your computer will boot up/shut down MUCH faster, so will your most often used programs. -
-
-
-
Not to resurrect an old thread but it came up on a google search when I wanted to upgrade my Socket P P8700 (25W) to a 35W SLGEB E8335 (nearly identical to a T9800). Wanted to leave a post for others who might search the same. I have an Asus N51Vn-X1A and was able to swap out the P8700 with the E8335 with no problems. BIOS recognized it (version 217) with no problems. My temps using Orthos for ~30 mins did go up from ~60C max load on the P8700 to ~70C max load on the E8335. My GT 240M is on the same cooler but different heat pipes (one fan, 2 heatpipes) and they apparently do NOT use the same fins on the heatsink. Stressing both CPU and GPU did not increase the CPU temps like I anticipated. This implies they fins do not touch.
Anyway, if you do your research on TDP you'll know it's a loaded term with lots of gray area. Your temps could actually be lower going from a 25W to a 35W TDP processor. Also note that there are other stock N51Vn models that use a 35W TDP processor so I correctly assumed that Asus used the same heatsink/fan for both TDPs. -
tilleroftheearth Wisdom listens quietly...
There seems to be a few things going on here:
1) the notebook is made for the higher wattage model to begin with (as you noted).
2) the performance increase is not anywhere close to the TDP increase.
See E8335 PM 'score': 2238:
PassMark - Intel Core2 Duo E8335 @ 2.93GHz - Price performance comparison
see: P8700 PM 'score': 1671:
PassMark - Intel Core2 Duo P8700 @ 2.53GHz - Price performance comparison
Max performance increase in %: 34%
Max TDP increase in %: 40%
While this was a good match for your specific model - this does not imply the same for a chassis that was already at the 35TDP Watt limit to begin with.
With a real world performance increase of (max) 25-30% - I don't even think this upgrade would be justified (for myself/my workloads).
Glad it worked out for you - but don't be so quick to dismiss the indicated TDP... it is not a made up number. Your notebook has a superior cooling solution to almost all other setups out there. The difference could even be worse performance (due to throttling) with such an upgrade (on an older Dell or HP setup, for example).
Thanks for the update though!
Take care. -
I would take a 25% real-world performance increase any day (for $30 bucks, cost of upgrade), especially on a 4+ year old notebook. My P8700 was regularly pinned (both cores) when moderately multi-tasking. I did a lot of research before upgrading and my cooling solution, while apparently adequate, was less than optimal when compared to other notebook solutions (thank you google images). The CPU and GPU on separate pipes and fans is generally the best solution from what I've seen. Also, my vents are designed to pull air from the opposite end of the laptop, to pull air across the mobo. It's debatable how much this is needed for mobo components but with the cover off, temps went down 5C on the CPU. Regarding TDP, you can get 100 different answers from 100 different people. Some say the binning process will put a 25.1W chip in the 35W bin. I have no idea personally, I don't work for Intel.
-
tilleroftheearth Wisdom listens quietly...
In a 'vacuum' I too would take a 25% improvement for $30.
But the issues here are not as simple as that.
Nor is it what I stated in my previous response.
There are many examples of such 'upgrades' that result in lower overall performance. Or a need to 'frankenize ' the notebook to enable the potential the processor promises - at the risk of making a mobile system a desktop based platform...
As I've stated; if a chassis' cooling design can handle it; you're golden. If not? The 'upgrade' was a waste of time and money.
Because the TDP rating of a processor or gpu is not arbitrary: it is mandatory to give the maximum performance as spec'd. -
It is in fact quite arbitrary. Intel can say the TDP is anything they want. The only requirement is that the processor doesn't exceed that. Intel routinely uses these arbitrary TDP numbers to differentiate otherwise similar products, and sell the lower TDP product at a higher price.
A perfect example of this, of Penryns similar to the ones discussed in this thread. Celeron 900, 2.2Ghz single core, 1MB L2 cache, 35W TDP. Core 2 Duo SL9600, 2.13Ghz dual core, 6MB L2 cache, 17W TDP. Both the same stepping, but the Core 2 has twice the cores, almost the same frequency, similar voltage, 6 times the L2 cache, yet less than half the TDP... -
tilleroftheearth Wisdom listens quietly...
Your example is great to show why the rated TDP is so important.
For Intel to 'guarantee' the Celeron works to the much lower spec - it needs up to that amount of amperage (and therefore TDP).
All this does is show just how much better a processor can be by 'binning'.
Same thing can be said for the nand in SSD's - everything is NOT created 'equal'. -
Meaker@Sager Company Representative
Everything is created bell curve
-
Binning is stratifying cpu dies based on presumed quality by using testing and statistical probability. The best dies can reach higher clocks with the same voltage, or run at the same clock speed with a lower voltage. Another part of this is how much of the die is functional. If one of the cores is non-functional, that core can be deactivated and the processor can be sold as a single or tri-core processor. If some of the cache does not work properly, that too can be disabled and the processor sold as a version with less cache. All manufactured processors are tested for basic functionality, but very few are put through rigorous stability, temperature, and voltage testing. Over decades of the manufacture of millions or billions of semiconductors, manufacturers are good at predicting how well cpu's perform based on very small sample sizes and past performance. Of course the exact details are secret, but this is the gist of how they bin.
Although there are some differences in different dies, they are nearly always more capable than the manufacturer rates them. It also allows the manufacturer to have a lot of wiggle room to stratify their processors based on the market and not based on manufacturing capability. This is also where overclocking and undervolting come in.
Finally, the part where you are most confused about and seem to think that can be affected by binning. The equation describing the power dissipation of a processor is P = C * (V^2) * f , where P is power, C is capacitance, the processor's ability to hold electric charge, V is voltage, and f is frequency. In the example we spoke about earlier between the Celeron and the Core 2 Duo, if what you say about power consumption having parity with TDP is correct, the capacitance of the Celeron's die would be about 4 times greater than that of the Core 2 Duo, even though the dies are nearly identical. These are some of the most precision manufactured devices on the planet, built within the tightest tolerances imaginable, where hundreds of millions of circuits must work flawlessly in unison in order for the CPU to function at all. But what you are saying is as if a car manufacturer can build the same exact engine on the same exact assembly line but get a normal range of 10-40mpg in the same car. Cars are built to tolerances many orders of magnitude larger than processor dies, but even that sounds completely ridiculous.
The fact that maximum power consumption of any given processors sharing a similar die, regardless of binning, is determined only by the frequency and and voltage is apparent to anyone who has tried it. Unfortunately, the community of laptop enthusiasts has limited experience with overclocking and/or undervolting, so a lot of these concepts are unknown, or argued with using flawed logic to arrive at wrong conclusions.
The fact of the matter is that Intel uses TDP as much as a marketing and sales tool as it does as a guide for power consumption. When the same die is used at the same voltage and frequency in two different processors, but their TDP figures vary wildly, or two processors share the same TDP but vary greatly in frequency and/or voltage, you know that TDP is not an accurate predictor of power consumption. -
Just a note, the E8335 has a TDP of 44w.
So, the max TDP increase is 76%. -
-
tilleroftheearth Wisdom listens quietly...
If a stepping can make this kind of difference (in the exact same model); then I don't think you are qualified to state which cpu should be equivalent to another, TDP-wise?
I think I'll trust Intel on this. Why? Because TDP is not what is sold to consumers; it is much more directed at notebook manufacturers. And if TDP was as arbitrary as you seem to think; we would have seen the same cpu in a range of different chassis' (with different cooling capacities).
The MBP is a prime example of ignoring TDP: they are the worst notebooks to have when running any sustained, high demanding workload on them. Even if they didn't throttle, which they do; a lot: the 3rd degree burns they offer are not part of the Intel computing benefits.
If TDP was used solely as a marketing ploy as you state - then I'm sure at least one notebook manufacturer would have figured it out before you... and offered us quad cores in 14" models years before - and fanless dual cores since a decade ago (all in less than 1" thick chassis). -
You make it sound as if my argument is that Intel has been taking the real power consumption and then vastly increasing that to come up with their TDP numbers. But if that is the case, you have greatly misunderstood me. Sometimes TDP is in line with maximum power consumption, and other times it is completely off the wall. Also overrating the TDP doesn't allow laptop manufacturers to perform cooling miracles with their processors. And besides, laptop manufacturers aren't quite as oblivious as you are to the fiasco with TDP. -
tilleroftheearth Wisdom listens quietly...
Sorry, but the black and white facts speak for themselves...
If you can't provide any facts for your case - it simply falls apart.
My point is that TDP is not over-rated: when maximum performance is expected from the chip, sustained, over time - as per specs.
Manufacturers that blindly ignore the rated TDP or believe the conspiracy myth you're trying to push are the ones who make designs with inferior cooling systems for the chips inside: and they run HOT. Too hot.
TDP fiasco? I think only in your head. -
Yet what I am talking about makes logical sense, and provided evidence to back it up. I have explained how processor power consumption works and how this is at odds with TDP ratings when we compare two processors. Also I have said that people who swap out different processors in their computers can clearly see what I am talking about in action. I used to do it myself on desktops for the longest time, and then on laptops.
For example, I bought a laptop with a Celeron 900, like I talked about earlier. It has a 35W TDP, and then I overclocked it from 2.2Ghz to 2.93Ghz. Later I replaced it with a Core 2 Duo P8600 which has a 25W TDP. Guess what? The 25W TDP chip ran hotter than the overclocked 35W TDP chip. To be expected, of course, but contrary to your blind faith in TDP. -
Just to hijack this thread (cause I'm too lazy to start a new one), how long (if ever) does it take for an older gen of CPU to be considered so archaic that it gets the bargain basement markdown? 3 years, 5 years, 10 years...?
-
Also take a look at i7 920's for socket 1366 on Ebay. They came out in 2008 and were the slowest processor available on the platform. Yet five years later, they still sell for $80. -
Aaarrgghh. That's a good reference point for me. I am looking at upgrading the CPU in the desktop I built for my parents in 2010. It's got an i3-540 in it right now. I am/was considering to put an i5-680 in it, but I will wait a while and see if the prices go up or down. The used ones are still over $100, so I will wait and see if they drop below $100. I just thought that because Intel is now releasing the 4th Gen Core i3/i5/i7 CPUs, the 1st Gen ones would get a price drop.
-
-
California. Half way between Los Angeles and Mexico. I will look into the i7 LGA 1156 CPUs.
-
Upgrading processor...going from Max TDP 25W to 35W
Discussion in 'Hardware Components and Aftermarket Upgrades' started by mr.rhtuner, Sep 17, 2011.