I recently came by a lightly used Presario CQ60-615DX. I love this little computer, it's just extremely slow. I upgraded the RAM to 4GB and have a T7350 laying around that I'd like to put in it to replace the JUNK Celery 900 that it came with.
But before I tear the whole thing apart I'd like to know:
1) Will it support a dual-core processor?
2) Will it support a 1066MHz FSB processor?
I've scoured the HP website but can't find anything.
-
-
tilleroftheearth Wisdom listens quietly...
Looking here:
See:
Intel® Celeron® Processor 900 (1M Cache, 2.20 GHz, 800 MHz FSB) with SPEC Code(s) SLGLQ
See:
Intel® Core?2 Duo Processor P7350 (3M Cache, 2.00 GHz, 1066 MHz FSB) with SPEC Code(s) SLB53, SLGE3
It seems like they share a socket spec.
Whether it works on your specific system will depend on the BIOS and possibly the chipset used.
I say give it a try - not only will your performance skyrocket - the notebook should actually run cooler too (35W vs. 25W TDP).
Good luck. -
What processor do you have? There is no such thing as a T7350.
-
tilleroftheearth Wisdom listens quietly...
Trottel, how can it be used as a marketing ploy?
Most people don't compare TDP numbers - right?
Anyway, with a dual core (yeah, I linked to a P7350 not a T7350, oops!) the cpu will get to idle quicker than a single core which will always be 'on' and that will get you a cooler system by itself.
If the TDP numbers are the most that the cpu's can survive/operate at - then, how can they be used as a ploy? -
P7350, not T7350. I guessed, didn't know for sure. It's the 2Ghz dual core out of my dead G51VX-RX05.
And I call the Celeron junk because, quite frankly, it is. The clock speed is decent, but it's a stripped down single core with 1MB of cache and 800MHz FSB.
Updated the BIOS to the newest version on HP's website, I guess now all that's left is to tear it apart and try it out. -
A dual core running at full load will consume twice the energy of a single core running at full load, all else being equal.
With a lower TDP, the cooling and energy regulation circuitry doesn't have to be as good and people can say that the lower TDP processors are more energy efficient, even if they really don't have a clue how much energy any of the processors consume. Even though the Celeron 900 could be given a TDP much lower than 35w, it isn't for two reasons. The first is that the Celeron 900 is a budget CPU, the cheapest mobile Core 2 that Intel sells. The second is that lower TDP processors are sold for more money than similar processors with a higher TDP. Remember, TDP is a somewhat arbitrary figure Intel gives to its different processor families.
As an aside, there are a couple of things irrespective of TDP that Intel has done to the Celeron 900 that makes it a bit less power efficient during low or no use times. It does not automatically lower its multiplier or decrease its voltage, although it does downclock using the FSB. Although during idle, it still has the advantage of being only one core. -
-
tilleroftheearth Wisdom listens quietly...
Trottel, I have to be honest, you lost me with your TDP explanation.
With regards to a dual core consuming the same power as a single core cpu doing the same work, sure, if we're comparing the chips on a Windows 3.1 O/S.
On a modern O/S install, with AV software installed, there will always be a real and measurable benefit to a dual core over a single core system - in performance and power/heat metrics.
The single core cpu is never at rest - while the dual core can/will operate at a much lower 'intensity' for much less time to do the same work - faster - at much less heat output too.
This is the whole premise of dual core tech - and on that it delivers. At least in my experience (with Intel CPU's). -
tilleroftheearth, think about it this way:
Let's say we have a single core processor and a dual core processor that is identical to the single core processor in every which way except that it has two cores. If we are running a process that doesn't require full power from the cpu some cycles will be used and some will be idle. If the process requires X cycles every minute on the single core processor, it will also require at least X cycles every minute on the dual core processor. Then if we have Y cycles spent idle for the single core processor during that time frame, we then have 2Y+X cycles spent idle on the dual core processor during that same time frame. So if we have the same amount of cycles being used by both processors, but one has more idle cycles, the one with more idle cycles, the dual core, will consume more energy/release more heat in the same time frame getting the same work done.
Of course the advantage of the dual core is that it can get more work done in the same amount of time or get the same amount of work done in less time, but the above paragraph shows that under no circumstances will the dual core use less power in the same time frame or use less power to get the same amount of work done as a single core. -
tilleroftheearth Wisdom listens quietly...
Thanks for trying to explain Trottel, but I still don't see it.
First, your example is not reflected in real life with real cpu's on real O/S installations.
I think you may be stuck with the first generation dual cores; the Pentium D series.
See:
Intel Dual Core Performance Preview Part II: A Deeper Look - AnandTech :: Your Source for Hardware Analysis and News
But even back then with a very non-optimized architecture a second core only added less than 15% to the power consumption.
I would be very hard pressed to believe that today's processors (Core 2 Duo's and even more to the point, i3's and i5's) are even less efficient than in 2005 when two single cores were basically glued together.
However, I am still open to further persuasion (with appropriate facts). -
You said that a dual core uses less power and puts out less heat than a single core. But there is just no way for that to be true under any circumstances if they are running the same software.
If the single core has 1 billion clock cycles per second, the dual core would have 2 billion clock cycles per second. Each clock cycle used to do work uses a high amount of energy, and each clock cycle idle uses a low amount of energy. If we have a program using 1 billion clock cycles per second, power consumption of the single core would be 1 billion high plus 0 low. Power consumption of the dual core would be 1 billion high plus 1 billion low. Thus the dual core will use more power. The single core cannot exceed a power draw of 1 billion high, so a program that tries to use more clock cycles per second will not increase power consumption of the single core, but may increase power consumption of the dual core all the way up to 2 billion high and 0 low. This hold true no matter what the values of high and low are, as long as high is more than low.
If you still don't agree with me, could you please explain to me how it could even be possible that adding a second core adds less than 0% power consumption?
-
tilleroftheearth Wisdom listens quietly...
Using your examples, I agree with your conclusion.
But, as mentioned before, your examples are not based in today's reality.
Adding a second core is not 'free' in regards to idle power usage - but intelligent power gating makes it (almost) effectively so.
A cpu will always output more heat at the peak of its clock limit than at lower frequencies. A dual core, by running at much lower frequencies (same amount of work shared by two cores) than a single core takes advantage of that property by running cooler.
The case you're stating is only for a single, single threaded application (and like I said above, I can agree with your conclusion).
The case I'm stating is that in a modern O/S with a full complement of apps and A/V software, there is never just a single thread loading the cpu at any one time.
If you also believe this state to be true, then maybe you can see how I come to my conclusions too:
The single core cpu will be loaded/struggling at a higher frequency and for much longer than a dual core system would. This translates into running at a higher inefficiency for a significantly longer period of time (the single core) vs. running at a greater efficiency (the dual core) for a much shorter period of time (where one or both cores are at 'idle' while the single core is still sweating heavily).
I can see what you're trying to present (finally), but knowing that a single core cpu can easily take 4+ times longer to do the same (modern) work as a dual core is something I've seen again and again with many clients systems compared to my dual+ core setups.
As an example, I'm currently running 102 processes right now at a 0%-5% cpu load (i3 350M cpu). With a single core running less than half as many processes (same basic install + A/V, 0.2GHz slower single core cpu) the cpu was spiking up at idle to 40% constantly and never, ever 'idled' less than 8%.
In a very limited example as you've set up, you are right. In the more real world usage I have seen different cpu/platforms/systems running side by side on, I think the facts are on my side?
When you compound the above with the fact that most single core cpu's do not downclock when idle (as you have stated previously), the dual cores are even more impressive from a (lower) power consumption and a (less) heat generated point of view.
Given that the Celeron 900 in the OP's system is such a CPU (no advanced speedstep circuitry to make it idle more efficiently) maybe you can see why I stated that the change to the P7350 would positively affect how cool the notebook would run compared to the 10W higher TDP of the Celeron processor.
Again, I don't see how the TDP is a 'marketing ploy' by Intel.
The benefits are real and tangible. In smaller, more powerful and cooler running notebooks and in longer run times while on battery power.
On the desktop side:
I can remember, when idling, the computers would heat up the office by themselves (great in the winter). Now, if I don't set the furnace to fire up, the office can be a great imitation of a walk in cooler.
(Above: I'm comparing 8 single core computers to a dozen quad cores, btw). -
Long story short, The P7350 is not compatible with the Compaq CQ60-615DX. I borrowed the T9300 from my sister's computer and it worked, so I know it supports dual core. No idea why the P7350 didn't work, but neither laptop would boot with it installed and I know the processor isn't broken.
-
But look, none of what we have been arguing about has anything to do with TDP. TDP is the thermal design power, the most energy Intel says the cooling system has to be able to remove. The processors reach max heat output at full load. There is no way that a 2.2Ghz single core processor could ever consume more power at full load than a 2Ghz dual core of the exact same architecture. The only reason Intel gave it a 35w TDP was for marketing purposes. Their low TDP models are sold at a premium, but that doesn't jive well with the Celeron 900 being the cheapest mobile Core 2 processor.
-
tilleroftheearth Wisdom listens quietly...
Trottel, okay, lets say we're both right (using our own assumptions).
The only thing I know is that there is a way for a 2.2GHz single core to be less efficient than a 2GHz dual core. It is manufactured on a less efficient process (requiring higher voltage to be stable).
On topic;
I am saddened that the OP couldn't get his cpu upgraded.
Could he keep his sisters cpu and put the P7350 in hers?
Ah! Just saw that it wouldn't boot with the P7350 in either computer.
How much does a T9300 cost? -
Trottel Your theorethical explanation is somehow convincing but what about people reporting that after upgrading to dual core battery life in every day usage actually increase, sometimes significant?. And aren;t You doing wrong assumption when You talking about identical cpu different only of number cores - one vs two. What's my point: if we have 2Ghz 800Mhz 2Mb cache dual core cpu is it equivalent is 2Ghz 800Mhz 2Mb cache 1 core cpu ? or maby 2Ghz 800Mhz 1Mb cache 1 core cpu beacause whats make cpu its not only cores but for example cache. If we have 1 core consuming theorethicly 10W, second core 10W and cache 5W that give us 25W cpu, but cpu with one core and that same cache will give us 15W cpu and even we assume that single core cpu have half cache and that half cache consume half energy - what I dont think we can assume - it give us 12.5W cpu. But I think if we have fairly compare 1 and 2 core cpu they should have that same amount of cache because amount of cache influence on performance in lots of task and programs. At the end we have 25W dual core vs 15W singiel core. And thats mean that 2 core even on full load will not consume 2 times more energy. Or maby my thinking is wrong ? I'm just asking? I'm curious myself how it is actually.
-
Also Intel has disabled multiplier downclocking and voltage dropping on their mobile Celerons, and although they still allowed downclocking via FSB, the voltage dropping ability is not in the Celerons' favor as far as power consumption.
What I am not trying to say is that the Celeron 900 is awesome or that the single core Core 2 and beyond processors Intel has released are great or anything like that. All I was trying to say was that despite Intel's arbitrary TDP rating, it isn't exactly power hungry compared to other similar processors and in fact could have a much lower TDP than most other mobile Core 2 processors, and that there are no advantages inherent in a dual core design versus a single core of the same architecture.
-
In your particular example, I think any difference in power consumption would be contributed to by a lower voltage range on the P9700 (1.012-1.175 V vs 1.00-1.25 V), as well as possibly the 25% faster FSB (1066 vs 800). The P9700 also came out half a year later than the T6400; we don't know what sort of refinements may have been made to efficiencies within the CPU within that time frame.
Either way, this is starting to wander rather far afield from the original topic. This thread ( http://forum.notebookreview.com/har...es/334009-gl40-chipset-cpu-compatibility.html) may be helpful with the CPU upgrade; there is mention of a successful upgrade of a Compaq CQ62-219WM being successfully upgraded from a Celeron 900 to a T6670. -
-
Power consumption for a processor is directly proportional to the number of cores and the clock speed, and proportional to the square of the voltage.
So you can see, at the top of their voltage ranges and at full load, the P9700 will consume roughly 24% more power than the T6400, yet the T6400 has a TDP 25% higher than the P9700! IE, TDP in this case is complete bull.
For these reasons, TDP is rarely useful information for end users. It is part guide to follow for manufacturers, and part marketing hoopla. -
This is not to say that Intel's TDPs are exact, you could be completely correct in that Intel has overestimated the actual TDP of a T6400, and they just slapped 35 watts on it because that's "close", but I think the only way for us to be sure would be for actual measurements to be taken.
I agree that TDP is rarely useful information for end users, I'm just not sure that as much of it is marketing hoopla as you suggest. Much like, say, the cubic capacity of an internal combustion engine; it has a very definite use, but people will often take it to mean more than it actually does. -
tilleroftheearth Wisdom listens quietly...
Trottel,
Yeah, that is where you are going off on a tangent.
Your bolded statement above is not (always) true. Too many other factors come into play that ultimately turns that extra power into heat. Including (as I've mentioned before) better manufacturing (or better optimized) processes.
My direct observations may be apples to oranges for you, but I think I am pretty good at weeding out the significant and the relevant from the chaos called 'real world'.
In a laboratory environment, with very narrow and restricted testing, I agree with most of what you're saying. In actual use those same variables you accuse me of 'comparing apples to oranges with' are coming back to bite you too.
I'll stand by what I said in multiple posts earlier:
if a 2005 era dual cpu accounts for only 15% more power consumption than a single core, then today, I can believe that with all the current tech available (your 'oranges'), will consume at the most that much 'extra' power.
What we get though in return is performance that is many multiples of the power increase introduced with a second/3/4th core and more importantly (my initial point I made) a significant decrease in the heat output of the system running basic/intermediate tasks - compared to the one core solution.
A major part of this heat reduction is not only the power of the additonal cores which complete a modern O/S's complex (multi-threaded) tasks faster, but also the fact that cpu's are most in-efficient and generate the most heat when they are pushed to their highest clock speed (made worse for single cores when combined with the exponential extra amount of time they spend at 100% to complete the same complex tasks).
You can play with theories and numbers and state MS is just using marketing bs to sell chips to manufacturers, but my view is that the manufacturers that believe that line of thought are also introducing poorly designed/engineered systems that not only risk the longetivity of the components in question, but also impact the performance of the same systems compared to other more properly engineered examples.
HP and Apple are the worst in this regard ime (run way too hot and get throttled way too much to be thought of as 'productivity' hardware at even the low/medium end of the spectrum). -
-
{Snip FSB explanation and effect on power consumption}
Ok.
Compaq Presario CQ60-615DX Processor Upgrade
Discussion in 'Hardware Components and Aftermarket Upgrades' started by cuytastic101, Oct 28, 2010.