This is something I never quite figured out. What kind of difference does it make if I have, let's say ~0.5 GHz faster cpu (of the same generation) than some otherwise identical laptop?
I know it makes hardly any difference in games, but that's because game performance isn't very dependent on the CPU.
Is there anything besides benchmark numbers that can tell me what's the difference? Videos perhaps?
-
What are you doing with your laptop exactly, this will give us an idea of whether or not you will even see a difference.
-
I'm not asking whether I specifically need a better one, I'm just asking what's the difference in general.
-
That's like asking whats the difference between a v6 1.5l and a v6 2.0l when driving in a 25 mph steet...
Who knows what the difference is in general. In general it could be no difference, and in general it could mean a very big difference.
We need more specifics. -
-
tilleroftheearth Wisdom listens quietly...
A cpu upgrade to the same generation platform as what you're currently on (even a 0.5GHz upgrade), while noticeable, is not worthwhile imo.
Even with an example of a 2.0GHz cpu being upgraded to a 2.5GHz cpu we get up to a 25% increase in performance (depending on the program/workload we're most interested in).
When we change to a newer/current platform - we can expect to see up to a doubling of performance (200% increase) regardless of what the actual GHz rating of the new cpu is. (This is why I don't recommend upgrading cpu's on old platforms),
If we're comparing a current generation cpu, then that 'example' 25% increase is worthwhile at almost any $$$ cost if we're using this for work related productivity (even if it only gives us 5% faster performance for our specific workload).
In my specific case; I can see/feel the 'snappiness/responsiveness' of an otherwise identically setup system with as little as 10% cpu speed difference - and up to a $300/400 'premium' over a slower processor, I feel it is worth it (consider that I keep such systems for at least 3 years) in the long run.
And you're wrong: it does make a difference in games. Maybe not the maximum FPS quoted by most; but in the minimum FPS that makes a very real difference in how playable a game feels.
To fully complete this thought: not only does the cpu's GHz rating come into play, but also the number of (real) cores it has too. Even for gaming. While dual cores can certainly power most setups today; quad cores simply do it effortlessly. In the very near future, dual cores will feel as sick and weak as single core systems do today.
Hope some of this helped? -
Not much of a difference. The usual ~.2-.3ghz bumps you get from upgrading a tier (e.g. i7-3610qm to i7-3720qm) is not going to affect the vast, vast majority of users. That 10% never be noticeable in games/internet browsing/general use. But sometimes when you upgrade you get extra features (e.g. the i7-3720qm is partially unlocked for overclocking and has some extra extensions) that might make it worthwhile if you have specialized needs.
Major upgrades, though, like dual core->quad core or 3-4 tier jumps (e.g. i7-3610qm to i7-3940xm) will probably have an impact, depending on the workload. -
I am aware that previous generation CPUs can be a lot worse even with higher GHz rating, that's why I included "from same generation."I don't think I'm ever buying a dual core CPU anyway, so that doesn't really matter much either.
The difference seems to be 1-3 GHz with each upgrade, and the upgrade I'm currently looking at is 150 for +.3 GHz (on a 1300+ laptop). It seems worth it to me considering I do a lot of heavy non-gaming stuff and I can easily afford it too. But even if it's price-wise relatively irrelevant whether I take it or not, I'd rather not if there's no actual noticeable real-world difference.
-
tillerofearth, that's quite a bit vague of a post.
25% increase in performance where? In casual usage, it's more like 0% performance gains.
Identically setup system, but not the same individual system. There are large variables between hard drive access times, and random reads. Which would attribute for the snappiness factor far more than a 1 ghz change in processor (2ghz vs 3 ghz). I've upgraded processors before. Often times from 2-2.8 ghz. From C2d to arrandale. Sorry but no difference in casual use you only see a difference in more intensive operations.
For example: I can get a 5 year old system feeling more responsive than my M4600 if I swap my m4600's drive for a low density platter but high capacity HD (3x250GB platters) while the old C2D runs a newer same speed but higher density platter (2x320GB platters).
OP isn't clear what his use is, so far it's a crab shoot. He may see 0 difference, or a 25% difference. The phrase in general doesn't really say much. -
tilleroftheearth Wisdom listens quietly...
Crimsoned, I don't think you read my post carefully enough.
compuNaN, 0.3 GHz is significant, I agree. Especially when you state 'I use a lot of heavy professional software almost daily'.
It seems like you'll put that extra HP to good use. I think it's money well spent in your case (with the facts as presented so far).
Could you let us know the actual cpu's you're deciding between? -
Didn't see that last comment of the OP regarding usage.
-
If your computer usage or workflow is to stare at a progress bar or precentage all day, then get a cpu upgrade. (or storage/ram/gpu upgrade
) If you don't do that everyday, don't brother.
-
tilleroftheearth Wisdom listens quietly...
-
you'll see some difference if you're doing photography work or video editing work. Rarely would there be a game that would rely on CPU exclusively, yet I know of one - Unreal Tournament 3 - it made me upgrade CPU, lol. There could be some benefit in CAD work environment as well.
what is your heavy professional software doing ?
just saying. -
The difference would only be noticeable when doing tasks that are CPU intensive (and where you can measure such a difference).
An example would include a program such as 3d Studio Max, LightWave, Blender or Maya - because all of them would be heavy on CPU use during rendering for example.
A difference between 2.0 and 2.5GhZ will not exactly yield stunning results, but it could reduce render times by about 25% - that alone could provide interesting results during longer render-times indeed, however, not if you are paying premium price as a result (because ultimately, replacing CPU's of the same generation for more powerful ones in that same generation could end up even more expensive).
Comparing such a difference in speed between general based tasks (Internet, email, youtube, etc.), no difference will be noticed at all.
Certain games might benefit, but the differential could easily be too negligible (and it often is - unless we are talking jumping from a dual core to quad core) to justify the monetary cost. -
Bottom line, if you crunch a lot of video, audio, or encoding in general, a faster CPU will finish it faster, and will save you time. How much faster? Well you'd have to look at the specific chips. But if it's an occasional heavy workload I don't think it's worth it. For 99% of your other tasks you will see little difference.
And baiii is right. If you need to get as much processing done as quickly as possible, get the fastest chip you can. Otherwise you get very little value out of a high clocked chip.
What kind of difference does CPU upgrades actually make?
Discussion in 'Hardware Components and Aftermarket Upgrades' started by compuNaN, Jul 24, 2012.