I was wondering if the Intel Core 2 Duo processor T9400 (2.53GHz, 6MB L2,1066MHz) will improve game performance (game load times, fps) compared to the Intel Core 2 Duo processor P8600 (2.40GHz 1066MHz 3MBL2), the graphics card is Radeon Mobility 3650 256mb. So I'm wondering if the processor will improve games in general, or is the graphics card bottlenecking the overall performance? And the t9400 is 120$ more, so should I upgrade to the t9400 or not?
-
Meaker@Sager Company Representative
Not $120 worth of performance no, unless you find a particularly CPU hungry game (harder with the HD 3650 since its going to run into issues more than the CPU).
-
Thanks, I won't upgrade then if my video card will more likely be the one holding the computer back.
-
Yes the 2.53GHz will improve performance over 2.4GHz. I say go for it!
-
Do you have an idea of how much performance increase it can get? Like if it goes up 5 - 10 more fps or not?
-
Soviet Sunrise Notebook Prophet
In which games?
-
-
-
-
Depends what game!
Some games not as graphics intensive will see improvement! -
You will not see any improvement in gaming unless your are playing an RTS at a very low resolution, and even then, the small increase in clock speed would barely give any improvement at all. Ive left out CPU intensive games for one reason, and one reason only, his GPU is fairly weak as well, therefore, even if the game IS CPU intensive, its still GPU intensive, and chances are, he will be bottlenecked by his GPU long before his CPU in any game, sans playing at lower resolutions.
Basically, save your money and stick with the P8600. -
-
Long story short, you will not notice a damn thing, in any game.
-
Lol, heated discussion
. Thanks everyone, I don't think i'll dish out the extra 120$.
-
OK this is how I like to explain about the difference between CPU & GPU.
Imagine typing this sentence.
Now your CPU would type each letter one after the other (ok, two at a time for dual cores), but each one would be typed very very fast.
Your GPU would type all letters at the same time, but maybe at a quarter of the speed of your CPU.
Now if the sentence was just two letters, your CPU would be much faster at it, whereas if it was 200 letters then your GPU would win out.
OK, now transfer this into a gaming environment. Let's use Crysis as an example.
So you shoot a barrel. Your CPU will decide where that barrel goes, and the path it will take because this is one thing after the other (i.e. series processing, it gets to point Z from point A by going through points B-Y first).
Your GPU will decide how to render both the barrel and the rest of the scene at each step of the barrels travel. The reason is that there is a lot more to be rendered, and ideally you want it all rendered at the same time.
OK, so now we understand what role each processing unit plays to answer the original question; would you notice a difference?
Well probably not. A slightly faster CPU can run calculations faster. However, in games there is typically much much much more to render via the GPU than there is work for the CPU. Would it be faster? Yes, it definitely would be; but at such an miniscule level that you will not notice the difference.
However, although the GPU is the workhorse for most games, there are some that are more CPU intensive than others. Some are out there already that require core 2 duos running at 1.86GHz. As time goes on, these requirements are only going to increase. Typically though, GPU requirements increase at a faster rate than CPU requirements.
In short, I would recommend against spending that amount of money on an upgrade that is not going to yield noticeable difference. By the time games require a processor that is higher I'm almost willing to bet that you will be looking to buy a new laptop anyway. -
Basically, if your one of those guys who spends hundreds of dollars to put a cold air intake on a 4 cylinder engine. Then sure, go for it.
-
Fragilexx, I think that was the best explanation I've ever read. Can a mod sticky just his post?
-
I'm almost certain that there would be another post with this sort of explanation already stickied somewhere. OK, maybe not my sentence analogy, but there are any number that could be applied. Like drawing a picture, ironing a shirt, cooking even.
What would be really great is if I had some statistics for what improvements are actually to be had where the GPU is not the bottleneck. Unfortunately I do not have processors that are only one step up from one another, otherwise I'd try it out. My comments though I would expect are still true, that the difference would be so miniscule that the performance to price ratio would radically change. I like Mark6614s comments in that respect, in that if you are the type of person who buys something because you can, then it doesn't matter what anyone says, you'll buy it anyway
Hopefully though it answered the original posters question
Regards -
moral hazard Notebook Nobel Laureate
But about the cpu upgrade, you made the right choice not to upgrade.
I noticed overclocking my cpu doesn't help with games. Although if I overclock my FSB I do get more frames but that's because my GPU is integrated and the higher ram bandwidth is what gives me the frame increase not the CPU speed.
Once I was accidently running ORTHOS while playing L4D and I didn't notice the difference. -
Ok, let's theorize the maximum performance that's possible with upgrading.
2.4 to 2.53Ghz - Theoretical maximum of 5.4%. Realistically, the improvements are around 70% of that. So end up to be 3.8%.
3MB to 6MB cache - It'll probably bring 3-4%. It might have been bigger if CPU did more of rendering, but its not true. Most of the code probably is much smaller than 3MB.
Add that, you'll probably get like ~7% out of a game. If it's a CPU bound game. Is it worth it for you? That's up to you to decide. -
If you want to improve your load times, get an ssd ;x
-
insanechinaman Notebook Evangelist
You might get better performance with the game Prototype, since it's a CPU intensive game, but it'll be somewhere like a 2-3 fps increase. On all other GPU intensive games, the increase will likely be 0-1 fps. Save your money for a quad core later on
-
Alexrose1uk Music, Media, Game
Depends on the resolution as well, lowering the resolution will reduce the GPU demand, and thus show more effect on the CPU. CPU limitations will also lower minimum framerate.
Typically heavy RTS games like Supreme Commander, and most of the popular MMOs are just as heavy CPU wise as GPU wise; however, in this case no I don't think it's worth it. If you had a P8400 (2.26), or were going to a 2.8-3GHz CPU, then maybe, but at the moment you'll be paying $120 for 5-10% performance at most; unless you sold off your old P8400 for about ~$70.
Even then thats $50 for minimal performance.
Save your cash for a GPU upgrade if you can find one and your laptop will support an MXM upgrade; or wait until you need that CPU power. -
I realise this might sound odd, but in reality it is not. The GPU has a lot to render, this we all accept, but it can't do it's work until certain actions have taken place that are controlled by the CPU. OK, so if a game has lots and lots to render, then you want to give it the "all clear" to start that rendering process as early as possible. The best way to do this is to have a CPU that completes its instructions very quickly.
Now it might be that the work the CPU has to do is only 2 commands. This is hardly CPU intensive. However, the faster these are done, the sooner the GPU can start it's work, so you'd still have an increase of some sort.
Let's go through some calculations:
OK, so imagine your GPU takes 0.03 seconds to render a frame. The work your CPU does can be done in 0.01 seconds with a 2.4GHz CPU (for arguments sake).
OK, so the time it takes for one frame to be rendered is 0.03+0.01 seconds. That results in a 25 frames per second render rate.
OK, so now with a CPU that is 2.5GHz. The time it takes for the work a CPU does is reduced by 5%, so it now becomes 0.0095 seconds. The time for one frame to be rendered now becomes 0.0395 and this results in a frames per second render rate of 25.31.
25.31 / 25 = 1.24% increase.
Now let's try the same calculations, but this time we're going to say it's an equally CPU/GPU intensive game, so the 2.4GHz one takes 0.03 seconds and so does the GPU. I'll just display the result below rather than the entire english:
2.4GHz = 1 / (0.03+0.03) = 16.6 Frames per Second
2.5GHz = 1 / (0.03+ (0.03 x 0.95)) = 17.1 Frames per second
Increase = 17.1/16.6 = 3% increase
Finally let's try the same calculations, but this time we're going to say it's a CPU intensive game, so the 2.4GHz one takes 0.05 seconds, the GPU still takes 0.03 seconds. I'll just display the result below rather than the entire english:
2.4GHz = 1 / (0.03+0.05) = 12.5 Frames per Second
2.5GHz = 1 / (0.03+ (0.05 x 0.95)) = 12.9 Frames per second
Increase = 12.9/12.5 = 3.2% increase
OK, in reality it is much more complicated than this, and of course the times I've given are completely made up, but you get the point. The more CPU intensive games will get a small boost, the less CPU intensive games will still get a very very small boost. Overall though, would you really be willing to spend over $100 for whay could be only a 1-3% increase in frames per second?
Personally, for me the answer is a resounding no. Especially because in reality things are much more complicated than those figures above and therefore performance increases are likely to be smaller.
EDIT: Oh and with regards to the L2 Cache difference that I've not accounted for. Well, this one is really difficult to qualify, seeing that the L2 Cache really only comes into it's own when the same data are being used again and again or the CPU just needs somewhere quick to dump some data to retrieve again in a few cycles. -
How about going from P8400 to a T9600 with a 9800GTS?
-
Highly unlikely that you'll notice any difference at all.
-
-
Just to add to this where I left off with regards to the L2 Cache - I found this article here at the AnandTech site which is interesting; the chart below has been taken from their site:
This shows that an increased level 2 cache can have different impacts based on what application you are using, although it's worth noting that for the games there can be quite a wide variance in the performance increase.
To explain why this is we need to understand what the level 2 cache is exactly.
OK so we know your CPU operates at very high speeds, but it can't do that without any data to work with. It could get that data from the hard drive, but these are incredibly slow compared to the CPU, so instead data is loaded from the HDD to the RAM. The CPU also doesn't really want to get the data from the RAM, because although it is much much faster than the HDD, it's still pretty slow compared to the CPU.
This is where the cache comes into play. Data is loaded from the RAM into the cache, and from here the CPU has much faster access times. The reason being that in modern CPUs, the cache is actually integrated into the chip (though it wasn't always like so).
Cache is extremely fast in comparison to the RAM. It utilises the same storage methods, but due to the manufacturing process, Cache is typically much much more expensive than RAM. Therefore we don't often have any way near as much of it as we do RAM.
OK, so we now know what cache is, why is there so much difference between different programs?
Well that is actually quite easy to explain.
Your CPU executes incredibly basic instructions; but lots and lots of them very fast. Depending upon what it is being asked to do, it might need to use the same data time and time again. OK, so let's say that the CPU has some work to do on some data as detailed below:
Data Size
A 1MB
B 1MB
C 1MB
So it needs to work with this data in the following sequence: A, B, C, A, C
You have a 2MB cache
Data A is pulled from the RAM and placed in the cache. The CPU accesses it from there and stores the result back in the cache. Data B is then pulled from the RAM and placed in the cache for the CPU to work on. The cache is now full. The CPU completes it work and places data B back in the cache.
Data C is pulled from the cache, but it has nowhere to go, so data A is overwritten (after being sent back to the RAM), the CPU finishes it's work and writes it back to the cache. Oh dear, now the CPU needs to work with data A again, but that needs to be grabbed from the RAM to the cache again.
Having a larger cache means that data A would still be in the cache and would be faster for the CPU to access rather than having to wait for the data to be brought from the RAM to the cache again.
As with my other explanations, this is incredibly simplified. In truth there are intelligent memory controllers that organise when things needs to be sent to different places and recognise that different pieces of data will be needed again in a few cycles and so would overwrite the cache with a little more finesse; but again this is just so you get the picture.
So different programs might need the same data to be worked on time and time again. These programs will benefit more from having a larger cache. Other programs require lots of different data to be worked on. These will still benefit from having a larger cache as more data can be available to the CPU at any one point, but not nearly as much of a performance increase as the first group of programs.
I hope that comes across relatively clear; and combined with my other two explanations this should just echo that a faster processor with more cache will always result in better performance, but looking at the results from AnandTech, this can be marginal or it can be quite large. It's down to each individual person to determine whether a potentially small increase in performance outweighs the additional cost.
Also, as an interesting tid bit. It has been said that in a dual core processor, losing the level 2 cache can have a larger impact than only running on a single core! -
ViciousXUSMC Master Viking NBR Reviewer
That is a very small boost in cpu power, and not worth $120 at all.
Get the cheaper cpu, and then overclock it -
Yes indeed. My point however is that many people say there will be no improvement with a better CPU whereas in fact there will always be an improvement - just might be very very small. As I'd said earlier, the price to performance ratio can become incredibly skewed; but if you really want to then you will buy it regardless
-
I don't believe many people can see a 1% difference (in gaming, where most people ask about upgrading their CPU), so while technically there is an improvement but since in real life since you can't notice it, there basically is no improvement.
-
Have a read of my post earlier on. Gaming is quite a generic name. Some games are more cpu intensive than others. 1% increase can very often be had, but whether people would actually notice the difference - that I doubt very much
-
Go for the P8600 without hesitation.
Processor improve gaming performance?
Discussion in 'Hardware Components and Aftermarket Upgrades' started by VietTran, Jul 14, 2009.