Is the T5250 (1.5ghz) bad for playing games.
Ones which say 1.6ghz minimum.. even though the system they're talking about is a pentium 4.. THANKS.
-
Since C2D is based on the P3 architecture, it's actually faster per clock as compared to P4's... So even a single core 1.5Ghz CPU based on the Core 2 architecture will be faster than a 1.6Ghz P4. -
-
.
They're all very efficient chips, and highly scalable in terms of Hertz, unlike their predecessor (the P3). -
yay.
I shall order this laptop at the weekend. And hopefully get it for christmass.
I was just concerned. I thought it would be slow and useles..
It's not - IS IT?!
It can handle things pretty good.. such as.. internet, music, msn at the same time, and games like CCS - the laptop has a medica 8400m gs.
Meh. What can you do!? for £550. It's a stealll! -
hmmm... would you say that the T5250 is capable of running crysis, at, say, low to medium settings (when paired with a good graphics card like the 8600mgt)?
-
Probably not.But then again the 8600GT is a pretty powerful card, so yea, low/medium might do it
...
-
Charles P. Jefferies Lead Moderator Super Moderator
Are there any higher CPU options?
-
-
Or you could go for a cheaper desktop with higher specs...
or buy a used one with higher specs, but in warranty..etc
1.5 is pretty much the minimum for anything nowadays... -
i mean, we are talking about a laptop here. i already know i can't expect a gaming monster out of itno matter how good laptops get, desktops apparently will always be better
-
There seems to be a lot of bad advice going around in this thread. First of all, Centrino is !NOT! a processor, it's a marketing sticker that essentially says there's an Intel processor, motherboard chipset, and wireless card, that's it.
Anyway, the processor should be enough for most games, seeing as how you have a dedicated video card. However, there will be a minor boost in performance with higher end processors, but nothing too significant. -
I strongly advise you to save a bit more and get the faster CPU, at least a T7250 if possible.You will not regret it. -
-
-
Current games pretty much specify the minimum being a Core duo or Core 2 duo...
That pretty much implies either CD, C2D or X2 CPUs -
Ahh.. this thread has grown alot..
1.5ghz..
But i wont need to play new games.. the point of getting a cheap one is to be able to afford a PS3
But i want the odd game on it. Like CCS and our house has a pretty decent desktop anyways..
You had me worried about the 1.5 then.. but it's a lappy top anyways i'll probs only have it for 2 years.. and 1.5 should be enough. I do not intend on playing Crysis. I took it back last week.. I didn't like it.
Haa. -
.
-
"Centrino" is not a processor.
And do you know what a "Hertz" is?
-
Can be applied to anything that's wave-like in nature (i.e. oceanic tides can be described using "hertz"), but it's generally applied to electronics and their frequencies, or CPU clock cycles in this context.
I don't see what that has to do with anything, as it's a well known fact that P3 didn't scale very well, and couldn't hit very high hertz, so Intel opted for the P4 architecture. However, the Core 2 architecture is, more or less, an improved P3 architecture that can actually scale very nicely to high hertz.
And Centrino isn't a processor, you're right. It's a branding scheme. I already know that. Read my post right above yours.
-
-
I play crysis on medium/ high settings just fine on a desktop with SINGLE CORE processor (AMD 3700). Just goes to show the minimum requirements are crap and it's all about the video card (8800GTS).
-
worst processor ever
-
P.S. It's good to see there's quite a few intelligent fellow around here such as yourself that I can learn from.
-
-
In the end the specs won't really matter, what matters is how good you play the game. I have a friend back in college who was able to beat the crap out of his opponent on a Celeron 333mhz with an integrated video card against a P3 in CS. The pixels were so big in the Celeron yet he can score headshots with an AK-47.
-
.
-
Dustin Sklavos Notebook Deity NBR Reviewer
A couple things:
1. So everyone is clear, Core 2 Duos are derived from Core Duos which are derived from Pentium Ms which are derived from Pentium IIIs with some improvements from the Pentium 4 architecture.
2. The Core 2 Duo T5250 may be the low man on the totem pole, but I've spent the last few hours benching it against a 1.9GHz Turion 64 X2, and the T5250 beat it soundly at everything I threw at it except wPrime. A 1.5GHz Core 2 Duo may be slow by C2D standards, but it's still freaking FAST.
3. People who flippantly dismiss technology just because there's much faster stuff available don't seem to understand that technology doesn't work on that "zero to 100" scale. The baseline is constantly being raised and right now, the baseline is pretty freaking high.
4. If you kids can't behave yourselves, I'm gonna start making you take turns playing Blair Witch Project in the corner, you feel me? -
The T5250 should match that 8400 perfectly. Unless you do a lot of encoding, you should be happy with it.
-
I'm not sure how you could scale hertz??? -
I've got the T5250 in my XPS-M1330 works perfect for Vista, Internet, Office 2007 and medium settings gaming (where the 8400 is the limiting factor)
Don't let the fanboys get you crazy, they don't know that the're talking about -
Don't let the T5250 owners tell you to settle for a slow processor, they don't know what they're talking about.
But yea, there are very few applications that a 2.0 GHz C2D, but not a 1.5GHz C2D, would handle smoothly. It's also not worth upgrading for just gaming. -
Name one application that a T5250 can not handle please?
-
A T7500 would give you that extra 33% space to work with before the CPU throttles. This is especially helpful when multitasking. -
LOL, bad implementation then:
My T5250 plays both BluRay and HD-DVD on 1920 * 1080p via HDMI out on my 50"plasma without any problems.
compared to your 575p TV-card this is 40* the amount of pixels that is tranfered at the same time, while dealing with h264 of VC1 decompression.
LOL, i can do all of that in real world tempo. I'll just finish 2 nanoseconds later.....
A stated before, modern processors have so much headroom that it really doesn't make much different.
BTW from a CPU UTILISATION persective, there is NO difference.
When my system doesn't respond, it might snap out of it at 5secs
yours might be out of it at 4.9999secs
buth when mine is 100% yours will be 100% as well...... -
Charles P. Jefferies Lead Moderator Super Moderator
Please keep the conversation non-confrontational, it was getting a bit too warm in here for my preferences. Thanks.
-
Dustin Sklavos Notebook Deity NBR Reviewer
A faster CPU will NOT increase gaming performance when paired up with any GPU. If this were the case, worlds of benchmark sites wouldn't run games at low resolutions to benchmark CPUs, since the difference should be evident even at high speeds. The fact of the matter is that with games, the graphics card is almost always the "limiting factor," or as most people refer to it, the bottleneck. A modern CPU has to be pretty lousy to be the bottleneck in a game; the T5250 would only ever see a problem like this paired with some kind of obscene video card like an 8800 series.
But I will tell you right now, it doesn't matter if you're running a Pentium Dual-Core T2060 or a desktop Q9770 overclocked to 4GHz, if you're running Crysis on a GeForce 8400GS, that GeForce is going to deliver the same performance when pushed at higher settings.
There is ALWAYS a bottleneck. In ANY application. If that weren't the case, I should theoretically be able to put 8GB of RAM in my desktop instead of 4GB and see performance gains across the board, instead of just in specialized programs where RAM is the most deciding factor in performance.
And I have practical experience with this, since I'm actually sitting here with a 1.9GHz Turion and a 1.5GHz Core 2 Duo T5250, testing them for this site. Anyone who'd tell you a T5250 is slow is sorely mistaken and clearly hasn't spent a whole lot of time with one. It may be comparatively slow compared to other Core 2 Duos, but that doesn't make it actually slow.
The fact of the matter is that a system needs to be balanced so that no one component severely holds back the entire machine's performance, and believe me when I say that I've spent the last year learning that the hard way with my desktop.
If you were anything resembling right, it wouldn't be half as funny as it is that Apple ships a lowly GeForce 7300GT in their $2,499 Mac Pro. Clearly four cores running at 2.66 GHz can pick up the slack from that in a game, right? -
@Pulp:
so what do you think how would t5250 @1.5gHz compare to a desktop P4 3.0gHz or athlon equivalent??? -
I didn't want to read through the whole flamewar you guys got going here, but this is how it goes:
For most games, the GPU is the bottleneck, even the lowest Core 2 will easily be able to handle the geometric processing, AI, physics, etc... The games that would cause the processor to be the bottleneck are usually RTS's like Supreme Commander, where there are many units on the field. -
Dustin Sklavos Notebook Deity NBR Reviewer
But honestly, the performance-per-clock of the T5250 is pretty freaking high. It'd murder the P4 in anything multithreaded. In single-threaded apps I'd still take the T5250. It's better optimized and more efficient.
-
I bought on perpose an 1.8ghz coreduo in stead of a 2.0ghz coreduo because that 200mhz more would cost me 150+ euro. And i wouldnt notice any difference at all in most games, so what i want to say: Coreduo's and Core2duo's are more powerfull than you think. But when you buy an 8800m gtx i think its advised to buy a really powerfull cpu since it's twice as fast as my gpu.
-
I think this conversation as started to drift from it's original topic. Pretty much, the straight answer the T5250 will play ANY game out there and ANY game for the next three years or so.
Whether your GPU will handle higher settings is another question.. -
as long as its dual core its fine for any gaming in the near future....heck im still runnning most games on a 1.7 P4, so i cant imagine what improvement any dual core would have be it Pentium D, Core, or Core 2
-
Don't believe me? Underclock your CPU when paired up with ANY graphics card. Run ANY modern game and record its average FPS. Overclock your CPU and do the same.
Post results please.
Contrary to what your rant states, weak and/or integrated GPUs actually prove my point the most. I know because I've actually tried it on both an Intel GMA 3000 and an 8800GTS.
Nice rant for someone who has absolutely no idea what they're talking about. -
Dustin Sklavos Notebook Deity NBR Reviewer
Now, first of all, I'm going to point out that the choice of hardware you tried this on is inherently flawed: the GMA 3000 still runs a LOT of its functions in software, so CPU speed is going to have an effect on it by virtue of the fact that it's barely doing any of the heavy lifting. The 8800s, at least when they came out, were also well known to be CPU limited, though as my independent test proved, that doesn't seem to be the case anymore.
So I did you a solid. I took your word for it and I tested FEAR on my desktop as seen in my sig. Now, since my 8800GTS is overclocked, it stands to reason that even at the highest settings it should be somewhat CPU bound.
I ran my Q6600 locked at 1.6GHz and 2.4GHz, its highest and lowest power states respectively. FEAR was run with all settings maxed, Soft Shadows off (because no one ever uses them ANYHOW), VSync off, at 16xCSAA and 16xAF and at 1920x1200. So theoretically, my 8800GTS should be getting pushed pretty freaking hard, even overclocked.
So:
@ 1.6GHz: min. 32 fps, max. 127 fps, avg. 60 fps
@ 2.4GHz: min. 31 fps, max. 122 fps, avg. 60 fps
Well now that's weird. Those variances are within the margin of error. So at 800MHz slower on the processor, FEAR shows...no performance difference!
What have we learned here that applies to the T5250? Well, let's go apples-to-apples.
The T5250's FSB runs at 667MHz against the Q6600's at 1066MHz. So there's a deficit there.
The T5250 runs 100MHz slower than the Q6600's slowest power state, but the Q6600's slowest power state didn't really seem to be an issue here so I'm pretty sure we've got some wiggle room here.
And the T5250 has a quarter the L2 of the Q6600, but the Q6600 is just two E6600s anyhow, so as far as performance goes, it's more appropriate to say "half the L2." Another deficit here, this usually results in about a 10% performance diff. or less.
Yet somehow, I doubt these things are going to come together to really punish the T5250. Certainly it might run a bit slower, but it's pretty clear that we've hit a threshold where we're...
Wait for it...
Entirely GPU bound!
So now that I've passed your test and proven exactly what I said before, I'm going to go ahead and use some of that weird logic crap that seems to have become really passe around here as of late: would it make sense for one random guy on a forum with low rep to be right about gaming software and hardware if that meant that paid writers who work with this stuff on a daily basis all over the internet would have to be wrong about it?
So, I've successfully proven you wrong using logic, using research, and using straight up number-based facts. Is there anything else you want me to pull from or should we consider this case closed?
As for the term comparative, while the T5250 is certainly on the slow side of Intel's modern lineup, Intel's modern lineup is grossly faster than most of AMD's lineup, not to mention grossly faster than most of the processors out in the world right now. I hope I've proven the T5250 will be just. Freaking. Fine. -
I get 63.6 FPS in the opening video of 3dmark06 with my e4300 clocked at 1.8 GHz and a whopping 72.8 when it's clocked at 3.0. But why should anyone believe personal unpublished claims?
http://www.extremetech.com/article2/0,1697,1996940,00.asp
http://www.anandtech.com/cpuchipsets/intel/showdoc.aspx?i=2963&p=8
http://www.anandtech.com/cpuchipsets/intel/showdoc.aspx?i=2903&p=6
http://www.anandtech.com/cpuchipsets/intel/showdoc.aspx?i=2866&p=16
There are hundreds, even thousands more of these published benchmarks by major websites. If you need more prooof, let me know
Good Game? Another nice very long rant btw. You're good at it!!
EDIT: Here's one with the exact same system over and under clocked.
http://www.anandtech.com/printarticle.aspx?i=2452 -
Dustin Sklavos Notebook Deity NBR Reviewer
But since you won't believe me, I'll go ahead and point out the parts in each article that refute what you've been saying. Did you actually read any of them?
1. Uses both terms "CPU bound" and "GPU bound" in the description, refuting your entire argument.
2. An 8800 GTX is an absurdly powerful card that is, in itself, proven to be CPU bound by most modern processors until you start hitting the newer high-end Intel chips.
3. Game was deliberately tested at a point where it can still be CPU bound. No AA or AF was applied, which would've stressed the card to the point where it would've been the limiting factor.
On that page, you'll notice gaming performance levels off with the top four processors showing at most a 10fps difference (well over 100fps, past the point of being noticeable), despite a clock difference of nearly 25% from the lowest to highest clocked chips.
On the second and third pages, F.E.A.R. and Oblivion continue this trend, and the only rogue is Rise of Legends; Rise of Legends is a RTS, a genre which is notorious for taxing the CPU heavily. Modern RTSes are frequently CPU bound.
"Looking closely it is very interesting that two of the most recent games, Doom 3 and Half Life 2, seem to have their performance almost entirely dictated by the graphics card. With the increases in graphics power we tested all games at 1280x1024 where possible. Whether 2.2GHz, 2.4GHz with double the cache, or 2.7GHz, Doom 3 and Half Life 2 performed about the same using the same graphics card at the same 1280x1024 resolution.
The game benchmarks we use for memory testing were much more responsive to processor speed increases. Wolfenstein ET saw a 18.4% increase in a 22.5% CPU speed boost, and Q3 increased 16%. UT2004 performed similarly at 16%, while Far Cry was in-between at 9.5%. These results should give you a good idea of why we use Wolfenstein-ET and Q3 for memory benchmarking."
It bears noting that Wolfenstein is running over 100fps on even the lowest config, and that Quake 3 is already running in excess of 400fps; it's pretty obvious neither of these games is GPU bound at 1280x1024, especially since Quake 3 ran on the freaking Intel Extreme Graphics 2 at 1024x768.
I'm not getting into this with you anymore. All these articles have been proving what I've been saying. I'd appreciate it if Chaz could close this thread.
Honestly, if you were some random troll that would be one thing. But I don't want people being fed misinformation here. They come to NBR because they know that we know our stuff, and they just want correct information to make proper buying decisions. When you don't know what you're talking about, don't read the articles, and drag me into a flamewar and force me to beat you over the head with facts, it winds up wasting a lot of time, giving people the wrong information that will lead them to waste money on hardware they don't need. -
Charles P. Jefferies Lead Moderator Super Moderator
I am closing this thread because we've reached the end of this topic and if left open it will degenerate further.
T5250 (1.5ghz) is it bad for playing games?
Discussion in 'Gaming (Software and Graphics Cards)' started by Stu_, Dec 13, 2007.