Is the T5250 (1.5ghz) bad for playing games.
Ones which say 1.6ghz minimum.. even though the system they're talking about is a pentium 4.. THANKS.
-
If the requirements are says 1.6Ghz single core (based on P4 speed), then yes, your C2D 1.5Ghz T5250 will handle it fine. I think most requirements are based on single core speeds anyways.
Since C2D is based on the P3 architecture, it's actually faster per clock as compared to P4's... So even a single core 1.5Ghz CPU based on the Core 2 architecture will be faster than a 1.6Ghz P4. -
A Pentium M is also based on a Pentium 3..
-
Correct
.
They're all very efficient chips, and highly scalable in terms of Hertz, unlike their predecessor (the P3). -
yay.
I shall order this laptop at the weekend. And hopefully get it for christmass.
I was just concerned. I thought it would be slow and useles..
It's not - IS IT?!
It can handle things pretty good.. such as.. internet, music, msn at the same time, and games like CCS - the laptop has a medica 8400m gs.
Meh. What can you do!? for £550. It's a stealll! -
hmmm... would you say that the T5250 is capable of running crysis, at, say, low to medium settings (when paired with a good graphics card like the 8600mgt)?
-
Probably not.But then again the 8600GT is a pretty powerful card, so yea, low/medium might do it
...
-
Charles P. Jefferies Lead Moderator Super Moderator
Are there any higher CPU options?
-
lol, nah not for me, i'm on a rather tight budget, but i guess i already knew what the T5250 is and is not capable of anyways. actually, i'm planning to get a vostro with the very similar T5270 next year, assuming dell keeps their prices stable into quater two of next year lol
-
Or you could go for a cheaper desktop with higher specs...
or buy a used one with higher specs, but in warranty..etc
1.5 is pretty much the minimum for anything nowadays... -
maybe.... but i definitely need a notebook for the portability...
i mean, we are talking about a laptop here. i already know i can't expect a gaming monster out of it
no matter how good laptops get, desktops apparently will always be better
-
There seems to be a lot of bad advice going around in this thread. First of all, Centrino is !NOT! a processor, it's a marketing sticker that essentially says there's an Intel processor, motherboard chipset, and wireless card, that's it.
Anyway, the processor should be enough for most games, seeing as how you have a dedicated video card. However, there will be a minor boost in performance with higher end processors, but nothing too significant. -
There is a reason for which MINIMUM REQUIREMENTS were invented, and it`s not because of a MINOR boost.It`a must have for decent gaming.
I strongly advise you to save a bit more and get the faster CPU, at least a T7250 if possible.You will not regret it. -
I have to agree. Unless you need it right away, try and save up and get the best possible setup you can afford. When buying computers you get what you pay for.
-
I won't disagree with you in terms of minimum requirements, but a 1.5GHz dual core processor is leagues ahead of the minimum requirement of a 1.6GHz single core processor. A T5250 will outperform a P4 3.0GHz processor in nearly 100% of benchmarks, and real applications.
-
Current games pretty much specify the minimum being a Core duo or Core 2 duo...
That pretty much implies either CD, C2D or X2 CPUs
-
Ahh.. this thread has grown alot..
1.5ghz..
But i wont need to play new games.. the point of getting a cheap one is to be able to afford a PS3
But i want the odd game on it. Like CCS and our house has a pretty decent desktop anyways..
You had me worried about the 1.5 then.. but it's a lappy top anyways i'll probs only have it for 2 years.. and 1.5 should be enough. I do not intend on playing Crysis. I took it back last week.. I didn't like it.
Haa. -
Doh! Thanks for the wake-up call. I do remember reading something along those lines a long while back
.
-
Not correct
"Centrino" is not a processor.
And do you know what a "Hertz" is?
Actually a faster processor will make a larger difference in systems with dedicatd graphics, and the boost is significant. FPS for newer games increased about 15% on my computer with an 8800GTS after a CPU overclock.
Low settings at a modest resolution. FYI, the minimum requirements for Crysis is 2.0 GHz with XP and 2.2GHz with Vista on a Core 2. So you would need a T7500 or better to reach it at stock speed. -
Yarp. One complete wave cycle.
Can be applied to anything that's wave-like in nature (i.e. oceanic tides can be described using "hertz"), but it's generally applied to electronics and their frequencies, or CPU clock cycles in this context.
I don't see what that has to do with anything, as it's a well known fact that P3 didn't scale very well, and couldn't hit very high hertz, so Intel opted for the P4 architecture. However, the Core 2 architecture is, more or less, an improved P3 architecture that can actually scale very nicely to high hertz.
And Centrino isn't a processor, you're right. It's a branding scheme. I already know that. Read my post right above yours
.
-
The Core Archiceture is nothing like the Pentium 3 architecture or the Netburst (P4) architecture. With each successive generation, the clock frequency increased substantially. The maximum frequency of a P2 was somewhere around 400-500MHz, as for the pentium 3's that was mass-released was 1GHz, if I'm not mistaken, compared to the P4's 3.6GHz. That being said, an E6300 @ 1.8GHz will outperform a PD960 at half the frequency and significantly less heat output.
-
I play crysis on medium/ high settings just fine on a desktop with SINGLE CORE processor (AMD 3700). Just goes to show the minimum requirements are crap and it's all about the video card (8800GTS).
-
worst processor ever
-
Fair enough. I'm not saying they are similar, just that Pentium M was based on P3, and that Core architecture was based on Pentium M's... Thus the root for both is the P3 architecture, no?
P.S. It's good to see there's quite a few intelligent fellow around here such as yourself that I can learn from
.
-
Doing a quick search, you're actually right about the Pentium M having some similarities compared to the P6 (Pentium 3's) architecture. It's highly derived from its architecture, however. Intel went back to its roots and improved the clock-for-clock efficiency. Pretty brilliant, actually.
-
In the end the specs won't really matter, what matters is how good you play the game. I have a friend back in college who was able to beat the crap out of his opponent on a Celeron 333mhz with an integrated video card against a P3 in CS. The pixels were so big in the Celeron yet he can score headshots with an AK-47.
-
If they were completely similar, then P-M's wouldn't be P-M's, but P3's
.
-
Dustin Sklavos Notebook Deity NBR Reviewer
A couple things:
1. So everyone is clear, Core 2 Duos are derived from Core Duos which are derived from Pentium Ms which are derived from Pentium IIIs with some improvements from the Pentium 4 architecture.
2. The Core 2 Duo T5250 may be the low man on the totem pole, but I've spent the last few hours benching it against a 1.9GHz Turion 64 X2, and the T5250 beat it soundly at everything I threw at it except wPrime. A 1.5GHz Core 2 Duo may be slow by C2D standards, but it's still freaking FAST.
3. People who flippantly dismiss technology just because there's much faster stuff available don't seem to understand that technology doesn't work on that "zero to 100" scale. The baseline is constantly being raised and right now, the baseline is pretty freaking high.
4. If you kids can't behave yourselves, I'm gonna start making you take turns playing Blair Witch Project in the corner, you feel me? -
The T5250 should match that 8400 perfectly. Unless you do a lot of encoding, you should be happy with it.
-
No... (the time it takes for) "one complete wave cycle" is the period. One Hertz simply means one cycle per second. The term hertz when referring to a CPU means how many cycles it could process per second.
I'm not sure how you could scale hertz??? -
I've got the T5250 in my XPS-M1330 works perfect for Vista, Internet, Office 2007 and medium settings gaming (where the 8400 is the limiting factor)
Don't let the fanboys get you crazy, they don't know that the're talking about -
There's no such thing as a "limiting factor" or bottleneck in gaming. A faster CPU would increase gaming performance when paired up with ANY GPU, and the same is true vice versa. This is of course assuming that the computer could actually run the game.
Don't let the T5250 owners tell you to settle for a slow processor, they don't know what they're talking about.
But yea, there are very few applications that a 2.0 GHz C2D, but not a 1.5GHz C2D, would handle smoothly. It's also not worth upgrading for just gaming. -
Name one application that a T5250 can not handle please?
-
My Pinnacle software for watching TV. My T7500 gets almost maxxed out when running at a high resolution. I'm sure a processor that's 33% slower would throttle.
A T7500 would give you that extra 33% space to work with before the CPU throttles. This is especially helpful when multitasking. -
LOL, bad implementation then:
My T5250 plays both BluRay and HD-DVD on 1920 * 1080p via HDMI out on my 50"plasma without any problems.
compared to your 575p TV-card this is 40* the amount of pixels that is tranfered at the same time, while dealing with h264 of VC1 decompression.
LOL, i can do all of that in real world tempo. I'll just finish 2 nanoseconds later.....
A stated before, modern processors have so much headroom that it really doesn't make much different.
BTW from a CPU UTILISATION persective, there is NO difference.
When my system doesn't respond, it might snap out of it at 5secs
yours might be out of it at 4.9999secs
buth when mine is 100% yours will be 100% as well......
-
Charles P. Jefferies Lead Moderator Super Moderator
Please keep the conversation non-confrontational, it was getting a bit too warm in here for my preferences. Thanks.
-
Dustin Sklavos Notebook Deity NBR Reviewer
Except that you're actually wrong. You are literally, factually wrong. There is ALWAYS a limiting factor with system performance in anything. By virtue of how computers are built, there is ALWAYS going to be a weakest link.
A faster CPU will NOT increase gaming performance when paired up with any GPU. If this were the case, worlds of benchmark sites wouldn't run games at low resolutions to benchmark CPUs, since the difference should be evident even at high speeds. The fact of the matter is that with games, the graphics card is almost always the "limiting factor," or as most people refer to it, the bottleneck. A modern CPU has to be pretty lousy to be the bottleneck in a game; the T5250 would only ever see a problem like this paired with some kind of obscene video card like an 8800 series.
But I will tell you right now, it doesn't matter if you're running a Pentium Dual-Core T2060 or a desktop Q9770 overclocked to 4GHz, if you're running Crysis on a GeForce 8400GS, that GeForce is going to deliver the same performance when pushed at higher settings.
There is ALWAYS a bottleneck. In ANY application. If that weren't the case, I should theoretically be able to put 8GB of RAM in my desktop instead of 4GB and see performance gains across the board, instead of just in specialized programs where RAM is the most deciding factor in performance.
And I have practical experience with this, since I'm actually sitting here with a 1.9GHz Turion and a 1.5GHz Core 2 Duo T5250, testing them for this site. Anyone who'd tell you a T5250 is slow is sorely mistaken and clearly hasn't spent a whole lot of time with one. It may be comparatively slow compared to other Core 2 Duos, but that doesn't make it actually slow.
The fact of the matter is that a system needs to be balanced so that no one component severely holds back the entire machine's performance, and believe me when I say that I've spent the last year learning that the hard way with my desktop.
If you were anything resembling right, it wouldn't be half as funny as it is that Apple ships a lowly GeForce 7300GT in their $2,499 Mac Pro. Clearly four cores running at 2.66 GHz can pick up the slack from that in a game, right? -
@Pulp:
so what do you think how would t5250 @1.5gHz compare to a desktop P4 3.0gHz or athlon equivalent???
-
I didn't want to read through the whole flamewar you guys got going here, but this is how it goes:
For most games, the GPU is the bottleneck, even the lowest Core 2 will easily be able to handle the geometric processing, AI, physics, etc... The games that would cause the processor to be the bottleneck are usually RTS's like Supreme Commander, where there are many units on the field. -
Dustin Sklavos Notebook Deity NBR Reviewer
Depends on the P4.
But honestly, the performance-per-clock of the T5250 is pretty freaking high. It'd murder the P4 in anything multithreaded. In single-threaded apps I'd still take the T5250. It's better optimized and more efficient.
-
I bought on perpose an 1.8ghz coreduo in stead of a 2.0ghz coreduo because that 200mhz more would cost me 150+ euro. And i wouldnt notice any difference at all in most games, so what i want to say: Coreduo's and Core2duo's are more powerfull than you think. But when you buy an 8800m gtx i think its advised to buy a really powerfull cpu since it's twice as fast as my gpu.
-
I think this conversation as started to drift from it's original topic. Pretty much, the straight answer the T5250 will play ANY game out there and ANY game for the next three years or so.
Whether your GPU will handle higher settings is another question.. -
as long as its dual core its fine for any gaming in the near future....heck im still runnning most games on a 1.7 P4, so i cant imagine what improvement any dual core would have be it Pentium D, Core, or Core 2
-
Except that you're actually wrong. You are literally, factually wrong. You have absolutely no idea how games works, and how most games optimize (a single core of) your CPU for 100% performance.
Don't believe me? Underclock your CPU when paired up with ANY graphics card. Run ANY modern game and record its average FPS. Overclock your CPU and do the same.
Post results please.
Contrary to what your rant states, weak and/or integrated GPUs actually prove my point the most. I know because I've actually tried it on both an Intel GMA 3000 and an 8800GTS.
LoL the words fast and slow are COMPARATIVE words.
Nice rant for someone who has absolutely no idea what they're talking about. -
Dustin Sklavos Notebook Deity NBR Reviewer
I don't want this to degenerate into the flame war it already has, but I just want to say: I'm really going to enjoy this.
Now, first of all, I'm going to point out that the choice of hardware you tried this on is inherently flawed: the GMA 3000 still runs a LOT of its functions in software, so CPU speed is going to have an effect on it by virtue of the fact that it's barely doing any of the heavy lifting. The 8800s, at least when they came out, were also well known to be CPU limited, though as my independent test proved, that doesn't seem to be the case anymore.
So I did you a solid. I took your word for it and I tested FEAR on my desktop as seen in my sig. Now, since my 8800GTS is overclocked, it stands to reason that even at the highest settings it should be somewhat CPU bound.
I ran my Q6600 locked at 1.6GHz and 2.4GHz, its highest and lowest power states respectively. FEAR was run with all settings maxed, Soft Shadows off (because no one ever uses them ANYHOW), VSync off, at 16xCSAA and 16xAF and at 1920x1200. So theoretically, my 8800GTS should be getting pushed pretty freaking hard, even overclocked.
So:
@ 1.6GHz: min. 32 fps, max. 127 fps, avg. 60 fps
@ 2.4GHz: min. 31 fps, max. 122 fps, avg. 60 fps
Well now that's weird. Those variances are within the margin of error. So at 800MHz slower on the processor, FEAR shows...no performance difference!
What have we learned here that applies to the T5250? Well, let's go apples-to-apples.
The T5250's FSB runs at 667MHz against the Q6600's at 1066MHz. So there's a deficit there.
The T5250 runs 100MHz slower than the Q6600's slowest power state, but the Q6600's slowest power state didn't really seem to be an issue here so I'm pretty sure we've got some wiggle room here.
And the T5250 has a quarter the L2 of the Q6600, but the Q6600 is just two E6600s anyhow, so as far as performance goes, it's more appropriate to say "half the L2." Another deficit here, this usually results in about a 10% performance diff. or less.
Yet somehow, I doubt these things are going to come together to really punish the T5250. Certainly it might run a bit slower, but it's pretty clear that we've hit a threshold where we're...
Wait for it...
Entirely GPU bound!
So now that I've passed your test and proven exactly what I said before, I'm going to go ahead and use some of that weird logic crap that seems to have become really passe around here as of late: would it make sense for one random guy on a forum with low rep to be right about gaming software and hardware if that meant that paid writers who work with this stuff on a daily basis all over the internet would have to be wrong about it?
So, I've successfully proven you wrong using logic, using research, and using straight up number-based facts. Is there anything else you want me to pull from or should we consider this case closed?
As for the term comparative, while the T5250 is certainly on the slow side of Intel's modern lineup, Intel's modern lineup is grossly faster than most of AMD's lineup, not to mention grossly faster than most of the processors out in the world right now. I hope I've proven the T5250 will be just. Freaking. Fine. -
HAHAAHAHHAA Nice FAKE benchmarks!!
I get 63.6 FPS in the opening video of 3dmark06 with my e4300 clocked at 1.8 GHz and a whopping 72.8 when it's clocked at 3.0. But why should anyone believe personal unpublished claims?
http://www.extremetech.com/article2/0,1697,1996940,00.asp
http://www.anandtech.com/cpuchipsets/intel/showdoc.aspx?i=2963&p=8
http://www.anandtech.com/cpuchipsets/intel/showdoc.aspx?i=2903&p=6
http://www.anandtech.com/cpuchipsets/intel/showdoc.aspx?i=2866&p=16
There are hundreds, even thousands more of these published benchmarks by major websites. If you need more prooof, let me know
Good Game? Another nice very long rant btw. You're good at it!!
EDIT: Here's one with the exact same system over and under clocked.
http://www.anandtech.com/printarticle.aspx?i=2452 -
Dustin Sklavos Notebook Deity NBR Reviewer
You told me to test it. I tested it. I deliberately pushed the GPU as hard as I could to ensure that any differences in CPU performance would be eked out. They weren't. If you like, I can tell you exactly how my system was set up when I ran these benches so you can duplicate it yourself.
But since you won't believe me, I'll go ahead and point out the parts in each article that refute what you've been saying. Did you actually read any of them?
From the article: "All games are run at a resolution of 1280x1024, with details turned up high. We wanted to test by running the games the way real gamers do—at a reasonably high resolution with all the eye candy turned on. The vast majority of monitors sold these days are either 17" or 19", and 1280x1024 is almost always the native resolution for these displays. We're using a high-quality, speedy, but affordable graphics card: a GeForce 7900 GT (currently costing less than $300). While we want to play the games the way people expect to be able to—with the graphics options turn up high—we didn't want the graphics card to be the limiting performance factor, so we never enabled anti-aliasing or anisotropic filtering."
"Oblivion has just as many CPU bound situations as it does GPU bound situations, this benchmark is merely one of the CPU bound cases. We tested Oblivion with the 1.1 patch and the game's default Very High quality settings, which can still be CPU bound with an 8800 GTX at 1600 x 1200"
1. Uses both terms "CPU bound" and "GPU bound" in the description, refuting your entire argument.
2. An 8800 GTX is an absurdly powerful card that is, in itself, proven to be CPU bound by most modern processors until you start hitting the newer high-end Intel chips.
3. Game was deliberately tested at a point where it can still be CPU bound. No AA or AF was applied, which would've stressed the card to the point where it would've been the limiting factor.
Test system was, again, running an 8800 GTX. Gaming performance was, again, run at settings that would not stress the GPU. The same settings, in fact, that were used in the previous article.
"Unfortunately, as is the case here, coarse threading often results in limited performance improvements. A system is only as fast as its slowest component, and in the case of Quake 4 the client thread still does the vast majority of work, including all the graphics and sound rendering. The end result is that even with high-end graphics cards, you still quickly become GPU limited."
On that page, you'll notice gaming performance levels off with the top four processors showing at most a 10fps difference (well over 100fps, past the point of being noticeable), despite a clock difference of nearly 25% from the lowest to highest clocked chips.
On the second and third pages, F.E.A.R. and Oblivion continue this trend, and the only rogue is Rise of Legends; Rise of Legends is a RTS, a genre which is notorious for taxing the CPU heavily. Modern RTSes are frequently CPU bound.
All they're doing is proving what I've already been saying.
Doom 3 and Half-Life 2 show virtually no change in performance. Far Cry and UT2004 will not be GPU bound at 1280x1024 by a GeForce 6800 Ultra.
"Looking closely it is very interesting that two of the most recent games, Doom 3 and Half Life 2, seem to have their performance almost entirely dictated by the graphics card. With the increases in graphics power we tested all games at 1280x1024 where possible. Whether 2.2GHz, 2.4GHz with double the cache, or 2.7GHz, Doom 3 and Half Life 2 performed about the same using the same graphics card at the same 1280x1024 resolution.
The game benchmarks we use for memory testing were much more responsive to processor speed increases. Wolfenstein ET saw a 18.4% increase in a 22.5% CPU speed boost, and Q3 increased 16%. UT2004 performed similarly at 16%, while Far Cry was in-between at 9.5%. These results should give you a good idea of why we use Wolfenstein-ET and Q3 for memory benchmarking."
It bears noting that Wolfenstein is running over 100fps on even the lowest config, and that Quake 3 is already running in excess of 400fps; it's pretty obvious neither of these games is GPU bound at 1280x1024, especially since Quake 3 ran on the freaking Intel Extreme Graphics 2 at 1024x768.
I'm not getting into this with you anymore. All these articles have been proving what I've been saying. I'd appreciate it if Chaz could close this thread.
Honestly, if you were some random troll that would be one thing. But I don't want people being fed misinformation here. They come to NBR because they know that we know our stuff, and they just want correct information to make proper buying decisions. When you don't know what you're talking about, don't read the articles, and drag me into a flamewar and force me to beat you over the head with facts, it winds up wasting a lot of time, giving people the wrong information that will lead them to waste money on hardware they don't need. -
Charles P. Jefferies Lead Moderator Super Moderator
I am closing this thread because we've reached the end of this topic and if left open it will degenerate further.
T5250 (1.5ghz) is it bad for playing games?
Discussion in 'Gaming (Software and Graphics Cards)' started by Stu_, Dec 13, 2007.