I know many of you have heard that M11xR2 isn't that significant, and some people even give the benchmark comparison from dell website. Well, after about 1/2-1 hour of research. I came to the conclusion that those information are not accurate, or at least quoted inaccurately. I made a new post because I believe this is important for people that trying to decide which revision of M11x they should buy, or to keep/cancel their current M11xR1. Because of the wide spread of misquoting Dell number, I here giving you my analysis w/ the links to track back so you can verify the result yourself. And the true is just middle in the mud somewhere that the search would just come back w/ bunch of wrong benchmark number that spread like wildfire instead. I wrote it here first, so that's where you should visit. But the continue discussion of M11xR2 should continue here.
Addendum:BatBoy(mod) want the discussion of M11xR2 benchmark to stay here, so I guess benchmark comments/remarks stay here.
-
reading that, there is no way l4d only gets 33fps with no at and no aa, on the m11x r1.
hell, i get 35fps while running the custom seteroscopic 3d drivers. -
Yeah, i averaged 35fps on high settings with 2xaa/2xaf.
-
I don't have Left 4 Dead, but the 3DMark Vantage scores are likely taken from R1, because in large extent, they depend on the graphic card and not so much on CPU (vs say 3DMark06).
-
I don't think some people here understand how this works. GPU doesn't run the game, CPU does. GPU runs 3D engine. Well this mean there have to be something tha feed data into 3D engine, and that's CPU. So, if the CPU is underpower and only fast enough to feed in 35frames, well that's what you gonna get out of it regardless of how good the GPU is. Well, try imagine GPU only being taxes 80% when running a games because of the bottle neck,what happen when you replace the CPU andremove that bottle neck? The 20% GPU that wasn't ther before is now shown, this is especially true when the game is heavily designed for taxing GPU. So, let me go back and point out that on one of the review I linked to earlier, you can see that there is a graph of framerate for M11xR1 running MW2 which show the results between the game running w/ 2xAA vs 4xAA. The graph show that there is relatively little changes in performance when graphics quality is being pressed upon. This show that there is a slack in GPU power and ther is CPU bottle neck. So, comparing it w/ dell published MW2 framerate it does make sense that M11xR2 make a significant improvement simply by replacing GPU, at least with MW2.
-
i think you're thinking too hard about it
we'll all see with the official benchmarks. for me, synthetic benchmarks are completely different to real life situations. for games, i can see overall cpu speed making much more of a difference. would be nice if dell let us overclock the r1 processors higher, if the cooling can take it. -
It's actually interesting you should note the change in AA settings - that's a very good indicator of wasted GPU potential (where the CPU can't keep up). If you can increase the AA substantially in a game with little framerate drop then you know that it's your CPU holding your framerate back at the lower AA levels. M11xR1 owners could try this out on any games that struggle to run to see if it's the CPU at fault (and therefore whether R2 will help in these games or not).
-
I have all of the systems here if you guys want me to try something specific... Let me know.
-
And this should provide some insight into the new i5-520UM cpu performance compared to the old SU7300
Intel Core i5-520UM benchmarked on Asus UL30JT -
Crysis, Bad Company 2 and DiRT 2 would be interesting to look at. Crysis and Bad Company 2 are two games that could probably benefit from a faster CPU for playability reasons (at medium settings and higher) and DiRT 2, whilst already playable at high, still seemingly benefits from a faster CPU at lower detail levels so that could be another game to check for differences in framerate at medium settings. The present tests on NotebookCheck do point to the R1 CPUs being underpowered for the GPU, but AA tests should help to confirm this.
@ miXwui - I think he simply means he has the presently released ones. And the laptop you linked to would of course be a good relatively direct comparison for tasks that don't make use of the graphics card. -
Also, can anyone clarify what the "Overclockable" means specifically on the Arrandales? On the configuration page, it has "- Overclockable" after all the turbo information.
The Rep I spoke to via Dell Chat was totally clueless, thought all i7s were Quad Cores, thought there was a 6-Core i7 mobile option, and claimed the m11x could reach "3GHz at your own risk" saying that it was possible to reach over 4GHz on the Extreme Edition i7-640UM. Which... doesn't exist.
Is there some sort of extra overclock option supported to reach 2.66GHz?
Edit: Random thought... I don't know why everyone uses MW2 to benchmark. I can get a decent 45 FPS on my integrated Intel X3100 (GMA 965) with dips into the 20s, depending on when my chipset starts to throttle itself. Granted it's tweaked and everything at the lowest, still... MW2 was at least well coded (I don't really like MW2 as much, but I do give Infinity Ward credit for good coding). -
Lol looks like I was beaten to the punch on the Bad Company 2 request. Could you run through some multiplayer matches to see the performance? Also if it's not too much trouble maybe some WoW performance, maybe running through Dalaran or some instances?
-
-
-
Or better yet, what's the base OC'd frequency of the i7-640um? That's specific.
thanks. -
-
More off topic:
You can say whatever you want guys... But Im expecting less than a 15% of performance increase comparing the core 2 duo SU7300 vs i7-640UM...
The full OCed "2.26ghz" performance IS NOT like a regular i7 @2.26ghz, also I read somewere that the top OC rarely shows on the core i7's... Also @ 2.26ghz it will consume more energy and produce more heat. The i7-640UM has a TDP of 18W at NORMAL usage, I expect the TDP to be higher at 2.26ghz... I hope Im wrong and see OMG performance.
In my opinion 15/20% is a NICE performance boost, but you know the price is another topic... with that price tag I could get a basic m15x and upgrade later.
-
133MHz*8 = 1.06GHz (Core i5 stock speed)
166MHz*8 = 1.33GHz (Core i5 max overclock)
133MHz*14 = 1.86GHz (Core i5 max Turboboost)
166MHz*14 = 2.32GHz (Core i5 max overclock and Turboboost)
133MHz*9 = 1.20GHz (Core i7 stock speed)
166MHz*9 = 1.49GHz (Core i7 max overclock)
133MHz*16 = 2.26GHz (Core i7 max Turboboost)
166MHz*16 = 2.66GHz (Core i7 max overclock and Turboboost) -
-
It' is known as the most unoptimized PC of perhaps the last ten years. It's a great game, but horrible CPU demanding.
It was coded for PS3, with it's Cell CPU, that has many smaller cores, in which each core is specified to do a certain thing. Developers will often use one core, sorely for physics, another core for textures and another for the soundscape, and so on..
When they ported GTA4 to PC to get into the christmas rush, they did a poor job, and its still reflected in the product today.
Xbox 360 and PS3 both had much more powerful CPUs than GPUs. It's always been like this for consoles, and thus many console ports are very heavy on the CPU part, but poor porting jobs also often puts unneeded stress on the GPUs.
If GTA4 can run it smooth at decent settings, then it truly is a capable machine. It is generally believed that the only people who should bother are those with Quad-Core and the most powerful desktop Dual-Cores.
Besides that?
Source games, where also made to run more on the CPU than GPU. Back in 04, it was something like 70% of the worlds computers ran on integrated graphics. I remember HL2 being decent on a crappy Nvidia FX 5200 *shudders* -
But it wouldn't really max out to 2.66 right?
Indeed it is still a good performance boost.
I hope erawneila shares the numbers! -
-
-
( 20/30 minutes less,just talked with a rep)...
But yeah lol I meat to say turboboost instead of OC.
Edit wooooooooooha 1000 posts!!!!! -
-
-
We're talking literally about minimum running power draw when Dell cites the battery life. With the old model it was 9.0 hours, with the new model it's 8.5 hours. Breaking this down, considering the battery is still 85Wh (meaning it would go from full to empty in one hour if powering a device with a draw of 85 watts), for the old model to last 9.0 hours, that would be an average wattage of 9.4W and for the new model to last 8.5 hours that would be 10.0W; an increase in draw of 0.6W. The new Core i processors however have a TDP that is 8.0W higher than that of the Core 2 Duo processors, but that's including their integrated graphics and so it's anyone's guess as to how much more power they actually use.
EDIT: Turns out that it's actually a 63Wh battery, i was thinking of the M15x >.<
So, in correcting, the old model would have seen an average draw of 7.0W for Dell's figure and the new model would have seen a draw of 7.4W, making a 0.4W difference. -
-
I think The Sims, World of Warcraft, Starcraft, and Counter Strike would like to contest that statement.
-
-
Fact of the matter is, even just including PC sales of "real" games, it's still probably lower than tenth place.
Basically, it shouldn't be the benchmark because it's simply not a game that'll stress test a gaming PC and any gamer would know that. It's a pointless game to test.
(Also, i'm pretty sure Gordon Freeman wants a word too... And that's saying something, cos he doesn't like to talk) -
i just hope that a retail release m11x r2 would be reviewed soon... just want an actual benchmark of the m11x old version pitting against the m11x r2 b4 i will be buying this new version and give my current m11x to my wife as
...
-
Here is an interesting tidbit provided by Anandtech, it's actually a review of M11xR1 on March 30, but the reviewer weight in on the future changes when upgrade to core i5/i7. Again it's still a prediction, but from the actual professional reviewer.
-
I still don't understand what this thread about...
-
-
As the world err thread turns with erawneila
-
I got busy today and didn't have a chance to run any benchmarks, but I'll try them on Monday. The M11xR2 has overclocking options - 15 steps increasing by 2MHz each, so everyone should be able to overclock to some degree. Not everyone will get the max... -
Don't forget, the 2.26ghz turboboost is only to one core. And like 1.8ghz turboboost to both cores.
-
Now, we just looking at M11xR1 benchmark and try to determine how much of the bottle neck SU7300 was and see how much would have been lifted if that bottleneck wasn't there.(yes, speculation works) And also waiting for new and actual benchmark from M11xR2 when those numbers start to arrive. -
-
-
I would look forward to a 3DMark06 score to compare with the R1.
-
-
I've no experience of these new ix CPUs and Turbo Boost. But it would be interesting to study the CPU frequency and active cores under gaming. It if throttles up and down, because some of these "factory-configured" values are exceeded, I can't imagine it being a good gaming experience.
I've seen a couple of other threads regarding ix CPUs and throttling... so, I am suspicious. So it remains to be seen if it is really possible to reach 2 cores OC to 2.28 GHz steady during a complete gaming session. -
we need someone to run prime on both one and four threads to see what it maxes out at!
-
i find that WoW 25 man raids are the most cpu intensive( and i've done it on many different laptops and I was disappointed that the m11xr1 cant even do 25man ICC(or other 25mans) at minimum settings.
people can quote in any thread that "this lappy played wow on ultra with FPS in dal at no lower than 20fps" but that's not a real indicator of performance IMO
spoke to a few different reps about upgrading the cpu on the machine and at least the common theme is, for those of us going that route, it wont be available for at least a month or so.
will the cpu upgrade give us better 25man performance in the m11x? thats yet to be determined and until i see video benchmark proof with my own eyes(or blizz codes their games better), i dont expect it to. -
Actually I played ICC25 quite well. Not the Ultra but medium-high settings, with basically shadows toned down a bit, particles a little and ground stuff a bit. The only completely unplayable game I tried so far on m11x (r1) was APB beta.
-
I really think a lot of the problems people have the M11X is due to poor configuration with their settings. Things like turning off Page File for WoW, or taking shadows down.
The M11X is short on bandwidth more than anything else, and so if you get slowdowns you need to cut things like shadows which eat bandwidth. There is just no way a game like WoW is CPU dependent. I ran 40 man Molten Core on all high years ago on a P4 and (I think) a Radeon 9700 Pro.
Intel Pentium 4 1.3 GHz or AMD Athlon XP 1500+
- 512 MB or more of RAM (Vista requires 1 GB or more of RAM)
- 3D graphics processor with Hardware Transform and Lighting with 32 MB VRAM, such as an ATI Radeon 7200 or NVIDIA GeForce 2 class card or better
Seriously guys, it only needs a P4 1.3 ghz. A single core of the SU7300 is easily twice as fast. -
You may have noticed that present WoW has way better graphics than the early release Wow.
Please try to play WoW with a P4 1.3 ghz, good luck mate, you will need it
Some don't admit it, but Wow is a hardware hungry game, especially the crowded areas can kill your machine -
Particularly with the shadows. you need a great computer to play it smooth with no drops. I can totally see why the shadow parts would be related to the CPU.
But wait a minute guys... can you even play WoW 25 man? the raid UI must be so small it would be impossible to see anything!? Lol.. actually I would love to see a screenshot, if anyone of you have one of a 25 ICC
M11xR2 very important benchmark info.
Discussion in 'Alienware M11x' started by freeman, Jun 10, 2010.