Hell... Why can't all things be coded as well as Source... Source may be simple but at least it in' works xD It's so hard to come across a well coded AAA game these days... Can you think of one?
-
Source? OMG... Maybe because no ones want the worst crappy phisics and NUDE Graphics that is what Valve source engine IS.
I hate that crap Game engine and every god damn game made in it (though I liked garry's mod with zombie mod).
It has crap sound effects, dull interactivity with the world, everything look so nude (like if there's something missing in the texture quality + effects).
Though I saw videos of the new Counter Strike and I liked it (CS GO), maybe they improved Source engine (after years with no advance, maybe they now took another step, or I might just be wrong).
UNREAL ENGINE is for me what almost all game engines should look like. It gives realistic effects, ton's of good immersive sound effects, Tons of shader that wanna come out of your LCD with so much texture quality + brilliancy. And all at very high performance!
Keep Cool -
Source is very streamlined. It's good because it runs on anything. You can run Source on a toaster. It's just the Valve games themselves that look bad themselves inside it. Yes the sound effects are shoddy, and yes the textures suck, but it's incredible what CAN be done with it. Look at Portal 2 and Dear Esther.
I love the Unreal Engine too, but it cannot touch Source on the ability to be run on virtually any hardware, unless you drastically tone down the fidelity of the game. I do have a bit of a CryEngine fetish, mind. It's not well coded and hard to run but the lighting is beautiful and when you make the textures huge it looks amazing. -
Yeah, but the thing is, involving huge physics in Source = LAG OF DEATH (at least was a hell of a lag on garry's mod with not much made, and couldn't ever make a whole map with little things, or the whole server would stuck for seconds and went so badly)...
I like Cryengine as a beauty story teller, but not as a true gaming engine. Unreal engine also support some super uber quality EFFECTS of shader, ligthing, sense of reality.... and doesn't cost much to make it like Cryengine.
Source only runs in everything, because they never update it, it's the same crap it has ever been. 2012 games are still so 2005 with source engine (not talking about the Source GO), all they did is add a few more effects and little things, but it still is the same old engine. While Unreal engine updated a lot.
Unreal engine 2 can also run on every PC, and some games with mods even look better in terms of shadowing, lighting and some other effects. America's Army for example proved that this engine could make something really with a huge feel of reality. But you have old games like the old Deus EX invisible war with HD texture pack, hell it's years away from source in therms of a true gaming experience. (Even against source from nowadays in terms of shadows, lighting effects and a few other things).
Unreal engine 3 can also run good on every PC from nowadays, thing is that nowadays games made in U3 use so much more power than they used to back in 2005, so that makes them to require a better pc than the first UE3 games (at least it's always getting better and better).
Unreal engine never stops. In game to game they can improve quality so much times, and the engine always offer more and more. Even on the same era of unreal engine it changes a lot (like UE2 to UE2.5)... And now going to UE4 HELL YEAH!
Keep Cool -
-
Yes it's from there... But u do have to have in mind that they tested probably houndreds of laptops with different specs, and even some gpu's have less clocks than others even though it's the same GPU, but the laptop manufacturer can change things.
It's just a reference to see how good\bad it is.
Also u surely won't ever have stable 30FPS on that settings, then Realize that even their low settings have "things enabled" which u disabled like anti aleasing.....
Your just playing the game they way it was never supposed to be. For me playing a huge title in minimum settings with no AA, no annysotropic.... nothing at all = losing the title! (I prefer not playing. I have so many games here to try from ages ago, which only runned on low to mid on my old GT 7600, will probably play them now after getting the W110ER).
You can even play BF3 easily on ultra on GT650m, but that is if u reduce on a lot of other things (disabling AA, annysotropic, bloor.... basically make the whole game to have a complete huge distortion and everything untexturized right a few meters away from u since u have no annysotropic).
And I don't see why everyone talks about Alienware 11... If ur so proud of it, there's a 14 inch MSI, that have latest generation I7 cpu's, and also have the GT540 and can last like 10 hours of battery too and weigths 1kg and little something and doesn't cost that much.
I like alienware, but I hate that 11 inch one, it looks KIDDY, uber ugly front like the old rubbers shape, and way too extreme for my taste. I prefer a million times the CLEAN and SIMPLE clevo W110 ER design. Though a subtle white led backlight keyboard with little brightness would have been very welcome.
But still I don't care much, since even my Saitek Cyborg that is full of ligth things, distracts me a lot, and I don't like playing in it.
ps: Then also drivers are updated, so the GT540 and GT650 that they tested will give a few more FPS in those tested games with different and more recent drivers\bios updates (however won't probably get much far from it).
Keep Cool -
Clean simple have you even seen the W110ER lol it looks plastic and cheap and far uglier. Clearly an Alienware hater wont waste my breath with you.
-
-
-
-
WingNut: very impressive review with links to all sorts of other reviews~ Are there any post review-reviews easily accessible? I'd like to see opinions on battery life and whatnot after having owned the beastie for a while.
Wondering just how much portability I am gaining if even for document work I'll need to have a charger block on hand (or just an extra battery) -
Maybe achievable if you turn down everything processor, GPU, brightness, mininmal usage etc...but then hold on why not get a ULV and 540M GT -
Nope. Latest ME/EC/BIOS + Prema UV BIOS mod gives me 5+ hours with wi-fi on, brightness at 30%, browsing with flash even.
-
Flash.!!! That settles the argument I stand down defeated
We need to get us a "Prema" over on the Alienware forum! -
Oh there is an update goody
That makes me want the laptop much more now lol
Basically, I think the M11x looks and feels great. It's a lovely laptop. I don't think the Clevo looks to shabby either and it CERTAINLY isn't cheaply made. I used one (Scan version) at Eurogamer Expo at the Scan booth this year and it's really nicely made. I just don't like the keyboard and the lettering lol.
I seriously need to ask about getting the W110ER now. I love how big this debate has gotten! It's quite interesting xD
Oh, also, does the battery wear down on the W110ER? My Vostro is brilliant in this aspect, I don't think it's lost a minute of runtime when it's almost constantly plugged into the charger. -
Well to be fair that was with a dual core. Going to go back and run with 3610QM quad. I just set it to Clevo's "quiet mode" and got 5hrs 5mins in my browser test. With regular balanced mode I got 4hrs 40mins. It's still not great, but good. I will throw in my 3610qm soon and do a battery test with it.
To be honest I am not a huge fan of Alienware, but I did like my M11x R1 a lot. Just not sure why Dell decided to stop making them, I thought they were pretty popular. Perhaps an 11" with similar guts to their 14" would have been problematic for sales. Although I don't know why they couldn't have worked with nVidia to get sometihng like a GTX 665m or something that was faster than most 660m's, and support 45W CPU's for the M14x and offer dual core 35W with a 650m in the M11x. -
It was with a dual core... Explains a lot
-
The 3632QM looks like the best option IMO I did not expect you to go for the M11x anyway I was just adding flame onto the fire here to get a debate going
as mentioned most of us who bought the M11x R3 did so prior to the W110ER unless you specifically want a backlit keyboard the Alienware features etc and longer battery the option is always going to be the Clevo, for me those things keep me from moving but the shear power is unquestionable and the fact the 650M GT actually performs as well if not better by having DDR3 instead of DDR5 that rules out the M14x as well.
Currently there are barely any R3 about for sale Ebay is scarce of them now which was always going to happen when they pulled them from circulation. -
I'd never buy the thing with a dual core lol, I have one of the best there is.
I mainly want this for the great GPU, smaller screen/higher DPI, small size and futureproofing. -
What the Heck, Did someone erased my comment?
I believe I had posted something like the following:
And the front of that alienware reminds me of this:
(the front looks like that rubber, not much my style).
While the Clevo doesn't look that bad with texturized plastic, that looks something like carbon style.
I just really hate the screen frame. But don't care much for that, as when I get mine, I will sand it, and paint it in a proper mettalic piano black color, which will give the final touch that the laptop really needed.
I also painted my Asus which I'm using all metallic black, and looks like something of top quality, though it was just a crap made in china gray plastic...
Keep Cool -
Really would like to see the 3632 battery run downs. Really wish there were a way to cut the power consumption down a big amount. Only today I was in a starbucks with no power outlet free D=
-
Lower clocks and voltages is probably the way to save a mass of power. And screen brightness/bluetooth/Wi-Fi etc. turned down/off.
-
I care more for the lowest power consumption possible than the time on battery for real...
Never used a laptop outside (only one or two times for real just to check a thing).
ps: But for 24/7 server and upping... I Want to pay the less I can for electricity at the end of the month...
Keep Cool -
Maybe I should just VPN remote game from my desktop on a crappy netbook ;D
-
90W * 24 * 30 = 64800 WHr = 64.8 kWHr
64.8 kWhr * $0.10/KWHr = $6.48/mo
With regular use it will cost less than 1/3 that running 24/7. -
M11x - $3.24. 8) -
Considering most users will run at most 12 hrs/day and likely at less than half load (still a very high useage scenario), you're looking at $1-$1.25/mo, and difference between M11x and Clevo will be pennies. Count them if you like.
Here you go. Go for a few pony rides.
-
Now now, there is no need to be like that Mr Nut. I much prefer a Noddy car anyhow.
-
Haha...
Yiddo, W110ER have improved a lot now. It's not 3 hours idle anymore, it's now 5 hours with Wlan on and with a few apps on the back and surfing the net as people reported on it's thread, isn't that bad now.
Though the alienware last 2x longer, it also have a very crappy CPU (more than 2x times worst) while for example a I7 3610QM gives almost the same performance as my top of the line on last year 2600K from Desktop, and the GT650 is also a lot more capable...
As I said If I wanted something for a true professional use that needs battery time, I would have gone with MSI, as they have a 14 inch laptop that lasts almost the same as that alienware, but have a better cpu and the same GT540M GPU, but it's one of the most light and thin laptop, and the alienware would probably buy 2x MSI...
Keep Cool -
The CPU is far from crappy clearly you have not used one because for general tasks in windows it processes just as fast as my I7-2760QM did and runs every game tried perfectly, you do not need a 3610QM to game with when you can game twice as long on the I5/I7 ULV which has a max TDP of 17watt compared to the 45watt of the Quad and which scores on par with the 720QM quad (2500-3000 3dMark points) anyway. The 540M GT is a very good GPU as well for 768P you cannot compare something that was released 16 months after it with new architecture.
As said the MSI is 14 inch....which makes it totally unrelated to a discussion specifically about 11 inch models. -
by the time the OP decides between the two they will be EOL
-
Once it hits January and rumours start my brain will switch to " you do not need it, wait for new hardware " mode and I will be safe. 8) -
The reason I need to go now, is because the laptop is going to lose a lot more value next year, so if I want to buy, I should ideally make it now. I'm pretty much decided on the W110ER, but don't know if it's going to go ahead yet lol. I'm just enjoying this as a topic of conversation about the different solutions and people's experiences.
-
True but Haswell is claiming to be able to double the integrated GPU performance and the new architecture coming in from Nvidia on the same process.
I want to wait to see that I think 8) Tick tock! -
Haha, loving the ticktock reference
There is no new architecture as far as I know, but an improved version of Kepler that Nvidia were unable to release this year due to manufacturing errors? Maxwell comes in 2014, not 2013. There is going to be more power in the GPUs, but not as significant a jump in the mobile sector as Fermi>Kepler because the process won't be changing from 28nm as far as I know. Intel are going to Haswell yes, but I'm not going to see any difference in power on my notebook whatsoever xD As you say, even your ULV real world really isn't slow whatsoever.
Not only that but intel's integrated chips are improving only on the hardware side of things, not on the driver side of things. The chips will still handle DX11 horribly and will not work with many smaller engines. Fair enough it may be usable when you're off the mains but next year is going to be my Desktop year me thinks.
Maxwell claims to be an incredible jump, and if it is it'll be amazing and makes me want to hold off getting rid of my 560Ti next year, and get rid of it in 2014 instead. However, I'm not going to hold my hopes up.
ALSO: Daft Punk - End Of The Line (Tron Legacy Theme) -
Haswell not maxwell fella and it will roll out in 2013. Wiki reckons the same as this year Q2 to Q3. Also the Nvidia is suppose to be new architecture according to rumours same process so yes 20-25% jump not as big as this one.
http://en.wikipedia.org/wiki/Haswell_(microarchitecture) -
No, Maxwell is the Nvidia architecture xD
I know about Haswell. I was mainly talking about GPU performance in that last post and how Maxwell (A supposed massive leap in GPUs, although that will be taken with a pinch of salt, Nvidia being Nvidia and all), Nvidia's new architecture isn't coming out 'till 2014. I thought you were saying that Maxwell was 2013, when it is not. Only Haswell comes out in 2013 and I was saying it's not likely to be that massive of a jump and certainly not too noticeable in everyday performance using current software certainly. -
Let's see...
I7 2617M 4 example from alienware M11X Max score: 2959
I7 3610QM from Clevo W110ER Max score: 8359 (almost 3x times more performance in benchmark)
My TOP OF THE LINE CPU FROM 2011 DESKTOP
I7 2600K DESKTOP (not laptop) Max score: 9070
Now you can see the HUGE difference.
W110ER I7 3610QM gives almost the same performance as the best CPU avaiable to DESKTOPS in 2011 (~I7 2600K. There was just the extreme version as the top and that one is pornography with a price of 999$ and doesn't give much more than the 2600K).
Alienware M11X is FAR, FAR behind W110ER, more than 2x better in overall performance. Clevo also have a slightly better screen (don't know about the new MATTE screen being produced on current model)
Keep Cool -
What he is saying guily, is that you will not notice the difference in everyday tasks. That extra benchmarking prowess means nothing for anyone who is just gaming and surfing etc. Video encoding, 3D modelling and stuff is where the 3610QM becomes relevant. The only limit on the ULV is the clock and even then, for the purpose most will use it for it is fine.
However, I'd take the 3610QM for future proofing and to be better than anybody else's system -
Is that a joke?
GT650 + that I7 from Alienware loses something like 15FPS or more (it makes the DDR3 version of the GPU to struggle like hell).
There's no much difference from GDDR5 GT650 version to DDR3 used in W110ER, just because it has a huge powerful CPU, as soon as u change it for even an I5 that is even better than the old used in the alienware, you loose a lot of FPS in a lot of games, there's a test of it on the internet.
If it was like that, I would buy a netbook with an I3 or something, it also doesn't make any difference for normal tasks like you say, but it's just a crap.
I try to make the best PERFORMANCE for MONEY, that's what Clevo W110ER gives me. For it's price I could only buy an Asus or other branded machine with a crapper I7 and a GT540M in it (this is in Portugal).
Keep Cool -
-
No, you guys are the ones who say wrong things...
Why do you think no one EVER buys this clevo with Pentium or I3???
You can buy it with an I3 3110M which is a 3410 score, which is even a good amount better than the CPU of Alienware W11X ( I7 2617M score: 2959).
The Clevo W110 laptop would only cost you around 540£ with that cpu, but why do u think everyone buys it with the I7 3610QM or better?
It's the difference from playing in high on BF3 to have to play it with things OFF and in middle.
I know the difference from CPU\GPU. But you don't seem to know this is a DDR3 GPU and not the GDDR5 and go see people's tests and read about it. Somehow having a bigger CPU makes it to have almost no difference in performance on most games from the DDR3 to GDDR5 with a weaker CPU "on the GT650M" (in lower resolutions, because on 1080P, the DDR3 version will struggle 100% no matter how good is ur CPU).
So since we can't change the GPU on this laptop, all we can do is put an even bigger CPU so that the GPU equalizes more against it's GDDR5 version. BF3 and so many other games are preferring this CPU increase theory than the difference from the DDR3 to GDDR5 it self (the more the cpu difference, the less is the DDR3 to GDDR5 gpu's difference in overall gaming performance).
As soon as you put an I3 in it, you lose tons of fps (some games around 20FPS lost ~from GT650 DDR3 with a weaker CPU to GDDR5 version).
ps: This kills the theory that you can go with the lowest CPU and buy a big GPU. You would only realize your GPU won't ever give it's true performance, because it's being limited by the CPU...
Keep Cool -
You do realise Guily that BF3 is almost completely GPU oriented. You can downclock a processor by about 1.5GHz and get the EXACT same FPS in BF3. And it's very much similar with most games. Only things such as flight simulator are really affected by CPU clock and cores, and that's because it's badly coded. Most other games are not affected more than 5fps.
The reason people go for the 3610QM or whatever, is because they want the best and because the i3 will bottleneck, but honestly, it's not by that bigger margin. -
No.. You still don't understand that this is a DDR3 GT650M and not the GT650 GDDR5.
Now imagine BF3 running on GT650 DDR3 at 30FPS and imagine it running on the GT650 GDDR5 at 45FPS. Now imagine you put a lot better CPU + that GT650 DDR3 version and it now gives near 40 FPS and probably with that same improved CPU the GDDR5 will not make much difference and would probably make 49FPS...
Somehow the GDDR3 version of this card seems to struggle a lot less the more CPU you put in. And in fact BF3 actually still have a good difference from I3 to I7 usually have a 10FPS difference (on maximum peaks, not the average).
But I'm not talking about CPU vs CPU. I'm talking about a CPU struggling the GPU or vice-versa.
Other example is that I7 on the Alienware is getting near minimum requirements for most games. My friend has a good ATI HD5000 series that can play BF3 on High, however he can't even play BF3 on minimum, because is Intel Core 2 E6600 seems to be less than the minimum required for BF3, and that's a 1502 score CPU. Alienware I7 cpu is only 2x better and soon will become out of the minimum required for gaming (that I7 is on the mid list on 2012 cpu scores, while any cheap I3 like the 3110M that can be put on this clevo, is on the top list).
Then some options in-game have a good impact on the CPU too (though yeah, most impact more the GPU).
ps: I wouldn't ever buy a system that one single component struggles others. If the ram was struggling, would put a better one. If the gpu was struggling the rest, better gpu... I want a balanced very POWERFUL LITTLE LAPTOP FOR the best PRICE, and for me only Clevo W110ER can give it. And Yes the DDR3 version of this gpu is struggling a lot the whole laptop performance, but the thing is... YOU CAN'T BUY ANYTHING MORE POWERFUL THAN IT for it's size and even price.
Keep Cool -
Battlefield 3 is one game....a game that does not even push my ULV to 100% load yes a CPU makes a difference in a few games but nothing in comparison to what a GPU does. The first gen I5/I7 is still more than enough for gaming and they score the same as the Sandy Bridge ULV processors, the only difference is the cores that deal with the load.
Moving from an I7 to an I5 would not make a single difference in 99% of games. If anything the I5 may take higher ground because not all games are quad requirement. The rule of thumb is always spend more on the GPU.
Also the E6600 is a 2006 2 core 2 thread weak CPU. It is nowhere near the same league.. -
100% load means nothing to performance...
Even if in a game you would have 20% cpu usage, it doesn't mean a better cpu with 5% usage won't even have higher FPS
It doesn't have to have 100% to say that your loosing performance.
And then there are games and games... Some games offcourse a better CPU won't mean nothing (however in every huge titles I'm totally sure every single one will have better performance and less FPS drops on any Clevo W110ER from I3 to I7 than in that Alienware M11X).
PS: I only agree with you that we save more on the CPU to buy a better GPU, but ur probably wrong on the 99% from what I have been seeing... And believe me the I7 3610QM will make huge impact on W110ER overall gaming performance. As I told you I'm talking about the DDR3 to GDDR5 version, and people with both reported somehow having a better cpu + the DDR3 version can equalize FPS to minimum differences (again on lower res only, because DDR3 can't support 1080P gaming well).
Keep Cool -
....what I am saying is that you seem to think just because you have an I3 or an I5 it is not going to be as powerful in games as an I7 but that is not case because games do not require a extremely fast I7 quad core to do the processing, if the CPU can do the processing that is it! Additional processing means nothing if it is not used. But when it comes to a GPU you always want more power because there are always more and more options to turn on.
Most CPU's from 2009 onwards will be able to run the majority of games just as well as any other CPU as long as the GPU is powerful enough. It may not appear that way when you look at it on paper but it is the case because I have 1TB of games on my HDD and every single one runs perfectly with my ULV.
If you buy an I7 for gaming reasons only that is because you intend to future proof your model and make sure that when the extra oompf is required it is there, but you do not necessarily need it. -
U are right on that, and don't u think I know that? But again lot's of games have 10fps difference from I3 to I7, and yeah there's also lot's that have no impact.
But that is not the same as what I'm saying in terms of equalizing the DDR3 version of this GT650 to it's GDDR5 version and u don't seem to understand that. Somehow a better CPU make the Memory bandwidth of the GPU to increase and therefore makes huge impact.
This is different from talking about a I3 VS I7 on the same GPU, as that makes almost no difference (though for me having a 5fps difference on lot's of games is enough for me to save money for a bigger time and go buy the I7). But this DDR3 card + better CPU against the GDDR5 + weaker CPU equalize a lot the performance (that's what lot's of people with this clevo and other GDDR5 gpu laptop reported). On most cards even only DDR3 version, having a better gpu doesn't usually makes as much impact as in this case of the GT650, don't know why.
ps: But yeah, I haven't tested this, as I don't have both, but I believe in people's tests!
Keep Cool -
I haven't mentioned the 650M GT ddr3 vs ddr5??? so I dont understand what I am wrong about.....
Memory bandwidth on a GPU is important especially at higher resolution however the DDR3 version performs better if you look at the results and HTwingnuts review because it is clocked a lot higher even with half the speed on the bandwidth and this will be because at such a low resolution on an 11 inch model the fillrates are more important, if you compare them at 1080p that may not be the case. This is a unique situation however.
You cannot compare CPU vs CPU the same as GPU vs GPU it just does not work like that. As above which you said you agreed with if a CPU is powerful enough to process the game that is it, with a single GPU you want as powerful as possible and then some because an intensive game will always want more GPU power because that is where the majority of the processing is done. -
Well you haven't mentioned it, but I did and I'm talking about this clevo.
And if for you playing in medium with things off to playing in high with everything on is not a big difference, then I don't know what it is.
Because this DDR3 struggles a lot and with an I3 will struggle even a lot more, so it's double shot of struggle. While with an I7 somehow on most games it makes a huge improvement against it's It's GDDR5 competition.
About having bigger clocks, I have heard that too before, but after seeing tons of tests from people, I realized FPS only get near on a few games and was the GDDR5 version with a worse CPU against the DDR3 version with a better CPU.
Most of the tests made on same CPU, most of games had 10, 20% of worse performance under DDR3 version, and on 1080P can become 80% on a few as off course...
Only one or two games performed the same under same CPU, and only one or two at max had better fps than the GDDR5 version, but was crappy games, and every one I play loses around 20% performance on the DDR3 version of this laptop.
GT650M GDDR5 vs DDR3 versions.
Street Fighter 4
DDR3 136fps
GDDR5 163fps
Difference +19.4%
Resident Evil 5
DDR3 66.5fps (weaker CPU bound)
GDDR5 121.4fps
Difference +83%
Lost Planet 2
DDR3 26.6fps
GDDR5 31.8fps
Difference +19.5%
DDR3 usually only have same performance as the GDDR5 version on benchmarks, as most games really have big differences, and even more with a worse cpu for most cases.
ps: this is even taken from this forum. As you can see RES5 with 83% of difference between GDDR5 vs DDR3, but it's also so heavy because of the WEAKER CPU, and there's lot more tests I have seen the same as I was saying, on this card DDR3 version wich already struggles a lot, will struggle even more on lot's of games the weaker is also the CPU
Keep Cool
Should I opt for the Clevo W110ER?
Discussion in 'Sager and Clevo' started by Razyre, Oct 4, 2012.