man I must be going blind - didnt see the dropdown for each component -- thought they still employed that horrible slideshow type configuration process
-
Mine Still In Preproduction.
ETA Delivers by
12/16/2014. :hi2:reborn2003 likes this. -
-
Yeah, their site definitely sucks.
-
You spent $3500 on two 13" notebooks? Creating a weak botnet with laptops?
-
-
-
-
Will i get perfs of a 980 if i use it with the AMPLIFIER or no ?
What performancew will i get ? -
A desktop performance, probably. For games the graphics power is what matters!
-
Yes but i heard the CPU will maybe bottleneck the GPU, is that real ?
-
bumbo2 likes this.
-
So, the GPU will perform at 100 percent or no ?
-
-
There is no way in hell a GTX 980 desktop card will perform at 100% with an i5 dual-core mobile processor, LOL. Keep dreaming, boys.
The 4710HQ bottlenecks the 980 M, and that's barely reaching GTX 780 performance. Déjà vu... I said this before? -
Fyi tho, at PAX, the 13 was playing arkham origins at 3k with the 980m perfectly smooth.
-
I'm not suggesting it won't benefit from it. It just won't perform at its full capacity due to the bottleneck.
-
-
Meaker@Sager Company Representative
It would also require an entirely new motherboard.
-
I understand bottlenecking, I was just pointing out that it's not bad enough to prevent one from running AAA titles @ 3k.
-
Guys, the alienware 13 was no made or built for overclocking to my Understanding!!! so Bottlenecking is no A Big Deal to me! I Just Want the portability.
-
I have been reading that the initial Broadwell impressions on the current Lenovo Yoga are far from what everyone hoped, but it's still unclear how much is due to the software setup and throttling Lenovo put into place. If there are similar issues on the i7-Broadwell used in the AW 13, I might just opt to get the Black Friday sale price on even a current i5 AW 13 since it's not my workhorse laptop but my around-house quiet one and avoid any Broadwell throttling if that turns out to be an issue. The problem is Dellienware is likely to introduce the Broadwell alongside the 900-series GPU's.... though I have a strong suspicion we may just still get an 860m -
@Docsteel. You Got A point here! totally Agree on This, the real test here is going to be when someone starts posting numbers of a fairly current game that truly recommends and uses a quad-core. But in This case I'll left My M18 Take care of That.
-
It's impossible for any dual-core mobile CPU to adequately support a desktop GTX 980. It will be better than the 860M, but a bottleneck will exist.bumbo2 likes this. -
Guys you need to understand the concept of what the CPU will bottleneck.
In the usual circumstances graphical effect that require CPU are different, and graphical aspects that require GPU performance is different.
Now what you have to do is. Turn down the effects that require CPU performance and turn up the effects that Require GPU performance.
Now what uses CPU? effects such as, shadows and dynamic lighting, physics (in case it's Nvidia PhysX, it's done by the GPU instead of the CPU)
And about GPU, the GPU is responsible, about resolution, Textures, Anti-aliasing, Filtering, ambient occlusion, Tesselation, TessFX (in games such as TOmb raider, that's also the reason why they used Tomb Raider as Example benchmark for the alienware 13, because it's VERY GPU performance requiring.)
HOWEVER! There are games that even if you put it down to the lowest settings, it still requires CPU performances due to the game mechanics. A good Example is managing All players online in an MMO.
So there are games that will run VERY good if optimized in the video settings correctly, but then again there will be games that run VERY bad nomatter what GPU you put in the Graphics amplifier. -
Yep - that's why it's important to look at GPU-bound game results alongside CPU-bound results when evaluating a system. Each game is a point along a spectrum between CPU-bounded-ness and GPU bounded-ness... I suspect even for current and upcoming games that would take full advantage of a quad-core that the ULV dual will in the majority of cases be sufficient, but will half the time bottleneck the GPU as all the lighting and physics, etc. have to take place coordinated with GPU rendering. If the CPU is slow... then the GPU will be often sitting idle a percentage of the time.
-
Robbo99999 likes this. -
I think it is more of looking at future games than at present games. In current games, the bottleneck is not important (obviously). However, in the future, it may be problematic. -
There's an entire thread dedicated to VRAM discussion in the Gaming and Graphics board.
In that regard, the Alienware 13 is obsolete before it's even sold, especially if you account for the Amplifier. -
Look at these test results for desktop chips, and see how little difference going from a quad core i5 without HT to a hex core i7 with HT makes in most games. -
Thanks for posting the link, n=1. Maybe I am wrong, but aren't all those games in the tests fairly well known to be GPU-bound games? If so, your point about there not being _as big_ a spread is true based on those results, but I wish we could get a similar list for quad vs dual core for current games that state a benefit from going quad.... anyone know of a good one?
I came across a somewhat old discussion on dual vs quad, with hyperthreading... it confirms something I have been suspecting, that dual cores supporting four threads might give a good bit of the benefit that a quad (granted, with 8 threads) would provide for a lot of games that are probably coded to look for four threads (on a dual core or 4 out of 8 on a quad). Some games coming up will be optimal for say eight threads (quad-core) but very few to utilize hexacores with 12 threads or heaven forbid, octacores with 16 threads. While not optimal, the dual-core with four threads probably will be sufficient for a couple more years... about the life of the AW 13 for this revision.
The Tech Buyer's Guru - Dual-Core, Quad-Core, and Hyperthreading Benchmark Analysis
I'm still in favor of waiting for an i7 though, dual core or not simply because the extra cache has been shown to definitely be impacting on games, not to mention the speed increase. not a huge increase mind you, but enough of one to be worth the difference imho.
This is a link from the article above : http://techbuyersguru.com/haswellgaming.php
Note this statement near the end of page 3: "With a balanced gaming rig, the GPU usage should never fall below CPU usage, and ideally should always be above 90%." -
Lost Planet 2 is known to be very CPU intensive and the results do reflect that. Crysis 3 should've been another poster child for more cores + more speed = better performance, but it doesn't really show up in that chart for some reason.
All the desktop i3 CPUs are dual core with HT, and the Pentiums are dual core without HT. Luckily that review has representative chips from each segment, so you do a direct comparison of how a dual core without HT Pentium compares to a dual core with HT i3, to a quad core without HT (i5) and with HT (i7), and all the way to a hex core i7 with HT.
I was in the "4710HQ is not enough for 980M" camp previously, and spread a lot of misinformation that way. So I'm simply doing what I can to help clean up the mess that I had a part in, which is why you may see me getting worked up about this particular issue.
And just so nobody misunderstands me, I definitely agree even a dual core desktop chip with HT simply won't be enough going forward, let alone a dual core ULV. -
How do i know if the games that I play is CPU or GPU bound?
I usually find myself wasting time on simulation / strategy games like Civilization / UFO / Total War / Sim City / War Games / NBA 2k15 / World of Tanks
I am not too much of an FPS guy. -
So its obvious that the ULV will bottleneck any higher-end GPU's e.g the GTX 980 and so on but what about the GTX 860M that comes with the laptop? Any possible bottlenecking there?
Also just to confirm and if I understand this right, the CPU bottleneck can be somewhat averted if you put graphics settings higher and more processing strain on the GPU instead?
I'm currently looking for a laptop for on the go before this January and this caught my attention. Graphics amplifier is something that I would get way later so I'm curious to see whether the GTX 860M will also be bottlenecked by the CPU as well. Especially in games like Battlefield 4 and so on. It's either this or I'm going for that MSI GS60 which is more expensive so I'm banking on this. -
xxpmrong likes this. -
You are only lying to yourself if you believe the i5-4210U processor will not bottleneck the GTX 980. I'd be willing to put money on the fact that, if a good reviewer (like HTWingNut) were to get his/her hands on an AW 13 and Amplifier with the GTX 980, it would be quite obvious that there is a bottleneck. How dramatic of one? I don't know. But it will exist.
- Fact: The 4710HQ is more powerful than any dual-core ULV processor offered in the AW 13.
- Fact: The 4710HQ bottlenecks the GTX 980M (equal to a GTX 770~780) in some games.
- Fact: The GTX 980 is more powerful than a GTX 980M or GTX 770/780 by a good margin.
-
I don't believe I ever said the 4210U wouldn't bottleneck a 980. In fact I said the exact opposite, that it would bottleneck the crap out of a 980.
And for the love of god please stop saying the 4710HQ "bottlenecks" the 980M. In my book something that runs over 100 FPS is not a bottleneck. Hell for any sufficiently non-demanding game, the CPU will always "bottleneck" the GPU. Let's say you go from 300 to 400 FPS when overclocking from 3.5 to 4.5GHz. Would you call that a "bottleneck"? Do you see what I'm getting at?bumbo2 likes this. -
I'd add too that MMO's also fall into the CPU-bound camp a good deal... this is why I am torn on the AW 13... or just going for an AW 14. If Dellienware would just state that the AW 14 is "dead" I'd consider moving on one, but without a word I am left hanging to see what they do in January. For all we know it could be back with a Broadwell chip and 900-series GPU... but I doubt it -
I used it as an example to prove if a better CPU has trouble with a weaker GPU, the weaker CPU will definitely have trouble with a better GPU. I'm not debating the 4710HQ here. I could have chosen any other CPU for my example, but I just happened to choose the 4710HQ because it was recently reviewed. -
AW14 isn't even on their website anymore?
-
On the question of bottle-necking - both points in the thread are true - you could get 100 fps or so, and be bottle-necking but for the majority of people it would be a moot point. I fall into the camp that is more than happy with 60 fps, but I could see people getting used to higher frame rates and the question of bottle-necking then being an issue.
SO aggravated with Dellienware for not putting out an official end to the AW 14 or not... I am half-tempted to buy one as I need a more general machine on the road but the issues with the AW 17/18's these days give me pause to even considering one of them not to mention the greater size (this is an issue for some of us, the thought of trying to use an AW 17 in coach on a plane just doesn't work. I guess AW 17/18 owners must all fly first class -
Dell FUBAR'ed my second order as well, and again, canceled it rather than trying to fix it. '3rd times the charm' order now had a ship date of Jan 14th, and they "can't" change that despite my previous 2 orders showing a ship date of Dec 16th (both canceled by Dell due to mistakes by Dell). I told them nevermind, that delay can eat it. Gonna wait for broadwell and see what Aorus does.
I wish Dell had a way to order that didn't involve the internet or the phone lol. Both ways resulted in my order being completely wrong. -
I was interested in the Aorus too after Dell dropped the ball.. but so many reports of heat, noise and build issues... ugh....
I've been a Dellienware loyalist since 2009, but these moves now are really making me start to think Clevo/Sager if Dellienware can't pull out a good rabbit in January.. -
-
was going to buy the alienware 13, but then realized that checking on notebookcheck the i5 4210u even bottlenecks the 840m. AW 13 will probably give less than desirable results.
-
If I stopped using that example, it still doesn't matter to this discussion. The 4210U will always bottleneck the 980, lol. So, you're arguing a moot point. -
Aorus - Linus says it runs cool? But would wait for the 9xx for sure. And, imo, no getting around a certain head load with a lot of power in a small space. As long as it's cool when doing desktop work, that's what matters to me.
Clevo/sager - not slim enough for me, and holy jebus they're ugly.
Dell - dual core ULV whaaaat?! /wrists
I agree Doc, gonna have to keep waiting. -
All I've ever been saying repeatedly is to stop using the 4710HQ as a bottlenecking example, because it's a terrible one. A much better example would be some of the older MSI models where they paired a 7970M with an AMD A10. Now that was a real bottleneck in every sense of the word.
-
-
*sigh* Multiple people have chimed in later in that thread to point out it's not a real bottleneck, and I even did some testing at the very end to see how CPU speed would affect FPS in a simulated 980M SLI machine. I'm just gonna put this quote here and leave it at that.
Alienware 13 Pre-Release Speculation Thread
Discussion in '2015+ Alienware 13 / 15 / 17' started by tinker_xp, Aug 8, 2014.