The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    When are the new Alienwares with Maxwell coming?

    Discussion in 'Alienware' started by Cloudfire, Sep 29, 2014.

  1. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,832
    Likes Received:
    59,570
    Trophy Points:
    931
    Dell may want throttle the GPU because they might be using a 180 watt power supply ..
     
  2. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,174
    Likes Received:
    17,885
    Trophy Points:
    931
    180w is fine unless you overclock.
     
  3. eats7

    eats7 Notebook Evangelist

    Reputations:
    168
    Messages:
    305
    Likes Received:
    37
    Trophy Points:
    41
    So if we want to game with the new aliewares, we have to hook them up to a desktop amplifier? what in the hell is the point of that? why wouldn't i just go buy a desktop rig? so now if i want to game at say the airport, or anywhere for that matter, such as in my basement on my tv, I not only have to bring my laptop, charger, hdmi, but now a DESKTOP AMPLIFIER with its proprietary cord, and its power cord? What ridiculous joke of mobile gaming design is this? Who the hell decided this would be a good idea for these larger laptops?

    Well, Alienware. You're not getting anymore of my money. Between the BS i went through with my last purchase, and now this, It's safe to say I'm no longer a customer.
     
    Last edited: Jan 6, 2015
  4. Midou

    Midou Notebook Consultant

    Reputations:
    23
    Messages:
    152
    Likes Received:
    14
    Trophy Points:
    31
    When the GA was introduced with the AW 13, I thought it was a cute idea. Buying a small notebook with portability the main goal in mind but having an option to hook up and play games when not on travel. Now, buying a big laptop only to have it hook up to a GA just doesn't make any sense.
     
    eats7 likes this.
  5. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    The idea of GA is excellent. More upgrade paths are great. Where Alienware is going wrong, intentional or not, is making powerful systems and GA support mutually exclusive and the only upgrade path. It's so goofy.

    Hopefully they get punished where it hurts, in the wallet. It looks like that going to be necessary for them to see that can't walk all over their customers and upping their game is people want.

    I'll add to that. It's been almost 6 months and no sign of a 9xx series GPU. It's pretty obvious that the AW brand had a change in direction at the last minute. Look at the lack of decisiveness in them communicating what's around the corner. Sketchy at best.

    It's not that hard to slap a some 9 series GPU's in their recently refreshed range.
     
    Last edited: Jan 6, 2015
    Ashtrix, eats7 and papusan like this.
  6. stefan1126-

    stefan1126- Notebook Enthusiast

    Reputations:
    0
    Messages:
    43
    Likes Received:
    6
    Trophy Points:
    16
    kamlesh likes this.
  7. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,158
    Likes Received:
    1,038
    Trophy Points:
    231
    Pretty much. I missed that detail.
    So three GPU options now known: the GTX 965m, the R9 m295x, and the GTX 980m. The i7-4710HQ is indeed the i7 option.
    Of course, this could be shared with the 17 R2 video that will inevitably come and ruin this all, but I have doubts that would be true.
     
  8. stefan1126-

    stefan1126- Notebook Enthusiast

    Reputations:
    0
    Messages:
    43
    Likes Received:
    6
    Trophy Points:
    16
    I really hope you're wrong about the AW 17 R2 video. When I noticed the GPU on this video, I started jumping like a little kid from the excitement! I really hope the AW 15 comes with the 980.... :)
     
  9. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    It should come with a 980m. Dam you AW. I would have been all over this new gen had things been different. Anyway, hurry up and come to market, the least you can do is give us a dump of your VBIOS :laugh:
     
    papusan likes this.
  10. eats7

    eats7 Notebook Evangelist

    Reputations:
    168
    Messages:
    305
    Likes Received:
    37
    Trophy Points:
    41
    lol a soldered on 980m. never thought i'd see that.
     
  11. nightingale

    nightingale Notebook Evangelist

    Reputations:
    182
    Messages:
    418
    Likes Received:
    278
    Trophy Points:
    76
    If you mean a soldered 980m in general, the asus g751jy would like a word with you. But alienware looks like at least theyr on some level "trying" to give a good machine even if its soldered. Should alienware get the GA to actually work properly, a 980m for the road and 980 for home would be a pretty sweet setup.
     
  12. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Yeah..it helps Dell sell you another machine when you're out of warranty, your GPU craps out and you can't replace the part.

    koolaidman.jpg
     
    Cloudfire, bumbo2, Ashtrix and 3 others like this.
  13. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,832
    Likes Received:
    59,570
    Trophy Points:
    931
    I'm thinking that a soldered gpu might have a lower TDP? With a normal Hotwell (mq - hq) and socket GPU would Dell had a need to deliver the laptop with 240W power supply and not a 180w.
     
  14. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    I really have no idea. They've probably locked down the BIOS so much that OCing without the GA won't be a thing and 180w will be ample.

    I hate being so negative about all this but it really does stink. Let me balance it out...umm "it looks goods" and it's light enough with carbon fibre so my delicate little arms can carry it.

    81unFhJQ42L._AA1500_.jpg
     
  15. eats7

    eats7 Notebook Evangelist

    Reputations:
    168
    Messages:
    305
    Likes Received:
    37
    Trophy Points:
    41
    Don't forget your honkin' graphics amiplifier! Might need another arm lol
     
    bumbo2 and Robbo99999 like this.
  16. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    I have a pretty strong suspicion that they may start having some provision for upgrading on the m/b replacement. I also have a feeling the move to BGA is going to start standardizing the form factor / chassis design, at least within a vendor for a period. Being able to buy and install your own (or have one installed as part of a purchase) upgraded motherboard with better chipset+CPU+GPU in the same system is likely.

    On the question of OC'ing, I am very curious how they will handle that on the AW 15/17 (and if there's an 18). I've been used to them locking the BIOS on and off for different chips for so long, it doesn't bother me as much. My M17x-R2 is with stock BIOS locked, while M14x's gpu is not, while one of M11x had OC options while the others didn't.

    People are panning the G/A, which I understand is being naturally associated with the BGA, but as an idea it has more than a little merit. If we were continuing to see non-BGA systems, and with the G/A option on top, I think people would be thinking of it as much more positive. Just imagine if we had the inevitable slide to BGA and no G/A option....
     
  17. WongJJ

    WongJJ Notebook Enthusiast

    Reputations:
    0
    Messages:
    32
    Likes Received:
    9
    Trophy Points:
    16
    The GA is way too overpriced imo, $300/£200 for an empty plastic box??? At the very least they should had added like 4 small wheels on it, so I can easily pull it around like a mini trolley.
     
  18. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
  19. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Im not 100% sure but I think soldered 980M = 91W while MXM 980M = 100W
    The TDP is all relative though because 780M/580M/680M/980M share the same TDP but vary a lot with actual heat output. Id say a 580M runs much hotter than a 980M for example
     
    Mr. Fox likes this.
  20. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    @Cloud: The 980M will reach well over 91W when overclocked, and people are going to be trying to overclock these Alienware's.

    Overclocking is now going to be inhibited by the low-watt PSU's. There is no longer an SLI system, meaning there will only be small PSU's. Alienware has changed so much.
     
    Mr. Fox likes this.
  21. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    FYI - a single 980M with a big overclock installed in my M18xR2 reaches a system draw of 350W on my CyberPower PSU meter. I am having trouble with throttling, but I have seen 540W draw with 980M SLI using a stock vBIOS and a mickey mouse baby overclock of +135 core offset. So, good luck with a 240W AC adapter and a single 980M.

    980M runs MUCH cooler. At 1300/1400 max of about 71°C in 3DMark11.
     
    Cloudfire, bumbo2, Zero989 and 2 others like this.
  22. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Exactly. Im not sure about why there is a difference in heat from two different GPUs tested on the same machine when they share the same thermal specifications.
    Unless TDP is Max and a 780M runs much closer to that max while 980M in average runs considerably lower.

    All I know is that it made no sense saying a 980M runs as hot as 780M (both 100W) when one is based on a 145W GPU (with 128 cores less) while the other based on 195W GPU. So there is something very fishy about the TDP info going around for mobile GPUs
     
  23. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    NVIDIA has not (unless something concrete surfaced recently) provided any official documentation on 980M TDP.

    I don't have thermal issues with either GPU. 980M runs cooler than 780M, but I have never had overheating problems with 780M, even with max overclock. What is interesting is that my max stable 780M overclock is about equal to 980M stock, LOL. If svl7, Prema and Johnksss can help me crack the 980M power throttling issues it will be amazing. If not, I'll have to buy a Clevo so I can use them.
     
    TBoneSan likes this.
  24. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
  25. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Technically, yes, but not really. I'm speaking in regards to high resolution.

    It's like saying a game is free, but making it nearly impossible to play without upgrading to a premium account (less features, etc.).
     
    Mr. Fox, eats7, maxheap and 1 other person like this.
  26. cj0r

    cj0r Newbie

    Reputations:
    0
    Messages:
    1
    Likes Received:
    5
    Trophy Points:
    6
    Guys, I'm seeing a lot of trash talk on the GA unit; barring the launch issues with drivers etc. and a slightly high price ($150-200 would have been better, but you're only buying it once... so what the heck) I think it's a fantastic piece of hardware and something that has been missing for years for people like me who were bound to desktop computers. Up until now, external desktop GPU enclosures have been limited by PCI-E lane constraints and clunky software issues. I now have the ability (almost out of box) to get awesome gaming performance at home along with being able to take a fairly powerful computer on the road with me. The need to own both a powerful desktop and laptop has been eliminated.

    I purchased the Alienware 13 and, excluding the somewhat lack luster processor which I have been closely monitoring but have not maxed out yet, I can not be more happy with the results so far. At home, my gaming performance is great. I have a GTX 970 in the GA and am maxing out all my games at 2560x1440 resolution (BF4, COD, Diablo, etc). On the road I get excellent productivity and reasonable gaming performance. Is it the highest level of performance? No. Is it more than acceptable? Yes. BF4 specifically plays much better than I anticipated since I always figured it to be a processor heavy game. The point is, I'm getting the best of both worlds and this is just the first iteration from Alienware! It can only improve as long as the hardware doesn't get dumped.

    Over the past 4-5 years, within my desktop, I have had some version of an intel i5 processor. Each time I upgraded, I had a suspicion that I didn't really need to do it and wasn't utilizing the CPU nearly as much as I thought. Eventually I caved in and did some research and found this to be 100% true. As most people know in the gaming world, the GPU tends to be the primary source of bottlenecks unless you're using some extremely high end and expensive hardware.

    Excluding certain models that allow upgrade-able components within a laptop, and even then there's a harsh ceiling to what you can upgrade to, you haven't had this ability before. Your laptop would age very quickly and you would feel the need to buy a new one. Like cars, laptops (and definitely gaming laptops) do not hold value well at all so reselling to minimize new costs wasn't very feasible. You would endlessly be dumping money into these systems looking for that faster fix that would bring you to "desktop replacement" status. You now have that faster fix in the form of the GA unit. I'm getting near full performance out of my GPU when compared to a similar unit in a friend's i5 4690 based desktop (GTX 970 G1 vs regular gigabyte).

    Further more, as new GPU's come out, I will have the ability to upgrade to them as well. Beyond that, as new Alienware laptops come out, I'll still be able to upgrade that hardware even further! You're used to buying new laptops anyway right? I have no where near the same level of constraints as I did in the past from GPU performance, and I don't feel like I'll have the itch to buy a new laptop anytime soon. For the first time, I feel like I have found a true desktop replacement. I can game at home at full performance, and game on the road at a reasonable level. I'm scratching my head as to why there's a lot of haters that don't see it that way? I for one do not want to buy a new laptop every year, this gives me that safety blanket. I would have loved a little faster of a processor for piece of mind, or at least a quad core, but from the tests I've been running the past 2 weeks... I have no reason to really be searching for that. The fact that I should only need to buy this GA once is the huge selling point here, however, if Alienware changes the proprietary connector or something to make it incompatible with future laptops... that'll be really screwed up on their part.

    Call me crazy if you don't see it this way, but I find my craziness valid.
     
    bumbo2, TBoneSan, 1nstance and 2 others like this.
  27. 1nstance

    1nstance Notebook Evangelist

    Reputations:
    517
    Messages:
    633
    Likes Received:
    221
    Trophy Points:
    56
    You actualy make some very valid points.
     
  28. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    There is a type of person that this setup suits. You're one of them. Actually, I myself am too just with 1 more caveat. A lot of outcry isn't from the GA. It's from the higher end AW's soldering components to the MB removing any ability to upgrade the machine's CPU/GPU itself. Some users like/need to travel with their machine and dragging a big box around so the machine can do it's job doesn't help in that regard.
    Fair enough for AW to offer a small and slim range with soldered components that can then be a best when you get back home - certainly why not. Just doing this to highend flagship products that should be able to work autonomously is a horrendous idea.
     
  29. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    Glad you like it, but I think you aren't running very taxing games if you can't observe bottle-necking. I assure you the potential is there with this pairing of CPU+GPU. If you can decrease graphic quality settings without a commensurate change in fps, you have bottle-necking. I have observed it already, and that with a circa 2009 game with max settings. Granted, it's not efficient code, and I am right on the edge of it bottle-necking most of the time, but there is no question that the processor will not maximize the GPU usage. You see a gain with an external GPU only because it can do certain operations more quickly that don't involve the processor, or minimally so. More CPU-bound games will not see as great an increase with the G/A and external GPU.

    I see no problem personally with it being something that can augment a high-end, SLI setup, but the option to use a G/A should not be the rationale for disposing of SLI setups altogether which unfortunately looks like it is the way Dellienware is headed.
     
    TBoneSan likes this.
  30. ejohnson

    ejohnson Is that lemon zest?

    Reputations:
    827
    Messages:
    2,278
    Likes Received:
    104
    Trophy Points:
    81
    Im all for the GA, I actually wish they made a retrofit kit for my current 17 to work with it.

    If they could get something to go through the unused DMC port that would be amazing!
     
  31. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    MSI is claiming graphics amplifier is limited to pci-e 4x speeds btw, I hope this will be either validated or proven otherwise soon, if this is true, AW is robbing people atm.
     
  32. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    The GA is sound and runs at the full x16. The CPU's on offer now are limiting it to 4x.
     
    Mr. Fox likes this.
  33. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Trash talk about the GA is mainly based around the fact that they chose a ridiculously slow dual core low voltage CPU for the AW13, which in no way is enough to fill the boots of a desktop card! I think it's an OK idea - the whole GA thing, but they should have put an i7 47W CPU in that A13 I think - their current combination of CPU is a major oversight, a poor teaming of components, which is only ever gonna become more pronounced if A13 GA users ever decide to upgrade their GPU in the GA to an even faster model a few years down the line. It effectively makes the A13 + GA combination unupgradeable due to such a slow CPU, which was supposed to be the whole point of the GA to begin with!
     
  34. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Mr. Fox, maxheap, TBoneSan and 3 others like this.
  35. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    Some people don't care about CPU performance. Case in point is AMD fanboys. AMD have a following of loyal customers that are content with lousy CPU performance. If you are an overclocking/benching enthusiast, this matters every bit as much as a beast GPU, maybe even more in some cases. For those that don't care if they have an awesome CPU, BGA doesn't matter unless a repair is needed and then you get a taste of the disposable/non-serviceable hardware poison. For those that don't care about having a great CPU, AMD offers socketed solutions and insane clock speeds that are amusing to look at in spite of their lackluster performance.

    As far as why they would offer an R9 M295X, maybe they are trying to offer (a) low budget alternative to those that plan to be tethered to the eGPU and don't need any discrete graphics performance, or (b) something to keep the red fanboys happy. Basically, something for everyone except for the real high performance enthusiasts. We are now the abandoned stepchildren that don't matter to them.
     
  36. CSHawkeye81

    CSHawkeye81 Notebook Deity

    Reputations:
    194
    Messages:
    1,596
    Likes Received:
    175
    Trophy Points:
    81
    I see an Alienware 15 in my future with a GTX980M.
     
    Cloudfire and bumbo2 like this.
  37. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    I love the new keyboard on the AW 17 R2. They added more macro keys. ;)
     
  38. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    What's the TDP of the R9 M295X Cloudfire? Performance wise do we know how it compares to 980M/970M?
     
  39. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    Yupp 980m finally seems to be confirmed, question is should i return my laptop for an AW 15 with 980m :D
     
  40. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    The M295x is a 125W card. It's going to be a problem if you get this, especially with a 150W/180W PSU, lol.

    The 980M does make the AW 15 a very attractive option. ;)
     
    Robbo99999 likes this.
  41. QUICKSORT

    QUICKSORT Notebook Evangelist

    Reputations:
    212
    Messages:
    686
    Likes Received:
    536
    Trophy Points:
    106
    Robbo99999 likes this.
  42. bumbo2

    bumbo2 Notebook Deity

    Reputations:
    324
    Messages:
    1,612
    Likes Received:
    104
    Trophy Points:
    81
    Me too, my GA is on fedex truck for delivery!
     
  43. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Cheers J.Dre, that does seem like an odd choice then for Alienware's now thinner (& maybe reduced cooling capability) laptop models, like Cloudfire was intimating.
     
  44. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    I love how the 980M is only $150 upgrade to the M295x, LOL.

    Worth. Every. Penny.
     
    maxheap and TBoneSan like this.
  45. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    Its 7 pounds though :( still superior cooling and quality construction of AW makes 15 an attractive option over p35x and 650sg imho
     
  46. ejohnson

    ejohnson Is that lemon zest?

    Reputations:
    827
    Messages:
    2,278
    Likes Received:
    104
    Trophy Points:
    81
    Whoa! the 17 has 4 m.2 slots.... turbo speed!

    THese are looking better and better to me!
     
  47. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    How does the 980M handle 4k? I won't purchase that screen if it's going to be like 20 FPS on ultra.
     
  48. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Got to be too much for a single 980M to handle with any decent framerate in a modern graphically demanding game. If you run at reduced resolution, that could be OK, but a lot of people say you get reduced image quality in comparison to running a native res - so 1080p native res better than 1080p on a 4K monitor. I wouldn't choose a monitor bigger than 1080p for a single 980M.
     
  49. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    I suppose they expect those with the 4k screen to have the graphics amplifier, lol.

    I'll probably just grab 1080p. Good ol' 1080p has yet to fail me.
     
  50. kamlesh

    kamlesh Notebook Enthusiast

    Reputations:
    0
    Messages:
    24
    Likes Received:
    3
    Trophy Points:
    6
    What are the possibilities of 120hz display being implemented after few weeks or months?Becoz if you remember when the 800m series gpu launched at first the 120hz display where not compatible with alienware 17 but then later on they were so was just wondering if i should wait to actually see or its 100% confirm that its not gonna happen ?
     
← Previous pageNext page →