The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    New Alienwares 2016

    Discussion in 'Alienware' started by vkt62, Sep 2, 2016.

  1. micman

    micman Notebook Evangelist

    Reputations:
    242
    Messages:
    662
    Likes Received:
    119
    Trophy Points:
    56
    Thanks mates!
     
  2. DeeX

    DeeX THz

    Reputations:
    254
    Messages:
    1,710
    Likes Received:
    907
    Trophy Points:
    131
    The WVA (Wide Viewing Angle) TN screens are actually not that bad. The WVA part of it solves the viewing angle part. ;)
    Plus it should be noted that OLED screens have a few issues.
    One, Displaying static images results in temporary image retention.
    Since OLED screens are organic, as time passes they literally degrade.
    Lastly is the infamous burn in.
     
  3. Diversion

    Diversion Notebook Deity

    Reputations:
    171
    Messages:
    1,813
    Likes Received:
    1,343
    Trophy Points:
    181
    Well.. supposedly LG has figured out how to eliminate the image retention i'm assuming by using built in hardware that varies the pixel behavior through dithering.. Which ultimately would degrade overall picture quality, especially when you're up close on a laptop and using 1080p. At 4K you wouldn't notice it as much, but it's a good technology to use on big screen TVs where you sit so far back you wouldn't see any dithering happening.

    So far I haven't heard any reports of the Alienware 13 with OLED yet.. but I haven't also looked hard enough to know if anybody has complained about image retention.
     
  4. CarbonTwelve

    CarbonTwelve Notebook Consultant

    Reputations:
    15
    Messages:
    232
    Likes Received:
    108
    Trophy Points:
    56
    Quite an assumption to make. I haven't seen Optimus advertised in any of their material, which I'm fairly sure they would be making a deal about if it was available.

    Some TN panels can actually be really good (other than viewing angles). For instance, the 1080p 120Hz panel in the GT73VR appears to be better than most rival 17" laptops with IPS panels: http://www.notebookcheck.net/MSI-GT73VR-6RE-Titan-Notebook-Review.172916.0.html

    Contrast ratio is 1365 for the TN vs 1100 or less for rivals, colour gamut is 100% sRGB, 76% AdobeRGB vs 85/56 for the IPS 1080p, and black levels of 0.23 vs 0.3+.

    Obviously the Alienware ones won't be the same panel (theirs are either 15.6" 1080p or 17.3" 1440p), but it's still possible they'll have good colour accuracy and contrast ratios / black levels.
     
    Last edited: Sep 25, 2016
  5. ashknani

    ashknani Notebook Consultant

    Reputations:
    16
    Messages:
    210
    Likes Received:
    17
    Trophy Points:
    31
    I think its been confirmed that alienware 17 with gtx 1070 comes with a 180w charger if u check hidevolution meaning that 90% chance the alienware 17 with gtx 1080 wont come with 330 but it will come with 240w which is still not enough..
     
    Spartan@HIDevolution likes this.
  6. CarbonTwelve

    CarbonTwelve Notebook Consultant

    Reputations:
    15
    Messages:
    232
    Likes Received:
    108
    Trophy Points:
    56
    If you add a 1070 laptop from the Dell US store it specifically lists the power adapter as being 240W. Also, there's a 330W adapter listed as an optional extra to buy from the US store right now, so it seems pretty clear to me that they'll be using that for the 1080 version (otherwise, why would they even bother having a 330W adapter?)...
     
    Last edited: Sep 25, 2016
    rinneh likes this.
  7. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,629
    Messages:
    23,562
    Likes Received:
    36,879
    Trophy Points:
    931
    hmscott, Cass-Olé and Papusan like this.
  8. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    Because it is most likely not fit more than 2 ram slots duo the thin designed "ECHO" chassis, or Dell believes 2x16GB ram is enough. Previous ECHO models also had 2 RAM slots (maximum 16GB RAM). But was rescued by the ram manufacturers when they increased ram stick capacity to 16GB. Therfore you can have the whole 2x16GB ram in all AW "Echo" now!! 32GB ram is decent in laptops today :D
    And NOO. You can't use the graphics in the AMP with the internal for SLI :cool: GTX 1080 BGA will come in a few months.
     
  9. CarbonTwelve

    CarbonTwelve Notebook Consultant

    Reputations:
    15
    Messages:
    232
    Likes Received:
    108
    Trophy Points:
    56
    Yup, it's already been seen in breakdown videos that they only have 2 RAM slots.

    Yeah, supposed to be mid Nov; same time as the 120Hz panels.
     
    Spartan@HIDevolution likes this.
  10. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    I did not need to see Dell's videos or pre reviews for this. I expected that Dell continued the same line as with previous AW ECHO models. So this was no surprise for me!! The same also with only a single 2.5 "SSD slot in their 17" model :cool: How many 17" laptop models out there, have only a single 2.5" ssd slot nowadays??
     
  11. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    A lot? Msi ghost series, evga sc17, clevo bga machines, asus 17inch machines etc.
     
  12. captn.ko

    captn.ko Notebook Deity

    Reputations:
    337
    Messages:
    981
    Likes Received:
    1,252
    Trophy Points:
    156
    dont feed the troll ;)
     
  13. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    Not to feed the troll. But in the end for i presume for most users its just empty unused space in their laptops. Why not make it more ergonomic convenient?
     
    Last edited: Sep 26, 2016
    Spartan@HIDevolution likes this.
  14. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,629
    Messages:
    23,562
    Likes Received:
    36,879
    Trophy Points:
    931
    I really wish they give us 4 RAM slots and two 2.5" bays so I can finally put an end to this crappy keyboard nightmare that I have to live in my Clevo P870DM and go back to Alienware
     
    hmscott likes this.
  15. Cass-Olé

    Cass-Olé Notebook Evangelist

    Reputations:
    728
    Messages:
    338
    Likes Received:
    986
    Trophy Points:
    106
    @Papusan in your expert opinion, will you weigh in on the following, please: 17R4 6700hq + 1070 = option: add +$100 6820hk
    why add +$100 ... is it for a heatink?​
    let us pretend it costs $100 more for a different heatsink:
    • when I place 6700hq into my cart
    • I've purchased a heatsink for it, don't you agree?
    if I opt for 6820 - add $100:
    • I have to purchase the original HQ heatsink
    • pay for it 1st as cost at checkout
    • opt out of that HQ heatsink 2nd
    • then pay another $100 for the HK heatsink
    • on top of the HQ heatsink I already paid for
    Since the cost of the HQ heatsink is already built into the cost before I opt into 6820 which implies the HQ heatsink has a definite cost I have to pay,(let us say that all 6700HQ buyers are expected to pay $100 builtin as the cost of their heatsink), does that not mean that the true cost to upgrade to 6820 is more like $150 ... that's one hell of an expensive heatsink

    6820hk heatsink is like the price of two heatsinks
    • pay for hq style heatsink 1st
    • pay for hk style heatsink 2nd
    if add $100 for 6820 is not for a heatsink, & since they're both 45w & $378 MSRP, why is 6820 $100 more?

    or is it for the 'dynamic' 4.1GHz over-clock?​
    If so, isn't a 4.1 over-clock as simple as pushing a button in the Bios? $100 for them to push a button? The same button the owner can push the very 1st time it is powered up out-of-box?

    What is 6820: Dynamically Overclocked up to 4.1GHz? Alienware's infamous for over-clocking just one core & charging for that one-core push-button over-clock, so is the 4.1 an oc one just one core? 180W Adaptor for 6820HK and Dynamic Overclocking

    edit: original comments made here yesterday are now removed from this post, but do stay in the record of new user posts which follow.Unedited question for @Papusan: "Last, wasn't 980m in the prior 17 only an 88watt card?"
    The user quotes above were brought into this post for proper vetting, along with with this original question: "Any clues Poppy". You may access the original text in follow up posts.

    The user quotes above were staged for proper vetting, along with with the original bookmarked questions: " @Papusan in your expert opinion, will you weigh in on the following, please" & "Any clues Poppy". You may access my original text in follow up posts for proper context.

    The new post beneath mine by user rinneh is about to make the irresponsible accusation against my user account here that 'misinformation' was spread in this post. I've come back as an edit to challenge that accusation right here right now by calling on the data given by NBR's pre-eminent expert, user Prema who may or may not weigh in regarding information vs misinformation in either of our posts, or let his data speak for itself instead:

    From Prema Tech | Inferno
    Core: 1038Mhz
    Boost: 1127Mhz
    vRAM: 1250Mhz
    TGP (package): 103W
    TDP (chip): 85W

    user rinneh will link to Notebookcheck & wikipedia as 'expert' sources for data; however NBC clearly states they're data is for MXM, not BGA; NBC clearly states a 122w tdp refuted by Prema (103w TGP); wikipedia simply states a nominal round (ideal) figure for 980m as simply 100w tdp even, refuted by Prema's data that says 85w tdp

    user Luke Taylor is equally welcome to tell us where his source for an 88w Alien980m comes from; note that 88w is closer to Prema 85w, than 88w is to rinneh/NBR/wiki 100w data


    It is my opinion that not just the data & information provided by user rinneh but also the use of a dubious source for that information as the evidence to try & blight my account in the following post below is the actual misinformation that's been spread & pretends to be 'correct' & settle the matter & save the forum from useless information: but when given notebookcheck/wiki or Prema data, the one must bow to the other. An odd thing would occur to call my account into question for spreading misinformation, if the information produced itself becomes the misinformation, which it appears to be, which compounds the original folly of making the accusation in the 1st place. Irresponsibly so. I didn't spread misinformation, I clearly asked for the Luke Taylor 88w quote be vetted by my friend & forum expert Papasun to be correct or not

    user rinneh acts hostile to my account, so it is a given he's about to rush in headstrong & try but fail to undermine the tenets of my original post. This is what happens when an expert is called for & a non-expert gets in the way 1st
    ______________

    edit: Chipset can't explain add $100 for 6820
    user rinneh: "The HK requires a z series chipset thus making it more expensive (if i am correct on this point). Also again we dont know what dell is paying for both cpu's"

    As a near 7day a week active apologist for Alienware, user rinneh's wishful optimistic observation above that the 6820HK 'z-170 chipset' may explain the add $100, this is wishful thinking at its worst & a poor excuse, given 6700HQ & 6820 share the chipset due to their sibling Skylake i7 nature; chipset information is on the 15 / 17 sales page itself, between Display & Battery options, above video card selector & only required 5seconds of research.

    Debunked: chipset explains $100 surcharge

    "Not knowing what Dell pays" is equally wishful thinking: user rinneh is an apologist for this company & wants there to be a good explanation for add $100 but thus far can not prove nor disprove it is a money grab, an idea which Papasun has no quarrel with in his opinion piece below. Dell uses OEM/Wholesale CPUs, bought in the 'miliions' at bulk discount ... one can hardly attribute 'what they pay for each' (cheap) on their end to account for $100 over the other
    _________________

    information v misinformation: let the forum decide
    From Prema Tech | Inferno
    Core: 1038Mhz
    Boost: 1127Mhz
    vRAM: 1250Mhz
    TGP (package): 103W
    TDP (chip): 85W​
    • 85/103 = x/180 > x = 148
    • 100/122 = x/180 > x = 147
    • 88/100 = x/180 > x = 158
    • TDP ≠ TGP
    • or what Frank Azor called a 150w 1080
    • example: 150/180 = x/100 > x = 83.33
     
    Last edited: Sep 30, 2016
    Ashtrix likes this.
  16. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    Again the misinformation. The 980m is a 122tdp gpu (including its components such as vrm). Power consumption is 100watt.

    Thus the misinformation from some users continues

    sources:
    https://en.wikipedia.org/wiki/GeForce_900_series#GeForce_900M_.289xxM.29_series
    http://www.notebookcheck.net/NVIDIA-GeForce-GTX-980M.126692.0.html

    With the new 10xx series for laptops they have a slightly lower power consumption compared to the desktop cards due to better binning. The desktop cards need slightly less power compared to the 9xx series but they are harder to cool due too the smaller chip die.
     
  17. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    Maybe 88W is standard for 980m BGA or Dell have crippled the graphics with firmware? I remember Dell crippled TGPU for Gtx780m. It took almost a year before Dell's engineers fixed bios and increased to 90 degree for Alienware 17 Nvidia GTX 780M!!
    Regarding 1080BGA. It will be interesting to see the maximum power consumption in bench with Dell's 1080 BGA card vs. what others provide. This will give us 100% clarity of what Dellienware's engineers have done and what Azor said about maxed TDP for their 1080 graphics.
    I am reasonably sure that i7 HK vs Hq BGA use the same heatsink. The price difference between HK vs HQ BGA goes into the pocket to its shareholders. There is no other reason for this price differance... And Dynamically Overclocked up to 4.1GHz on just 1 core is nonsense... What software or games use only 1 core nowadays? Either should Dell put max OC settings for 4 cores in bios to 4x4,1 GH or clock down to 4x 39 :cool:
     
    Last edited: Sep 26, 2016
    Ashtrix, TBoneSan and Cass-Olé like this.
  18. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    The bga 980m is exact the same as the mxm board. The difference is that it is integrated directly on a motherboard. Also the 3dmark scores dont reflect any differences as well.

    The HK requires a z series chipset thus making it more expensive (if i am correct on this point). Also again we dont know what dell is paying for both cpu's.
     
  19. zeroibis

    zeroibis Notebook Consultant

    Reputations:
    6
    Messages:
    235
    Likes Received:
    41
    Trophy Points:
    41
    Anyone know what PCI SSD models alienware will be shipping with these new laptops?

    Also I want to comment that the issue I have with the screens is why no 1440 option for the 15. Your forced between 1080 or 4k. I do not see needing 4k on a 15" but 1440 would be nice mean while they will have 1440 on the 17 when you really would want 4k except I guess they can not get any decently fast screen as that resolution still. This also begs the question is why force the 15" users to be stuck on 1080 just to get gsync and a monitor with decent refresh rates...
     
  20. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    Wqhd is comingater if i am correct. But dell is bound by the panels available and offered by the manufacturers.
     
  21. zeroibis

    zeroibis Notebook Consultant

    Reputations:
    6
    Messages:
    235
    Likes Received:
    41
    Trophy Points:
    41
    Given that the GPUs are now down clocked desktop versions does this mean we will be able to use the desktop drivers?
     
  22. vkt62

    vkt62 Notebook Consultant

    Reputations:
    2
    Messages:
    241
    Likes Received:
    49
    Trophy Points:
    41
    I doubt it. The 1070m is different from the 1070 (number of core is higher on the m but has a lower clock). The clocks and some other settings are also different on the 1060m and 1080m.


    Sent from my iPhone using Tapatalk
     
  23. CarbonTwelve

    CarbonTwelve Notebook Consultant

    Reputations:
    15
    Messages:
    232
    Likes Received:
    108
    Trophy Points:
    56
    Most people who buy a 17" laptop aren't looking to make it more ergonomic. I think most 17" laptop users actually would prefer more slots for HDDs / RAM.

    Can't please everyone. Personally I prefer having a 1080p 120Hz display to 1440p. Even with some current games the 1070 can't drive them at 120Hz at 1080p, and given I'm planning to have it for several years I wouldn't want a 1440p that it's going to end up struggling with.
     
  24. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    It depend si guess. but I find it very annoying if the laptop base is too high. Makes using the keyboard for a long time quite terrible.
     
  25. CarbonTwelve

    CarbonTwelve Notebook Consultant

    Reputations:
    15
    Messages:
    232
    Likes Received:
    108
    Trophy Points:
    56
    I've never really found that to be an issue - if your palms are on the palmrest then the angle to the keys are the same, which means it just a matter of desk/chair height adjustment.

    Besides, even in the case of the new 17, for all the deal they're making about the new range being thinner, it's literally less than 5mm thinner (about a sixth of an inch), so I have a hard time imagining it makes that much difference.
     
  26. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    Compared to the old version it doesnt make a different indeed. The chair is also indeed important. Something that I cant avoid whil ebeing in Asia. Cramped spaces with crappy chairs lacking an armrest. So the thicker the notebook. The less support I can use from a desk.
     
  27. vkt62

    vkt62 Notebook Consultant

    Reputations:
    2
    Messages:
    241
    Likes Received:
    49
    Trophy Points:
    41
    I would have preferred getting rid of the sata slot for the hdd and just live on 2 to 4 m.2 slots on the laptop, atleast for the 15 in. They could have made it slightly smaller and lighter.


    Sent from my iPhone using Tapatalk
     
    rinneh likes this.
  28. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    The AW15R1 had 4 m2 slots on the motherboard. But 2 of hose where blocked by the 2.5inch drive unfortunately. I rather ditch the oldschool sata port as well.
     
  29. darkloki

    darkloki Notebook Deity

    Reputations:
    412
    Messages:
    1,829
    Likes Received:
    182
    Trophy Points:
    81
    They could have easily fit a full size keyboard on the 15 inch version couldn't they have? I understand smaller keyboard had the system been thinner, but when I see other Dell 15 inch systems (such as my work system the Latitude e6540) I can't help but wonder if it was possible for them to fit in a full size keyboard, unless being centered is that important, but I don't think it is anymore...
     
    Last edited: Sep 26, 2016
  30. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    It is really personal preference. I for example hate off center keyboards. Some cant live without a numpad.
     
  31. darkloki

    darkloki Notebook Deity

    Reputations:
    412
    Messages:
    1,829
    Likes Received:
    182
    Trophy Points:
    81
    Yeah I believe (Again just my opinion here) but the centering is primarily based on the touchpad rather then the actual keyboard. And then when gaming, I personally hardly ever use the touchpad thus centering isn't all that important. WASD is always to the left, so when it's centered it's to the left, and when it is off center it is just more to the left, but again hardly noticeable to me between Left and Left Left, lol.

    My apologies if that's just way too confusing what I said above, but I think you will somewhat understand what I'm trying to say.

    I think this is why I said in earlier posts that I get it if the laptop was thinner (Like a Razer Blade or Macbook for the portable uses) I didn't realize until now, just why exactly but these are just my reasoning.
     
  32. wickette

    wickette Notebook Deity

    Reputations:
    241
    Messages:
    1,006
    Likes Received:
    495
    Trophy Points:
    101
    Not really they might be downclocked but a mobile GPU has to support Optimus, MXM standards, custom power supplying etc. Things that has nothing to do in a standard PCIE 3.0 computer ;). So maybe we'll have unified driver but that's useless it would make the driver heavier. I'd rather have notebook drivers tailored for us.
     
  33. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    I think it is (in my case it is) to keep the hands centered compared to the laptop chassis itself. Not pushing the hands to the left side for the majority. ALso the Alienware keycaps are a tad larger. THey have to make those smaller as well to keep at least a small strip of space on both sides.
     
  34. Ashtrix

    Ashtrix ψυχή υπεροχή

    Reputations:
    2,381
    Messages:
    2,082
    Likes Received:
    3,291
    Trophy Points:
    281
    IS this really you ? The man who used the Phoenix (FTL drives) wanting to go back to pre-historic era ?
    Cass-ole' provides us so much of stuff everyday about the false marketing/ high priced and low grade hw they are putting into these days and god look at those designs of the new system Aurora man wtf is that, the Old Aurora was the true Alienware machine this is just a rotten junk knitted inside that old Beast's skin, not only that all new systems are like that - Preposterous load of junk.

    Also Even if they have MXM hardware I won't purchase them. Period Why ? Because their super greedy and mega anti consumer strategies, That's like welcoming a robber with open arms, Sorry I lost the brand and their recognition in my heart, I just can't stand to them, But they always will be remembered.

    The company has steeped down to the rotting pits of hell and It's not coming back, Instead of this Jaillienware / Failienware / Dellienware 100% gimped hw, bios options better opt for MSI where you can get a nice 2xSLI machine too, You wanted the best right ? the GTX 1080 SLI is the best and that option doesn't exist in Dell's book anymore, All their years of experience put into a book with " Passion and Enthusiasm, Niche, True to hw, High performance, Dashing looks" all they replaced with are " Greed, Extortion, Gimp , Overpricing, Low tier quality, Buzzwords".

    Alienware is Past. Present is an Abomination, Are you ready for the Genocide ?

    I'm using desktop drivers currently here with INF mods, And It had been like that since many years. About the performance Improvement I can't say much but the drivers I used so far are the best that Nvidia has put.
     
    Last edited by a moderator: Sep 27, 2016
  35. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    You can use any desktop drivers for laptops with 980 DT cards :cool: And all Nvidia hotfix drivers can be used on both noteBook's and desktops!! Use of mod drivers is also a nice way to get rid of all the Bloatware pushed out from Nvidia :D
     
  36. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,784
    Trophy Points:
    931
    Well, I wouldn't touch that with a 10 foot pole, but no... can't do SLI with internal GPU and eGPU. I saw where someone did SLI with two eGPUs using two Thunderbolt ports with a MacBook. There is an old post on Tech|Inferno showing that.

    Yeah, 980M MXM is over 88W just running stock. Start pushing it hard and it can go over 200W. In fact, just saw that tonight benching in my hotel room using the AC unit under the Window. Pulled just over 220W on the 980M running Fire Strike with @Prema vBIOS. It will be interesting to see what happens with BGA 1080 in terms of power draw and whether or not it keeps up with an MXM 1080. They will definitely need the 330W AC adapter for that, which may not set well with the folks that are shooting for easier travel and less weight as their primary objectives.

    [​IMG]

     
    Last edited: Sep 26, 2016
  37. zeroibis

    zeroibis Notebook Consultant

    Reputations:
    6
    Messages:
    235
    Likes Received:
    41
    Trophy Points:
    41
    They already failed that objective when they got a 17" gaming laptop...
     
    TBoneSan, Papusan and Mr. Fox like this.
  38. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,357
    Likes Received:
    70,784
    Trophy Points:
    931
    Well, I just did that with a 15-inch laptop. If Alienware takes a consistent approach with the new machines they are going to be the same product on the inside. So, for all intents and purposes the 15 and 17 should perform the same, give or take a little bit considering the 17 should run cooler than a 15. At stock they should be close to identical performance.

    Are they even going to offer 1080 in the 15-incher? I thought that was only going to be offered on the 17.
     
  39. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    Quite interesting to see their new thin laptop with a massive big 330W psu. A Odd combination :D Maybe Azor has something hidden? Maybe it will come up a new designed 260W psu for the new AW models with 1080? So everything's in style? :cool:
    Aw15 R3 is confirmed with max 1070.
     
    TBoneSan, Cass-Olé and Mr. Fox like this.
  40. Jeremy Vaughan

    Jeremy Vaughan Newbie

    Reputations:
    0
    Messages:
    8
    Likes Received:
    1
    Trophy Points:
    6
    I didn't quite get through the entire thread here but did see a post mentioned earlier about Optimus. I was told by Dell that the new notebooks don't support it at all. But I had read here that someone talked to a rep and they said you could turn it off. If it is indeed still there, does that mean even if it's off the video still going to run through the intel gpu to the miniDP and just use the nvidia for processing? If that's the case it still won't support a gsync external monitor? I need this thing to be able to run with my PG278q with Gsync over 120hz

    also one other question. How much do you think will be the cost difference between the 60hz IPS and the 120hz gsync display option. trying to decide if I should wait for that.
     
    Last edited by a moderator: Sep 27, 2016
  41. CarbonTwelve

    CarbonTwelve Notebook Consultant

    Reputations:
    15
    Messages:
    232
    Likes Received:
    108
    Trophy Points:
    56
    I'm fairly certain it won't support Optimus. They've made no mention of it in their marketing, and with one of the panels supporting G-Sync it means that the screen needs to be connected directly to the GPU. It's possible that they have two internal connections (with the G-Sync panel connected directly and other panels connected to the integrated chip), or they've implemented a mux switch (which theoretically would mean Optimus is supported on the G-Sync panel too) but I think either of these options are unlikely, and as before if it did support it then I'm fairly sure they'd be advertising it.

    I expect it to be less than the 4K screen (based on the Clevo 120Hz option cost and the cost of desktop panels). Probably something like $100-$150 US to upgrade.
     
  42. noodles-the-cat

    noodles-the-cat Notebook Enthusiast

    Reputations:
    5
    Messages:
    15
    Likes Received:
    13
    Trophy Points:
    6
    So, is the general consensus that Alienware will be releasing the FHD 1080p panel in 120Hz or GSync support? I tried to make heads and tails of the big Facebook thread on the Aienware wall, and I'm left rather confused about what will be coming out "soon", besides the QHD 120Hz 1440p panel with GSync.
     
  43. CarbonTwelve

    CarbonTwelve Notebook Consultant

    Reputations:
    15
    Messages:
    232
    Likes Received:
    108
    Trophy Points:
    56
    The 17" laptop will have the following panels:
    1080p 60Hz IPS
    1440p 120Hz TN G-Sync
    4K 60Hz IPS

    The 15" laptop will have the following:
    1080p 60Hz IPS
    1080p 120Hz TN G-Sync
    4K 60Hz IPS

    I'm guessing your confusion is coming from the 15" vs 17" 120Hz G-Sync panels.
     
  44. noodles-the-cat

    noodles-the-cat Notebook Enthusiast

    Reputations:
    5
    Messages:
    15
    Likes Received:
    13
    Trophy Points:
    6
    Yes, sorry, I was referring to the 17" specifically.
     
  45. captn.ko

    captn.ko Notebook Deity

    Reputations:
    337
    Messages:
    981
    Likes Received:
    1,252
    Trophy Points:
    156
    The german configurator is online too. GTX1060 get 180W powerbrick, GTX1070 the 240w powerbrick. Lets hope they spend the 330W brick for the 1080 :)
     
  46. ashknani

    ashknani Notebook Consultant

    Reputations:
    16
    Messages:
    210
    Likes Received:
    17
    Trophy Points:
    31
    I have contacted Hidevolution and the weird thing is that the FHD AND 4K display both r G-Sync enabled :/
     
  47. CarbonTwelve

    CarbonTwelve Notebook Consultant

    Reputations:
    15
    Messages:
    232
    Likes Received:
    108
    Trophy Points:
    56
    I think that'll just be a bad assumption on Hidevolution's part. I'd be trusting Alienware's specs (which say no G-Sync other than 120Hz panels) over Hidevolution's.

    Incidentally, has anyone who has ordered already got their laptop, or if not, do you have a shipping date?
     
  48. tinker_xp

    tinker_xp Notebook Consultant

    Reputations:
    10
    Messages:
    179
    Likes Received:
    36
    Trophy Points:
    41
  49. raiden87

    raiden87 Notebook Evangelist

    Reputations:
    46
    Messages:
    341
    Likes Received:
    123
    Trophy Points:
    56
    will it be possible to upgrade that screen aftermarket to 120hz if i buy the 60hz version now?
     
  50. wickette

    wickette Notebook Deity

    Reputations:
    241
    Messages:
    1,006
    Likes Received:
    495
    Trophy Points:
    101
    ordered mine french side :). 240W or AW15R3 1070 confirmed.
     
← Previous pageNext page →