The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Alienware Dock

    Discussion in 'Alienware' started by Lp18, Oct 27, 2014.

  1. Lp18

    Lp18 Notebook Consultant

    Reputations:
    38
    Messages:
    266
    Likes Received:
    4
    Trophy Points:
    31
    Hello everyone, I just saw Engadget posting on facebook something about an alienware dock, so I opened it and yes it's real, its going to only work for now with the alienware 13 and the new laptops that come after it. It can handle anything up to 375 watts. If you find anything more just post it. I enjoy the idea but I would enjoyed it more if it worked with my alienware 18, but they mention it needs an Pci-Express port so yeap its going to difficult a litle bit more of "how about if we mod it".

    DSC01828.jpg

    EDIT: So it seems it's already out for who wants to buy it also uses an PCI-Express x16, although is has a 460 Watt Alienware Power Supply it's limited to a 375 Watt graphic card, I'm not really sure if it's about the connection or to "give some edge" to a specified brand of graphic cards, it just seems a bit too weird.

    Links:
    http://www.ign.com/articles/2014/10/28/alienware-13-laptop-offers-gpu-boosting-amplifier-add-on
    http://www.theverge.com/2014/10/27/7079879/alienware-new-graphics-amplifier-laptop
    http://www.extremetech.com/computin...nally-desktop-quality-graphics-on-your-laptop
    http://www.engadget.com/2014/10/27/alienware-graphics-amplifier/

    Video:
    http://www.youtube.com/watch?v=gSmH3f8i8HU

    WebPage of the product:
    http://www.dell.com/content/product...c=us&cs=19&l=en&s=dhs&sku=452-BBRG&redirect=1
     
    radji likes this.
  2. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    If someone wanted to be a bit daft but go to the extreme, you could technincally fit a GTX TITAN Z in the GA.
    Scratch that, never mind. You can't because its too wide.
     
  3. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    Nice !
    Finally i can use Titan Z/980 with my Alienware 13 mobile dual core ULV processor !!!! this is as stupid as it gets unless it can be used with updated/refreshed A14 or A17
     
  4. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Actually, you can't install a TITAN Z because the GA only supports up to dual-slot GPUs.
     
  5. hypersonic

    hypersonic Notebook Consultant

    Reputations:
    236
    Messages:
    186
    Likes Received:
    45
    Trophy Points:
    41
    Too much excessive space in that enclosure
    It's nice that the psu is replaceable
    but a custom psu can make the dock much, much more compact
     
  6. ForestEX

    ForestEX Notebook Guru

    Reputations:
    4
    Messages:
    68
    Likes Received:
    3
    Trophy Points:
    16
  7. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,727
    Messages:
    29,852
    Likes Received:
    59,676
    Trophy Points:
    931
    Does anyone think Alienware 13 mobile dual core ULV processor will be the bottleneck with a gtx980 / 980ti? :D
     
    TBoneSan likes this.
  8. radji

    radji Farewell, Solenya...

    Reputations:
    3,856
    Messages:
    3,074
    Likes Received:
    2,619
    Trophy Points:
    231
    Excessive space makes for better airflow -> better cooling.

    That said, its a step in the right direction for those Alienware users who want to use their Alienware 13 as a laptop on the go, and a gaming machine at home. This give them the better option of using a desktop GPU for the gaming at home.
     
    Game7a1 likes this.
  9. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Here's an interesting tidbit from the Verge:
     
  10. hypersonic

    hypersonic Notebook Consultant

    Reputations:
    236
    Messages:
    186
    Likes Received:
    45
    Trophy Points:
    41
    By excessive space I meant the space behind the psu, populated by the cables
    but yeah I believe they're preparing it for non blower cards on the market, or even dual gpu cards
    Maybe the form factor is just plain not appealing to me
    Such a "tube" shape is extremely uneconomical in space
    I would love to see something like this, I believe it can be done?
    [​IMG]
     
  11. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
  12. radji

    radji Farewell, Solenya...

    Reputations:
    3,856
    Messages:
    3,074
    Likes Received:
    2,619
    Trophy Points:
    231
    The current blower cards suck in from the side (top of the fan) and blow the hot air out of the vents at the rear of the card. The cables appear to be on the opposite side of the GPUs PCB. Can't see how they would impede air flow on that side. In this case, the intake on the GPU fan is right next to a vent on the dock's shell. Dunno how well that will or will not work, but it's better than nothing I suppose.
     
  13. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    That Information in there conflicts with other sources saying the PSU is 375w and the PCI-E slot is 16x (even the same source you pulled up, which is odd). EDIT: Never mind, it is PCI-E x4, but this is due to CPU limitation.
    Still, you are only using 1 GPU.
     
    TBoneSan likes this.
  14. radji

    radji Farewell, Solenya...

    Reputations:
    3,856
    Messages:
    3,074
    Likes Received:
    2,619
    Trophy Points:
    231
    Why in the hell would Alienware go thru the trouble of engineering an eGPU dock and only make it PCIe x4???

    That-makes-no-sense! :err:
     
  15. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    The CPU is what is limiting the bandwidth to x4, not the dock.
     
    TBoneSan likes this.
  16. Mobius 1

    Mobius 1 Notebook Nobel Laureate

    Reputations:
    3,447
    Messages:
    9,069
    Likes Received:
    6,376
    Trophy Points:
    681
    I wonder if the GS30 Ghost can do x16 with the external MSI dock? I'd buy that over the Alienware 13 if it's possible.
     
  17. elevul

    elevul Notebook Consultant

    Reputations:
    0
    Messages:
    203
    Likes Received:
    4
    Trophy Points:
    31
    I would love it if they made it work with a slim Alienware 17 that only had an intel integrated graphics on it and a 17" 4k display!
     
  18. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    If it has a quad-core i7, it should support x16 or x8. If it's a dual core, it'll be x8 or x4. PCI-e 3.0 x8 is enough for current gpus unless you're running a Titan-Z or R9 295X.
     
    Mr. Fox and radji like this.
  19. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    if only they came up with a better docking solution...
     
  20. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    They should have just thrown in a 970m and called it a day and ignore the whole dock. 860m is no slouch, but to use it for a new product when they could have engineered a 970m in it is absurd. And that 15W CPU just makes it that much less attractive.
     
    Mr. Fox likes this.
  21. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,907
    Trophy Points:
    931
    ULV cpus only have that many lanes. Though the lane count is fine for a high end gpu.
     
  22. radji

    radji Farewell, Solenya...

    Reputations:
    3,856
    Messages:
    3,074
    Likes Received:
    2,619
    Trophy Points:
    231
    OK, that makes sense. But then, why go thru the trouble of an eGPU if it will be limited by the low voltage CPUs to only PCIe x4? Wouldn't the 860m inside be sufficient enough?

    No its not.

    Even my old M17x R2 supports PCIe x8 across 2 cards. :confused:
     
  23. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,907
    Trophy Points:
    931
    SLI and single card bandwidth requirements are very different.
     
  24. radji

    radji Farewell, Solenya...

    Reputations:
    3,856
    Messages:
    3,074
    Likes Received:
    2,619
    Trophy Points:
    231
    That's not the point.

    Yes, PCIe x4 is sufficient for run-of-the-mill or older GPUs. Those don't need to communicate with with any other component on the computer at the speed of x16, then not having x16 will not slow anything down at all.

    So a GPU running at 4 lanes will have very little (if any) performance drop.

    See: HARDOCP - GTX 480 SLI x16/x16 vs. x4/x4 - GTX 480 SLI PCIe Bandwidth Perf. - x16/x16 vs. x4/x4

    But for a high end gpu, you should have a non-ULV CPU that supports the x8 at the least. Those GPUs are designed with the higher PCIe bandwidth support in mind.

    Hence my surprise with the fact Alienware would even go to the trouble in engineering an eGPU dock for the Alienware 13 if the laptop's CPU was limited to x4 anyway. :confused:
     
  25. woodzstack

    woodzstack Alezka Computers , Official Clevo reseller.

    Reputations:
    1,201
    Messages:
    3,495
    Likes Received:
    2,593
    Trophy Points:
    231
    wow... so wait, basically , if I am understanding this correctly, if they were to release the 13 with somerthing powerful like a 980m (which is of course not going to happen) it would be bottle necked by the cpu's LANES, not the fact its pretty slow to begin with ?

    So, basically, there is not enough bandwidth to really use 4k sufficiently at game-like frame rates from a 980 gtx desktop card, even. Maybe not even enough bandwidth for a 780..nevermind 780i or varients.
     
  26. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Yes 4x is the highest the 4210u supports. Two 4x lanes and they're PCI-e x16 gen 2 not 3.

    Definitely not the fastest, but cards should still perform decently, especially at higher resolutions and gpu-bound games.
     
  27. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,250
    Messages:
    39,346
    Likes Received:
    70,714
    Trophy Points:
    931
    I agree with the critics, but all of this hubbub about 4x, PCIe 2 versus 3, etc., really should not be a big deal considering the size, weight and capacity for performance. Anyone that would go for a machine the size and caliber of the 13, or even a large machine by Ultrabook standards, should not be looking for face-melting performance anyway. If they are, they are about to make a huge mistake and need to stop to re-evaluate the situation. These products are not designed to be desktop replacements. They are designed to be super thin, lightweight, easy for anyone to grab and run with, and capable of playing games if you do not already have access to something that is better-suited for that purpose. The eGPU is there to try to help the 13 make up for the ULV CPU and 860M shortcomings. Bottlenecked performance should still be a lot better than the machine by itself without the eGPU. Not saying that's a great thing, only that it has got to be better than not having the eGPU. And, it's got to be a humongous improvement over playing Android games on some piece of poop smartphone with a high-def touch screen, even without the eGPU. (Yes, there are people that actually do that. I don't get it, but they're definitely out there.) This is the niche where Ultrabooks serve their most legitimate purpose.
     
  28. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,907
    Trophy Points:
    931
    A GTX 970 or a "960" (whatever that may be) is likely the best pairing as it does offer 1080p gaming opportunities at home.
     
  29. radji

    radji Farewell, Solenya...

    Reputations:
    3,856
    Messages:
    3,074
    Likes Received:
    2,619
    Trophy Points:
    231
    But it still defeats the purpose of using one.

    The whole point of using an eGPU is to make use of the higher power and speed of a desktop GPU when paired with the 13. The discrete GPU in the 13 is specifically designed to give maximum performance within the confines of ULV's PCIe bandwidth limitations.

    The question I have are: will an eGPU (no matter which one) be able to outperform the discrete GPU in the 13 when throttled down the x4? If it can outperform the discrete GPU, then by how much? And, is the throttling worth the cost of buying the Dock and a desktop GPU to go with it?
     
  30. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Placing something like a gtx 980 would easily outperform the gtx 860m even at 4x speed. The performance loss should only be around 10-15% max from a x16 port. A dual gpu card would of course suffer from this more.
     
  31. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    That is, if a dual card GPU could fit inside the GA. Only the GTX 690 and HD 7990 can fit in there; most other dual GPU cards won't fit.
     
  32. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Doesn't mean nVidia/AMD won't release a dual slot GTX 990/R9 395X2. I can see a GTX 990 coming. It would use less power and produce less heat than a GTX 690.
     
  33. radji

    radji Farewell, Solenya...

    Reputations:
    3,856
    Messages:
    3,074
    Likes Received:
    2,619
    Trophy Points:
    231
    You're right. 900 series is too big to fit in that. Same with the TITAN-Z. 700 series also looks to big for the Dock.

    I can't see either company doing that. Said card would be too bulky, even for a dual width. You would either compromise air flow due to proximity with other PCIe cards, or not be able to use both the PCIe slots adjacent to the card due to said bulkiness.
     
  34. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    I really did not want to make that post, so crap.
    I actually researched a bit more, and only the R9 295x2 and TITAN Z can't fit in there. All other dual GPU cards can fit and usually are under or at 375w. As long as the GPU isn't triple slot, the GPU will fit in there.
     
  35. radji

    radji Farewell, Solenya...

    Reputations:
    3,856
    Messages:
    3,074
    Likes Received:
    2,619
    Trophy Points:
    231
    Regular TITAN looks like it can fit. I just wonder what the benchmark comparison will be between the x4 and the x16 will be.
     
  36. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    It can fit any two slot full size card. This means almost every two slot card out right now will fit. The titan-z is three slot and the 295x2 has a radiator attached so those are out of the question. Otherwise, it'll work with most cards. It supports 375w of power, that's dual 8-pin connectors.

    Here's an x16 to x8 example of performance: Impact of PCI-E Speed on Gaming Performance - Puget Custom Computers

    While x4 is slower, it's not like you can only use 50% of the gpu, you can use a lot more.
     
  37. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,250
    Messages:
    39,346
    Likes Received:
    70,714
    Trophy Points:
    931
    So, 2015-grade 25K 3DMarks on GPU and 2006-grade 3K 3DMarks (LOL) on CPU... electronics schizophrenia.
     
    TBoneSan likes this.
  38. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    After playing around in Star Citizen and it eating all my CPU had to offer I can safely say this 13 should literally get around 8 FPS. Awesome if you enjoy the Slideshow effect.
     
    Mr. Fox and Harryboiyeye like this.
  39. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,907
    Trophy Points:
    931
    Reminds me of putting the 7770M OCed to 1ghz in my acer with a 2.53ghz core 2.
     
    Mr. Fox likes this.
  40. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    They need to release a "CPU amplifier " box too.
     
    reborn2003 likes this.
  41. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Maybe if they had a server chipset like the c606, you can have dual xeons 2S scaling :D
     
  42. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,907
    Trophy Points:
    931
    Lol that would be funny, but yes with CPU latency would be a real killer.
     
  43. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,250
    Messages:
    39,346
    Likes Received:
    70,714
    Trophy Points:
    931
    Would be like back in the days of adding a math coprocessor to your 486SX system after you realize how bad you screwed up by not paying extra for the 486DX in the first place. Man, the stupid things we do to try to save a few bucks. Ends up cost more in the long run to fix it sometimes. It takes money for performance not to suck. It did way back then and does now. Some things don't change.
     
  44. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Is latency really still an issue on dual socket cpy systems? How bad is it with ivy-e? I've always wanted to make a dual socket 2011 platform.
     
  45. radji

    radji Farewell, Solenya...

    Reputations:
    3,856
    Messages:
    3,074
    Likes Received:
    2,619
    Trophy Points:
    231
    While Ivy-E has shown improvements, people need to realize: Xeon was designed to be a workhorse. It's not meant for speed.
     
  46. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Oh of course. Dual 12c xeons would render much faster than even a overclocked 8c 5960x, but what I'm saying is, let's say you have a multipurpose desktop. You want to also game on it, would there be horrible lag during gameplay (fps dips, etc.) assuming the game is highly threaded?
     
  47. radji

    radji Farewell, Solenya...

    Reputations:
    3,856
    Messages:
    3,074
    Likes Received:
    2,619
    Trophy Points:
    231
    Few (if any) games available today are coded to properly take advantage of an octacore or dodecacore Xeon.
     
  48. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
  49. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,907
    Trophy Points:
    931
    Hey I have no issue with a port on the 17 since it has a lot of cpu horsepower and it could make extending it's life as a desktop replacement easier.
     
    reborn2003 likes this.
  50. woodzstack

    woodzstack Alezka Computers , Official Clevo reseller.

    Reputations:
    1,201
    Messages:
    3,495
    Likes Received:
    2,593
    Trophy Points:
    231
 Next page →