The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *OFFICIAL* Alienware "Graphics Amplifier" Owner's Lounge and Benchmark Thread (All 13, 15 and 17)

    Discussion in '2015+ Alienware 13 / 15 / 17' started by Mr. Fox, Dec 10, 2014.

  1. illuMinniti

    illuMinniti Notebook Evangelist

    Reputations:
    131
    Messages:
    566
    Likes Received:
    261
    Trophy Points:
    76
    I leave the AGA plugged in while uninstalling and installing the GPU drivers.
     
    etern4l likes this.
  2. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    BTW in the m15 the internal dGPU is still active alongside the eGPU, rigtt?
     
  3. illuMinniti

    illuMinniti Notebook Evangelist

    Reputations:
    131
    Messages:
    566
    Likes Received:
    261
    Trophy Points:
    76
    If I remember correctly, yes. However, one thing that is weird is most games force itself to choose the AGA GPU. For example :
    Laptop HDMI to TV + AGA = videogame crash with some dumb error that makes no sense (I forget what it is but I know its really dumb).
    Laptop + AGA HDMI to TV = works fine for normal games.
    However I am pretty sure something like RPCS3 set to specifically use the laptop GPU, works fine.

    Edit: have to clear up, the AGA does work without even using DP/HDMI-out from it, it can use the Laptops ports. But a few times I have seen some really weird game error, like so weird I can't even think of what it says cuz it makes no sense. But it always happened when I used HDMI-out from my laptop to my audio receiver and then to my TV. Since I switch them around and forget about it every so often, like I wouldn't turn on my 2080 for some Netflix/movies when the laptop GPU is sufficient, but it would make me forget to switch the HDMI back over (my receiver has like 1x 4K60 HDR port).
     
    Last edited: Jun 30, 2020
    etern4l likes this.
  4. axtran

    axtran Newbie

    Reputations:
    0
    Messages:
    5
    Likes Received:
    1
    Trophy Points:
    6
    Great news! Upgraded my GPU last night to a XFX 5700XT blower. Runs great, quiet AND cool! My AGA is definitely modified (Corsair CX650M PSU, Noctua 92mm FLX fan).
     
    etern4l likes this.
  5. kqmaverick

    kqmaverick Notebook Consultant

    Reputations:
    55
    Messages:
    273
    Likes Received:
    30
    Trophy Points:
    41
    I wonder if Thunderbolt 4 will replace this as the best External Graphics option with its 32gb/s pcie support or if Alienware will update to pcie 4 with next gen systems.
     
    etern4l likes this.
  6. illuMinniti

    illuMinniti Notebook Evangelist

    Reputations:
    131
    Messages:
    566
    Likes Received:
    261
    Trophy Points:
    76
    Has anyone proven the AGA to bottleneck a GPU more than 1-3% yet? Tbh after matching my 2080 scores with 2080 in desktops, they're pretty much similar, I didn't notice any decreases at all. I actually remember a few scores being better with my 2080 in AGA, but that's after undervolting. But still, from what I gather the AGA isn't holding anything 2080 and below back. Can't speak for 2080Ti and above
     
  7. kqmaverick

    kqmaverick Notebook Consultant

    Reputations:
    55
    Messages:
    273
    Likes Received:
    30
    Trophy Points:
    41
  8. illuMinniti

    illuMinniti Notebook Evangelist

    Reputations:
    131
    Messages:
    566
    Likes Received:
    261
    Trophy Points:
    76
    Most of those are 1-3fps differences, which I'd call margin of error. Everything on the internet shows even a RTX 2080 should be bottlenecked by the AGA and even a simple 7700HQ. Except neither happened in gaming or benchmarks for me. That's why I'd like to see something like 2080ti AGA vs 2080ti Desktop for Heaven or Firestrike. I am sure someone on this forum has a 2080Ti or their Firestrike/TimeSpy score to share with it in the AGA. The best test would be to compare the same GPU stock in AGA and then in a desktop. But I'd be satisfied with just the AGA score from someone on this forum to compare with. I'd rather not use a reddit score or something where they don't know much of what they're doing or they are using the laptop screen as well lol
     
  9. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
  10. alaskajoel

    alaskajoel Notebook Deity

    Reputations:
    1,088
    Messages:
    1,031
    Likes Received:
    964
    Trophy Points:
    131
    Unfortunately, there is not a simple answer to this question. I'll try to give the best explanation I can because it gets asked a lot and misinformation abounds. For context, my comments apply to the recent Alienware models post 15R3 / 17R4. Some of the earlier devices did whacky things with the routing of PCIe lanes and resource allocation that make them a little different. TL;DR at the end.

    Cause of the AGA Bottleneck

    The bottleneck imposed by the AGA is a direct function of how much information needs to be sent over it's 4x PCIe 3.0 bus. The interface technically suffers a very small amount of bandwidth loss compared to the 4x PCIe 3.0 bus of a typical desktop computer because of the way it is implemented in the AGA. The caldera connector, cable length and inferior EMI shielding inherent in it's design all contribute, but it's a very small amount we can ignore for the sake of explanation.

    The amount of information transmitting the PCIe bus is primarily determined by how much work the CPU has to perform and subsequently transmit to the GPU and vice versa. In most cases--especially when gaming--the majority of PCIe traffic navigates the bus from CPU to GPU, so this is where we are most worried about bottlenecking. This assumption does not necessarily hold when the GPU is working on a task in a headless configuration (no display being drawn at all) or when an image is being rendered on one GPU and drawn on a display attached to another GPU. For simplicity, let's assume we are discussing the AGA bottleneck when gaming on a display directly attached to the GPU in the AGA. When using the AGA to render an image and output it on the internal laptop display, users will experience a wide range of performance results, influenced by both the model of laptop, the version of Windows being used and the OS settings applied. Microsoft has been very active on finding new ways to accommodate systems with multiple GPUs, rendering and outputting on different GPU devices, GPU scheduling, so the results in such a condition likely to change even over the next few months.

    Examples of where the AGA bottleneck does not exist
    If an AGA bottleneck is a function of how much information transmits the PCIe bus, let's use a few practical examples to understand where the PCIe bus limits do and do not exist. I am going to generalize a bit here for sake of example, but the result holds true on average. In almost all cases, a CPU will have more information to transmit across the PCIe bus to the GPU when frame rates increase, all else constant. Frame rates are not the only factor, but they are an easy one to understand. For most games, a CPU generates much more information to send to the GPU at 120 FPS compared to 60 FPS, assuming the frame rate is all that changes. Such is the case when a frame limiter is used and no other visual settings or the resolution is changed. High CPU loads alone are insufficient to create a bottleneck. For example, in Civilization 6, the AI turn times towards the end of the game can get very slow while the CPU works hard to calculate the thousands of actions the AI must perform. Despite working the CPU hard, these late game AI turns have largely no effect on any bottleneck imposed by the AGA because very little of this work performed by the CPU needs to be sent to the GPU.

    Similarly, extremely high GPU loads alone do not exacerbate an AGA bottleneck. A very simple game requiring minimal CPU power but drawn at an absurd 16k resolution will not require a lot of information sent across the PCIe bus, but will require an enormous amount of horsepower from the GPU. Another extreme example of GPU bottlenecking with minimal stress on the PCIe bus exists in a lot of non-gaming GPU work such as data science and crypto currency mining. If you have a moment to look at a crypto currency mining motherboard, you'll find many have a dozen or more 1x PCIe 2.0 slots to populate with powerful GPUs. Despite each one of these slots having 1/8th the bandwidth of even our lowly AGA, the 1x PCIe 2.0 interface does not bottleneck even the most powerful GPUs when it comes to crypto mining. The PCIe bus in this case is only used to send raw data to fill the VRAM, which the GPU slowly processes. The rate the GPU works through the data stored in its VRAM is slower than even the abysmal bitrate of a 1x PCIe 2.0 interface. Again, no bottleneck exists.

    Where the AGA bottleneck does exist
    So if the above extreme cases are where an AGA bottleneck does not exist, where will an AGA bottleneck be the most pronounced? The answer is where neither the GPU nor CPU are bottlenecks themselves, yet the information transmitted from CPU to GPU remains high. One such situation exists when playing relatively modern titles at low resolutions, and with exceptionally high frame rates. Such is the case with esports titles, where a game like CS:GO is still regularly played at very low resolutions of 1024x768. Even these games played on these competitive graphics settings are likely to hit a CPU or GPU bottleneck before the PCIe bottleneck on some hardware. Such was the case of my first AGA compatible 15R4 (i7-8750h), where I used a GTX 1080 in the AGA. The worst PCIe bottleneck I ever experienced (and was able to reliably measure) with this system was in CS:GO at 1024x768 and low settings. The AGA bottleneck in this scenario was an average frame reduction of about 10%, as measured against the exact same GPU in a desktop with an 8700k tuned to the equivalent TDP and frequency limits of the Alienware's 8750h. The case of CS:GO at 1024x768 low was the worst result I was able to identify after trying about 8 different games, all at unrealistically low resolutions. Rocket League also exhibited very similar behavior to CS:GO at 1280x800. While this is a seemingly significant result, the resolution is ridiculously low. I never experienced a bottleneck with this 15R4/AGA GTX1080 combination when playing a game at 1080p resolution or higher, no matter the game or graphics settings configuration. The CPU or the GPU would hit a wall before the PCIe bus bottlenecked.

    The 15R4 and GTX 1080 is showing its age, so what about a more modern Alienware configuration? At the other end of the spectrum is my current system: an Area 51m with a 9900ks (tuned to only 4.4Ghz for heat & noise) and a Titan RTX in the AGA. This is more power than almost anyone else will try to force through the AGA, but it presents a useful anecdote at the other extreme end of the spectrum. In the same conditions as the earlier CS:GO test on the 15R4, the Area 51m exhibits an average reduction of 28% in framerate compared to the exact same CPU (identical power and frequency limits) and GPU in a Z390 desktop build. While this is a seemingly huge deficit in comparison to that exhibited by the 15r4, remember this is at 1024x768, low settings and with the best CPU + GPU combination available in an Alienware laptop + AGA (at least until the A51m R2 hits shelves.) At a more realistic 1080p resolution, the deficit was only 11% on average compared to the desktop equivalent. The gap shrunk to 7% and 4% at 1440p and 4k respectively.

    While I was never able to reliably measure a drop in performance for the 15R4+GTX1080 system compared to its desktop counterpart at 1080p or above, the same cannot be said for the 51m + Titan RTX system. In the 8 games I tried, my experience is that you should expect a decrease in performance of 5-10%, compared to a desktop equivalent using the AGA with this level of elite hardware at 1080p. To oversimplify this result a bit, as the average FPS of the game increases, so too will the prevalence of an AGA bottleneck. At 1440p, the performance difference shrinks to between 4-7% and at 4k it closes to 2-5%.

    I have also tried similar testing the 51m with a 2070S and 2080 in the AGA. I concluded there is effectively no practical difference between using them in the AGA compared to a desktop. Perhaps 1-3% variation at most, but I'm certainly not sharp enough to recognize a difference that small and it could very well be within the margin of error for the hardware. The difference grew slightly at 1080p, but still always remained within 5% of the desktop.

    Last thoughts
    A few last things to mention. First, Nvidia is emphasizing GPU features such as tensor cores and raytracing that largely perform without help from a CPU. Turning on raytracing on a compatible AGA GPU results in the expected reduction to framerates we would expect from raytracing, but it causes the GPU to bottleneck earlier, reducing the performance impact of an AGA attributable bottleneck. Said another way, the cost of enabling raytracing on an AGA RTX GPU is comparatively less than enabling RTX in a desktop. If a desktop were to drop 20% framerate performance by enabling RTX, enabling it on an AGA RTX GPU might only drop your framerate by 15%. This isn't particularly novel, but if features like raytracing are important to you, the AGA won't hold you back.

    Second, Microsoft is moving towards more work being done on a GPU without help from the CPU through scheduling and DX12. I expect this to further reduce the stress on the PCIe bus as functions get moved directly to the GPU and no longer need to traverse the PCIe bus. I plan to test some of the new Windows and Nvidia driver updates in the AGA next month to see what improvements have been made.

    Third, many gamers with the money to spend on a fancy Alienware laptop, AGA and another GPU will also do their AGA gaming on an external monitor with a resolution higher than 1080. Very little bottlenecking exists under these conditions, even with extremely high end hardware. Only the most devout esports gamers who want to play at exceptionally low resolutions (<720p) and high FPS should absolutely shy away from the AGA in favor of a desktop because of AGA bottlenecking.

    Lastly, a well-thought-out system and properly configured game settings will always provide the best experience. Please don't pair an 8750h with a 2080ti and expect to get the same FPS as people with the same GPU in a 9900k desktop. Your laptop chip is inferior to the desktop ones used for GPU reviews. Even the 9900ks in my 51m is grossly inferior to the same chip unrestricted under Noctua copper in a desktop. These are not the fault of the AGA, but rather the result of our world's current fascination with thin-and-light machines. You can always modify your graphics settings to find the right balance of visuals and frames that work to minimize any premature GPU, CPU or AGA PCIe bottleneck.

    Overall, the AGA is still a fantastic performer in 2020 and outperforms any TB3 enclosure.


    TL;DR
    • Anyone with the i7-9750h and an AGA RTX 2080 or lower will experience little to no bottlenecking due to the AGA when using an external monitor of 1440p or above. Folks using 1080p displays can expect a deficit of 5-7% at most as compared to an equivalent desktop.
    • People with elite hardware (i9-9880h / AGA RTX 2080ti) should expect a worst case AGA bottleneck of 10% at 1080p, 7% at 1440p and 5% at 4k compared to an equivalent desktop.
    • Severe cases of AGA bottlenecking can be seen at the extremes, with as much as a 30% decrease in framerate seen in certain esports titles played at 1024x768 resolutions and low settings. These experiences are rare and limited to situations where very high end hardware (51m + 9900ks + AGA RTX Titan) is used with exceptionally low visual settings.
     
    Gumwars, FXi, illuMinniti and 2 others like this.
  11. illuMinniti

    illuMinniti Notebook Evangelist

    Reputations:
    131
    Messages:
    566
    Likes Received:
    261
    Trophy Points:
    76
    The entire post is very good info, but I don't want to quote it all and clutter the thread. But you are right, I stupidly have been giving my opinion based only on 4K benchmarks since I won't be using my 2080 for 1080p ever. Funnily, I did wanna try my 2080 on my 1080p240 monitor... but Modern Warfare 2019 doesn't like the AGA for some reason. It crashes and gives some dumb error if I have the AGA connected and load the game. However other games had no problem with it, like RB6 Siege. I'll have to give it a try sometime to see if it works nowadays, I did mean to message Activision in case its a problem everyone with a AGA has with the game
     
    alaskajoel and etern4l like this.
  12. kqmaverick

    kqmaverick Notebook Consultant

    Reputations:
    55
    Messages:
    273
    Likes Received:
    30
    Trophy Points:
    41
    I don't have any issues with the AGA and Modern Warfare and it looks like we have the same system except I am running a 9750H.
     
    illuMinniti likes this.
  13. kqmaverick

    kqmaverick Notebook Consultant

    Reputations:
    55
    Messages:
    273
    Likes Received:
    30
    Trophy Points:
    41
    I hope this remains true with the RTX 3080ti/3090 whatever they wind up calling it. My worry is that with the next gen cards moving to PCIe Gen 4 that PCIe Gen 3 x4 would have a bigger performance effect. I only game at 4K and I skipped the 2080ti because I did not feel ray tracing at 4K was there yet.
     
  14. illuMinniti

    illuMinniti Notebook Evangelist

    Reputations:
    131
    Messages:
    566
    Likes Received:
    261
    Trophy Points:
    76
    How long have you been using the AGA with Modern Warfare 2019? I tried when it released and even a couple months ago and it always had the same error. I always do clean GPU driver reinstalls though so I wonder why it wouldn't work for me unless it was fixed recently. No other games had problems, which is why I didn't even bother or care, especially with people have many complaints with the game not even working in general I was satisfied with the Notebook GPU
     
  15. kqmaverick

    kqmaverick Notebook Consultant

    Reputations:
    55
    Messages:
    273
    Likes Received:
    30
    Trophy Points:
    41
    It worked for me a few months ago, I tested it again yesterday and it worked fine.
     
    illuMinniti likes this.
  16. Soprano523

    Soprano523 Newbie

    Reputations:
    0
    Messages:
    1
    Likes Received:
    1
    Trophy Points:
    6
    Hello,

    I know this isn't a technical support forum but I have pretty much no where else to go for help because Dell won't assist me because the GPU I'm using is not officially supported. I bought a Galax RTX 2060 Super and I am having trouble making it work in my Alienware Graphics Amplifier. The 2060 Super is not showing up in device manager, the lights are on on the GPU, but the GPU is not being detected and the fans are not spinning.

    My specs for my Alienware 15 R2:
    Intel Core i7 6700HQ
    32GB RAM
    NVIDIA 965m (internal GPU)
    Windows 10 (version 2004)

    I have tried uninstalling and reinstalling the latest NVIDIA drivers, updating the BIOS, and the BIOS for the GPU itself, connecting only 6-pins instead of 8 from the PSU to the GPU (I later connected all 8-pins), updating all Alienware softwares (command center and AGA software), and connecting an external monitor. Nothing has worked (keep in mind that the 2060 Super fits in the Graphics Amplifier perfectly fine). I have tested the 2060 Super in another desktop and it works flawlessly, so it is not an issue with the GPU. I have also tested the Graphics Amplifier with a different GPU (1060 3GB) and it works fine, so the Amplifier itself is not faulty.

    If there is anything anyone can think of that I haven't tried yet that might make it work, please let me know.
     
    etern4l likes this.
  17. illuMinniti

    illuMinniti Notebook Evangelist

    Reputations:
    131
    Messages:
    566
    Likes Received:
    261
    Trophy Points:
    76
    That is actually extremely weird. My first thought was "how can a GPU not be supported if it simply fits in the PCIE slot and is within the PSU wattages?" and so I searched for the AGA supported GPUs list. The 2060 just isn't on it whatsoever. It's weird how a 1060 is compatible and a 2080Ti is compatible, but apparently a 2060 is considered problematic/unsupported to them? I can't think of a reason why that would be but it might actually just be a hardware limitation on the 2060 GPU's

    I actually did even another fresh install of AWCC and GPU drivers a month or so ago so I am about to try this again and see, thanks for the info
     
  18. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Have you asked support when they are planning to pull the finger out and update the drivers as well as the compatibility page:

    https://www.dell.com/support/articl...mplifier-supported-graphics-card-list?lang=en

    There is no good reason for them not to support your card, and look at the list of supported systems: 2 years old!

    Ask on Twitter.
     
    Last edited: Jul 10, 2020
  19. illuMinniti

    illuMinniti Notebook Evangelist

    Reputations:
    131
    Messages:
    566
    Likes Received:
    261
    Trophy Points:
    76
    This is what I get any time I try to use my AGA with Modern Warfare. I wonder if it simply doesn't like sending the signal from AGA to Laptop to Monitor. Every other game is fine with it but I probably didn't care enough to grab a HDMI cord to try this directly plugged in.
    [​IMG]
    EDIT : Yep, just tested the game by plugging HDMI directly into the AGA, CoD runs fine. Weird that its one of the only games that has ever done this, most others just work fine sending display-out thru the AGA cable
     
    Last edited: Jul 10, 2020
  20. dsmrunnah

    dsmrunnah Notebook Guru

    Reputations:
    5
    Messages:
    51
    Likes Received:
    39
    Trophy Points:
    26
    I know I’m a little late, but I’ve benchmarked (Firestrike & Timespy) both the RTX 2080 when they first came out and the RTX 2070 Super and only saw 2-3% difference in GPU scores between the AGA and a full x16 3.0 desktop slot. Part of that could have also been related to my desktop CPU being much better than my laptop CPU.

    The only time I’ve seen major bottlenecking was if I tried to use the laptop monitor. My internal GTX 1080 did better than the 2070Super in the AGA when trying to use the laptop screen instead of an external monitor.

    Edit-
    RTX 2080, Firestrike 1.1 comparison - https://www.3dmark.com/compare/fs/17632650/fs/18159004

    RTX 2070 Super (Hybrid), laptop run - https://www.3dmark.com/fs/21721610

    I thought I had a comparison with the 2070 Super, but evidently I never benchmarked the desktop with it. My GF's PC has a 9900k/2070Super setup now so maybe I can run a benchmark on hers to see how it does, but I'm betting it'll be close since it benchmarks almost what my RTX 2080 did on the GPU scores.
     
    Last edited: Jul 23, 2020
    etern4l likes this.
  21. dsmrunnah

    dsmrunnah Notebook Guru

    Reputations:
    5
    Messages:
    51
    Likes Received:
    39
    Trophy Points:
    26
    I'm pretty sure I've seen people using the 2060 in AGA's before. Does the Galax 2060 use a reference card? I have heard of issues with using custom PCB's from like EVGA or Asus and having issues with the AGA recognizing them.
     
    etern4l likes this.
  22. Zelb

    Zelb Notebook Enthusiast

    Reputations:
    4
    Messages:
    10
    Likes Received:
    7
    Trophy Points:
    6
    Does anyone have any experience with installing the Corsair SF series PSUs in the amplifier? I just need to know if any mods would be needed.
    The power plug seems to be close enough and it is a smaller power supply so it could line up with the hole in the aga. There is also a SF to ATX bracket that could be used.
    I ordered one already as I like the idea of having the psu fan facing the outside of the enclosure and not the gpu. Also it is modular so less clutter inside the case.
    The noise when idle seems to be bothering me lately since working from home became permanent so replacing the psu and fan.
    Picture of the power connection on the power supply below:

    [​IMG]
     
    etern4l likes this.
  23. illuMinniti

    illuMinniti Notebook Evangelist

    Reputations:
    131
    Messages:
    566
    Likes Received:
    261
    Trophy Points:
    76
    Imo I would just get a normal sized PSU. That SF750 is very expensive for what it does. It is gonna do the same thing as a normal sized PSU, but for almost double the price while having a smaller fan. I thought it would be weird having a PSU right behind a hot GPU, but I went with a EVGA Supernova G5 that has a 150mm fan and honestly I can't see temps ever being a problem. You figure the GPU is gonna use less than half the PSU wattage and then on top of that have a giant fan to circulate air for it as well.

    If you go with the SF750, I can't see it being too difficult, aside from the AC plug port very likely being in the wrong spot. I guess there isn't really a standard for those plug locations, since desktops just have a giant opening cut out for the PSU. It's pretty thick metal on that grill back there too, its definitely the strongest of the material of the whole AGA and you'll need a tool to cut a new AC port. But you would have this problem regardless unless you went with a PSU you knew 100% would match the AC port location of the AGA.

    But yeah, the coil whine of the stock PSU in the AGA is the most obnoxious thing I've ever heard. I always thought it was my GPU, since things like furmark and GPU-Z have a few ways of stressing the GPU and making obnoxious coil whine. But it is the cheap PSU.

    Also to anyone that cares, I used Krylon ColorMAXX paint on mine and even though the temp spec of the paint was veryyy close to the max temp my GPU would be hitting, I checked inside months later and none of the paint was damaged or melted. I had assumed the temp of the GPU die would be much hotter than the temp of the AGA box would ever reach, due to airflow and that the AGA isn't a heatsink lol, but still.
     
    FXi and etern4l like this.
  24. Zelb

    Zelb Notebook Enthusiast

    Reputations:
    4
    Messages:
    10
    Likes Received:
    7
    Trophy Points:
    6
    Thanks so much for the reply.
    I have to admit i never thought that a large psu fan would help with cooling inside the amplifier, as i definitely know that cooling is really bad at stock. Heat is trapped inside at high loads and one of the culprits is the stock psu, so your suggestion was great.
    I know I wanted a modular PSU and that ruled out the Corsair CX power supplies that are a drop in fit.
    Then I saw the SF PSUs and the direction of the fan made sense to me. Originally i was going to go for the 600w but it was out of stock so I went for 750w. It is slightly more expensive than the G5 if you buy direct from Corsair so I have no regrets about the price. I might use it for an itx build since my laptop is getting older now!
    Having to mod the amplifier is another concern since I don't really have any tools to do that except sharp knives! Might need to get a dremel or something....
     
  25. illuMinniti

    illuMinniti Notebook Evangelist

    Reputations:
    131
    Messages:
    566
    Likes Received:
    261
    Trophy Points:
    76
    Yeah the stock PSU has like a 80mm fan and from my memory, it also sucks in from the GPU side. So if they felt that that was adequate, honestly anything is a improvement. Not to mention idek if the stock PSU is any form of 80plus certified. I don't think the SF750 is gonna perform bad by any means cooling-wise, I just figure there is gonna be a lot of extra space regardless in the box so a cheaper bigger PSU to me was a better choice.

    https://imgur.com/a/FAqxis4 - Here are pics of mine when I finished modding it. I have since turned the fan around so it throws air outside the front of the AGA and it works better temp-wise for mine (but I did cut away a lot of plastic). You can see the metal cut away on the back of the AGA. A dremel is what I used for a lot of the modding and cutting. It melts the AGA plastic though, instead of cutting, but thats a common thing with dremels due to the high RPM on plastic.

    But yeah for sure, if you have the money for a SFX PSU, one day hopefully they are the norm for desktop PSUs. I prefer laptops but PSU have been big and heavy for years. Who knows, maybe eventually a new AGA will use a SFX PSU form factor. I just figured gaming at 4K, I am going to get a lot of use out of this AGA for years and a ATX PSU wouldn't be a bad investment.
     
  26. Zelb

    Zelb Notebook Enthusiast

    Reputations:
    4
    Messages:
    10
    Likes Received:
    7
    Trophy Points:
    6
    I see you went all out with the paint job! Thanks for the pictures, i see what is involved now in regards to cutting. Good tip to try expelling air at the front too. Is that a 120mm fan? I didn't know it could fit. I ordered a 92mm noctua fan which i have seen other people upgrade to. I will do some testing this week to see what works best. The new psu should not really spin the fan at all even at high load, i think it needs around 300w to start spinning. It is completely overkill for this application. If I still get heat trapped inside the case after the upgrades i might drill some holes on the case, on top of the gpu. They should really be offering a mesh cover for the aga!
     
    illuMinniti likes this.
  27. illuMinniti

    illuMinniti Notebook Evangelist

    Reputations:
    131
    Messages:
    566
    Likes Received:
    261
    Trophy Points:
    76
    Thank you. It is a 140mm fan, but that can't really fit without cutting away A Lot of useless plastic from the front. And thendoing that also makes it so the hinge system doesn't work. Imo, the AGA is a complete pain to open, between the hinge, the lock in the back and the clips on the sides, so I don't miss the hinge at all since either way its very hard to open.

    The AGA at stock uses a 92mm fan, but I cut away all of that plastic and used epoxy to glue a 140mm metal mesh grill to mount the 140mm fan. Without the plastic cut away, it really doesn't get much airflow anyway, it literally blocks about 40%+ of the fan just so it can be mounted. Some people have drilled holes and modded theirs a lot, I am sure its very effective too while being simple. But imo I would just cut out the plastic windows before I would drill holes all over.

    It is a pain though to work with, they didn't really use screws or glue. They melted the plastic to create a seal. And there is like 3 different layers of thick plastics to create this box. That's another way I managed to fit the 140mm fan, I cut and sanded down a bunch of these useless plastic sticking out of the top lid
     
  28. Zelb

    Zelb Notebook Enthusiast

    Reputations:
    4
    Messages:
    10
    Likes Received:
    7
    Trophy Points:
    6
    As an update I installed the noctua nf-a9 flx with the low noise adaptor and the sf750. The psu lines up with the hole for the power cable but there is no way to secure it. To do that you would need to attach the psu to the ATX bracket and then to the back of the amplifier.
    A large section would need to be cut a bit further up and to the right of the power outlet hole on the back of the amplifier. I started cutting with a dremel but stopped after I realised how ill prepared I am to do the mod. There is both metal and plastic at that section.
    I have the psu freestanding at the moment inside the amplifier without the bracket. That is until I get some more effective cutting heads for the dremel and some safety equipment. Also I would need to do it in the garden and not indoors.

    As it stands now acoustically, it is much quieter. The noctua fan with the low noise adaptor is at around 11db, the psu is practically passive as there is not enough load to spin the fan and the loudest part is the gpu which is running at idle at 1500rpm but is quite silent.
    Thermally the gpu temps are exactly the same as with the stock parts. I was playing Diablo 3 at 4k before at 63/64C gpu temp and 2200rpm, and the same now. Heat still gets trapped inside and the gpu is running hoter than it should which makes it more noisy under load for more demanding games. The best solution would be to have it uncovered but I have cats, so there needs to be a cover of some sort.
    I think I would also need to cut a square in the gpu intake side, and on top of the gpufor the hot air to escape. To cover the holes I am thinking the standard mesh/dust covers that you can buy for different fan sizes, and somehow attach them on so they are not exposed. Anyway knowing me, i will probably procrastinate a lot until that happens!
     
    illuMinniti likes this.
  29. illuMinniti

    illuMinniti Notebook Evangelist

    Reputations:
    131
    Messages:
    566
    Likes Received:
    261
    Trophy Points:
    76
    It's been months since I looked at prices for this stuff, but amazon has really good priced stuff for dremels and cheap simple PC stuff like fan grills. There are a lot of cheap dremel accessories that people would complain about, but I ended up choosing that Dremel pack. I believe they are also sold in stores like Home Depot, but usually to me the cost is less than a dollar difference and it got delivered to my door. Corona likely is affecting prices for these.

    Yeah all that plastic is the most annoying thing I have ever seen for a case, but I am sure that's how Dell made such an effective sturdy and efficient cheap eGPU box. In the end I ended up spending as much as a more expensive box (15$ paint, 100$ PSU, 20$ fan, 5$ fan grill, 200$ AGA, 10$ for dremel accessories, 10$ for epoxy), but it performs better, has higher quality parts and looks nicer imo.

    Those cheap clear-black side windows are inconvenient though. Without them it would be ugly, open and less sturdy. After cutting so much plastic mine is noticeably less sturdy, but its not like I plan on manhandling a 1000$ box so I am ok with it.

    Dremel accessories, was 10.75$ now its 25$.
    https://www.amazon.com/gp/product/B00005LEY1/ref=ppx_yo_dt_b_search_asin_title?ie=UTF8&psc=1

    5.15$, good quality thick metal mesh fan grill (but its 140mm, it can def fit on the side though).
    https://www.amazon.com/gp/product/B007EVKIXW/ref=ppx_yo_dt_b_search_asin_title?ie=UTF8&psc=1
     
    etern4l and Zelb like this.
  30. kqmaverick

    kqmaverick Notebook Consultant

    Reputations:
    55
    Messages:
    273
    Likes Received:
    30
    Trophy Points:
    41
    Back to talk about the AGA needing to be updated. If the rumors about the RTX 3000 series are true we are looking at a new power connector requirement and a much bigger GPU. Understand that swapping out the power supply is not a huge deal but I doubt a RTX 3090 from the leaked pictures will fit inside the enclosure.
     
    build6 and etern4l like this.
  31. illuMinniti

    illuMinniti Notebook Evangelist

    Reputations:
    131
    Messages:
    566
    Likes Received:
    261
    Trophy Points:
    76
    Yeah the 3090 will not fit without the lid being off or the case being modded. Apparently the 12-pin adapters need to be made specifically for specific PSU (or maybe just the different brands?) so its even more unlikely a stock AGA would work.
     
    etern4l likes this.
  32. nuestra

    nuestra Notebook Enthusiast

    Reputations:
    0
    Messages:
    36
    Likes Received:
    1
    Trophy Points:
    16
    Hey hey ;) i had a 2080ti palit gaming pro in aga and had with area 51m over 8000 points in rtx benchmark...


    can we put the 3090 in the aga when we open the aga and cut the side away ?
    is the 3090 ready with drivers when the 3090 start ?
    for the power kabel i think we get adapters...

    i had a lot of problems with the 2080 super with aga in the area 51m. are all problems away with last bios from the area 51m
     
    etern4l likes this.
  33. kqmaverick

    kqmaverick Notebook Consultant

    Reputations:
    55
    Messages:
    273
    Likes Received:
    30
    Trophy Points:
    41
    New cards should be announced Tuesday, after that we will have a better idea on the 3090 dimensions but the leaks are show not just a triple thickness but also a taller and longer card.
     
  34. illuMinniti

    illuMinniti Notebook Evangelist

    Reputations:
    131
    Messages:
    566
    Likes Received:
    261
    Trophy Points:
    76
    Honestly since someone had a problem with their 2060 not working, and not even being compatible according to Dell, its also possible the 3000 series can't even work at all. But physically without the lid I feel like it should have np plugging in. I hope it does though, I don't plan on upgrading but I wanna see this whole PCIE4 + AGA cable performance (or lack thereof, since PCIE3 should bottleneck the RTX3000 right?). Which will only be possible when a CPU that supports PCIE4 is available on Alienwares.
     
  35. kqmaverick

    kqmaverick Notebook Consultant

    Reputations:
    55
    Messages:
    273
    Likes Received:
    30
    Trophy Points:
    41
    They would also need to release a new AGA with PCIE4 support. I am going to get a 3090 and hope for the best when they released. Worst case I figure I ditch the AGA completely and get a thunderbolt enclosure.
     
  36. illuMinniti

    illuMinniti Notebook Evangelist

    Reputations:
    131
    Messages:
    566
    Likes Received:
    261
    Trophy Points:
    76
    TB3 loses a lot of performance afaik, even with like the GTX 900 series (which is like -20%). I feel like a RTX 3000 series card would lose a lot more performance, unless of course gaming at higher resolutions.

    I wonder if a new AGA does come out, if we could simply buy a replacement PCIE adapter that goes in the AGA. I don't plan on upgrading anyway, but if it came down to spending 30$ on a adapter vs 250$ for whole new AGA, I would cut a square out of the side of mine
     
  37. kqmaverick

    kqmaverick Notebook Consultant

    Reputations:
    55
    Messages:
    273
    Likes Received:
    30
    Trophy Points:
    41
    I doubt that would work if they switch to PCIE4. I am tracking the TB3 issues but it’s still better than the AGA if dell just decides it’s not worth supporting anymore.
     
    FXi likes this.
  38. CarbonCamaro

    CarbonCamaro Newbie

    Reputations:
    0
    Messages:
    1
    Likes Received:
    1
    Trophy Points:
    6
    FWIW, I was able to get a 2060s and 2070s to work with the AGA without any issues. Literally just plug and play, no crazy driver issues or anything. Both were Founders Editions though.
     
    illuMinniti likes this.
  39. illuMinniti

    illuMinniti Notebook Evangelist

    Reputations:
    131
    Messages:
    566
    Likes Received:
    261
    Trophy Points:
    76
    I doubt it will too. I doubt they'll even sell seperate PCIE adapters. But it would be cool. Zotac has shown their 3090, and it is not a triple slot card. It is a little more than a double slot, but most are like that. I feel like Nvidia is doing what they always do, selling some unique looking new 'founders' edition that is slightly better than the previous generations (in terms of cooling) but still behind the aftermarket and triple-fan GPUs.
     
  40. kqmaverick

    kqmaverick Notebook Consultant

    Reputations:
    55
    Messages:
    273
    Likes Received:
    30
    Trophy Points:
    41
    Looks like the 3080 will fit, no chance on the 3090.
     
  41. kqmaverick

    kqmaverick Notebook Consultant

    Reputations:
    55
    Messages:
    273
    Likes Received:
    30
    Trophy Points:
    41
    Official Specs are online:

    3090: 12.3" length, 5.4" tall, 3x wide
    3080: 11.2" length, 4.4" tall, 2x wide
    3070: 9.5" length, 4.4" tall, 2x wide

    Looks like none will fit.
     
    build6 likes this.
  42. kqmaverick

    kqmaverick Notebook Consultant

    Reputations:
    55
    Messages:
    273
    Likes Received:
    30
    Trophy Points:
    41
    Twitter reply from Alienware 0A314444-8FCC-4C2A-AE9A-FC8884050B10.png
     
    build6 likes this.
  43. twin snakes

    twin snakes Notebook Consultant

    Reputations:
    97
    Messages:
    175
    Likes Received:
    117
    Trophy Points:
    56
    lol time to ebay my 2080Ti
     
  44. devilhunter

    devilhunter Notebook Evangelist

    Reputations:
    120
    Messages:
    361
    Likes Received:
    254
    Trophy Points:
    76
    can you fit 3090 on AW Graphics amplifier
     
  45. kqmaverick

    kqmaverick Notebook Consultant

    Reputations:
    55
    Messages:
    273
    Likes Received:
    30
    Trophy Points:
    41
    not with the lid on it. It is to tall, long and wide.
     
  46. newvelaric

    newvelaric Notebook Consultant

    Reputations:
    6
    Messages:
    117
    Likes Received:
    15
    Trophy Points:
    31
    Guys, I just bought an M17 R3 about 2 months ago. And then bought the Graphics Amplifier, only to discover the RTX 3000 series is coming out with sizes that will not fit inside the Amplifier. lol

    I am fine with that since I want to use the RTX 2080 Ti. I am just waiting for a good deal. Because of that, I was wondering if the current PSU and fan are ok with such a strong card. Or should I change to a new PSU and fan? If I have to change, what should I buy as a new PSU and a new fan? Thanks in advance for the help!
     
    build6 likes this.
  47. doofus99

    doofus99 Notebook Deity

    Reputations:
    284
    Messages:
    1,013
    Likes Received:
    508
    Trophy Points:
    131
    A few days ago I almost pressed the button on a brand new desktop system with a RTX 2070 Super. Very glad I did not go ahead.

    I have made a quick table comparing NVidia graphics cards - I would be expecting a very serious price drop on the 1080s and the 2080s now. Very much worth holding out a bit more since the 3080 is a beast and will easily see us through 2+ years.

    2020-09-04 21_17_13-Microsoft Excel - ground floor drains levels.xlsx.jpg
     
  48. cn555ic

    cn555ic Notebook Deity

    Reputations:
    149
    Messages:
    917
    Likes Received:
    470
    Trophy Points:
    76
    The 3080 without the lid you guys think it will work? I know the 3090 is a lot thicker but the 3080? About to pull the trigger if it fits
     
  49. newvelaric

    newvelaric Notebook Consultant

    Reputations:
    6
    Messages:
    117
    Likes Received:
    15
    Trophy Points:
    31
    Ah crap! I just read about the specs for the series 3000! They look great! What to do? The lid is nice, but do you need it in
    Yikes! I see!!!!!

    Now I am conflicted! Should I wait or jump into the water for the RTX 3080/3090? Choices!!!!
     
  50. newvelaric

    newvelaric Notebook Consultant

    Reputations:
    6
    Messages:
    117
    Likes Received:
    15
    Trophy Points:
    31
← Previous pageNext page →