The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *OFFICIAL* Alienware "Graphics Amplifier" Owner's Lounge and Benchmark Thread (All 13, 15 and 17)

    Discussion in '2015+ Alienware 13 / 15 / 17' started by Mr. Fox, Dec 10, 2014.

  1. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    BTW There is a better one: an 8 lane proprietary eGPU used in the new Asus Flow. Unfortunately the
    3080 GPU is integrated with the enclosure... And of course 13 inch laptops are not for everyone.
     
  2. Terreos

    Terreos Royal Guard

    Reputations:
    1,170
    Messages:
    1,847
    Likes Received:
    2,264
    Trophy Points:
    181
    I have heard of that. Haven’t had the chance to check it out yet. That’s honestly a solution I’d like to put through it’s paces though. My only worry is that’s an ASUS product. They seem to have awesome ideas that they only ever do for one year then drop it the next. Like I don’t see upgraded versions of the water cooled laptops, motherships, etc. So I could see that being the same thing here. But I’ll take a look. I feel solutions with an external gpu and a laptop are the way forward.
     
    FXi and etern4l like this.
  3. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Agreed, but vendors might prefer selling laptops rather than eGPU enclosures...
     
    Last edited: Feb 9, 2021
  4. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,092
    Trophy Points:
    931
    I deleted numerous posts that were part of an argument/back and forth. Nobody is in trouble but we’ll just need to keep things lower key going forward.
    Charles
     
    Papusan and Virale like this.
  5. Gumwars

    Gumwars Notebook Evangelist

    Reputations:
    291
    Messages:
    341
    Likes Received:
    371
    Trophy Points:
    76
    That's a $3K+ purchase, and unlike AW, you are forced to buy the whole thing upfront. Asus isn't offering it as two separate units, it's all or nothing. Further, while the Asus solution is better owing to it being 8x rather than 4x, the proprietary enclosure is a single-use affair; you can't swap in a different card and it appears to be very limited on I/O. Only one displayport and HDMI are at your disposal.

    Dell is screwing up big time walking away from the AGA, it's still the best eGPU on the market. As a fair question to you all, will this really hurt Alienware in the long run? It seems that the product stack is all about looking a certain part but no longer really being able to perform it. I feel like the angle is to hook the whales with deep pockets rather than those that make deliberate decisions when making purchases like this. I don't have thousands sitting around to buy a new laptop every year. This is something I do every 5-7 years, and I don't think I'm alone on that. The AGA is why I can go that long between purchases and without it, I don't have a reason to stick with Dell. They must know that and don't care because whatever they're doing is working from a profitability standpoint.

    Are we the fringe?
     
    Virale, FXi and etern4l like this.
  6. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Well, last time I priced a reasonable 3080 m15 R4 config wasn't cheap, so the $3k for the Asus solution looks like a good deal. Of course, the particular eGPU will particularly appeal to those wanting o travel with the eGPU (?). I hope Asus releases a bigger, proper enclosure model and adds the port to further laptops.
     
    Last edited: Feb 9, 2021
    Gumwars likes this.
  7. FXi

    FXi Notebook Deity

    Reputations:
    345
    Messages:
    1,054
    Likes Received:
    130
    Trophy Points:
    81
    Asus walks away from customers worse than Dell (and that's not something easily done! :)
    But sure wish they'd consider updating/upgrading the AGA than dumping it. But that said in the years since it was released TB solutions have become much better. Not so much that I wouldn't prefer the AGA, but with the lanes of TB wired direct to the CPU, the change to a TB solution might be at least a possible way out of dumpville. For the moment since both next gen chips (Rocket Lake and Tiger Lake) as well as 3xxx GPU's are native pci-e 4.0 solutions, and in theory a 4 lane 4.0 would double the bandwidth we have now as well as handily match the ability of the Asus Flow custom solution, I think it's worth waiting to see if something is forthcoming. I'm not overly hopeful, but it's reasonable to think 4.0 forced something to get altered and thus the old became EOL rather than simply updated.
     
    Gumwars and etern4l like this.
  8. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Yeah. Sadly TB4 is still 40 Gbps. My guess is Dell won't pick up the ball and we will be stuck with "faster TB3" for a while, which should be good enough for many use cases, including 4K gaming. What's also annoying is that compelling AMD CPU options are cropping up, but one of them support TB. Perhaps USB4 will be the answer, as it basically incorporates TB3.
     
    FXi likes this.
  9. Terreos

    Terreos Royal Guard

    Reputations:
    1,170
    Messages:
    1,847
    Likes Received:
    2,264
    Trophy Points:
    181
    Well I did check out the Asus x13 Flow. And I'm intrigued. Especially if the Graphics amp is going away. Seriously the only ones I can find are people trying to resell them at like $500. So something is up. My only concern is I'm getting the feeling the Flow is using the lower end 90w 3080. Unless someone can confirm otherwise?

    TB4 does still seem to have some improvements. seems like the lows improved a good bit. So if you're sensitive to fps dips this is good news. Really makes you wonder if they made a proper TB4 egpu if that would help more. But we'll have to see.

     
    etern4l likes this.
  10. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    Asus will sell the fab ROG Flow X13 without an eGPU after all

    Proprietary design is still disgusting! On top forced to use castrated GeForce RTX Mobile cards is the last nail in the coffin. No thanks.
     
    FXi, etern4l and Gumwars like this.
  11. Gumwars

    Gumwars Notebook Evangelist

    Reputations:
    291
    Messages:
    341
    Likes Received:
    371
    Trophy Points:
    76
    It would be nice to see the industry leapfrog off of this idea and offer a standardized 8x PCIe port to capitalize on this segment. In the absence of that, proprietary is all we're ever going to see. Your link says you can get the Flow without the eGPU, but they only offer the 3080 in it. My issue with Ampere is the blatant disregard for the customer by obfuscating the TGP for the GPU packaged in the laptop. Nvidia eventually directed OEMs to label it, but only after the vast majority didn't.
     
    FXi, etern4l, Papusan and 1 other person like this.
  12. Virale

    Virale Notebook Evangelist

    Reputations:
    312
    Messages:
    529
    Likes Received:
    775
    Trophy Points:
    106
    Yeah I’m so glad Nvidia is finally putting a halt to this bull crap.

    Hopefully now you’ll be able to see performance numbers before buying like TDP, Clocks, etc. As it should be... not just “3080” or whatever
     
    FXi, Terreos and etern4l like this.
  13. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    OMG so the Flow "eGPU" is actually a mobile chip,.. yeah that explains the size of the unit. Wow Asus.
     
    FXi, Terreos and Papusan like this.
  14. Terreos

    Terreos Royal Guard

    Reputations:
    1,170
    Messages:
    1,847
    Likes Received:
    2,264
    Trophy Points:
    181
    That’s exactly what I was afraid of. The lack of specs and how small it was had me thinking it was too good to be true. It does say it’s the 150w version. So it should be in 2080S performance range. Which is what the laptops with this chip would get. Depending on performance lose through the egpu connection. Plus like I’ve said before Asus will likely only make these for one year and it will be dead. So it would be nice if someone make a universal standard for this kind of port.
     
    Last edited: Feb 8, 2021
    FXi, Papusan and etern4l like this.
  15. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    There should be no perf loss given the 8x PCIe link, but agree with other points. A weird solution from Asus, they should have gone for a full fat desktop GPU or just an open enclosure - they would have a winner.
     
    Gumwars and Terreos like this.
  16. Terreos

    Terreos Royal Guard

    Reputations:
    1,170
    Messages:
    1,847
    Likes Received:
    2,264
    Trophy Points:
    181
    Yeah I have mixed thoughts on that. Being portable is nice. . .but then why not buy a machine that just has a 150w 3080 inside it? If Asus had a better track record of not dumping projects before they even get off the ground I could see this being decent. Like if they kept making these and when new GPU’s were made and you could easily buy and upgrade the eGPU. We’ll have to see if Gigabyte or Razer steps up to the plate and makes something similar but user upgradeable.
     
    etern4l likes this.
  17. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    I expect Asus will most likely throw this awful idea down the drain as they did for the water docked Asus ROG GX700 with liquid cooling
    [​IMG]


    Stupidity have no limits, bro Terreos. eGPU for Mobile cards is a damn stupid idea! And should never have seen the light.
    [​IMG]
     
    FXi, Gumwars, Terreos and 1 other person like this.
  18. Terreos

    Terreos Royal Guard

    Reputations:
    1,170
    Messages:
    1,847
    Likes Received:
    2,264
    Trophy Points:
    181
    Yeah I'm afraid I agree Buddy. If they did it right an made it an upgradeable egpu like the Graphics amp then that thing would sell like hot cakes.

    They should hire me as an idea guy. I'll straight up tell them what's a good idea and what's stupid. ;)
     
    FXi, Gumwars and Papusan like this.
  19. Suxel

    Suxel Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    Good Day Good people of Alienware lounge, I need a bit of help with recently bought Alienware Amplifier and 2080 Super Founders Edition. I've got 13 R3 with 1060 onboard. Bought amplifier and 2080 Super FE. Alienware logo is lit up, cable LED is lit up. Geforce Experience won't download drivers saying "Unable to download recommended driver", so I've installed them manually (desktop variant) from nvidia website but I can't use 2080S. Everything runs on iGPU. Device manager sees 2080S, I can see the fans spinning, everything is lit up, Amplifier app on R13 can see the amplifier. I've used DDU to remove and re-install drivers in safe mode, but still the same thing. I've got no Nvidia Control Panel either, as when I fire it up I get "Nvidia Display settings are not available. You are not currently using a display attached to NVidia GPU". GPU-Z can see 2080S but says it has 0MB Memory. Unplugging Amplifier reverts everything back to normal, 1060 shows up, Geforce Experience works fine, Nvidia control panel works fine. The panel I've got is 1440p OLED 60Hz panel with no G-sync. What am I missing here please guys? I'm using internal laptop display
     
  20. Suxel

    Suxel Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    Fixed it by preventing windows from automatically downloading 2080 Super drivers which then GFE couldn't download it's own drivers for because it didn't know what gpu is in laptop
     
  21. Juan Phoenix

    Juan Phoenix Notebook Guru

    Reputations:
    15
    Messages:
    54
    Likes Received:
    29
    Trophy Points:
    26
    I have the same PC, Alienware 13 R3 OLED, for almost a year I had the 2080 Super with AGA, (Gigabyte OC Edition), and I never had any problem, I recommend you to reinstall the AGA driver, and not use the 461.33, 461.40, 461.51 drivers, they are broken, you can use the 461.09 down in its desktop version.
     
  22. Juan Phoenix

    Juan Phoenix Notebook Guru

    Reputations:
    15
    Messages:
    54
    Likes Received:
    29
    Trophy Points:
    26
    My experience with AGA on 3070

    My Alienware is a 13 R3 OLED 1440p
    32gb ddr4 2667mhz
    2Tb SSD
    Intel i7 7700HQ
    Nvidia 1060 mobile

    My first card connected to AGA was a 2080 SUPER Gigabyte OC Edition, I never had any problem.

    Then I bought a Gigabyte 3070 OC Edition, but I could not get it to work in any way, I tried everything, even the multiple methods of plugging and unplugging, both power cable and data cable, nothing worked, the system recognizes it perfectly and the drivers are installed, but it just does not work ...

    I sold it and bought a 2080 Ti Asus ROG Strix and it works perfectly.

    If DELL fails to make the 3000 series work normally on the AGA, possibly in the future I will go for the TITAN RTX which the next gen games, gives 8-10% more FPS compared to the 3070.

    I read right here that the TITAN RTX is fully AGA compatible, and technically the most powerful GPU that can run normally on AGA without any tricks.
     
    Last edited: Feb 12, 2021
    FXi and etern4l like this.
  23. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Hello bro @illuMinniti, could you share some benchmarks of your 3090 + m15 R1 setup?
     
  24. Mike 06

    Mike 06 Notebook Enthusiast

    Reputations:
    7
    Messages:
    24
    Likes Received:
    15
    Trophy Points:
    6
    Be careful with the old configurations. Putting a big GPU in won't work for long. You will quickly be CPU limited. On openworld or with crowds (Cyberpunk), I am CPU limited with an i7-6820HK (AW 17 R3) and an RTX 2070 Super OC. I have 60 FPS/Ultra FHD 95% of the time except in cases where the CPU is very busy. i7-6820HK and i7-7700HQ = 1% difference. AGA is a backup solution to play comfortably before redoing a configuration. It will never perform like an 8C / 16T CPU 4.5 Ghz fixed solution with RTX 3090
     
    Last edited: Feb 12, 2021
    FXi likes this.
  25. Juan Phoenix

    Juan Phoenix Notebook Guru

    Reputations:
    15
    Messages:
    54
    Likes Received:
    29
    Trophy Points:
    26
    As I commented in the post above, I have the Alienware 13R3 i7 7700HQ with 2Tb SSD, 32gb DDR4 2667mhz, and AGA first with 2080 SUPER and currently with 2080 Ti, and I have never had a CPU bottleneck, I haven't tested Cyberpunk yet but I always set the settings to ultra at 1440p in all games and the GPU usually goes to 90% and the CPU always stays between 45 and 70% at most, but usually it's between 55 and 65%.

    Obviously an 8C/16T CPU is ideal, but I think there won't be any bottleneck until the 3080, maybe you'll get about 10-12 more FPS out of it with a newer CPU, but I don't think there will be any bottleneck until at least the 3090.
     
    Suxel and etern4l like this.
  26. Mike 06

    Mike 06 Notebook Enthusiast

    Reputations:
    7
    Messages:
    24
    Likes Received:
    15
    Trophy Points:
    6
    Start by installing MSI Afterburner with Rivatuner and launch 3DMark. Display FPS with Rivatuner.
    You will see that your GPU score will be Top but the CPU part will be low.
    My i7-6820HK being 1% more powerful than your i7-7700HQ ... I know what I'm talking about.
    I am also Full SSD with 32GB of Ram and when a game requires a lot of the CPU I drop FPS.
    My GPU is rarely at 100% except with raytracing and I run 60 FPS Ultra on all recent titles.
    Except ... when the CPU is limited as on Cyberpunk or certain openworld passages on Star Wars Jedi Fallen Order or Shadow of the Tomb Raider for example.
    The games are smooth but drop to 45/55 FPS and I see it on monitoring. The CPU is at 100% on some cores.
    Maybe you do not realize or that you are not sensitive to small jerks or that you have not advanced in the big AAA games but I guarantee you that your configuration is CPU limited.
    I myself am CPU limited with an i7-6820HK and an RTX 2070S OC.
    Launch Rivatuner and display FPS. Launch Cyberpunk in Ultra, play a bit and once in town grab a car. Roll and watch your CPU and FPS. You will know what the CPU Limited is
    It is not serious ;).
    You just have to accept that the i7 laptop 4C / 8T is at the end of its life :)
    https://pc-builds.com/calculator/Core_i7-7700HQ/GeForce_RTX_2080_Ti/0KS12nlv/
     
    Last edited: Feb 12, 2021
  27. Juan Phoenix

    Juan Phoenix Notebook Guru

    Reputations:
    15
    Messages:
    54
    Likes Received:
    29
    Trophy Points:
    26
    I think there are 2 years left to play decently with 4C/8T.

    As I said in any game I have not used 100% of my CPU, partly because I play at 1440p, at higher resolutions less bottleneck for the CPU and more load for the GPU, especially in 4k. And my FPS never drop below 65FPS, even with Ray Tracing. I think Cyberpunk plays in the major leagues, I haven't tested it yet, but I know I won't be able to run 60FPS with everything in ultra at 1440p.

    But I think 4C/8T still has no bottleneck problem with 3070 or less.

    As long as you don't use 100% of the CPU there is no bottleneck, it is true that newer CPU's give more FPS, but this is not because of "bottleneck" it is simply because of more cores, more GHz, and new technologies.
     
    Last edited: Feb 12, 2021
  28. FXi

    FXi Notebook Deity

    Reputations:
    345
    Messages:
    1,054
    Likes Received:
    130
    Trophy Points:
    81
    I wonder (thinking on new connectors) if some maker might figure out how to chain two TB4 into a single 8x effective connection. Now that Maple Ridge is out, there are many designs with dual TB ports. Probably not technically easy, but once you've hit 8x in the current world anyway you've got very little bottlenecking left.
     
    etern4l likes this.
  29. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    The new and shiny 4C/8T @ 5.0GHz is a bottleneck even for the castrated "RTX 3070 laptop GPU with 85 watts". And this 4C/8T have a lot higher IPC + clock speed than the old gen mobile chips with same amount cores/threads.
    upload_2021-2-13_0-50-2.png

    http://forum.notebookreview.com/thr...cale-on-laptops.834039/page-104#post-11077763

    Tiger Lake-H35 is too weak for RTX 3070
     
    Last edited: Feb 12, 2021
  30. Juan Phoenix

    Juan Phoenix Notebook Guru

    Reputations:
    15
    Messages:
    54
    Likes Received:
    29
    Trophy Points:
    26
    In the Benchmarks and synthetic test, on the practice (the games), you barely lose a few FPS, depending on the title, and resolution, etc etc, on average it is about 8-10fps, I have a 2080Ti and I have no bottleneck, in performance the 2080Ti and the 3070 are practically identical. ( I know I would gain more FPS with a 8c/16T CPU, but as I said, it is not because of "bottleneck" it is because newer I mean more cores, more GHz, better technology, etc...).
     
    Last edited: Feb 13, 2021
    etern4l likes this.
  31. Mike 06

    Mike 06 Notebook Enthusiast

    Reputations:
    7
    Messages:
    24
    Likes Received:
    15
    Trophy Points:
    6

    I think without judgment that it does not play AAA Ultra games such as Assassin's Creed Valhalla, Watch Dogs Legion, Cyberpunk etc which are CPU intensive ... I think if he tested these games in FHD or 4K he would understand that the physics of a game depends on the CPU! ... Even if he put an RTX 3090 on his i7 7700HQ laptop, he would have a big drop in fps when he arrived in an area with a lot of PNJ to manage. Same consequences when arriving in a medieval village with a lot of entertainment to deal with or driving in a crowded megalopolis processed by the CPU. Once again I know what I'm talking about. On my i7-6820HK OC 4Ghz ( ;) ), when I am CPU limited in the cases listed above, whether I am in 720P or 4K, the game jerks (45 / 55fps). I have one or two cores at 100% so drop despite an RTX 2070S at 70/80%. 4C / 8T CPUs are no longer a viable long-term solution with a large GPU. There will always be drops in fps on open area or with a lot of physics on the screen. I really like my 17R3 combo with RTX2070S but I feel more and more the CPU limited in FHD or 4k on TV. I wanted to change this year but the prices are amazing because of the Covid. I hope that in 2022 we will have stock on Nvidia serie 4 ... and CPU 8C / 16T or 16C / 32T at 5Ghz in fixed configuration.
     

    Attached Files:

    Last edited: Feb 13, 2021
  32. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    I know the 7700Hq is a bottleneck. Because the oc'd 4 core 3770K can't keep up with 2080Ti.
    https://www.3dmark.com/fs/22427749

    https://hwbot.org/submission/4365180_papusan_cinebench___r15_core_i7_3770k_922_cb
    [​IMG]

    https://hwbot.org/submission/4090770_krzyslaw_cinebench___r15_core_i7_7700hq_801_cb
    [​IMG]

    https://hwbot.org/benchmark/cineben...Id=processor_5361&cores=4#start=0#interval=20
     
    Last edited: Feb 13, 2021
  33. Juan Phoenix

    Juan Phoenix Notebook Guru

    Reputations:
    15
    Messages:
    54
    Likes Received:
    29
    Trophy Points:
    26
  34. Mike 06

    Mike 06 Notebook Enthusiast

    Reputations:
    7
    Messages:
    24
    Likes Received:
    15
    Trophy Points:
    6
    [QUOTE = "Juan Phoenix, post: 11078253, membre: 737230"] La perte de performance en 1440p est de 10% et en 4K est de 5%, seulement en 1080p il peut y avoir une perte considérable ...

    Test FPS dans les jeux! Pas dans les benchmarks et les tests synthétiques.

    La source:

    https://www.techpowerup.com/review/intel-core-i9-10900k/21.html [/ QUOTE]

    Be specific guys! ...
    High resolution takes the strain off the processor, okay!
    Why? Because the GPU works a lot to calculate THD (4K) textures.
    But the game mechanics, physics (PNG, Visual depth, Shadows, Object movements, Anything that moves ...) uses the CPU!
    In the long run, a "good" processor will provide 100% protection against a good GPU.
    Otherwise it looks like DIY ...
    Obviously, you will drops in fps on the big AAA Ultra games!
    In short! Today for 60fps non / stop in FHD / 4K without drop, CPU 8C / 16T minimum :)
    Tomorrow and for 3/5 years 16C / 32T with RTX 3090 or RTX 4 ...
     
    Last edited: Feb 13, 2021
  35. FXi

    FXi Notebook Deity

    Reputations:
    345
    Messages:
    1,054
    Likes Received:
    130
    Trophy Points:
    81
    Would it be fair to say many egpu users are using some form of higher res display? I'm thinking 3440 16:9 and higher.
     
    etern4l likes this.
  36. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
  37. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Great answer from Chris. Short and sweet, everything is crystal clear. They could have just changed the name of the device and its port to ALGAE (Alienware Legacy Graphics Amplifier EOL), to make it crystal clear to the customers and avoid all those stupid questions and unnecessary load on Community Managers.

    But seriously, there are still some use cases of AGA: people could buy the lowest GPU variants and hook up a 2080Ti, or enjoy multiple GPUs for whatever reason - an Alienware laptop can have up to 3 discrete GPUs :)
    A 2080Ti should dominate the latest Ampere mobile 3080 150W, given the TimeSpy score of over 14k, and FireStrike above 36k :)
     
    Last edited: Feb 20, 2021
    Justin Stuever and FXi like this.
  38. Virale

    Virale Notebook Evangelist

    Reputations:
    312
    Messages:
    529
    Likes Received:
    775
    Trophy Points:
    106
    This is true... unlike the 20X0 gen, the 30X0 gen, in particular the 3080 is PITIFUL in performance vs it’s desktop sibling. Thermal headroom has reached its limits on laptops. The 3080 is just too power hungry and hot.

    The best gen for mobile was the 2080/2080S max-P like in the 51M... almost neck and neck with a vanilla desktop 2080/2080S.

    If I were looking to buy right now, I’d seriously consider skipping this gen; 30X0 mobile are absolute garbage compared to their desktop counterparts. You’re not going to get your money’s worth. You’ll be better off building a miniATX portable build than going with a mobile 3080 in both performance and your wallet.
     
    etern4l likes this.
  39. Clamibot

    Clamibot Notebook Deity

    Reputations:
    645
    Messages:
    1,132
    Likes Received:
    1,567
    Trophy Points:
    181
    Unfortunately this is what happens when people prioritize form over function, which is a really stupid mentality. This is what we get for asking for thinner laptops.

    Gaming laptops need to get thicker again so they can handle higher power parts, or we need innovations in cooling that allow a slim form factor to keep up with a desktop's cooling prowress.
     
    FXi, etern4l and Virale like this.
  40. Gumwars

    Gumwars Notebook Evangelist

    Reputations:
    291
    Messages:
    341
    Likes Received:
    371
    Trophy Points:
    76
    Considering how power-hungry the desktop parts are these days, is that even possible? Nvidia has decided that the solution is to crank everything to 11 and make a card that draws so much juice the lights in your house dim when the thing goes to work. How would that ever work in a mobile solution? Yeah, we can get some big chonky Clevo based solution with vapor chambers all over, but in the end, it will still end up struggling because desktop cooling solutions are simply bigger and better.

    AMD might have it sorted with the drop to 5nm, but that's still at least a year out. Either die size/efficiency and/or a new form of cooling will need to come along before we see parity between desktop and laptop systems.
     
  41. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    If they are able to improve the cooling they will just increase the sick race towards Apple design. Be you sure, you won't get better cooling. You willl only get skinny, thinner and even slimmer gamingbook models. Welcome to reality :)

    [​IMG]

    Nope! CIGAR.gif
     
    Last edited: Feb 21, 2021
  42. FXi

    FXi Notebook Deity

    Reputations:
    345
    Messages:
    1,054
    Likes Received:
    130
    Trophy Points:
    81
    It will be interesting to see what the Tiger Lake models have. At that point they're dealing with pci-e 4.0 coming off the cpu. Signal integrity is bound to be a problem, or perhaps they just convert it to 3.0 and attach to the "legacy" AGA port. Maybe they skip it entirely and go with TB4 since that's coming off the CPU. Sad if they never continue because the AGA we have is doing wonderful duty on the older AW now, flawless through driver updates and so forth. /sigh
     
    Papusan and etern4l like this.
  43. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Could you expand on the "signal integrity issue"?
     
  44. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    In short, cheaply made motherboards will struggle with signal noise. The cheaper, the worse. Already forgot why Dell crippled the Area-51m with lower ram specs than Intel data sheet for later gen processors? They was forced to due usual costs cutting.
     
    FXi and etern4l like this.
  45. Suxel

    Suxel Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    So far no issues. I get about 11k in 3D Mark GPU score which is very close to standards 2080S FE 3D Mark score. It allows me.to play control with RT (but not Ultra) in 1440P@60 and the same applied to CP2077 (Ultra + Medium RT) and I couldn't be happier. Unfortunately in my case this has to suffice. I used to have MSI GE75-1066UK with 2080 Max-P but gaming on a lap (I can't play on desk, otherwise I would get a desktop) was unbearable. Using cooling pad helped but not with cooling down the internals, the hot bottom just didn't touch my legs but the whole thing became clunky and chunky which reminded me of my 4.5KG MSI GT73EVR with 1080 Max-P, which was great as long as you had a desk to place it on. 1060 is for the 5th year in a row most popular GPU on Steam Survey so I guess that 2080S will last me a while. Funnily enough on r/GamingLaptops people with 150W 3080 are getting similar 3D Mark score (for GPU). Can't complaint (for now), but I can see what you mean.

    P.S. ~11k GPU score is on a loop R13<->AGA. I don't have external monitor
     
    Last edited: Feb 25, 2021
  46. Justin Stuever

    Justin Stuever Notebook Consultant

    Reputations:
    85
    Messages:
    143
    Likes Received:
    141
    Trophy Points:
    56
    This is exactly right! I grabbed a used AGA and used 2080TI hybrid for a reasonable price and use it with my 34" UW AW screen. It helps to get a lot of the heat off the heatsink inside my 51m R2 10900k/2080S. Drops my CPU temps about 15C during heavy load vs using the internal 2080S.

    The 30XX cards are not going to be much of an improvement in laptop form due to heat restrictions as stated. They will need better cooling and a step change in the 40XX cards, instead of just pouring the coals to them with watts.
     
    etern4l likes this.
  47. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    Not much will be changed. Going with etc 7nm or below means smaller chips (heat density will be even worse than todays re-fined 10nm). And Nvidia and AMD will continue with same TGP. How high will depends on the competitors.
     
    etern4l likes this.
  48. Juan Phoenix

    Juan Phoenix Notebook Guru

    Reputations:
    15
    Messages:
    54
    Likes Received:
    29
    Trophy Points:
    26
    Good News!!

    There is a project called "open caldera" about AGA and Nvidia 3000 series, and according to the tests, the incompatibility problem is for the Nvidia Drivers, not for the AGA hardware, but it seems that AW and NVIDIA are already aware of the problem and NVIDIA will release a driver in the near future for the 3000 series to work properly on AGA.

    AW posted this in a topic on Discord:
    "If you see the Discord AMA, AW committed to working with Nvidia to get fixes out "soon."

    Link on Reddit:
    https://www.reddit.com/r/Alienware/...ource=amp&utm_medium=&utm_content=add_comment
     
    Virale and Gumwars like this.
  49. Anglijsk

    Anglijsk Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    I have an Alienware 17R2 and AGA The laptop won't shut down. The monitor goes out, but the backlight stays on. Sometimes the fans come on and make noise. What is your advice? :bigconfused:
     
  50. Juan Phoenix

    Juan Phoenix Notebook Guru

    Reputations:
    15
    Messages:
    54
    Likes Received:
    29
    Trophy Points:
    26
    Finally it seems that the 3000 series works on AGA normally with the new Nvidia 465.89 driver.

    This is confirmed by some users on reedit and github.

    I sold my 3070 and can't confirm it myself, but if someone has his AGA and a 3000 series card, he can share his experience.

    I leave an image that confirms what I say:
     

    Attached Files:

    Gumwars and alaskajoel like this.
← Previous pageNext page →