Guys, need your help please. I'm loosing my mind. Have the following setup, Alienware 13 R1, i5-4120u, 16GB RAM. BIOS A08 + AGA with Gigabyte 1060 6gb WF2. All drivers up to date, but still having problems with low FPS during game. Is this a bottleneck with the CPU or am I installing the drivers wrong? OS Clean install WIN10 1607. Downloaded from nvidia latest 1060 driver. Please help
-
download gpu z and check of your gpu in the AGA is connected with pci express 4X speeds. If not and its on 1x only your cable might be faulty.
-
yep, so currently no reinstalls are needed when switching between the AGA and internal 1080, also I do have the gsync model, do you think that maybe why my internal card isn't disabling?
cheers -
yes i thik it is. because I saw this behavious multiple times from Gsync systems. it seems teh AGA is reliant on the iGPU switching technology for it to disable the internal Geforce GPU.Grechie likes this.
-
cookinwitdiesel Retired Bencher
On the Optimus systems it can drive the internal display off the iGPU which is probably what allows the dGPU to be disabled. I would imagine on the G-Sync systems since the internal display is driven directly off of the dGPU it has to keep that fired up to power the internal screen in paralell to the AGA powering the external displays.
-
OK, I got my nice new Asus VG248QE 144Hz monitor (plus data port cable) yesterday. I gotta say that by adding this to my AGA/gtx1080ti (Zotac 1080ti Blower, with 1x hdmi and 3x data ports), and choosing to display it only, does improve my PC's performance quite a bit compared to my AW 17r4's internal gtx1060, and monitor.
Sorry, no benchmarks right now but non-VR games/apps are about ~2x faster/better FPS. Using 144Hz certainly makes non-VR flight/racing sims look buttery smooth. Also, it does not take up as much room on my desktop than I thought it would. Now I luv using this external monitor a lot more than my laptop's! I also now find that using a USB mouse works a lot better than the touchpad. For some reason, my AW FX lights go crazy when using the touchpad with this setup. Plus, the mouse (with touchpad disabled) is better/easier to use anyway imho.
VR games/apps with my Oculus Rift (with 2x sensors) are not that much better FPS-wise but I have been able to improve this a little by adjusting my Nvida control panel 3D app settings, like AA up a bit more than I could before and still maintain good FPS performance. Also, I did not have any problems using the AGA's USB ports for all my Rift headset/sensors. This was def a problem with my other Dell WMR system. Even Dell support told me to avoid using the AGA's USB for my Dell WMR headset's USB.
I did try diff monitor resolutions and refresh rates but this did not make much difference. I think that this is now because my AW 17r4 CPU (i7 7700hq 2.8/3.6 GHz, with 32 Gb ram btw) is now the bottleneck. With some flight sims like X-Plane 11 (now 11.25rc2) this may improve when they go to Vulcan, maybe later this year, and are then able to better utilise multi-cores. Alternatively, I still may move my gtx1080ti into a better desktop PC later, but not right for now.
Overall, (except for the noisy USB hub, lol!) I am now pretty happy with my AGA/1080ti combo and I appreciate all the help I got from supporters, like rinneh, on this thread. Many thanks guys. Cheers.
Aside; I still think that if you already have an AW laptop with at least a gtx1060, and are mainly into VR, adding an AGA with gtx1080ti does not make a lot of economic sense imho. Although I get a big boost in non-VR performance, I may just go back to my stock gtx1060 with this external monitor.
Aside #2; I just tried using my new monitor plugged into my AW display port without the AGA and I got similar good results. Go figure????
Aside #3; Went back to AGA w/gtx1080ti again and got similar results. All Non-VR games/apps are much improved and look great on my new Asus 144hz monitor. All my VR games look fine with my Oculus Rift but performance gains on resource hungry sims like X-plane 11.25rc2 and Realflight 8 are not much better. I think that XP11 may go to Vulcan later this year so that may help a lot because neither of these sims fully utilise mult-core cpu's. Anyway, I'll stick with the AGA for now. I luv my new monitor and at least I kinda feel that I am getting some value from my Zotac gtx1080ti blower.Last edited: Jul 27, 2018Mjozko, cookinwitdiesel and rinneh like this. -
Just successfully installed the Graphic Amplifier to the Alienware 17 R5 with Core-i9. It took me such a long time to get it right, because the Nvidia laptop driver wasn't able to pick up the GTX1080Ti and the graphic amplifier wasn't detected. I tried doing loopback mode to run the GTX1080Ti on the internal screen, but it didn't work. So the result is using external monitor. (direct output from the graphic card, not going from the laptop, because it only picks up the internal GTX1080). I believe this could be an interesting result to see, as I have heard from other people saying that the CPU limits the performance on the eGPU.
For the test, I only ran Fire Strike 1.1. (Sorry that I am not a professional, I only ran the test to see if there is any difference.)
On-board internal GTX1080 Benchmark:
Graphic Amplifier with ASUS GTX 1080 Ti Turbo. Non-overclocked and Overclocked Results:
Overclocked using ASUS GPU Tweak II:
- GPU Clock: 1756 MHz (+174)
- Memory Clock: 11527 MHz (+527)
- Fan Speed up to 100%
- Highest GPU Temperature at 67C; Average at 64C
After all, I see that there is about 10% decrease from what it can achieve while installed in a desktop. However, there is about 25-30% performance increase compare to the internal GTX1080.
So far I can understand is, there shouldn't be any bottleneck for the Graphic Amplifier and the OEM PSU is definitely enough for the card alone.
In addition: This is the result of my old Aorus X7 Pro running with 2 GTX 970M SLI. This is quite amazing to see the difference from the mobile graphic card to a deaktop-grade card.
Last edited: Aug 3, 2018TomC69 likes this. -
cookinwitdiesel Retired Bencher
Curiously, that is almost the exact same GPU score and almost exactly 50% better CPU score vs my laptop with the same GPU in the AGA. So I would assume your rig is running well (or both of ours are equally weird
)
https://www.3dmark.com/fs/13682800Last edited: Aug 3, 2018 -
Glad to hear this is working well for you guys. Benchmarks are great indicators but the real test is when you run some high demand games/apps and see what real world improvements, like FPS, you get. Many thanks for keeping us updated on all this.
Last edited: Aug 4, 2018 -
Hello there!
I would be really grateful if any of you could help me out or suggest me something because I have no idea what the hell is happening.
I have an Alienware 15 R2 with GTX 970M in it. I bought a Graphics Amplifier to makes things better with an EVGA 1060SC.
I properly seated the card in the GA, connected the power supply as it should be (actually I tried with the brown and yellow as well, nothing changed the fact). So, my problem is, ""unrecognised device in graphics amplifier" all the time. I tried with multiple versions of BIOS, Graphics Amplifier software and Command Center (It's happening in Windows 10). I get this message all the time when booting up the notebook. I tried with a GTX960 Windforce, and an old GTX660, same error message all the time. If I connect something to the USB ports on the back of the amplifier it's working and I can switch the colors as well in the command center. The cables, connectors, everything is fine, no bent pins, damaged connectors. What should I do next? Thank you for your answer. -
Hard to say for sure but maybe check the connection(s) between the AGA power supply and the gpu board. Sometimes these get a bit loose and need reseating. If this does not help I recommend contacting Dell Support. They are usually very helpful imho. It could be that your AGA is faulty and Dell will probably send you a replacement unit. Unfortunately you may also find out that your gpu is not supported. You should check this list before buying;
https://www.dell.com/support/articl...mplifier-supported-graphics-card-list?lang=en
Good luck sorting this out. -
Way too early to even speculate I guess but is there any indication that these new RTX cards are/aren’t likely to work with the amplifier?
-
I don’t see any reason they wouldn’t. There may be some bottlenecking over the AGA connection though. I’m sure we’ll see some benchmarks once the cards are actually released.
Sent from my iPhone using Tapatalk -
@Hstanez @cookinwitdiesel https://www.3dmark.com/3dm/27668858?
i didnt know that my gpu score was higher :S . i have an aorus extreme editionVasudev likes this. -
@Hstanez @cookinwitdiesel also look at my temperature while gaming
-
hello every one !! i just got my i9 17r5 yesterday and was wondering if i buy a rtx 2080 for it can i driver the laptop display? my system does have gsync .
let me know ! Thanks for all the help! cant wait to get my replacement system so i can finally use the system! -
I dont think you cant, the gsync panel is directly routed through the internal geforce gpu. Thats why in gsync models the gpu isnt disabled.
But you throw away a lot of performance anyway if you use the internal screen. No use in getting a 2080 for that. -
Without looking at every single one of the 340 pages of this thread. are there any cool custom mod setups folks might be able to direct me to? Where they've physically modified the housing or replaced it entirely?
-
i have seen people that replace the noisy fan with a noctua fan with the same diameter. i think it is 80mm tick ... and others change the power supply with a 86 mm (H) x 150 mm (W) x 140 mm (D) power supply that can be this one https://www.amazon.com/Corsair-Bron...ie=UTF8&qid=1535648283&sr=8-1&keywords=cx550m
also look this mod
Muezick likes this. -
cookinwitdiesel Retired Bencher
I think overclocking the card in your AGA, while cute, is largely pointless. That will just exasperate the PCIe bottleneck and result in even more CPU wait time where the CPU cannot scale up in clocks nearly as well as the external (and potentially water cooled) GPU will.
Also, a radiator with fan is NOT going to be a silent solution - AND you still need airflow over the card to cool the ram and VRMs unless you get a full coverage block.
Now excuse me while I go through my spare watercooling parts....lol
Update: Of course I got a "special" GPU that there are no FC blocks for...thanks Asus!Last edited: Aug 30, 2018 -
Yall use AMD GPUs in your AGAs just fine yeah?
-
I have not heard from any using AMD GPU's, just Nvidia. Here is a link to Supported GPU's (Nvidia and AMD);
https://www.dell.com/support/articl...mplifier-supported-graphics-card-list?lang=en -
I used to use an R9 285 and RX 480 in my AGA. They worked fine.
-
Excuse me while I call you an idiot.
Sent from my iPhone using Tapatalkjudal57 likes this. -
Because there is no PCIe bottleneck. 4 lanes of PCIe Gen 3 are more than sufficient to run any current card. There is a performance hit when using the AGA to drive the laptop display, but when running an external monitor, it matches the performance you would get in a desktop with similar specs.
Sent from my iPhone using Tapatalkjudal57 likes this. -
Ok. I get it, thanks.
With and external monitor on my 17r4 I am getting a very nice performance boost with my AGA/gtx1080ti with all non-VR games/apps. Like ~2x the FPS performance compared to my internal gtx1060. I also like the benefit of lower laptop temps since the AGA removes the load of the internal 1060. I don't think I have ever heard my laptop fans switch on now I'm using the AGA. Of course the AGA fan goes all the time but this really is not too bad and at least it is keeping my AGA's 1080ti temps down.
However, improvements with my Oculus Rift and all VR games/apps is a lot less, ~10-20% (depending on the app). Still, a very nice improvement though. This is with the headset hdmi and its usb plus 2x sensor usb's connected to the AGA. I have tried moving the usb connections back to my laptop usb ports but this does not make any difference. I think (but do not really know) that this is because my internal cpu (i7 7700hq 2.8/3.6 ghz) is now the bottleneck with graphics intensive VR apps. Esp. those apps that do not fully utilise the cpu's multi-cores efficiently. One of my highest demanding VR apps (X-Plane 11) is planning to incorporate Vulcan later this year so hopefully this will help.
Maybe we need a revised AGA that also includes the ability to add a better CPU. That would really help make my 17r4 a lot more future-proof, lol! -
Exactly why I ordered a refurb AW 17 R5 with the intel 8950HK, I love my AGA because it's silent and offers better performance than any internal graphics card but my current AW CPU is becoming a bottleneck.
-
cookinwitdiesel Retired Bencher
While there is not a realistic difference with the 10x0 cards (less than 5% performance difference? - margin of error stuff) the more powerful the card, the more that gap will spread. For example, a new generation of cards that are up to 50-100% more performance. Overclocking will also grow the gap.
In the future, I recommend citing sources and testing to back up your point instead of just calling people idiots - you may be surprised to learn that some of them aren't that stupid and are even knowledgeable and helpful. -
And your argument is based on straight up conjecture. Until the cards have been tested in both desktops and the AGA, there is no source or test to cite.
Sent from my iPhone using Tapatalk -
cookinwitdiesel Retired Bencher
You are right, I am giving people advice before they spend $1k+ on a GPU for their laptop. I would hope anyone looking to spend that kind of money is going to wait to see what experience they can expect vs being a guinea pig.
The only conjecture I offered was in my first sentence (the 5% part), the rest is pretty straight forward math. The faster the GPU is running (either through a faster architecture or overclocking), the more bandwidth it will take to feed it fresh instructions. As more bandwidth is needed, a fixed pipe will become a bottleneck to performance. I would be curious to see what the peak and average GPU utilization are for a few different scenarios with the AGA:
GPU 1 at stock clocks in AGA
GPU 1 overclocked in AGA
GPU 2 (new architecture) in AGA
In addition to looking just at FPS, you need to look at GPU utilization as well. Although your FPS will go up with each test case above, if GPU utilization is not staying the same across them, then the PCIe link is a likely bottleneck. I kept the CPU as a constant here but you could also throw in a 2nd set of all of the above with overclocked CPU (first runs being stock CPU). That just becomes a lot of data and variables to track and can make it harder for some to follow.
It is known and accepted that the pipe can limit the performance, that is why the AGA stomps all over any TB3 based solution in terms of peak performance and performance consistency. -
Your original post that I responded to had nothing to do with the new RTX cards, it had to do with overclocking, which works just fine with the current gen and the AGA.
Sent from my iPhone using Tapatalkjudal57 likes this. -
the AGA is not the bottleneck, is your cpu my friend that is why the gpu is not at 100% of load. and that depends on the game
-
cookinwitdiesel Retired Bencher
I am not personally having any issues, I primarily play games from Blizzard which have always been more CPU dependent and don't tax the GPU. I am simply providing advice and an opinion.
-
I've found all your guys info very helpful and informative, thanks. Please don't sweat the small stuff though. Cheers.
c69k likes this. -
Multiple reviews at guru3d have shown that 4x is barely a bottleneck. You only lose a couple of frames and this is even with extreme high vandwidtg cards such as the amd vega 64 or the titan V.
The limiting factor could be cpu but even in modern titles such as dedtiny 2, assassins creed origins, far cry 5 etc i have to see cpu usage touch even the 80% -
I think a lot of cpu limits are due to the poor way that many apps utilise multi-cores.
-
That indeed happens too, arma3 is a noce example of this, game runs like crap. Barely 40% cpu usage and barely 60% gpu usage. Only 40~60 fps no matter what graphic settings i use.c69k likes this.
-
cookinwitdiesel Retired Bencher
I would definitely expect software to become a limit well before hardware. We just talk about hardware more here since that is what we have a lot more control over (what components, how they are clocked and cooled, etc)
c69k likes this. -
More precise to say workloads that hammer the hardware. SW will say it'll work 100% on any hardware but when actual work is assigned the hardware might not deliver because it might be generalized rather than optimal use of resources based on each hardware.c69k likes this.
-
cookinwitdiesel Retired Bencher
Yep, software is made to be flexible in Windows with any hardware, not ideally optimized. If you want fully optimized, get a console.
-
4x vs 16x in a desktop
There is a slight performance drop when running the highest end cards in 4x. Although I don't agree with not overclocking. The performance might be the same or very minor improvement. If voltages aren't high and temps are in control nothing wrong with overclocking in AGA. Interestingly the cpu usage drops very slightly when running in 4x. -
Rengsey R. H. Jr. I Never Slept
Hey fam .....i just want to share some good news from your older sibling , the M17X R4 running GTX 1080 .. currently has low specs No OC ..
https://www.3dmark.com/3dm11/12917274
-
Ya, my main VR flight simulator X-Plane 11 only achieves 50% cpu and 55% gpu no matter what graphic settings I use. Apparently XP 11 is going over to Vulcan soon (Probably not till mid 2019). This should help utilise cpu multi-cores more efficiently and improve things quite a bit (I hope).
Pretty bad that many expensive VR PC's that we spend +$2,000 for then seem hampered by $60-100 software apps. Developers of VR apps are going to have to address this soon if VR usage is going to grow.rinneh likes this. -
Hello Guys! I would need to some advice here.
I have a R4 2017 (7820HK, 16Gigs and a 1070).
I bought an AGA for the future but never had to use until now.
So I recently bought a monitor ROG Strix XG32VQ so its a 32 inches VA panel with Freesync/Adaptive Sync, 2K res and I will be using the display port to use its 144hz of refresh rate.
I am positive that would be hard to sync at 144hz with a 1070 for games like BF1, BFV, R6 Siege, PUBG etc.
So my next purchase would be a graphics card to put in my AGA and I also want to take advantages of my new monitor.
A 1080TI would be a good option or a I should wait for the 2080 ?
Another question, to use free sync I must have a Radeon card right? Is there any Radeon model that beats or competes with a 10 series or 20 series from nVidia?
My third question would be: What is the difference between Free Sync and Adaptive Sync? The monitor supports both technologies.
Thanks in advance
Stefano -
I would wait until you see how well your internal 1070 handles your new monitor before jumping in with an AGA. Also, you need to evaluate whether or not any syncing really makes any difference.
I do not have any experience with Radeon cards so I cannot comment on these.
I have a asus vg248 144hz external monitor (no g-sync) and with my 17r4 with 1060, and with the monitor plugged into my mini-displayport it works great so I think your internal 1070 should work even better. Also works great with my AGA w/ zotac 1080ti blower with the monitor plugged into the 1080to DP. I have never experience any screen tearing. I have played with diff refresh rates (60-144hz) and am currently using 120hz because I think that the colours look a little better compared to 144hz. Your screen resolution will also make a big diff. I only use 1080p.
In both cases it's best to set your display settings to only project to the external monitor. This disables the laptop monitor and bypasses the integrated intel 630 graphics, which seems to give a nice fps performance boost.
If $'s were not a problem, I would wait for the 2080ti's to come out and see how these work, esp. with an AGA. If you want to minimise $'s spent, since you already have an AGA, I would wait and buy a gtx1080ti. Prices for these should come down after the 20 series is out there and tested. Also, there should be a lot of used 1080ti cards out there at low prices as well. In both cases you are best to stick with Foundation Edition (FE) blower-type versions because these fit the AGA best and allow you to properly close the AGA lid.
One advantage you may find using and AGA is that since it bypasses your internal 1070 your laptop may. run a bit cooler. The AGA also has 3x 3.0 USB ports which are handy since the 17r4 is not blessed with many of these.
The main disadvantage of the AGA is that it does take up extra desk space and is a bit noisy (but not too bad). Also, if you need to take your 17r4 on the road by itself you need to switch back off the AGA. This normally works fine for me but sometimes I need to reinstall graphics drivers (no big deal).
Hope this helps and good luck whichever way you go. Please keep us posted on this, thanks. -
Hello Tom, Thanks you for your recomendation.
When I get the monitor I will setuo and post here the results.
Stefano
Enviado do meu iPhone usando Tapatalk -
Showing 100fps on a 144hz monitor is still a huge benefit compared to running 100fps on a 60hz panel. You dont need to stay at 144hz to see benefits.gunbolt likes this.
-
Very true, after going 144hz I can't go back. I'm also waiting to see the performance and reviews to maybe upgrade the 1080 in my AGA.
-
Without reading through the backlog of pages here, does anyone know for sure if a RTX 2080 FE will fit in the graphics amp and will it work today?
Thanks!
*OFFICIAL* Alienware "Graphics Amplifier" Owner's Lounge and Benchmark Thread (All 13, 15 and 17)
Discussion in '2015+ Alienware 13 / 15 / 17' started by Mr. Fox, Dec 10, 2014.