I recently added a eGPU to my Clevo laptop, and thought I'd post some thoughts and results...
My goals
Not my goals
- Good performance -- I wanted to play the latest games and do some tensorflow work. I have a 1080p, 75Hz display, so that's the top end of my perf requirements.
- Quiet -- I generally use my laptop on the couch where my wife is sitting and reading/game/watching TV. Loud fans are pretty much a no-go.
- Cool -- it's annoying, especially in the summer, when my laptop turns into a blazing furnace whenever I try to play anything.
What I started with
- Price -- I didn't want to spend an insane amount, but cost wasn't a big consideration. Particularly if it lets me delay upgrading my laptop, which I tend to do every couple of years.
- Benchmarking -- I want performance that lets me play what I want to, and play it well, but I'm not really interested in chasing benchmark scores.
I have a Clevo P775DM3-G that I purchased a couple years ago. It's big 17" laptop that has a 6700k desktop process and mobile 1080. I generally use it on the couch or on a desk, either at home or work. Yes, it's heavy and big. No, it doesn't bother me.![]()
The big problem with using this laptop for gaming (or any other really intensive activity) is that, on the "Normal" fan setting (ie. usably quiet), it ends up thermally throttling both the CPU and the GPU. As far as I can tell, this is just a limitation of the laptop itself -- it's always done this, the heatsinks are clean, thermal paste is good. It's just that running a desktop processor and a 1080 in a small enclosure means you need to blast the fans to have any hope of keeping up. And on top of the annoyance factor, it's not great to be getting your CPU and GPU to their their thermal throttling points regularly.
So, I decided to give an eGPU a try. I'm well aware of the performance hit that you take when you use an eGPU, but in doing some testing with a GTX980 that I had sitting around, it looked like it would improve my perf while also keeping temperatures and noise levels low, so I took the plunge.
What I bought
I ended up picking up the Razer Core X ($300) and a RTX 2080 TI ($1200). The Core X is a simple, clean little box with a few outputs and a good power supply. Basically a no-muss, no-fuss eGPU, that also happens to be on the lower end of the price spectrum of eGPUs. I totally recommend it.
The 2080 TI is almost surely overkill, but a) I knew I was going to be taking a perf hit by using a eGPU, and I wanted to make sure it was going to improve my current performance, and b) money wasn't a huge issue and like I said, especially if this gets me to delay another $3000 laptop purchase for a year, it ends up making more sense. The specific model was the Gigabyte Gaming OC, mostly because it was the one that briefly popped up in stock on Amazon.
How it works
So I got the Core X, put the 2080TI in it (comfortably fits), and bought the 6' powered TB3 from Cable Matters (~$60). When I plugged it into the TB3 port on my laptop, Windows first installed some drivers and then popped up a dialog:
![]()
Well, THAT message doesn't look promising. Thankfully, it's a total lie and eGPUs work great on this computer!Don't worry about it.
Then I had to install desktop video drivers from nvidia, since previously I only had the mobile ones installed.
The only thing you really need to do is make sure that you're on the 1803 version of Windows, which allows you to specify which GPU to run on. Annoyingly you have to do this on an execuable-by-executable basis, but it works great, and it remembers all the settings so you don't need to do it more than once for anything that you want to run. Basically just go to Desktop Settings and select Graphics Settings at the bottom:
![]()
Then you can Browse to apps and add them to the list:
![]()
And then if you select Options, you can pick which GPU to run on (this was when I was testing with the 980):
![]()
And that's it. Then you can just run whatever you want, and it'll render it over on the eGPU and use your local GPU for actually rendering it to your display. You can also use an external display, which will get you slightly better performance, but it wasn't what I was interested in.
And now, I can run whatever I want, and I just need to plug in the eGPU cable. The box itself is sitting next to me behind the end table, and is almost completely silent, even under heavy load. And whereas previously I'd end up with my CPU around 95C and my GPU at 90C, both thermally throttling, now my CPU never goes above 80C and neither GPU breaks 75C, all while on the lowest fan setting! It's great.
Of course, I did some benchmarks to see what kind of performance difference there was between the throttling mobile 1080 and the eGPU 2080 TI:
Ashes of the Singularity:
![]()
![]()
Far Cry 5 (fairly CPU-limited, it seems):
![]()
![]()
Fire Strike:
![]()
![]()
Superposition (Extreme):
![]()
![]()
Timespy:
![]()
![]()
Shadow of the Tomb Raider:
![]()
![]()
Overall
I couldn't be happier! It works great, and does everything I was looking for. Happy to answer any other questions or do other benchmarks, if people are interested.
-
yrekabakery Notebook Virtuoso
Wow. Your FS Graphics Score with the 2080 Ti eGPU is lower than what I get on my internal 1080 at stock...
Edit: Looks like your 2080 Ti is losing about 40% of its performance
Did you try using an external monitor hooked up to the 2080 Ti instead of the laptop display?
Your internal 1080 is also being throttled by almost 30%.Last edited: Oct 14, 2018 -
Yep, like I said, I'm aware of the perf loss. But thermal throttling is losing a lot too.
Didn't try using an external monitor -- I'd just buy a desktop machine if I wanted that.
The other interesting thing is that it varies a lot by benchmark. Firestrike is a pretty big drop, but something like Superposition shows a much smaller drop (~18%), and Time Spy is smaller too (~20%). And then Far Cry isn't much faster than my 1080 at all, but Shadow of the Tomb Raider is a LOT faster.Last edited: Oct 14, 2018 -
yrekabakery Notebook Virtuoso
I suggested an external monitor hooked directly to the 2080 Ti because it would probably not destroy performance as badly.
But idk. $1600 ain't exactly chump change, and seems like a hugely unnecessary expenditure versus fixing what you already had, which would've gotten you better performance and mobility to boot. A set of P775TM1 heatsinks, liquid metal, and a delid tool would've gotten the job done, then you could spend the rest building a kickass desktop with a discounted 1080 Ti. -
I guess I'm not convinced I could have gotten any better performance out of my existing laptop -- it's already using liquid metal on both the CPU and GPU, and afaik you can't delid the GPU anyways? That's a lot of work, not without some risk, and there's no guarantee I end up with anything better than where I started.
Anyways, this is the eGPU section of the forum is it not? Not much point telling people that they shouldn't use a eGPU.I was pretty clear about my goals, and given that I now have a silent laptop that's faster than any other (single-gpu) laptop that I'm aware of, I'm pretty happy. Mostly I posted in case others were interested in trying it themselves, so they can see the tradeoffs and what to expect.
Last edited: Oct 15, 2018Lopezco likes this. -
yrekabakery Notebook Virtuoso
Your bottlenecks in FC5 and SotTR are huge at 1080p. In FC5, the desktop 2080 Ti gets 130 FPS to your 83 FPS (-36%). In SotTR, the desktop 2080 Ti gets 129 FPS to your 77 FPS (-40%), and you are GPU bound only 30% of the time, so 70% of the time you are bottlenecked by the eGPU interface or CPU. In both cases, your 2080 Ti is still slower than a non-throttling 1080, in FC5 massively so.
Your poor 1080 is throttling so badly, even a massively bottlenecked 2080 Ti eGPU looks amazing next to it. Compared to a non-throttling 1080, your 1080 is getting 70% of its normal performance in FC5 and FS, and only 50% (!) of its normal performance in Superposition, Time Spy, and SotTR.
Last edited: Oct 15, 2018intruder16, Maleko48 and Mr. Fox like this. -
In this community there are people who insists that notebook hardware can keep up with desktop hardware, which is an absolute joke. As you can already read from yrekabakery's post, he is absolutely ignoring facts and claims stuff that doesn't make any sense whatsoever.
Best example here:
https://www.3dmark.com/spy/3766000
meanwhile you beat it with ease, you probably have some bottlenecks in your RTX 2080TI that aren't only bandwidth bound tho.Last edited by a moderator: Oct 15, 2018 -
yrekabakery Notebook Virtuoso
Even a 2080 Ti bottlenecked by that much still beats his 1080 (albeit just barely in FC5), because his particular 1080 is throttling so badly that it's only getting 50% to 70% of its normal performance. -
Also he probably has some settings wrong etc. eGPU can be a little fiddling work, I will help the guy out and if he accepts my help, He will probably show u some better numbers.
also yes, external screen does reduce boitteneck even further. -
yrekabakery Notebook Virtuoso
Mr. Fox likes this. -
?????????????????????????
What are you getting at?? Do you even bother reading posts?
I said his modded GTX 1080. Stop posting bogus, tyvm. -
yrekabakery Notebook Virtuoso
-
Desktop GTX 980 TI 59.9 FPS
eGPU GTX 980 TI: 58.1 FPS
Tom Clancy's The Division
Desktop GTX 980 TI: 59.8
eGPU GTX 980 TI FPS 46.8
Tom Clancy's GhostRecon
Desktop GTX 980 TI: 87.9 FPS
eGPU: GTX 980 TI 56.6 FPS
Shadow of Mordor
Desktop GTX 980 TI: 128.5 FPS
eGPU: GTX 980 TI 97.3 FPS
source
I repeat, bogus. I have no idea why you made that up, but it's certanly not how eGPU works. Higher FPS doesn't mean stronger bottleneck, that's just something you made up for no apparent reason. -
yrekabakery Notebook Virtuoso
Maleko48 likes this. -
-
yrekabakery Notebook Virtuoso
Oh and this:
-
'
If you still can't see it, please find professional help.
Here is some elementary school math for you since you probably need this as well:
87.9FPS is less than 128.5FPS
56.6/87.9 = 64.3%
128.5/97.3 = 75.7%
100 - 64.3= 35,7% bottleneck
100 - 75.7 = 24.3% bottleneck.
1440P:
66/45.9 = 69%
89.2/73.6= 82.5%
100 - 69 = 31% bottleneck
100 - 82.5 = 17.5% bottleneck.
You might want to improve your reading and math skills.
Checkmate, pal.Last edited by a moderator: Oct 15, 2018 -
yrekabakery Notebook Virtuoso
-
yeah we can agree on that.
As you showcased yet again with that screenshot.
It would be interesting to find out why 4k is less bottlenecked tho. -
yrekabakery Notebook Virtuoso
Maleko48 likes this. -
Are you seriously that desperate to prove a point that doesn't exist?
Literally in the first 3 valley benchmark your statement is wrong. The highest bottleneck is with the test that has the lowest FPS.
Shadow of mordor has most FPS in the gaming part, yet Ghost Recon has much more bottleneck despite having less FPS.
You are extremely deluded.
Not to mention that you're currently saying, that 4k uses less bandwith than 1080p. Obvioulsy the OP of that thread has no idea what he's talking about and completely fails to interpret the benchmarks provided. Nobody in his right mind is going to say that 4k uses less bandwith than 1080p, the very idea is just stupid. There is a reason why older HDMI ports cannot handle 4k 60hz output, but can easily handle 1080p 120hz. The quote you're hanging on to is so increadibly contradicory, that onnly a person who is extremely deluded or has no idea about anything would kling on to. -
yrekabakery Notebook Virtuoso
No, you're the one who's deluded. You're talking about the display interface. He's talking about the GPU interface (TBT/PCIe). Two totally different things. The monitor's resolution and refresh rate affect the bandwidth usage on the display interface. The frame rate that the GPU is outputting, which is separate from the monitor's resolution and refresh rate, affects the bandwidth usage on the GPU interface. He and I are both saying that higher FPS (due to lower resolution) is more bandwidth on the GPU interface, and lower FPS (due to higher resolution) is less bandwidth on the GPU interface.
You're the one who's being obtuse by comparing different games in terms of absolute FPS values. You can't seem to grasp that different games have different inherent GPU interface bandwidth usage. However, a higher frame rate will require more bandwith than a lower frame rate within the same game.Last edited: Oct 15, 2018Maleko48 likes this. -
As to why higher resolution = less bottleneck:
The higher the resolution and the more stressed the card is, the smaller the performance gap becomes compared to the same card installed in a desktop. The bottleneck starts to shift from PCIe bandwidth back to raw GPU performance.
Also what you described is upscaling, NOT native 4k. The stress is bigger on both the GPU and CPU interface when playing at 4k. Otherwise we would see little to no performance difference in 1080p vs 4k if it would work like you currently are making it up to be.
seriously, stop it, it's getting extremely embarrassing for you.
To illustrate even further how correct I am The performance drops are relative to how well or bad the game / program can utilize the GPU.
The oldest and weakest test which cannot use the GPU from unigine has the highest Performance loss, the 2nd newest has the 2nd highest performance loss, the 3rd and newest has the least amount of performance loss.
Meanwhile Firestrike runs poorly vs Timespy.Last edited by a moderator: Oct 15, 2018 -
Mastermind5200 Notebook Virtuoso
Wow, all I can say is, what a big waste of money, try to return that RTX 2080 TI if you can. Even with an NVME and external monitor setup, youre still bottlenecked to high hell
-
yrekabakery Notebook Virtuoso
On one hand, you're agreeing with me with this statement:
Like earlier in this thread, you don't even realize that we're saying the same thing, so you can't decide whether to agree with me or insult me. Honestly it's hilarious to see the dichotomy and you struggling with your lack of self-awareness. Just make up your damn mind already, am I right or am I wrong, which is it gonna be? You can't have both. LMAO
Anyway, I'm done wasting my time with you. It's like trying to explain things to a petulant child. Even the most basic terms go over your head and cause you to throw tantrums.Last edited: Oct 15, 2018Maleko48 likes this. -
Not sure what I expected posting here, seems like everything on this forum turns into people telling other people that they're doing something dumb.
I mean, this is the eGPU section of the forum. I started out saying that I know that the 2080 TI is going to be significantly bottlenecked by the TB3 connection. Everyone knows this! Pointing out that it would be faster in a desktop is pointless, of course it is! But -- for my goals -- it works fantastically!
As far as I can tell -- and please let me know if I'm wrong -- there is absolutely no other way to get a similar level of performance to what I'm currently getting, at the fan/noise/heat levels that I want. Yes, I posted a faster TimeSpy when I first got my laptop, but that was with the fans going full blast, in order to get the highest score possible, and I've already said that's a dealbreaker for me for normal use.
Additionally, my laptop's panel is only 75Hz, which means that any increased bottlenecking from higher framerates aren't that big a deal -- they mostly show up in benchmarks, but not real use.
Bottom line: If money isn't a big factor, and you want a quiet/cool laptop that has better performance than pretty much anything else on the market, I think this is a great way to go.Maleko48 likes this. -
yrekabakery Notebook Virtuoso
Maleko48 likes this. -
And no, I'm not interested in replacing the entire heatsink of my laptop, or water-cooling it (lol). That's way more work and risk than I'm interested in, when there's a plug-and-play option out there that I know works just as well.
Are there 1080 laptops that can run with low temps on low fans and match the perf of my current setup? If so, great, point me at them, that's almost surely what I'll be buying when I next upgrade. If I can get comparable temp/noise/perf results without a eGPU, that's great, I'll buy one of those and throw the 2080TI into my VR machine. But I'm not aware of any right now.
Not sure why everything has to turn into a pissing contest here. It's really obnoxious, and it drives away people like me, when I spent some time putting together a post that I thought other people -- on the eGPU forum especially -- would find interesting and informative. -
yrekabakery Notebook Virtuoso
-
Given that it's probably going to give at least a year of life to my laptop, I don't think it's quite as financially impractical as you make it out to be.
Another advantage is that, since I'm using a desktop part, it's easily transplantable -- like I said, if at some point I get a new laptop or decide that I don't want to use the eGPU anymore, I can take the 2080TI and put it in my VR machine, or sell it, or what have you. That's not really possible with a laptop GPU part.
Fundamentally, anyone thinking of going the eGPU route should have their eyes open about the tradeoffs that they're making. If they're not willing to give up 20-40% of raw desktop performance, they shouldn't be doing it, and they SHOULD be turned off by the process. Thankfully, at the moment, that raw performance loss is still ahead of what most laptops can do, and lands in a better performance-vs-noise-vs-temperature place than any laptop that I'm aware of. -
I’ve 144hz 1070maxq Laptop and I use that for esports titles to runs over 100fps, but for 4K/1440p games I’m using egpu setting with rtx card. Here’s my post on egpu.io for comparison.
https://egpu.io/forums/builds/rtx-2080ti-in-razer-core-v2/Maleko48 likes this. -
greetings, recently i wanted to upgrade my p870tm-1 1080 to 2080, the options are limited and pricy, either got an upgrade parts of MXM 2080 or egpu option for an clevo 2080 mxm card cost 1799 CAD before tax. or at same price i can got a 2080ti + razer core x RGB. i already got my custom waterblock which worked pretty decent.
so which one would be a little bit worth?
the waterblock is capable for 2080 mxm card according to seller. -
Mastermind5200 Notebook Virtuoso
2080 internal will likely lead to better performance.
-
why does it so expensive to buy spare parts in Canada!
-
My 2080 Ti can 34000 Graphic Points at fire strike score on alienware amplifier with area 51m notebook. A other member here beat my score to 38000 Points in alienware egpu in alienware area 51m
-
Hi everyone,
I just stumbled upon this thread researching whether the addition of a GTX 2080 TI eGPU to my Clevo P775TM1-G with a GTX 1080 is a good idea to reduce rendering times in 3d graphics.
I understand that the eGPU is less performant than its desktop equivalent in real-time applications such as games, but what about 3D production GPU rendering engines such as Octane?
Actually, I would be considering exactly the same solution as yours, @ccarollo. Maybe it is too much to ask, but I thought I would give it a go - would you be so kind and run one more benchmark for me? That would be the OctaneBench with both GPUs enabled, if you could also run the fans in performance mode to minimize throttling it would be awesome. It just takes a few minutes. I would really really appreciate it. It would dispel all my doubts after reading confusing stuff on different forums, and I think such information would be very beneficial for many others.
I was also wondering if it's possible to connect two same eGPUs at the same time for rendering since our Clevos have two thunderbolts. Has anybody tried doing that maybe? Do you think it's theoretically possible?
I would really appreciate any feedback.
Thanks!
Clevo + eGPU results
Discussion in 'e-GPU (External Graphics) Discussion' started by ccarollo, Oct 14, 2018.