The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Clevo + eGPU results

    Discussion in 'e-GPU (External Graphics) Discussion' started by ccarollo, Oct 14, 2018.

  1. ccarollo

    ccarollo Notebook Consultant

    Reputations:
    47
    Messages:
    281
    Likes Received:
    90
    Trophy Points:
    41
    I recently added a eGPU to my Clevo laptop, and thought I'd post some thoughts and results...

    My goals
    • Good performance -- I wanted to play the latest games and do some tensorflow work. I have a 1080p, 75Hz display, so that's the top end of my perf requirements.
    • Quiet -- I generally use my laptop on the couch where my wife is sitting and reading/game/watching TV. Loud fans are pretty much a no-go.
    • Cool -- it's annoying, especially in the summer, when my laptop turns into a blazing furnace whenever I try to play anything.
    Not my goals
    • Price -- I didn't want to spend an insane amount, but cost wasn't a big consideration. Particularly if it lets me delay upgrading my laptop, which I tend to do every couple of years.
    • Benchmarking -- I want performance that lets me play what I want to, and play it well, but I'm not really interested in chasing benchmark scores.
    What I started with
    I have a Clevo P775DM3-G that I purchased a couple years ago. It's big 17" laptop that has a 6700k desktop process and mobile 1080. I generally use it on the couch or on a desk, either at home or work. Yes, it's heavy and big. No, it doesn't bother me. :D

    The big problem with using this laptop for gaming (or any other really intensive activity) is that, on the "Normal" fan setting (ie. usably quiet), it ends up thermally throttling both the CPU and the GPU. As far as I can tell, this is just a limitation of the laptop itself -- it's always done this, the heatsinks are clean, thermal paste is good. It's just that running a desktop processor and a 1080 in a small enclosure means you need to blast the fans to have any hope of keeping up. And on top of the annoyance factor, it's not great to be getting your CPU and GPU to their their thermal throttling points regularly.

    So, I decided to give an eGPU a try. I'm well aware of the performance hit that you take when you use an eGPU, but in doing some testing with a GTX980 that I had sitting around, it looked like it would improve my perf while also keeping temperatures and noise levels low, so I took the plunge.

    What I bought
    I ended up picking up the Razer Core X ($300) and a RTX 2080 TI ($1200). The Core X is a simple, clean little box with a few outputs and a good power supply. Basically a no-muss, no-fuss eGPU, that also happens to be on the lower end of the price spectrum of eGPUs. I totally recommend it.

    The 2080 TI is almost surely overkill, but a) I knew I was going to be taking a perf hit by using a eGPU, and I wanted to make sure it was going to improve my current performance, and b) money wasn't a huge issue and like I said, especially if this gets me to delay another $3000 laptop purchase for a year, it ends up making more sense. The specific model was the Gigabyte Gaming OC, mostly because it was the one that briefly popped up in stock on Amazon.

    How it works
    So I got the Core X, put the 2080TI in it (comfortably fits), and bought the 6' powered TB3 from Cable Matters (~$60). When I plugged it into the TB3 port on my laptop, Windows first installed some drivers and then popped up a dialog:

    [​IMG]

    Well, THAT message doesn't look promising. Thankfully, it's a total lie and eGPUs work great on this computer! :D Don't worry about it.

    Then I had to install desktop video drivers from nvidia, since previously I only had the mobile ones installed.

    The only thing you really need to do is make sure that you're on the 1803 version of Windows, which allows you to specify which GPU to run on. Annoyingly you have to do this on an execuable-by-executable basis, but it works great, and it remembers all the settings so you don't need to do it more than once for anything that you want to run. Basically just go to Desktop Settings and select Graphics Settings at the bottom:

    [​IMG]

    Then you can Browse to apps and add them to the list:

    [​IMG]

    And then if you select Options, you can pick which GPU to run on (this was when I was testing with the 980):

    [​IMG]
    And that's it. Then you can just run whatever you want, and it'll render it over on the eGPU and use your local GPU for actually rendering it to your display. You can also use an external display, which will get you slightly better performance, but it wasn't what I was interested in.

    And now, I can run whatever I want, and I just need to plug in the eGPU cable. The box itself is sitting next to me behind the end table, and is almost completely silent, even under heavy load. And whereas previously I'd end up with my CPU around 95C and my GPU at 90C, both thermally throttling, now my CPU never goes above 80C and neither GPU breaks 75C, all while on the lowest fan setting! It's great.

    Of course, I did some benchmarks to see what kind of performance difference there was between the throttling mobile 1080 and the eGPU 2080 TI:

    Ashes of the Singularity:

    [​IMG]
    [​IMG]

    Far Cry 5 (fairly CPU-limited, it seems):

    [​IMG]
    [​IMG]

    Fire Strike:

    [​IMG]
    [​IMG]

    Superposition (Extreme):

    [​IMG]
    [​IMG]

    Timespy:

    [​IMG]
    [​IMG]

    Shadow of the Tomb Raider:

    [​IMG]
    [​IMG]

    Overall

    I couldn't be happier! It works great, and does everything I was looking for. Happy to answer any other questions or do other benchmarks, if people are interested.
     
  2. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Wow. Your FS Graphics Score with the 2080 Ti eGPU is lower than what I get on my internal 1080 at stock...

    Edit: Looks like your 2080 Ti is losing about 40% of its performance
    [​IMG]

    Did you try using an external monitor hooked up to the 2080 Ti instead of the laptop display?

    Your internal 1080 is also being throttled by almost 30%.
     
    Last edited: Oct 14, 2018
  3. ccarollo

    ccarollo Notebook Consultant

    Reputations:
    47
    Messages:
    281
    Likes Received:
    90
    Trophy Points:
    41
    Yep, like I said, I'm aware of the perf loss. But thermal throttling is losing a lot too.

    Didn't try using an external monitor -- I'd just buy a desktop machine if I wanted that.

    The other interesting thing is that it varies a lot by benchmark. Firestrike is a pretty big drop, but something like Superposition shows a much smaller drop (~18%), and Time Spy is smaller too (~20%). And then Far Cry isn't much faster than my 1080 at all, but Shadow of the Tomb Raider is a LOT faster.
     
    Last edited: Oct 14, 2018
  4. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    I suggested an external monitor hooked directly to the 2080 Ti because it would probably not destroy performance as badly.

    But idk. $1600 ain't exactly chump change, and seems like a hugely unnecessary expenditure versus fixing what you already had, which would've gotten you better performance and mobility to boot. A set of P775TM1 heatsinks, liquid metal, and a delid tool would've gotten the job done, then you could spend the rest building a kickass desktop with a discounted 1080 Ti. :p
     
  5. ccarollo

    ccarollo Notebook Consultant

    Reputations:
    47
    Messages:
    281
    Likes Received:
    90
    Trophy Points:
    41
    I guess I'm not convinced I could have gotten any better performance out of my existing laptop -- it's already using liquid metal on both the CPU and GPU, and afaik you can't delid the GPU anyways? That's a lot of work, not without some risk, and there's no guarantee I end up with anything better than where I started.

    Anyways, this is the eGPU section of the forum is it not? Not much point telling people that they shouldn't use a eGPU. :D I was pretty clear about my goals, and given that I now have a silent laptop that's faster than any other (single-gpu) laptop that I'm aware of, I'm pretty happy. Mostly I posted in case others were interested in trying it themselves, so they can see the tradeoffs and what to expect.
     
    Last edited: Oct 15, 2018
    Lopezco likes this.
  6. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    That's completely normal behavior. The higher the FPS, the more the eGPU gets bottlenecked by the TB3 interface. Superposition and Time Spy don't see as much of a drop as Fire Strike because they run at lower frame rates.

    Your bottlenecks in FC5 and SotTR are huge at 1080p. In FC5, the desktop 2080 Ti gets 130 FPS to your 83 FPS (-36%). In SotTR, the desktop 2080 Ti gets 129 FPS to your 77 FPS (-40%), and you are GPU bound only 30% of the time, so 70% of the time you are bottlenecked by the eGPU interface or CPU. In both cases, your 2080 Ti is still slower than a non-throttling 1080, in FC5 massively so.

    Your poor 1080 is throttling so badly, even a massively bottlenecked 2080 Ti eGPU looks amazing next to it. Compared to a non-throttling 1080, your 1080 is getting 70% of its normal performance in FC5 and FS, and only 50% (!) of its normal performance in Superposition, Time Spy, and SotTR.

    Um excuse me? LOL. You have gotten much better performance out of your existing laptop. This is what you scored in Fire Strike when you first got the laptop 2 years ago. Huge difference. I think it's pretty obvious that you need to take off the heatsink, clean out the fans, and repaste. That should solve your thermal problems, assuming the CPU is already delidded with LM under the IHS and undervolted, and the GPU is also undervolted. If cleaning and repasting don't fix it, like I said, get the upgraded heatsinks from the P775TM1, those will drop your CPU and GPU temps under load by another 10C.

    Don't get me wrong, I'm not criticizing your little eGPU experiement. I'm all for doing things in the name of science and providing more information (even if it's the same information that's been out there for a long time regarding TB3 bottlenecking any eGPU faster than a 970). Getting 1070 level gaming performance out of a $1600 2080 Ti eGPU setup is horrendous, when most people can simply purchase an entire 1070-based notebook for that price, and just serves to further validate the fact that if your laptop has a GPU faster than a 1060, any TB3 eGPU setup is a waste of money. It's just not financially practical for most people, nor necessarily useful since your Clevo's abysmal GPU performance is an outlier to begin with and easily fixable, not to mention reducing mobility and convenience.
     
    Last edited: Oct 15, 2018
    intruder16, Maleko48 and Mr. Fox like this.
  7. Danishblunt

    Danishblunt Guest

    Reputations:
    0
    Look at the results again, interestingly enough only firestrike is bottlenecked that hard, the rest is utterly destroying the 1080. Look at timespy for instance.

    It was the right decision. Since non nerfed high end RTX cards won't exist for MXM this was the only right thing to do. Modding in your notebook would have been waste of time and you couldn't ever get the performance you have now, not only that but eGPU is still in it's "baby shoes" it will get better and better over the years with drivers etc.

    In this community there are people who insists that notebook hardware can keep up with desktop hardware, which is an absolute joke. As you can already read from yrekabakery's post, he is absolutely ignoring facts and claims stuff that doesn't make any sense whatsoever.

    Best example here:
    This is the highest modded TDP unlocked and clocked notebook GTX 1080
    https://www.3dmark.com/spy/3766000

    meanwhile you beat it with ease, you probably have some bottlenecks in your RTX 2080TI that aren't only bandwidth bound tho.
     
    Last edited by a moderator: Oct 15, 2018
  8. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    FC5 and SotTR show similar 36% to 40% bottlenecks with performance well below a non-throttling 1080. TimeSpy and Superposition show little bottlenecking because they are running at lower frame rates. The severity of the bottleneck scales with frame rate, that's why a powerful TB3 eGPU setup is far more suited to 4K 60Hz than 1080p 144Hz, and external monitor plugged directly into the eGPU reduces the bottleneck as well.

    Even a 2080 Ti bottlenecked by that much still beats his 1080 (albeit just barely in FC5), because his particular 1080 is throttling so badly that it's only getting 50% to 70% of its normal performance.
     
  9. Danishblunt

    Danishblunt Guest

    Reputations:
    0
    Mr.fox modded GTX 1080 doesn't get nowhere close to his 2080TI in timespy for instance. So stop with that bogus.

    Also he probably has some settings wrong etc. eGPU can be a little fiddling work, I will help the guy out and if he accepts my help, He will probably show u some better numbers.

    also yes, external screen does reduce boitteneck even further.
     
  10. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    @Mr. Fox doesn't have a 2080 Ti. And read what I wrote first before posting bogus. I'm talking about 2080 Ti performance inside a TB3 eGPU enclosure, not inside a desktop. I've already explained to you why the eGPU is less bottlenecked in Time Spy and similar low frame rate synthetic benchmarks than in actual games running at more realistic frame rates.
     
    Mr. Fox likes this.
  11. Danishblunt

    Danishblunt Guest

    Reputations:
    0
    https://www.3dmark.com/compare/spy/4670417/spy/3766000



    ?????????????????????????
    What are you getting at?? Do you even bother reading posts?

    I said his modded GTX 1080. Stop posting bogus, tyvm.
     
  12. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
  13. Danishblunt

    Danishblunt Guest

    Reputations:
    0
    Rise of the Tomb Raider '
    Desktop GTX 980 TI 59.9 FPS
    eGPU GTX 980 TI: 58.1 FPS

    Tom Clancy's The Division
    Desktop GTX 980 TI: 59.8
    eGPU GTX 980 TI FPS 46.8

    Tom Clancy's GhostRecon
    Desktop GTX 980 TI: 87.9 FPS
    eGPU: GTX 980 TI 56.6 FPS

    Shadow of Mordor
    Desktop GTX 980 TI: 128.5 FPS
    eGPU: GTX 980 TI 97.3 FPS

    source


    I repeat, bogus. I have no idea why you made that up, but it's certanly not how eGPU works. Higher FPS doesn't mean stronger bottleneck, that's just something you made up for no apparent reason.
     
  14. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    And in the same article:
    Nice reading skills, pal. :rolleyes:
     
    Maleko48 likes this.
  15. Danishblunt

    Danishblunt Guest

    Reputations:
    0
    Where does it state higher FPS higher bottleneck?

    Nice reading skills, pal. :rolleyes:
     
  16. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    In the table below that quote. You can do some basic math, right?

    Oh and this:
    Checkmate, pal.
     
  17. Danishblunt

    Danishblunt Guest

    Reputations:
    0
    Sicne you obviously have insane difficulty reading I'll have to make a picture for you because this is getting redicolous how hard it is for you to read a table.
    [​IMG]'

    If you still can't see it, please find professional help.

    Here is some elementary school math for you since you probably need this as well:
    87.9FPS is less than 128.5FPS

    56.6/87.9 = 64.3%
    128.5/97.3 = 75.7%

    100 - 64.3= 35,7% bottleneck
    100 - 75.7 = 24.3% bottleneck.

    1440P:
    66/45.9 = 69%
    89.2/73.6= 82.5%

    100 - 69 = 31% bottleneck
    100 - 82.5 = 17.5% bottleneck.

    You might want to improve your reading and math skills.

    Checkmate, pal.
     
    Last edited by a moderator: Oct 15, 2018
  18. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Good, I'm glad we're on the same page now.

    [​IMG]
     
    Last edited: Oct 15, 2018
    Maleko48 likes this.
  19. Danishblunt

    Danishblunt Guest

    Reputations:
    0
    that your statement about higher FPS = more bottleneck is garbage?
    yeah we can agree on that.

    As you showcased yet again with that screenshot.

    It would be interesting to find out why 4k is less bottlenecked tho.
     
  20. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Read this again. Slowly. Several times if you have to. Until you understand.
     
    Maleko48 likes this.
  21. Danishblunt

    Danishblunt Guest

    Reputations:
    0
    Literally look at your own and my picture which literally conflicts with the quote.
    Are you seriously that desperate to prove a point that doesn't exist?

    Literally in the first 3 valley benchmark your statement is wrong. The highest bottleneck is with the test that has the lowest FPS.

    Shadow of mordor has most FPS in the gaming part, yet Ghost Recon has much more bottleneck despite having less FPS.

    You are extremely deluded.

    Not to mention that you're currently saying, that 4k uses less bandwith than 1080p. Obvioulsy the OP of that thread has no idea what he's talking about and completely fails to interpret the benchmarks provided. Nobody in his right mind is going to say that 4k uses less bandwith than 1080p, the very idea is just stupid. There is a reason why older HDMI ports cannot handle 4k 60hz output, but can easily handle 1080p 120hz. The quote you're hanging on to is so increadibly contradicory, that onnly a person who is extremely deluded or has no idea about anything would kling on to.
     
  22. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    In the same game, rendering at higher FPS (due to lower resolution) results in a larger bottleneck than rendering at lower FPS (due to higher resolution). The data fully supports this.

    No, you're the one who's deluded. You're talking about the display interface. He's talking about the GPU interface (TBT/PCIe). Two totally different things. The monitor's resolution and refresh rate affect the bandwidth usage on the display interface. The frame rate that the GPU is outputting, which is separate from the monitor's resolution and refresh rate, affects the bandwidth usage on the GPU interface. He and I are both saying that higher FPS (due to lower resolution) is more bandwidth on the GPU interface, and lower FPS (due to higher resolution) is less bandwidth on the GPU interface.

    You're the one who's being obtuse by comparing different games in terms of absolute FPS values. You can't seem to grasp that different games have different inherent GPU interface bandwidth usage. However, a higher frame rate will require more bandwith than a lower frame rate within the same game.
     
    Last edited: Oct 15, 2018
    Maleko48 likes this.
  23. Danishblunt

    Danishblunt Guest

    Reputations:
    0
    Again an absolute baseless assumption. Where are lowered settings showcasing more FPS being more bottlenecked (aside from resolution obviously)??
    As to why higher resolution = less bottleneck:
    The higher the resolution and the more stressed the card is, the smaller the performance gap becomes compared to the same card installed in a desktop. The bottleneck starts to shift from PCIe bandwidth back to raw GPU performance.

    Also what you described is upscaling, NOT native 4k. The stress is bigger on both the GPU and CPU interface when playing at 4k. Otherwise we would see little to no performance difference in 1080p vs 4k if it would work like you currently are making it up to be.

    seriously, stop it, it's getting extremely embarrassing for you.

    To illustrate even further how correct I am The performance drops are relative to how well or bad the game / program can utilize the GPU.

    The oldest and weakest test which cannot use the GPU from unigine has the highest Performance loss, the 2nd newest has the 2nd highest performance loss, the 3rd and newest has the least amount of performance loss.

    Meanwhile Firestrike runs poorly vs Timespy.
     
    Last edited by a moderator: Oct 15, 2018
  24. Mastermind5200

    Mastermind5200 Notebook Virtuoso

    Reputations:
    372
    Messages:
    2,152
    Likes Received:
    826
    Trophy Points:
    131
    Wow, all I can say is, what a big waste of money, try to return that RTX 2080 TI if you can. Even with an NVME and external monitor setup, youre still bottlenecked to high hell
     
  25. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    No, you're the one embarrassing yourself as per usual. You get burned so much in this forum, you're probably immune to it by this point.

    On one hand, you're agreeing with me with this statement:
    On the other hand, you're flaming and throwing in orthogonal variables such as differing graphical settings and upscaling which were non-factors in the test.

    Like earlier in this thread, you don't even realize that we're saying the same thing, so you can't decide whether to agree with me or insult me. Honestly it's hilarious to see the dichotomy and you struggling with your lack of self-awareness. Just make up your damn mind already, am I right or am I wrong, which is it gonna be? You can't have both. LMAO

    Anyway, I'm done wasting my time with you. It's like trying to explain things to a petulant child. Even the most basic terms go over your head and cause you to throw tantrums.
     
    Last edited: Oct 15, 2018
    Maleko48 likes this.
  26. ccarollo

    ccarollo Notebook Consultant

    Reputations:
    47
    Messages:
    281
    Likes Received:
    90
    Trophy Points:
    41
    Not sure what I expected posting here, seems like everything on this forum turns into people telling other people that they're doing something dumb.

    I mean, this is the eGPU section of the forum. I started out saying that I know that the 2080 TI is going to be significantly bottlenecked by the TB3 connection. Everyone knows this! Pointing out that it would be faster in a desktop is pointless, of course it is! But -- for my goals -- it works fantastically!

    As far as I can tell -- and please let me know if I'm wrong -- there is absolutely no other way to get a similar level of performance to what I'm currently getting, at the fan/noise/heat levels that I want. Yes, I posted a faster TimeSpy when I first got my laptop, but that was with the fans going full blast, in order to get the highest score possible, and I've already said that's a dealbreaker for me for normal use.

    Additionally, my laptop's panel is only 75Hz, which means that any increased bottlenecking from higher framerates aren't that big a deal -- they mostly show up in benchmarks, but not real use.

    Bottom line: If money isn't a big factor, and you want a quiet/cool laptop that has better performance than pretty much anything else on the market, I think this is a great way to go.
     
    Maleko48 likes this.
  27. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    There is. You could water cool your notebook like @bennyg and others have done.

    Then you got into the wrong market (gaming notebooks) when perhaps you would've been better served going the silent desktop route. Fan noise is inevitable on a gaming notebook, and if that is dealbreaker, then you're simply not understanding the physical constraints around stuffing high performance, high heat producing components inside a small enclosed space.

    But this bottom line is incorrect. All the 1080 and 1070/1080 SLI laptops didn't just suddenly die and go to heaven.
     
    Maleko48 likes this.
  28. ccarollo

    ccarollo Notebook Consultant

    Reputations:
    47
    Messages:
    281
    Likes Received:
    90
    Trophy Points:
    41
    But I didn't! I have a laptop right now that's quiet and fast, and all I have to do is plug another cable in when I'm sitting on the couch with it! That's the entire point of my post. I guess I don't know why you're in this subsection of the forum if you're just going to tell everyone that they're making the wrong decision. There are obviously tradeoffs. I explained those tradeoffs in my original post. Given what I'm looking for, I'm totally okay with those tradeoffs.

    And no, I'm not interested in replacing the entire heatsink of my laptop, or water-cooling it (lol). That's way more work and risk than I'm interested in, when there's a plug-and-play option out there that I know works just as well.

    Are there 1080 laptops that can run with low temps on low fans and match the perf of my current setup? If so, great, point me at them, that's almost surely what I'll be buying when I next upgrade. If I can get comparable temp/noise/perf results without a eGPU, that's great, I'll buy one of those and throw the 2080TI into my VR machine. But I'm not aware of any right now.

    Not sure why everything has to turn into a pissing contest here. It's really obnoxious, and it drives away people like me, when I spent some time putting together a post that I thought other people -- on the eGPU forum especially -- would find interesting and informative.
     
  29. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Like I said in my previous disclaimer, it's a cool little science experiment, but financially and functionally impractical for the vast majority of people. If anything, seeing this degree of performance degradation turns off people from considering the TB3 eGPU route altogether, when there are valid use-cases for it, such as a 1060/RX 580 eGPU with an ultraportable that lacks a dGPU.
     
  30. ccarollo

    ccarollo Notebook Consultant

    Reputations:
    47
    Messages:
    281
    Likes Received:
    90
    Trophy Points:
    41
    Given that it's probably going to give at least a year of life to my laptop, I don't think it's quite as financially impractical as you make it out to be.

    Another advantage is that, since I'm using a desktop part, it's easily transplantable -- like I said, if at some point I get a new laptop or decide that I don't want to use the eGPU anymore, I can take the 2080TI and put it in my VR machine, or sell it, or what have you. That's not really possible with a laptop GPU part.

    Fundamentally, anyone thinking of going the eGPU route should have their eyes open about the tradeoffs that they're making. If they're not willing to give up 20-40% of raw desktop performance, they shouldn't be doing it, and they SHOULD be turned off by the process. Thankfully, at the moment, that raw performance loss is still ahead of what most laptops can do, and lands in a better performance-vs-noise-vs-temperature place than any laptop that I'm aware of.
     
  31. Joikansai

    Joikansai Notebook Deity

    Reputations:
    254
    Messages:
    1,663
    Likes Received:
    704
    Trophy Points:
    131
    It’s wasting 2080ti performance on 1080p especially on eGPU setting imo, I would recommend getting 1440p or 4K monitor. It’s due TB3 bandwidth limitation and cpu bottleneck, try on higher resolution I’m playing SoTR well on 4K and almost getting same performance with desktop setting. Other things to note on high FPS data that transferred between host (laptop) and device would be slower that made huge FPS drop on high FPS. TB3 Egpu setting is also depends on PCIe3 lanes of your laptop, 2 lanes will has lower FPS compared to 4 lanes and would get half bandwidth (similar with tb2) if you use internal laptops monitor due data looping. From benchmark since 3Dmark, Firestrike is 1080p you’ll see huge gpu score bottleneck but not on dx12 timespy, still lower than desktop though but as huge as Firestrike. If you’ve full 3Dmark, you can compare on higher resolution. Far cry could run better if you use full screen instead borderless I think. The cpu temperature and Fans noise as you said, yes it’s a huge improvement and it’s one eGPU setting advantage with gaming laptop, my setting also similar, cpu never breaks 80 on most games, always stay on 70ish since there’s more room for cooling from inactivity dGPU inside laptop. Only AC Odyssey, same as previous origins that stress the cpu more than any stress test can bring it to 80ish.
    I’ve 144hz 1070maxq Laptop and I use that for esports titles to runs over 100fps, but for 4K/1440p games I’m using egpu setting with rtx card. Here’s my post on egpu.io for comparison.
    https://egpu.io/forums/builds/rtx-2080ti-in-razer-core-v2/
     
    Maleko48 likes this.
  32. GK123

    GK123 Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    greetings, recently i wanted to upgrade my p870tm-1 1080 to 2080, the options are limited and pricy, either got an upgrade parts of MXM 2080 or egpu option for an clevo 2080 mxm card cost 1799 CAD before tax. or at same price i can got a 2080ti + razer core x RGB. i already got my custom waterblock which worked pretty decent.
    so which one would be a little bit worth?

    the waterblock is capable for 2080 mxm card according to seller.
     
  33. Mastermind5200

    Mastermind5200 Notebook Virtuoso

    Reputations:
    372
    Messages:
    2,152
    Likes Received:
    826
    Trophy Points:
    131
    2080 internal will likely lead to better performance.
     
  34. GK123

    GK123 Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    why does it so expensive to buy spare parts in Canada!
     
  35. nuestra

    nuestra Notebook Enthusiast

    Reputations:
    0
    Messages:
    36
    Likes Received:
    1
    Trophy Points:
    16
    My 2080 Ti can 34000 Graphic Points at fire strike score on alienware amplifier with area 51m notebook. A other member here beat my score to 38000 Points in alienware egpu in alienware area 51m
     
  36. Antek J

    Antek J Newbie

    Reputations:
    0
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    5
    Hi everyone,
    I just stumbled upon this thread researching whether the addition of a GTX 2080 TI eGPU to my Clevo P775TM1-G with a GTX 1080 is a good idea to reduce rendering times in 3d graphics.

    I understand that the eGPU is less performant than its desktop equivalent in real-time applications such as games, but what about 3D production GPU rendering engines such as Octane?

    Actually, I would be considering exactly the same solution as yours, @ccarollo. Maybe it is too much to ask, but I thought I would give it a go - would you be so kind and run one more benchmark for me? That would be the OctaneBench with both GPUs enabled, if you could also run the fans in performance mode to minimize throttling it would be awesome. It just takes a few minutes. I would really really appreciate it. It would dispel all my doubts after reading confusing stuff on different forums, and I think such information would be very beneficial for many others.

    I was also wondering if it's possible to connect two same eGPUs at the same time for rendering since our Clevos have two thunderbolts. Has anybody tried doing that maybe? Do you think it's theoretically possible?

    I would really appreciate any feedback.

    Thanks!