The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Testing possible eGPU bottleneck with Titan XP

    Discussion in 'Gaming (Software and Graphics Cards)' started by tgipier, Aug 6, 2016.

  1. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    So my eGPU bandwidth testing on the titan XP is finished. A bit of good news for you guys.
    @Mr. Fox @bloodhawk @anassa

    Test Conditions:
    CPU: [email protected] OC
    RAM:16GB of DDR4 Quad Channel@3200mhz 16-18-18-38-2
    GPU:Titan XP running with custom fan profile, 120% Power Limit and stock clocks.
    I used GPUZ to confirm the card is running at PCIE 1.1 x16(I believe same data rate as PCIE 1.0 x16 and PCIE 3.0 x4)

    Firestrike Ultra:
    PCIE 1.1 x16: http://www.3dmark.com/fs/9668106
    PCIE 3.0 x16: http://www.3dmark.com/fs/9668539

    Heaven 4.0:
    PCIE 1.1 x16: http://imgur.com/a/dfOay
    PCIE 3.0 x16: http://imgur.com/a/gcv1S

    So far, very little differences between the two. It looks like the lower PCIE bandwidth is not bottlenecking the titan XP heavily! Although it seems like during heaven, PCIE 1.1 x16 did have a lower minimum fps. I am not sure if thats just due to variances or something to do with lower bandwidth.
    However, remember actual TB3 have 10% less bandwidth than PCIE 3.0 x4(I believe) and I wouldnt recommend running a Titan XP as a eGPU on anything other than a highly clocked 6700k especially if you are running it at 1080p. I would be very worried about CPU bottlenecks on 1080p even with 6700k OC.
     
  2. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    You can't make it run at x4 speeds? As in PCIE 3.0 x4 not PCIE 1.1 x16
     
  3. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    No, I can only control the generation of PCIE. However, I believe PCIE 1.1 x 16 is the same speed as PCIE 3.0 x4.
     
  4. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Yes that's right. PCIE 1 x16 is about the same bandwidth as PCIE 3 x4.

    You should have turned off four cores and done the testing to see how it is since most eGPU people will have quad cores at best. ;)
     
  5. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    It wont make a massive difference imo. I dont think for those programs take advantage of more than 4 cores if even. :(

    So if you want to avoid CPU bottleneck on lower resolutions, 6700k OC is a must imo. Unless even CPU bottlenecked, you can maintain your ideal fps at 1080p.
     
    bloodhawk likes this.
  6. bloodhawk

    bloodhawk Derailer of threads.

    Reputations:
    2,967
    Messages:
    5,851
    Likes Received:
    8,566
    Trophy Points:
    681
    Nice! Not much difference, the GPU ill be running as an eGPU will be a 1080 that too at 1080p, tops. Not sure if systems without Optimus let the eGPU drive the internal display, though.

    I dont know any games that might get bottle-necked by the 6700k running in tandem with a TitanXP or 1080, unless the PCIe lanes come into play, i might be wrong though, since i dont quite play too many AAA titles.
     
  7. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    Oh it probably will. Considering games still have problems taking advantage of multiple cores...
    Its just it may give you enough fps anyways. Titan XP is a beast made for 4k/ultrawide, if you are on 1080p/1440p, just go with 1070 or even 1080.
     
    bloodhawk likes this.
  8. bloodhawk

    bloodhawk Derailer of threads.

    Reputations:
    2,967
    Messages:
    5,851
    Likes Received:
    8,566
    Trophy Points:
    681
    Yeah, gonna shove a TitanXP in the workstation anyways.
    Still deciding between the 1070 and the 1080, IF the Razer Core works.

    HURRY the hell up with 1080 Strix stocks Amazon!
     
  9. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    I would go with the 1070 if its only 1080p.
     
  10. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Any chance you would be able to hook this up to a dual-core laptop (i.e. 6500U), as this is a notebook forum. I'd be interested to see how it handles.

    Suppose you could also disable cores on your 5960X to test dual-core, quad-core, etc.
     
  11. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    Sure..... I can actually do that. I will have it tested and up later. I can adjust clock speed as well to match simulated system.

    No I dont have a egpu enclosure atm nor a TB3 laptop.
     
    bloodhawk and J.Dre like this.
  12. stamar

    stamar Notebook Prophet

    Reputations:
    454
    Messages:
    6,802
    Likes Received:
    102
    Trophy Points:
    231
    what is going to go on with

    a dell xps 15 with i5 a razer core and then a 1070

    uhd internal screen

    witcher 3
     
  13. bloodhawk

    bloodhawk Derailer of threads.

    Reputations:
    2,967
    Messages:
    5,851
    Likes Received:
    8,566
    Trophy Points:
    681
  14. Mobius 1

    Mobius 1 Notebook Nobel Laureate

    Reputations:
    3,447
    Messages:
    9,069
    Likes Received:
    6,376
    Trophy Points:
    681
    I have a 5820k with 1080, anyone interested in a 2 core benchmark?
     
    D2 Ultima likes this.
  15. bloodhawk

    bloodhawk Derailer of threads.

    Reputations:
    2,967
    Messages:
    5,851
    Likes Received:
    8,566
    Trophy Points:
    681
    Yeap.

    While you are at it, will it possible to lock the clock @ 4Ghz and Cores to 4 , then run a few benches?
     
  16. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Think you can also bench Crysis 3, Witcher 3, GTA V and Black Ops 3 if you have it?

    I know everyone likes to use established benching programs, but I honestly think they might not stress the connection enough. A friend of mine once plugged in his 980Ti into the wrong slot and while firestrike only had a small percentage drop in GPU score (as I made him test), in WoW with a bunch of characters on-screen he was extremely limited. It made his old 780Ti look twice as strong, and if I remember correctly he couldn't even hold 30fps at 1080p.

    So I want to test some games where there's a ton of draw calls or ones that are pretty bandwidth dependent in general.
     
    bloodhawk likes this.
  17. anassa

    anassa Notebook Consultant

    Reputations:
    35
    Messages:
    210
    Likes Received:
    109
    Trophy Points:
    56

    Brilliant. Thanks a lot, and everyone else who is dropping some good information. With TB3 using a eGPU really does look it will be worth it. Its also non-optimus then so having a nice freesyn/gsync 1440p ips monitor +eGPU (1070 or whatever big AMD chip comes in a year) seems like a very nice setup. As mentioned before by others, I am worried about some of weak CPU's that are being shipped with the TB3 laptop, in this case it makes a good argument for going with a laptop like P750DM if the TB3 connection will work with a eGPU.
     
  18. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    @tgipier,

    Any updates? :) Thanks for testing it out, by the way.
     
  19. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    I just woke up. I will go mimic a 6700hq system later. I will run it at 4 cores 3.0ghz. Although I still have quad channel mem.
     
  20. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Okay, sounds good. I doubt the memory will make a huge difference. It will still give us a good idea of what to expect.
     
  21. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    If you have some spare time I'd also love to see 2 cores @ 3ghz as well. If not I can try and see if I can do this tomorrow on my 5820K + Fury X
     
    hmscott likes this.
  22. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    Nvm, similar utilization issue on full setup.
     
  23. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    GTA V is very CPU intensive so I definitely would have expected that.

    What resolution are you testing btw?
     
  24. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    3440 x 1440.
     
  25. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Oh you should test at 1080p man! At those resolutions (2K, 3K, 4K) the games become more GPU dependent. At 1080p you will see much better results from this testing.
     
  26. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    Yeah I probably should. Maybe I should just use normal firestrike over firestrike ultra...
     
  27. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Yes also make sure the games you test are running at 1080p or less. The higher your resolution the less the cpu matters. At 3K/4K most games become GPU bound.
     
  28. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
  29. bloodhawk

    bloodhawk Derailer of threads.

    Reputations:
    2,967
    Messages:
    5,851
    Likes Received:
    8,566
    Trophy Points:
    681
    Wow, the GPU scores in firestrike pretty much stay the same.

    There is a 10 FPS drip in Unigen. That is also pretty much decent and not that much.

    Thank you for doing these. Ill make a thread as soon as we manage to get the 1080 and drop it in the Core.
     
  30. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Surprised by the results. Thought it would be much more degrading, having only two cores running.

    Thank you.
     
  31. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    It will be more apparent in games probably.
     
  32. bloodhawk

    bloodhawk Derailer of threads.

    Reputations:
    2,967
    Messages:
    5,851
    Likes Received:
    8,566
    Trophy Points:
    681
    Not really, most of the AAA titles that i was fooling around with, dont quite use the CPU as much as we would like them to. End of the day it comes down to the single core clock speed if a game isn't all that optimized.
    Witcher 3 is one of the examples that uses both the CPU and GPU quite well depending on the scenario.
     
    hmscott likes this.
  33. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    bloodhawk and Papusan like this.
  34. PendragonInc

    PendragonInc Notebook Guru

    Reputations:
    17
    Messages:
    52
    Likes Received:
    26
    Trophy Points:
    26
    Unfortunately I believe you lose way more performance on an actual eGPU setup (played around with the Razer Core). It's not the bandwidth that's holding it back, that's at most contributing a performance degradation of 1-4%. For some reason on eGPU, benching Firestrike on laptop vs desktop comes up with similar scores. Playing games you take a 20-25% hit on performance even with something like a 6280HQ. I honestly have no idea why, but there's discussion that it's the overhead from converting PCIE to TB3 signals, but I actually have no idea why the performance degradation occurs.
     
  35. bloodhawk

    bloodhawk Derailer of threads.

    Reputations:
    2,967
    Messages:
    5,851
    Likes Received:
    8,566
    Trophy Points:
    681
    Which system did you test on? Using the Internal display will definitely have degradation and then even more if the most of the lanes are busy.
     
  36. PendragonInc

    PendragonInc Notebook Guru

    Reputations:
    17
    Messages:
    52
    Likes Received:
    26
    Trophy Points:
    26
    Using the internal display adds an addition 10-15% performance degradation on top of the 20-25% hit I mentioned. I was on a XPS15 benching against a Z170 with a 980. It was my friend's rig. You can get the Razer Core working with a desktop motherboard and notice the same performance degradation versus just slotting the GPU onto the native PCIE slots on the mb.
     
    Last edited: Aug 9, 2016
    hmscott likes this.