So my eGPU bandwidth testing on the titan XP is finished. A bit of good news for you guys.
@Mr. Fox @bloodhawk @anassa
Test Conditions:
CPU: [email protected] OC
RAM:16GB of DDR4 Quad Channel@3200mhz 16-18-18-38-2
GPU:Titan XP running with custom fan profile, 120% Power Limit and stock clocks.
I used GPUZ to confirm the card is running at PCIE 1.1 x16(I believe same data rate as PCIE 1.0 x16 and PCIE 3.0 x4)
Firestrike Ultra:
PCIE 1.1 x16: http://www.3dmark.com/fs/9668106
PCIE 3.0 x16: http://www.3dmark.com/fs/9668539
Heaven 4.0:
PCIE 1.1 x16: http://imgur.com/a/dfOay
PCIE 3.0 x16: http://imgur.com/a/gcv1S
So far, very little differences between the two. It looks like the lower PCIE bandwidth is not bottlenecking the titan XP heavily! Although it seems like during heaven, PCIE 1.1 x16 did have a lower minimum fps. I am not sure if thats just due to variances or something to do with lower bandwidth.
However, remember actual TB3 have 10% less bandwidth than PCIE 3.0 x4(I believe) and I wouldnt recommend running a Titan XP as a eGPU on anything other than a highly clocked 6700k especially if you are running it at 1080p. I would be very worried about CPU bottlenecks on 1080p even with 6700k OC.
-
You can't make it run at x4 speeds? As in PCIE 3.0 x4 not PCIE 1.1 x16
-
-
You should have turned off four cores and done the testing to see how it is since most eGPU people will have quad cores at best. -
So if you want to avoid CPU bottleneck on lower resolutions, 6700k OC is a must imo. Unless even CPU bottlenecked, you can maintain your ideal fps at 1080p.bloodhawk likes this. -
Nice! Not much difference, the GPU ill be running as an eGPU will be a 1080 that too at 1080p, tops. Not sure if systems without Optimus let the eGPU drive the internal display, though.
I dont know any games that might get bottle-necked by the 6700k running in tandem with a TitanXP or 1080, unless the PCIe lanes come into play, i might be wrong though, since i dont quite play too many AAA titles. -
Its just it may give you enough fps anyways. Titan XP is a beast made for 4k/ultrawide, if you are on 1080p/1440p, just go with 1070 or even 1080.bloodhawk likes this. -
Still deciding between the 1070 and the 1080, IF the Razer Core works.
HURRY the hell up with 1080 Strix stocks Amazon! -
-
Any chance you would be able to hook this up to a dual-core laptop (i.e. 6500U), as this is a notebook forum. I'd be interested to see how it handles.
Suppose you could also disable cores on your 5960X to test dual-core, quad-core, etc. -
No I dont have a egpu enclosure atm nor a TB3 laptop. -
what is going to go on with
a dell xps 15 with i5 a razer core and then a 1070
uhd internal screen
witcher 3 -
http://forum.notebookreview.com/threads/i-have-an-xps-15-9550-working-with-a-razer-core-ama.793250/ -
I have a 5820k with 1080, anyone interested in a 2 core benchmark?
D2 Ultima likes this. -
While you are at it, will it possible to lock the clock @ 4Ghz and Cores to 4 , then run a few benches? -
I know everyone likes to use established benching programs, but I honestly think they might not stress the connection enough. A friend of mine once plugged in his 980Ti into the wrong slot and while firestrike only had a small percentage drop in GPU score (as I made him test), in WoW with a bunch of characters on-screen he was extremely limited. It made his old 780Ti look twice as strong, and if I remember correctly he couldn't even hold 30fps at 1080p.
So I want to test some games where there's a ton of draw calls or ones that are pretty bandwidth dependent in general.bloodhawk likes this. -
Brilliant. Thanks a lot, and everyone else who is dropping some good information. With TB3 using a eGPU really does look it will be worth it. Its also non-optimus then so having a nice freesyn/gsync 1440p ips monitor +eGPU (1070 or whatever big AMD chip comes in a year) seems like a very nice setup. As mentioned before by others, I am worried about some of weak CPU's that are being shipped with the TB3 laptop, in this case it makes a good argument for going with a laptop like P750DM if the TB3 connection will work with a eGPU. -
@tgipier,
Any updates?Thanks for testing it out, by the way.
-
-
Okay, sounds good. I doubt the memory will make a huge difference. It will still give us a good idea of what to expect.
-
hmscott likes this.
-
-
GTA V is very CPU intensive so I definitely would have expected that.
What resolution are you testing btw? -
-
-
-
Yes also make sure the games you test are running at 1080p or less. The higher your resolution the less the cpu matters. At 3K/4K most games become GPU bound.
-
Regular Firestrike:
http://www.3dmark.com/fs/9682737 (full system)
http://www.3dmark.com/fs/9683064 (PCIE 1.1 x16 and quad cores locked 3.0ghz.)
http://www.3dmark.com/fs/9683213 (PCIE 1.1 x16 and dual cores locked 3.0ghz)
Unigine 1080p maxed
http://imgur.com/a/k4Elp (full system)
http://imgur.com/a/nByVB (PCIE 1.1 x16 and quad cores locked 3.0ghz.)
http://imgur.com/a/whAme (PCIE 1.1 x16 and dual cores locked 3.0ghz) -
There is a 10 FPS drip in Unigen. That is also pretty much decent and not that much.
Thank you for doing these. Ill make a thread as soon as we manage to get the 1080 and drop it in the Core. -
Surprised by the results. Thought it would be much more degrading, having only two cores running.
Thank you. -
-
Witcher 3 is one of the examples that uses both the CPU and GPU quite well depending on the scenario.hmscott likes this. -
Now we wait for @bloodhawk to test the core.
-
Unfortunately I believe you lose way more performance on an actual eGPU setup (played around with the Razer Core). It's not the bandwidth that's holding it back, that's at most contributing a performance degradation of 1-4%. For some reason on eGPU, benching Firestrike on laptop vs desktop comes up with similar scores. Playing games you take a 20-25% hit on performance even with something like a 6280HQ. I honestly have no idea why, but there's discussion that it's the overhead from converting PCIE to TB3 signals, but I actually have no idea why the performance degradation occurs.
-
-
Using the internal display adds an addition 10-15% performance degradation on top of the 20-25% hit I mentioned. I was on a XPS15 benching against a Z170 with a 980. It was my friend's rig. You can get the Razer Core working with a desktop motherboard and notice the same performance degradation versus just slotting the GPU onto the native PCIE slots on the mb.
Last edited: Aug 9, 2016hmscott likes this.
Testing possible eGPU bottleneck with Titan XP
Discussion in 'Gaming (Software and Graphics Cards)' started by tgipier, Aug 6, 2016.