+207 core +268 memory that one was at
-
-
Ah...., thats not good then. I thought it was ok for stock.
http://www.3dmark.com/fs/9860979
Similar clock. I think its around +190/180? core +450??? mem. -
Well that's right in line with a 20 percent loss most people are reporting. Just need to figure out why! I'm not convinced it's just tb3
-
Ahh, will he be able to do a stock run ?
@Johnksss@iBUYPOWER @Mobius 1
Need your help in overclocking this card. For now, i cant go even a mhz beyond +125/+150 , that gives me a turbo up to 2012Mhz. Perfectly stable. Max temps i have seen is 65C.
Do i need to flash a different BIOS ?
Now that explains it. -
I haven't heard back from him since the last run.... But if you filter 3d mark ultra score's by gtx 970m you will see all his runs on his Razer with core.
That way you can compare clock for clock -
Interesting... this means all laptops with internal gsync displays and no MUX switch will never be capable of using an eGPU without a second screen? It makes some form of sense, though. Passing through the dGPU was never really a thing; only the iGPUs. It's possible nVidia's drivers have no allowance for it, and only Intel and AMD do. Why would anybody have had to use an Optimus-esque solution for a dGPU-only machine in the past?
-
You need lower temps than 65c to sustain a given MHz.
The card will drop -13MHz every 5c (after 50c). So your target temperature should be around 49c or lower. Pascal also benefits greatly on OC from lower temps. Do a repaste with liquid metal if you can.
Memory (GDDR5X, Micron) should be able to get at least +400. Most cards are stable at around +450 to +500 before the error correction kicks in and you get negative scaling from memory OC.
+125 on core is a lot, especially for a pre-overclocked card. I suggest you reset that for now (apply 0 on afterburner).
Open afterburner settings and select "unlock voltage: third party, force constant voltage". Click apply.
You can use the graph (CTRL+F) and select a voltage limit of 1.093v, you should be able to click on that and adjust it with the arrow keys on keyboard for increment the MHz added to final boost clock. Once you set it, maybe +13MHz, press L on that particular voltage point, MSi afterburner should now lock the card to operate only at that voltage limit when it is running a 3D Load.
If your card stays at a high voltage on idle, use Nvidia inspector "multi display power saver" to force P8 idle state under a specified GPU load (I set mine to kick on P0 whenever the GPU/VPU usage crosses 13%). -
You need an iGPU to control the internal display since the desktop cards don't have any display signal lanes on the PCIe connections. -
Yeah, on further reading, seems like the laptop needs Optimus / Mux to use it in loopback. Not a big deal for me, since i only use external monitors.
Gotcha. I was trying to use the ASUS GPU Tweak app, like i did with my 980's in the desktop. But seems like its become garbage since then. I cant even set a custom fan curve since its bugged or something.
Will definitely do a re-paste with CLU tonight on later on next week.
Do MSI After Burner / EVGA Precision , let you do custom fan curves?
Ill give this a shot later tonight.
Just touched 18536 @ +125 / +300 without any extra voltage and the fans @ 80%.
http://www.3dmark.com/3dm/15193649
http://www.3dmark.com/3dm/15193762
http://www.3dmark.com/3dm/15193921
This leads me to believe, i dont think we are saturating the TB3 link. There is definitely a conversion overhead, but definitely no saturation. Im sure we should be able to eek out another +10%.Last edited: Oct 1, 2016 -
afterburner let you control fan, yes
I recommend 100% speed @48c -
Run that by me again?
-
I meant for normal use. I usually like to keep the fans @ 30-40% for idle/light use and ramp em up @ 40C+.
-
I had absolutely no idea after 50c it throttles... that's ridiculous and very disappointing. Wow! I mean it is what it is but that's crazy.
Sent from my iPhone using Tapatalk -
The desktop cards are not capable of sending a video signal down the pcie lane if you don't have iGPU (or at least that's the theory derived from the Core + GA).
For normal use, my 1080 stays completely passive.
Additionally you can grab 1.2v XOC 1080STRIX bios from here. Just make sure you only alter the 1.2v target and lock it. -> http://forum.hwbot.org/showthread.php?t=159025
50c it throttles from the effect of temperature limiter. That's not accounting for power limit and maximum voltage limit.bloodhawk likes this. -
Throttles from any OC? Not from stock OC settings outside the box though, correct?
Sent from my iPhone using Tapatalk -
Doesn't make much sense. What I find more likely is that the dGPUs aren't capable of passing through data to the internal screen. Even though in effect it ends up being the same thing.
-
Pretty much the lack of a mux. Which acts a data pipe, directing the flow.
-
Looking at some of the results in that threads, my scores arent all that off. I added FSU/FSE/TimeSpy a few posts back.
Will give the BIOS a shot later tonight, thanks. -
Ok let's see here.
Base clock is a guaranteed clock, while boost is the theoretical max the card is allowed to run at.
When you overclock, you add +13MHz (example) to boost clock. The base stays the same (seems to be set on BIOS). So in your case the MHz would increase from 2000 theoretical max to 2013 theoretical max.
It could stay at 2013 if the temps are below 50, power limit is not exceeded. But most of the time it's not that way.
It's easy to short the resistor near PCIe power connector shortly to exceed the 2nd limitation, but staying below 50 is probably only attainable if you put a waterblock or use the EVGA AIO kit with liquid metal + extreme duty 120mm fans.
Speaking of that, I think someone managed to run an eGPU on a muxed AW18. -
Zelda and .... ?
-
That's crazy.... I wonder if the 1080ti would be a better performer or overclocker. Probably worth waiting to determine best G4C (gpu for core lol)
Sent from my iPhone using Tapatalk -
I doubt. Optimus doesn't use MUX. If it did, we'd be able to set dGPU-only on those machines with a BIOS hack even if it isn't naturally available. Optimus is designed for the iGPU to passthrough data to the internal screen using another video card as the number cruncher; this is why the eGPU solutions work on Optimus notebooks. Because it's just another video card passing through. Hence why I think the nVidia dGPUs simply lack the capability to passthrough.
Internal screen? On the iGPU? It might make sense like that; just because it's a SLI board if you're on the iGPU you could still use it as a pass through. -
this applies to all pascal, regardless of where you put it
for thunderbolt eGPU, the maximum point at which you can run a card without bottleneck is the 1080 -
That's actually not true as proven by titan owners vs 1080 owners. I have not seen a single 1080 in a Tb3 eGPU score as high as the titan. Granted there's still a loss vs a desktop setup but tb3 is definitely not capping out.
There seems to be a bottleneck no matter what card you use. But it's not because of a certain limit which makes it that much more confusing.
Sent from my iPhone using Tapatalk -
You still need help or did you get it figured out?bloodhawk likes this.
-
Part of it. Was able to do +125/+300 without any extra voltage.
Will post a few question later on tonight , once im back.Last edited: Oct 1, 2016ajc9988 likes this. -
Ok, I should be back then also.
-
I initially read "once I'm black" and was trying to figure out whether you would use paint or some ancient ritual before I realized I must be tired.bloodhawk likes this.
-
Go to bed man. Rest a little. Im somewhat chocolaty brown as it is.electrosoft, Papusan, Cass-Olé and 2 others like this.
-
Well technically you shouldn't need to touch the voltage at all. It scales.....
temp00876 likes this. -
Hmm, i was reading about people going up to 1.15V or something. Mine is using 1.063V i think at stock.
Gotta double check. -
Dont bother with voltage for now.
-
They flashed a higher vbios, but if your temps don't line up...It only makes things worse...
electrosoft likes this. -
Yeah, more concerned about lining up my GPU scores for now. But i did manage to extend the score by about 300 points, after a mild OC.
Gotcha, i did download the XOC BIOS, havent flashed it yet. Whats your take on that? Or is there a better one out there?
I dont quite understand how the temperatures not lining up is related to the voltage? -
Ah gotcha. Now it makes sense.
2 Limits kicking in.
The power limit @ 120% managed to draw 227W @ 2035mhz boost.
Lets see how much CLU helps. -
Just wish I didn't have such a crappy gpu in the lottery :|
Prema likes this. -
Im sure its better than mine
-
Meaker@Sager Company Representative
Over 50C the speed can go down a bin or two but should remain stable over that.
The other monitoring is on voltage levels which can still kick in even with the shunt mod. -
Here is where im confused as hell, my FS Ultra and Time Spy GPU scores seem to be in line with, or not too far away from, some of the 1080 MXM scores. Seems like the scaling is better with heavier loads.
-
Meaker@Sager Company Representative
Compared to firestrike standard?
-
Well, the 1080mxm is the same chip.
I think I hit around 22-23k on fs standard -
FS Standard is where the discrepancy is the highest.
Yeah, thats correct but FS Standard is where the scores are the most apart. Once you start looking at stock FS Ultra / Time Spy, they are either super close or kinda similar.
http://www.3dmark.com/compare/fs/10111800/fs/10344058
http://www.3dmark.com/compare/spy/422324/spy/527787
http://www.3dmark.com/fs/10344113Last edited: Oct 2, 2016 -
Sorry for a newbie/stupid question, but do I understand correctly that to run an eGPU on a P870DM(-G) you absolutely need an external display? Or there are some workarounds possible? I did read the discussion on last pages in this topic but still don't really understand.
I mean, the whole point of an eGPU for me was to add just a small-ish box with a GPU and be good, even if it would cost 10-20% performance due to additional data transfer to the laptop's display.Last edited: Oct 2, 2016 -
Yeap, and that will be the case with any laptop that only uses a dGPU.
Also this is a question for the masses, why would anyone want to use the internal display with an eGPU? You are already getting tethered to a desk, and why waste resources on the internal display and lose performance at the same time. eGPU's are meant to give you that extra performance boost compared to whats already in your system. By doing loopback to the internal display you are increasing that performance loss to about 40%.steberg, TBoneSan, Papusan and 1 other person like this. -
tell that to razer
-
Thank you for the answer., and also I agree with your point to an extent. My vision of it is I just like how compact laptop is, the all-in-one design, not taking much space on my desk and while it is not necessary ultra-mobile I can move with it with just a moderate effort. A GPU case would not make the situation much worse. But if I need to buy a display and make something like an actual DT system, then I'm probably leaning towards a Volta-based DT as my next PC.
-
Hello Bloodhawk. Thank you for shedding an abundance of light on this whole egpu ordeal. I myself have been planning to use an egpu to keep my laptop capable for years to come since the beginning of purchase. Unfortunately been hearing a lot about the performance lost, and how buggy it is to get it to work, on other laptops that is. But it's good to see it is very possible and simple to setup as youve shown.
I have the 6700K with desktop gtx 980. I was actually considering in upgrading to the new DM3 for about 3k, for the 1080 sli and swapping parts... But thinking about it might be a huge mistake and i may very well find myself either continuing to waste money in the long run or just build a desktop which is what ive decided to do later on when the new intel chip comes out etc. But in the mean time id like to run an egpu setup.
Now coming from a 980, would you recommend running a 1080 egpu? I see that there is a 20% loss in performance, and seeing as a 1080 gives about 30 FPS boost, would it be safe to say id only see a drop of about 6 FPS loss due to egpu? Is that reasonable? If that's the case seems a bit worth it i think...
Also, i noticed you listed the hardware specs of your setup. I got a bit confused, are you using the akito enclosure, or the inxtron? Or both? Im looking to purchase either or by this week of what you recommend. Thanks a lot. -
Yeap, its rather easy. Mostly plug and play.
I was playing doom yesterday, and with everything maxed out the fps didnt drop under 140 @ 1440p. So coming form the 980 its definitely a good boost. But unlike in desktops, where the jump is about 50-60%, as an eGPU over TB3 it seems to be about 30%-35%.
End of the day, its upto you to decide if that much increase is worth a investment or not. But then again, the whole setup costed me under $900. (i got the TB3 HDK through my work place)
There will always be people saying that its not performing as much as it should. But with new tech there are always bugs to be ironed out. And as far as TB3 is concerned, its rather closed interface, so there is not much info out there. (Thank Apple and Intel for that)
Its the TB3 HDK from Inxtron. They are the ODM/Parent company of Akitio. So its basically the same thing.
I would suggest waiting for another week or so. I have something coming in later this week, to see if that solution is better than the TB3 implementations.
________________________________________________________________________________________________________________________________________________________
Guys please refrain from posting eGPU related queries in this thread. Its meant for OC related posts. Ill create a dedicated P980DM-G / TB3 eGPU thread in a few hours. Will ask one of the mods to move all the related posts there.
Feel free to PM me in the mean time. Id be happy to answer your questions.
Clevo Overclocker's Lounge
Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by Spartan@HIDevolution, Mar 4, 2016.