Well, that's actually exactly what you should expect.As a rough approximation performance is about proportional to cores * frequency, and if you've got more cores running at the same wattage, you should see a bit of a performance benefit since you have to scale multiple things upwards to push clocks higher.
Bit over 300W from the wall, then the rest from the battery.
-
Near the end of this thread, post from 1/22.
http://forum.notebookreview.com/thr...titan-w-desktop-980-gpus-owners-lounge.785384
*Dang it, beat by Katiecat himself/herself!
* I want to add to what Katiecat said above. Personally in all my additional testing recently I am able to get 330w from the wall every time, stock or overclocked. I was assuming power supply inefficiency would mean it was like 250 to the notebook, but maybe it could be 90% efficient and actually be 300w at the notebook? -
I don't admire this engineering choice. -
-
Just... can't win with notebooks. <3 Desktops.
-
Zero989 likes this.
-
I did some more testing, and found out that the 2nd adapter that is not getting warm, is drawing around 110w during peak firestrike tests when I get max power pull. Nothing revolutionary, just confirms that they are both being used in parallel as expected, just one is getting used a lot more than the other for some reason. I am still guessing the one getting warm is a slightly higher voltage making it be drawing from first.
Second thing is I measured the output using my UPS, no idea how accurate it is but it shows a little different data than my killawatt does. It shows that during the same peak period in firestrike that shows 330w or so every time (from my killawatt, and is repeatable) that the two adapters are pulling closer to 400w. The display fluctuates from around 370-400+ every time, with similar 3dmark results so I'm not pulling any more than I have any other time. This is taking into account(subtracting) the 18w of "other" load on the UPS (display in standby, speakers sitting idle etc). The UPS peaks around 420-430ish on it's display, including the small load of approx 18w.
Lastly, are our internal displays g-sync compatible? I noticed that when I left it enabled on only the internal display my scores are much lower (due to it limiting to 60fps during the tests). Since I thought it was not compatible I didn't figure it would make any difference. Is there a simple way I can confirm one way or the other? Without loading up games and trying to see tearing (which for me would be very subjective). -
Pretty sure the internal display is not gsync compatible, but you shouldn't be limited to 60fps. Have you tried overclocking the display? I am running 96hz, and I have run overclocked for almost a year now.
Also, it seems you are loading 400w total from the wall? Is that with single psu also, or only with dual psus?hmscott likes this. -
*I missed your second question. The wattage readings from my UPS are the same whether one or two PSU's. I do feel it helps a bit to use both as neither is being pushed to the limit, and one is warm and the other is cool to the touch. I've had plenty of adapters be scorching hot so that is nice in itself. Also I did have the system restart once mid test, and it was on the single PSU, maybe that was just chance (50% afterall) but I plan to leave both connected so they stay cooler.Last edited: Jan 24, 2016 -
I just did some battery boost (MSI NOS) testing. I estimated from previous testing, that at best the battery boost was only using 13w from the battery, and at worst it was using 26w from the battery. This seems low to me, but with only an 80whr battery the math works out. I am using a figure halfway between those two, 20w draw while on battery boost, just for the estimates below.
After reaching the lower limit of discharge, which I think is 30%, the performance will lower about 20% in FPS from my testing. I used 3dmark firestrike, and The Witcher 3. 3dmark lost about 2.5k points, from about 16k stock to 13.5k in firestrike, and the Witcher 3 went from 50+ FPS to about 40+ which was about 20% FPS loss. Overclocking helped regain some if it, I was able to get back to 15k which turned out to be roughly 10% FPS loss. This was all with the battery installed, and charging. I noticed that it seemed to charge through all my tests, at a rate of 36w. So not only did the system lose the 20w of battery boost, it also is losing another 36w from the charger, since it is now charging the battery. So a net of 56w lost from best case, to worst case.
Then I decided to try disconnecting the battery, to see if the system would score better with no battery at all, since the battery wasn't being charged, and freeing up another 36w. I was somewhat surprised to see my results were pretty much the same as with the battery installed and charging.
As a final test, with battery reinstalled, I wanted to see what the battery was really doing when in game, and below the battery boost threshold. I ran HWinfo64 and logged it while in game. The battery continued to charge, never stopping, but it dropped from the normal max of 36w to around 1w, it did fluctuate quite a bit in this short test though. This would explain my initial results above, and why taking the battery out didn't help. Since the system automatically slowed the charging way down to almost nothing, it was just too little of a difference.
What I still don't understand, after all of this testing, is how approx 20w can make a difference of 20% in FPS. I mean say each GPU is using 120w, so 240w total, you would think only 20w less would impact performance less than 10%. I know you can't directly compare power used to performance, but with all else being the same in these tests I would still think that it would only reduce performance by 5-10%.
Maybe someone will find this info usefulhmscott likes this. -
Well, when the battery hits 30%, the GPUs are throttled to around 105W each, so that sounds about right. It's not that it performs terribly when it's throttling, but that they charge a $1000 premium and don't tell you that it can only outperform stock 980Ms for 3-5 hours at a time. I can live with the fact that heavily overclocked 980Ms can match even lightly OCed 980s on the right laptop, but that really is a bit much for me to stomach.
RMPG505 likes this. -
I did some comparing to my Sager with a 4940MX and SLI 980m's. Even comparing the stock SLI 980m with a better CPU, to the "gimped" no battery GT80S with SLI 980, the GT80 still does better by 5-10%. Nothing worth writing home about but still interesting to note. -
It will depend on the power target for the GPUs. when you lost the battery power because of low charge, and then overclocked, you regained performance and got much closer to full power. This just means that the throttle programmed for low battery is higher than it can, probably for safety.
What were your clocks on your GPU, was your CPU affected? If you don't overclock at all, you can probably test undervolting the GPUs, which would result in no battery drain probably. -
Hooray for not being able to sleep!
I know you've all been hearing all sorts of negativity from me about the 6QF, so here's a positive note for a change.
My battery was not amused though. D:GTVEVO, Porter, Dufus and 1 other person like this. -
Very nice! You beat my high score from yesterday by 111 points! (18,412) I was stuck home during a blizzard and had nothing else to do
hmscott likes this. -
You guys are already around 15% stronger than my best overclock that is not stable because it can shut down my laptop hahaha
Very nice.
MSI GT80 3dMark Scores, test and tune
Discussion in 'MSI' started by GTVEVO, Mar 19, 2015.