Try setting power management mode to prefer max performance in the Nvidia driver?
-
yrekabakery Notebook Virtuoso
-
When I searched this problem out earlier today, that was certainly the most common advice given, but they nearly all, like me, have already got it set like that. Both in the default profile, as well as the application-specific ones. I have, in fact, tried changing that particular setting back and forth to see if it has any impact on this particular issue at all, and as far as I can tell, it does not.
-
has anyone here tried running nix on their DM or TM? im wondering how things like the fingerprint reader perform.
-
yrekabakery Notebook Virtuoso
Sorry man, I'm running out of ideas. Do you have GeForce Experience installed? Are Windows 10 Fullscreen Optimizations disabled for the game executable? -
Yes - I've tried both of these things. Thanks for your suggestions - I'm also out of ideas at this point. So to summarize, here's the things I've tried (both on and off) that haven't made any difference (game has the low CPU/GPU glitch with any of these settings on or off):
- NVidia driver setting for Power Management Mode (both the global and app-specific settings in all combinations)
- Windows power plan (High Performance/Balanced - slider at max performance/Balanced - slider at mid performance)
- Window Game Bar (completely off / On with gaming mode off / On with gaming mode on)
- Clevo control center (Performance setting / Entertainment setting)
- Executable properties (Fullscreen optimizations on / off)
- Windows scaling (fix blurry apps on/off, combined with overriding/not overriding exe's following these settings or not)
- Running executable directly (outside of Steam with its overlay)
- Running executable with elevated privileges
- Uninstalling GeForce Experience (which I had previously installed to see if it made a difference - it didn't. While installed, I also tried various combos of having GeForce's abilities enabled/disabled - things like shadowplay/battery boost/etc.)
- Running the game in fullscreen vs. borderless windowed
- edit: Thought of one more: G-Sync enabled/disabled or enabled only for fullscreen
So it's not so good for testing my system's thermal management, but on the other hand, if I decide to actually do a serious playthrough, it looks like it will be a breeze for my machine to handle it! Looks like a stunningly beautiful game now that I've had a chance to visit the various parts of the game world - why in the heck did they choose to start you off in such a dreary, uninspiring location? I had actually kind of shelved the game because the starting area was so unappealing - I just didn't feel inspired to try to complete the beginning quest.
Man, I've spent far too long on this today - going to bed now. -
yrekabakery Notebook Virtuoso
One of the weirdest things I've ever seen, to be sure. Thanks for trying though. At this point I'm wondering if the software reading the sensors might be bugged, or somehow Witcher 3 is so much more demanding on the overall system that it's triggering some kind of internal firmware throttling of both the CPU and GPU. -
After some testing pls find below the pic, I have made some hwinfo screenshot, under OW (approx 10-12mins) it says 197W, and under superposition 203W). Yesterday it was 23-24C at home, so during normal summertime it could go higher, still @1,05V w/o I am pretty ok with it, capping around 0,9V it would be significantly cooler.
-
yrekabakery Notebook Virtuoso
Did you forget to include the HWiNFO? -
ebay has some. i just checked if you can't get the part from your reseller. -
No, I have it on the laptop at home, I am working atm. Anyways, it turned out, it is also a general gpu, so the show seems to be over
-
yrekabakery Notebook Virtuoso
I remain unconvinced, and it seems you won't provide the proper info, which is unfortunate. Take a read over the last few pages. Your claimed results are better than all of ours, some with LM and heatsink tweaking, so we just want to know why. -
Meaker@Sager Company Representative
Could be an off internal diode as an edge case?
-
I do not understand your tone... and why you are so unfriendly and pushy... if you remain unconvinced so be, it is up to you. I have not shown off with the temp, just published my temp after 15 minutes testing. My F5 has worse temp after superposition with 7700k, so I was suprised that 751tm1 has good cooling. That is it. If you read my posts, I always provide fair information, even if I have bad results or worse than others... (read my cpu posts what path I went thru). As I said, I would post the info (hw screenshots), but I could post only those pics, what I had on my phone. Even if I have 2C better temp why is it so problematic? Why cant I have these temp? Dont get you at all... my friend has a 775tm1 with even better temps, simple mx4 on his gpu... I will post the shots evening. Cheers
-
yrekabakery Notebook Virtuoso
Looking forward to the HWiNFO screenshots.
-
Dont worry I am keeping my word, you will have it, and hopefuly can explain, that my temps, performance and machine is a piece of s!hit, and everybody can be happy after it
cheers
-
yrekabakery Notebook Virtuoso
No man, you got us all wrong. This is all just in the spirit of comparing information, not to put anyone down. -
Dont worry I will post the info... it was also thinking about it... maybe there is some throttle, however superposition 16,4k result on 82C is convincing for me. It goes to 203W under superposition so it seems also good to me. Firestrike 20,1K also convincing, that it runs ok. As I have said, my F5 's gpu is capped 0,9V, and does 15,8k-16k on 82-85C in superposition. F5 is comfortable with 150W. I still use my F5, the 751tm1 is on the desk in second skin, so actually I switch on stimes for testing. When I pass over my F5 I will have more time to play with the tm1, try to lower the voltage to 0,9-0,95V, usually it does not ruin the performance but lower the temp pretty much. cheersLast edited: Jul 24, 2018
-
Back at home, prior family got home
1) superposition result +hwinfo
2) hwinfo after epic overwatch
3) firestrike test
raz8020, Papusan and Donald@Paladin44 like this. -
I wish you would have done a longer in game test. Hitting 86C GPU after only 7 mins is crazy, guessing it would have been throttling at 91 far before the 60 min mark.
-
I will test more I have just left OW in practice screen for about 10mins, it was almost 8... however on auto fan... I will test later with fn+1 for 15-20min... we will see.
and try to cut the voltage at 1V to see how its improving...Last edited: Jul 24, 2018 -
yrekabakery Notebook Virtuoso
Can you also figure out why your average GPU Power is so low? If the GPU was fully loaded the entire time, the average GPU Power should be much higher, like 150W+.
Edit: All these average values are low. Maybe you forgot to reset the values after starting the test, or ran a test much shorter than 7m and forgot to screenshot it before the test ended, or were idling/low load for most of the test?
Last edited: Jul 24, 2018raz8020 likes this. -
no, i assume I reseted before I started OW, but I multitasked left in the background while i wrote a short mail. I thought it is the same... I can redo with a longer game as it was only 7-8 mins, however the max fps was around 70, so it was unplayable.
-
yrekabakery Notebook Virtuoso
I understand, 70 FPS is hard to play in a competitive FPS
. You can just sit in training area with Epic 200% and let it run for 10 mins for so. They did remove the idle kick timer for training map, right? Or at least extended it much longer. It used to be kick after a few minutes of inactivity which is ridiculous.
-
Yes, but in practice mode I can leave it longer... I will test it again, however it is hot here today, but should not count that much.
-
yrekabakery Notebook Virtuoso
Use max fans.
Would also remove fan speed as a variable, since different BIOS/EC have different auto fan curves and it's also dependent on CCC setting.raz8020 likes this. -
After 8 mins playing practice it reached 90C@1,044V, then I switch off, no miracle here
most probably the background working resulted the lower temps. I will check lower voltage later on around 1V
ps: i left the fans auto but I dont think it means anything just reach the peak temp later.
raz8020 and yrekabakery like this. -
Interesting but voltage cap does not work... i set 1885@0,95V but it still gives 1,05V. Nevermind I will try tomorrow, however tm1 still has better cooling than the F5 which still suprise me....
-
yrekabakery Notebook Virtuoso
I just added +160MHz in main interface and flattened out the curve at 912mV like this:
That gives me a minimum of 1848MHz@913mV unless power or thermal limit is reached. -
160W seems about right
-
Btw, what could be realistic price for 751tm1? Technically brand new, however I have bought as barebone with 90 days warranty, which expired. specs: IPS 144hz, 8700k, 16G HyperX cl14 2400mhz, 1080gtx (it has warranty til november), 256G sata3 ssd, international keyboard, killer wifi, new battery (the plastic cover is still on the screen
)
Last edited: Jul 24, 2018 -
I would estimate 2000-2500 for a used one depending on how badly someone wants it or how quickly is needed to sell. I would say without warranty on the entire machine will mean closer to the 2000.
-
Thx for the opinion, I would not sell for 2k definately, that's true that it has no longer warranty, however brand new
kinda good laptop though
Can somebody send my a stock 1060GTX vbios from TM1 system? Thanks in advance! -
i tried but GPU-Z wont let me dump the bios. its giving me the error that bios reading is not supported on the card
-
gpu-z is not good, could you try with nvflash, it works for sure
If I remember correctly "nvflash -b backup.rom"
-
Papusan likes this.
-
Thank you very much!
-
Hi I also have a 99% new EVOC P751TM1-G which I bought in March but seldom use it. What could be the fair resale price? It's still in warranty. Configuration: 8700k Delidded OC'd 4.8GHz, Prema BIOS, 16GB DDR4-2400, 256GB SATA SSD, GTX 1070 8GB, 4K AUO IPS Display, Intel AC9260. Thanks for suggestion.
-
Meaker@Sager Company Representative
It's going to be tricky to sell to knowledgeable people on here with upcoming changes.
-
ThatOldGuy Notebook Virtuoso
Still respectable stats, and some people who value CPU over upcoming GPUs.
Probably about $1700 -
All right. I thought it should be worth more than that.
-
I figured a used one with a 1080gtx and 32GB memory would be worth more like 2300 (new 3000?) so I figure 1700 sounds about right for 1070gtx and 16GB.
-
So does this laptop use MXM gpu’s that can be upgraded?
-
MXM yes, upgradable to 1080 sure, buy beyond that who knows. Same with CPU, up to 8700K for sure, beyond that who knows.Terreos likes this.
-
That’s cool. I guess it’s always a gamble if they continue to make MXM for the next generation or not. But it would atleast mean if you damaged a component trying liquid metal you could buy a replacement.
Thanks.Porter likes this. -
Meaker@Sager Company Representative
Plus you could replace it with a slower card if it's relegated to 2d duties later on.
-
I could increase RAM to 32 GB and include 1TB pm961 nvme ssd. Then it should be worth more than 2000, agree?
Rumors say Intel 9th gen LGA processor will be compatible to z370. So it’s possible to upgrade 8700k to i7-9700k (8c8t) or even i9-9900k(8c16t). -
ThatOldGuy Notebook Virtuoso
Actually, you probably would get more money (for the RAM and SSD) by selling them separately. -
Also, 9900k and 9700k won’t be using thermal pad. Instead, solder heat spread will be used. So it’s good news for thermal management.Terreos likes this.
-
I wasn’t aware they were making new CPU’s so soon. Hmm. . .better hold off until those are released.
-
These are rumors. They might be true or not. But we will see soon. It’s almost August 1st.
*** Official Clevo Sager NP9155 / P750TM-G / P751TM-G Owner's Lounge! ***
Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by Spartan@HIDevolution, Oct 6, 2017.