I do not like these 1,4-1,5v spikes... i will try constant and see
-
yrekabakery Notebook Virtuoso
raz8020 likes this. -
and what is "VR CURRENT LIMIT" in overclock menu in BIOS? I will try the static and report back to you guys the behaviour of the TM1.
-
Meaker@Sager Company Representative
Voltage regulator current limit, ie how much the VRMs are allowed to push through.
raz8020 likes this. -
However the machine runs ok, superposition 16500 point gpu max temp 82C, firestrike around 20000 points, and the temps seems ok. I will posts some screenshots later on. I try to figure out the best voltage on 4,7Ghz, I do not want to push further... This machine - 751tm1 - seems to be a very nice build, the only thing I do not like atm is the bios and the voltage fuss... -
Tried undervolting during gameplay today. Here is HWInfo data after over 2 hours of Fallout 4 at Ultimate settings (Full-screen 1920x1080, framerates capped at 90fps). Ambient temps are 75F, with notebook on a lap desk with an aluminum stand to keep about an inch of space under the rear of the machine and the surface. Using a -130mv undervolt, which seems completely stable under all conditions - idle, stress testing, benchmarking, gaming. Stock frequencies, fans on auto (and very quiet). Windows High Performance power plan, Clevo Control Center on "Performance" mode.
-
Last edited: Jul 22, 2018
-
That said, according to HWInfo, my BIOS is 1.05.10RLS2, dated 2/26/2018, which was what came on the machine as directly purchased from the Sager Notebooks website.
I am simply using the Clevo overclocking utility to try and establish a stable undervolt offset that will minimize temps, since most of my games (my most demanding normal usage of my computer) are not much of a challenge for a machine with this kind of hardware - I'm nowhere near needing additional clock speeds for my daily machine use.
I had initially tried setting a negative voltage offset in my machine's BIOS, but couldn't get it to boot, so I've just returned to using the provided Clevo-supplied software for doing so.
So I'm not trying to run at higher than stock clocks, other than doing a bit of playing around with Cinebench just for the lols. Note that for the 8086K, the stock multipliers are 50x, 46x, 45x, 44x, 44x, 43x for 1-2-3-4-5-6 cores in active use. As you can see from my HWInfo shot above, this seems to result in the expected 4.3ghz overall average during extended gaming, since I suspect Fallout 4 is designed to use all available cores. Therefore, I'm not pushing the 8086K chip as hard as you are your 8700K if you are using 4.7ghz multipliers across the board.
I will say that, during Cinebench testing, I was able to increase my CB scores up to the point where I was using 48x multipliers across all 6 cores, but setting them at any higher multipliers resulted in lower scores for me. I posted my best score (1543cb, I think) on the overclocking thread a couple of days ago. -
yrekabakery Notebook Virtuoso
So you tell me which is preferable. Loadline 0 with more spastic (but not unsafe) idle VID behavior and load stability at lower VID. Or Loadline 1 with VID locked at your override voltage at all times, but a higher override needed for stability leading to more throttling. -
However I do not consider it as a problem or bottlenecks the usability of the laptop, just a little annoying to see those temps in hwinfoAll in it is a great laptop, after some test, it easly beats 20k in firestrike, the gpu maxes at 1950mhz reaching 200W, and under heay stress it was 82C. So I suprised in a positive way
-
yrekabakery Notebook Virtuoso
What are your GTX 1080 temps when playing Witcher 3 maxed out with unlimited FPS? -
I have tried overwatch uncapped fps and ultra, it was 82C max after 30 mins 99% gpu load. I am amazed how it can be 82C@200W@1,05V@1950mhz
-
yrekabakery Notebook Virtuoso
-
ps: i have cooling thermal pads on the graphics card, it might help as well. -
yrekabakery Notebook Virtuoso
-
-
yrekabakery Notebook Virtuoso
-
how much do you score in superposition basic benchmark (medium)? -
yrekabakery Notebook Virtuoso
-
-
yrekabakery Notebook Virtuoso
I don't remember if the OW main menu is capped at 60 FPS or not. If it is, increase render scale to 200% at epic settings or find something else that will load the GPU power constantly at 190W and let it sit for 10 min.
I use Witcher 3 to heat test because it uses far more GPU power than any other game or 3DMark/Unigine bench. I get 88C max at [email protected] 180-190W. This is with my heatsink flattened, liquid metal, and pressured out using 2 paper clips under each retention arm. I also ordered an upgraded heatsink from China that will arrive in a week or so with more pipes over the GPU core, hope that will lower GPU temps further. -
-
yrekabakery Notebook Virtuoso
-
16,4k 82c -
yrekabakery Notebook Virtuoso
Still no idea how you can use 200W constantly on a card limited to 190W... -
I dont know either, but does it count? I assume you are satisfied with yours, every chip is different, may be mine is better. Dont forget God plays for X
my cpu spikes my gpu is fine, so its equal
btw, on my F5 i have similar settings like yours, 0,9V@150-160W 82-85C under full stress. -
yrekabakery Notebook Virtuoso
-
-
yrekabakery Notebook Virtuoso
-
-
Falkentyne Notebook Prophet
With AC DC = 0 (auto) setting, that 1.2v would spike to 1.3-1.35v at full load on a MSI board, because since there is no vdroop, the Auto (1.80 or 2.10 mOhms) setting causes REAL voltage spike at load. So only on MSI boards do you have to be worried (and you can tell by the temps!).
Clevos and eVGA boards do NOT use an "internal" loadline calibration setting, so the IA AC DC setting has to be kept at default. Or if they do, someone with a digital multimeter would have to find the read points to check the true vcore going into the CPU at full load (not just what the VID says), but this can usually be "guesstimated" by comparing the power draw and temps at different voltage levels.
Setting IA AC DC loadline to 1 on clevo boards causes instability because there is massive vdroop (this is what "loadline calibration" prevents from happening). So Setting 1.2v on your clevo and setting IA AC DC loadline=1 will cause it to be about 1.1v at FULL LOAD. The spikes you see are actually happening at very light load, and those spikes are BEFORE "vdroop" is occurring.
Note: the IA AC setting is what controls the VOLTAGE RISE At full load (before vdroop) and the IA DC setting is what controls the VID DROOP (not to be confused with vcore droop!!) at full load. Setting IA AC loadline to 2.10 mOhms (value=210) on either a MSI or Clevo board, and setting IA DC loadline to 800, and setting a vcore of 1.20v static, will cause the VID to drop to under 1.0v at full load. This clearly is not the true VID---the DC setting is only changing the REPORTED VID. (the AC And DC settings are presets to intel design characteristics for how adaptive voltage and vdroop are "supposed" to be handled.
Desktop boards tend to ignore the IA AC DC setting completely when static voltage is used (the VID will be shown very high but desktop boards have a VCORE sensor also, and the vcore will be MUCH lower than the VID).raz8020, Papusan and yrekabakery like this. -
yrekabakery Notebook Virtuoso
The P775DMx design has thicker heatpipes and rads, and a 3rd shared heatpipe running over the GPU core:
The P75xDMx/TMx design has thinner heatpipes and rads, and reroutes that 3rd GPU heatpipe to the CPU only:
The P75xDMx heatsink was designed for up to GTX 1070 (115W) since that was the top GPU officially offered for the 15" in that generation. The GPU cooling is probably adequate for that card, but CPU cooling is a little overkill for Skylake/Kaby Lake.
On the TMx gen, they re-used the same heatsink, which is adequate on CPU side for Coffee Lake, but is inadequate on GPU side for GTX 1080 (190W), since it was originally intended for a GPU with a 75W lower (-40%) TDP.
BTW this is the upgraded heatsink I bought. More pipes over the GPU and less "unified" (only one joining heatpipe) compared to the 2nd pic above:
Dennismungai, raz8020, FTW_260 and 1 other person like this. -
Meaker@Sager Company Representative
Its still only officially supplied with 1070 afaik.
-
I'm really enjoying the power from such a small notebook and understood the limitations of having an 8700K and 1080 in this 15" chassis. I do have LM applied to GPU & CPU by HIDEvolution and temps for benchmarking and general usage seem under control. I can bench overclocked a fair amount and can undervolt a good amount around stock clocks too.
That being said, one thing I have found is that under gaming loads the temps get toasty pretty fast. The notebook is on a cooling pad (fans on or off don't really make a measurable difference to peak temps). I play with max fans forced on. At stock CPU and GPU clocks, with a -100mv on CPU what happens is after a while (60 mins) the external display will be disconnected, it switches to the internal display, then after a few seconds it will switch back to the external. Surprisingly, games don't seem to care and continue to run. The GPU hits 90-91 which appears to be what is causing the disconnect. This is running 144 refresh and allowing the game to run as fast as it can.
I tried lowering the CPU clocks (disabled turbo so it's running at 3.7 now), and increasing undervolt (at -150 now which seems stable and not causing any other issues). GPU running totally stock (not sure if you can undervolt it or how). I also lowered the FPS cap to try to identify limits and thermal capability. That helped "fix" the issue but reading on here where some non LM, non prema BIOS folks may be getting much better temps. Is my situation typical, or is it running even hotter than it should be?
I didn't write this down, so it's all from memory and may not be 100% correct but should give an idea of approx what I was seeing. All from FC55. I have been playing a ton of FO4 but since I keep it locked to half refresh at 72 fps it never had heat issues or disconnects.
At 144 fps cap - CPU upper 80's GPU hits 91 and get external display disconnects
At 80 fps cap - CPU 60's GPU 80
At 70 fps cap - CPU 60's GPU 75
At 60 fps cap - CPU 60's GPU 70
So while I can play games decently, I really hate the disconnects. And by “fixing” my disconnects I am crippling the performance more than I feel I should have to. My previous machine was a thin 6820HK and 1080, 120hz display, with a 230w power adapter and had no issues like this. I have tried the latest nvidia driver, moved from an older one that was stable and worked good (except for the disconnects which is only reason I tried the upgrade).Last edited: Jul 23, 2018 -
yrekabakery Notebook Virtuoso
You can drop the GPU power and temps significantly while still maintaining a high boost clock if you undervolt using the curve (Ctrl+F) in MSI Afterburner. For example, adding +160MHz to core in the main interface and then flattening out the curve at 913mV lets me run 1848MHz@913mV, which shaves off about 40W under full 3DMark/Unigine/typical gaming loads and drops peak temps from 91C to 80C. However, an extremely GPU heavy game like Witcher 3 that uses the full 190W even with the GPU undervolted will still peak at 88C.
Keep in mind my temp numbers are with max fans (which doesn't really matter since auto fans spin up to max past 80C anyway on the Prema mod) with GPU undervolted and CPU overclocked to [email protected], in addition to LM on everything and tweaks to improve heatsink contact. So most people will get worse results than those. But I'm not satisfied yet, and am hoping the upgraded heatsink will further lower my peak GPU temps to 85C or less. -
Thanks. I had tried to undervolt GPU using afterburner, and google searches led me to discussions about how it worked, and also that it doesn't work on notebooks at all. I will try it again when I get home. I've never heard of the control+F so maybe that was my lack of reading up on it and just trying to click around.
Keep us up to date on cooling. If you find something that makes a huge difference (more than a degree or two) then I might be interested in trying it too. -
If the P751TM can keep a 1080 bellow 90C then why can't the P775TM1-G keep the 1080 bellow 90C even with undervolt? This was the case when my system was working. Even with the 1080 running at 0.913v it was still hitting 90C.
-
-
Well I'm not having any trouble keeping my 1080 (or my 8086K CPU) way, way below 90F during gaming, but my games are not as challenging as I suppose some of yours are. But with Skyrim SE (Ultra), Fallout 4 (Ultra) and GTA V (Very High) games settings, I just use NVidia Inspector to set a global profile that frame-limits to between 75 and 90 fps (I play Fallout 4 and Skyrim at 90 and GTA at 75). Lately, I've been keeping the CPU undervolted by 130mv, which is stable but has no effect on the GPU of course.
I really don't get the idea of completely uncapping the framerates and letting the machine just kill itself generating frames far and away above what's necessary for completely smooth, buttery gameplay.
These games, run under these settings for extended playtimes, rarely max the GPU higher than 80F (mainly GTA) or 70F (Fallout 4 and Skyrim SE), and average GPU temps are much lower, tending to be at least 10F or 15F below the max temps, and with the bonus of keeping the fans nice and quiet (or at least unobjectionable...).
I play inside where ambient temperatures are 75F or below, with the laptop on a lap desk, atop a compact folding aluminum frame that lifts the rear of the laptop about 1" above the surface of the lap desk.
I am using the fans on Auto, with Windows power plan at maximum performance and Clevo Control Center on "Performance" mode. I only have the stock BIOS, and had Sager upgrade the thermal paste when I ordered the machine.
Temperatures have been amazing, IMO, but, again, I realize I'm not pushing the machine as hard as some of you are trying to do.
-
Playing BF1 one with 8700k at 4.7ghz. 1.175v and 1080 at 1848/10516 at 0.931v i still got 84C on cpu abd 88 on 1080 in my P775TM1-G.
-
yrekabakery Notebook Virtuoso
-
Well i guess i have to try out the paper clip mod then when my system comes back.
-
I haven't really played Witcher 3 much, but I do own it, so I installed it and loaded my last save game and captured some data for you. This is with about 30 minutes of play, but I don't know the map or game well enough to get to the city you posted footage of a while back, so I just galloped my horse around the village I was in and galloped around the dense forests and fields in the surrounding area and killed a few wolves.
For this session, I completely uncapped framerates, left Vsync off, and set the game to Ultra graphics settings at 1920x1080 Full Screen. My framerate never much exceeded the low 70's at any point (saw as low as 60 and as high as 80 once or twice, but mostly hovered around 70-72fps), but I was still surprised when I saw the max and average values from HWInfo. Here they are (captured a few seconds after exiting the game).
What should I be doing in this game to really push the hardware? Seems like the framerates just never naturally got high enough to push things into the red zone...
Porter likes this. -
-
yrekabakery Notebook Virtuoso
Something is still limiting your performance. Turn off any FPS caps in-game/driver/RTSS and turn off G-Sync and VSync. Go to White Orchard (beginning area of the game after the tutorial) and ride/run around a bit in the foliage. The GPU should heat up very quickly.
Last edited: Jul 23, 2018raz8020 likes this. -
Meaker@Sager Company Representative
Resolution at 720p by mistake perhaps?
-
@Meaker@Sager : No, very careful to check graphics settings. All gameplay was at 1920x1080 with framerates uncapped and vsync disabled. I had the machine running at stock clocks, frequencies and using max performance settings in both Windows power plan and Clevo Control Center. When the game is at its initial settings screen, the framerates are over 2000fps, so I know the framerates are unlocked.
I managed to do some reconfiguring with RTTS and HWInfo and tried to set up my sensor displays as close to the ones used by @yrekabakery for his recently-uploaded gameplay video as I could figure out how to do.
Since I don't usually play Witcher 3, I didn't have a save game that enabled me to visit the city in his gameplay video, so I downloaded an endgame-level save game and used it to make a new video. Please excuse my bumbling around in-game - I don't know this game at all well, and got lost several times.
This video is about 12 minutes long, and seems to me to show that the game is not using my machine's full hardware capabilities, although the GPU core-load percent does stay around 95% for most of the video. I don't understand why the CPU clocks and voltages and the GPU clocks and voltages are so low during this gameplay. There are places in the video where I can get framerates above 100fps, but it tends to stay in the low 70's. Even though the GPU usage is pretty high, the temperatures, voltages and frequencies of both CPU and GPU stay quite low, so I don't understand what's going on. I've double-checked to make sure I'm using high-performance settings in the power plan and control center. My other games are quite capable of driving the CPU/GPU to thermal limits if I leave the framerates fully unlocked, so I don't really know what's going on with Witcher 3.
Here is a snip of my selected sensors from HWInfo immediately after the video ended (I added the GPU Core Load) - some of the other sensor names have been changed, as that was the only way I could get the labels I wanted on the OSD in-game:
-
That's really strange. 1080 should be hitting just under 1800mhz and your CPU should be 4.3ghz flat.
-
yrekabakery Notebook Virtuoso
-
I also played around a bit more with Witcher 3 settings to see if I could improve framerates, and I followed an optimization guide's recommendations to switch off the Hairworks stuff and reduce foliage distance fade to "High" instead of "Very High". By doing those two things, the identical gameplay depicted in my posted video rose from 70-80 fps to 90-120fps, with the average hovering just over 100fps.
But even so, the CPU and GPU frequencies stayed at the lower-than-expected values seen in the video.
I also spent some time searching with Google to try and find if others had seen similar issues, and there are a few cases where people playing Witcher 3 were seeing the GPU drop its frequencies to levels normally used for 2D desktop display. This was variously blamed on updates to Windows / NVidia driver settings / GeForce Experience and several other programs, including MSI Afterburner and its ilk. But though some people report solving the problem by uninstalling or changing an application, I didn't see any clear general solution. It seems if you have the problem, you may be stuck with it.
I did see one single instance where, during Fallout 4 play, the GPU frequencies dropped and "hung" at too low a level, causing framerates to plummet. I waited for it to fix itself for a few seconds, and when it didn't, I task-switched away from Fallout 4 and then back to it, and the GPU frequencies instantly returned to their maximum values. I wonder if this is related to the problem I'm seeing in Witcher 3?Last edited: Jul 24, 2018
*** Official Clevo Sager NP9155 / P750TM-G / P751TM-G Owner's Lounge! ***
Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by Spartan@HIDevolution, Oct 6, 2017.