The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *** Official Clevo Sager NP9155 / P750TM-G / P751TM-G Owner's Lounge! ***

    Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by Spartan@HIDevolution, Oct 6, 2017.

  1. matyee

    matyee Notebook Deity

    Reputations:
    254
    Messages:
    861
    Likes Received:
    430
    Trophy Points:
    76
    I do not like these 1,4-1,5v spikes... i will try constant and see
     
  2. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Those VID spikes on Loadline 0 don't affect the package power. Also Loadline 1 with higher VID has the same package power as Loadline 0 with lower VID under load, which should tell you not to overthink it.
     
    raz8020 likes this.
  3. matyee

    matyee Notebook Deity

    Reputations:
    254
    Messages:
    861
    Likes Received:
    430
    Trophy Points:
    76
    and what is "VR CURRENT LIMIT" in overclock menu in BIOS? I will try the static and report back to you guys the behaviour of the TM1.
     
  4. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,897
    Trophy Points:
    931
    Voltage regulator current limit, ie how much the VRMs are allowed to push through.
     
    raz8020 likes this.
  5. matyee

    matyee Notebook Deity

    Reputations:
    254
    Messages:
    861
    Likes Received:
    430
    Trophy Points:
    76
    Current it is disabled in BIOS, if I set to enabled, what is the right number for it? Meantime I have set the voltage to OVERRIDE and 1125, the machine spikes lower, however it is moving up and down, maxing around 1,375V... even if the cpu goes down to 1200mhz, the voltage is around 1,1-1,2V. Is that ok? With throttlestop I could set 1,075V@4,7Ghz, in bios I need to set 1125... I do not get this whole thing :(
    However the machine runs ok, superposition 16500 point gpu max temp 82C, firestrike around 20000 points, and the temps seems ok. I will posts some screenshots later on. I try to figure out the best voltage on 4,7Ghz, I do not want to push further... This machine - 751tm1 - seems to be a very nice build, the only thing I do not like atm is the bios and the voltage fuss...
     
  6. TJCacher

    TJCacher Notebook Consultant

    Reputations:
    56
    Messages:
    117
    Likes Received:
    89
    Trophy Points:
    41
    Tried undervolting during gameplay today. Here is HWInfo data after over 2 hours of Fallout 4 at Ultimate settings (Full-screen 1920x1080, framerates capped at 90fps). Ambient temps are 75F, with notebook on a lap desk with an aluminum stand to keep about an inch of space under the rear of the machine and the surface. Using a -130mv undervolt, which seems completely stable under all conditions - idle, stress testing, benchmarking, gaming. Stock frequencies, fans on auto (and very quiet). Windows High Performance power plan, Clevo Control Center on "Performance" mode.

    Fallout 4 Temps -130mv.png
     
  7. matyee

    matyee Notebook Deity

    Reputations:
    254
    Messages:
    861
    Likes Received:
    430
    Trophy Points:
    76
    What bios do you have? The problem is not the temp itself.... the issue that the bios gives voltage spikes what I do not like... @4,7Ghz I have 70C during gaming so tempwise it is ok, however under stress it is around 1,2V, but under lite browsing it is 1,35V...
     
    Last edited: Jul 22, 2018
  8. TJCacher

    TJCacher Notebook Consultant

    Reputations:
    56
    Messages:
    117
    Likes Received:
    89
    Trophy Points:
    41
    @matyee Sorry, I was just posting temps since I'm still getting acquainted with the new machine. I didn't realize you'd been having issues with yours. I don't know that much about over/under clocking/volting except very basic stuff, so can't offer much help, unfortunately.

    That said, according to HWInfo, my BIOS is 1.05.10RLS2, dated 2/26/2018, which was what came on the machine as directly purchased from the Sager Notebooks website.

    I am simply using the Clevo overclocking utility to try and establish a stable undervolt offset that will minimize temps, since most of my games (my most demanding normal usage of my computer) are not much of a challenge for a machine with this kind of hardware - I'm nowhere near needing additional clock speeds for my daily machine use.

    I had initially tried setting a negative voltage offset in my machine's BIOS, but couldn't get it to boot, so I've just returned to using the provided Clevo-supplied software for doing so.

    So I'm not trying to run at higher than stock clocks, other than doing a bit of playing around with Cinebench just for the lols. Note that for the 8086K, the stock multipliers are 50x, 46x, 45x, 44x, 44x, 43x for 1-2-3-4-5-6 cores in active use. As you can see from my HWInfo shot above, this seems to result in the expected 4.3ghz overall average during extended gaming, since I suspect Fallout 4 is designed to use all available cores. Therefore, I'm not pushing the 8086K chip as hard as you are your 8700K if you are using 4.7ghz multipliers across the board.

    I will say that, during Cinebench testing, I was able to increase my CB scores up to the point where I was using 48x multipliers across all 6 cores, but setting them at any higher multipliers resulted in lower scores for me. I posted my best score (1543cb, I think) on the overclocking thread a couple of days ago.
     
  9. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Okay, so what will happen if you get an unlocked BIOS and set IA AC/DC Loadline to 1, is that the 1200mV override you're using now, will not boot. You'll need to up it to 1.3-1.4V for stability, at which point it will throttle under load since these boards get flakey at over 1.3V.

    So you tell me which is preferable. Loadline 0 with more spastic (but not unsafe) idle VID behavior and load stability at lower VID. Or Loadline 1 with VID locked at your override voltage at all times, but a higher override needed for stability leading to more throttling.
     
  10. matyee

    matyee Notebook Deity

    Reputations:
    254
    Messages:
    861
    Likes Received:
    430
    Trophy Points:
    76
    I am not quite sure that would happen... based on my experiences with eg 775dm3 or F5, it will boot with the very same voltage, however the voltage spikes could be eliminated... with F5 happened the same, and works stable with the very same volts...

    However I do not consider it as a problem or bottlenecks the usability of the laptop, just a little annoying to see those temps in hwinfo :) All in it is a great laptop, after some test, it easly beats 20k in firestrike, the gpu maxes at 1950mhz reaching 200W, and under heay stress it was 82C. So I suprised in a positive way
     
  11. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Those are different boards, and F5 is an MSI which have different loadline characteristics so the same settings will not work the same way. I have tested the loadline stuff at length on this system with the Prema mod and found that default loadline is optimal.

    What are your GTX 1080 temps when playing Witcher 3 maxed out with unlimited FPS?
     
  12. matyee

    matyee Notebook Deity

    Reputations:
    254
    Messages:
    861
    Likes Received:
    430
    Trophy Points:
    76
    I dont have witcher :) I have tried overwatch uncapped fps and ultra, it was 82C max after 30 mins 99% gpu load. I am amazed how it can be 82C@200W@1,05V@1950mhz

    I would be glad if I could have prema bios... even it would be the same, I have high confidence in prema's work. I have very good experiences on the 775dm3 with prema bios
     
  13. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Did you do anything special with your GPU cooling, and did you TDP mod the card? Clevo 1080 is limited to 190W by default.
     
  14. matyee

    matyee Notebook Deity

    Reputations:
    254
    Messages:
    861
    Likes Received:
    430
    Trophy Points:
    76
    Nope, nothing... pasted by TGK, it is measured by hwinfo, so may be it is not that accurate... I am not fan of maxing out the laptop, I am rather chasing high performance on acceptable temps. +150mhz on core, +450mhz on mems for testing, that's it. I expected higher temps, it maxes around 82C...

    ps: i have cooling thermal pads on the graphics card, it might help as well.
     
  15. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Wow. Do you live in an igloo?
     
  16. matyee

    matyee Notebook Deity

    Reputations:
    254
    Messages:
    861
    Likes Received:
    430
    Trophy Points:
    76
    Come on! I have tested in my weekend house, inside it was not that hot, but not extremely cold either, around 25C...
     
  17. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    I had to ask because those are far and away better GTX 1080 temps than anyone else has been able to achieve in a P7 system at those settings, so my apologies I just find it a bit hard to believe.
     
  18. matyee

    matyee Notebook Deity

    Reputations:
    254
    Messages:
    861
    Likes Received:
    430
    Trophy Points:
    76
    I will post screenshots if you dont believe me... however in my p775dm3 I had the very same temps (as it is the same card)... so 82C@200W is very standard I guess....

    how much do you score in superposition basic benchmark (medium)?
     
  19. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    It's exceptional. Can you post a video of sitting on the OW main menu (unlocked FPS) for 10 minutes with the GPU temps, clocks, voltage, power on-screen?
     
  20. matyee

    matyee Notebook Deity

    Reputations:
    254
    Messages:
    861
    Likes Received:
    430
    Trophy Points:
    76
    I can do tomorrow, is it ok if I take a screenshot after 10-12mins? I do not know how to record screen. However superposition is a good benchmark, whats you score/temps?
     
  21. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    I guess you could use the GPU sensors in HWiNFO, but make sure to reset the values after the test has started, and take screenshot after 10-12 min with game still running so we have accurate min/max/avg for the duration.

    I don't remember if the OW main menu is capped at 60 FPS or not. If it is, increase render scale to 200% at epic settings or find something else that will load the GPU power constantly at 190W and let it sit for 10 min.

    I use Witcher 3 to heat test because it uses far more GPU power than any other game or 3DMark/Unigine bench. I get 88C max at [email protected] 180-190W. This is with my heatsink flattened, liquid metal, and pressured out using 2 paper clips under each retention arm. I also ordered an upgraded heatsink from China that will arrive in a week or so with more pipes over the GPU core, hope that will lower GPU temps further.
     
  22. matyee

    matyee Notebook Deity

    Reputations:
    254
    Messages:
    861
    Likes Received:
    430
    Trophy Points:
    76
    OK, but superposition constantly used 100%, overwatch used 100% and max at 82C. I will check tomorrow, however I dont expect big changes (hopefully)
     
  23. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    It's not about the usage %, it's about the GPU power. For example, 3DMark and Unigine use 150-160W at 100% while Witcher 3 uses 180-190W at 100%.
     
  24. matyee

    matyee Notebook Deity

    Reputations:
    254
    Messages:
    861
    Likes Received:
    430
    Trophy Points:
    76
    superposition uses 200w constantly but i will check tomorrow. do you have sp benchmark to compare?
    16,4k 82c
     
  25. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    1080p medium 15.7K 80C max, 150-160W with my undervolt [email protected] while Witcher 3 uses 30W more at the same settings.

    Still no idea how you can use 200W constantly on a card limited to 190W...
     
  26. matyee

    matyee Notebook Deity

    Reputations:
    254
    Messages:
    861
    Likes Received:
    430
    Trophy Points:
    76
    I dont know either, but does it count? I assume you are satisfied with yours, every chip is different, may be mine is better. Dont forget God plays for X :) my cpu spikes my gpu is fine, so its equal :)

    btw, on my F5 i have similar settings like yours, 0,9V@150-160W 82-85C under full stress.
     
  27. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Well not exactly. Those CPU spikes happen for everyone, but nobody's GTX 1080 runs in this system like yours does, that's why I wanted to see. I'm still not satisfied with my GPU thermals after a couple of attempts, so it's still a WIP pending the arrival of the upgraded heatsink.
     
  28. matyee

    matyee Notebook Deity

    Reputations:
    254
    Messages:
    861
    Likes Received:
    430
    Trophy Points:
    76
    If it is so extremely good, then there must be some other explanation, I dont think this card is so special... however it ran similar in my 775dm3. Tomorrow I try to play OW in epic for 10-15 mins uncapped and we will see :)
     
  29. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    That is strange too because the P775DM3 has better GPU cooling (but worse CPU cooling) than smaller 15" version.
     
  30. matyee

    matyee Notebook Deity

    Reputations:
    254
    Messages:
    861
    Likes Received:
    430
    Trophy Points:
    76
    I dont think that it is strange... the two machine is almost the same, and actually the card is the same. I will test tomorrow and we will see, but I am 100% sure that it had the same temp in the dm3
     
  31. Falkentyne

    Falkentyne Notebook Prophet

    Reputations:
    8,396
    Messages:
    5,992
    Likes Received:
    8,633
    Trophy Points:
    681
    AC DC loadline "1" on MSI boards only works because MSI uses a "loadline calibration" setting of the equivalent of "Medium" on desktop boards, as a unchangeable preset on their laptops. This caues "vdroop" at load to vanish on MSI boards, so putting 1.20v into a MSI F5 at idle gives you "close" to 1.20v at full load at IA AC DC Loadline=1.

    With AC DC = 0 (auto) setting, that 1.2v would spike to 1.3-1.35v at full load on a MSI board, because since there is no vdroop, the Auto (1.80 or 2.10 mOhms) setting causes REAL voltage spike at load. So only on MSI boards do you have to be worried (and you can tell by the temps!).

    Clevos and eVGA boards do NOT use an "internal" loadline calibration setting, so the IA AC DC setting has to be kept at default. Or if they do, someone with a digital multimeter would have to find the read points to check the true vcore going into the CPU at full load (not just what the VID says), but this can usually be "guesstimated" by comparing the power draw and temps at different voltage levels.

    Setting IA AC DC loadline to 1 on clevo boards causes instability because there is massive vdroop (this is what "loadline calibration" prevents from happening). So Setting 1.2v on your clevo and setting IA AC DC loadline=1 will cause it to be about 1.1v at FULL LOAD. The spikes you see are actually happening at very light load, and those spikes are BEFORE "vdroop" is occurring.

    Note: the IA AC setting is what controls the VOLTAGE RISE At full load (before vdroop) and the IA DC setting is what controls the VID DROOP (not to be confused with vcore droop!!) at full load. Setting IA AC loadline to 2.10 mOhms (value=210) on either a MSI or Clevo board, and setting IA DC loadline to 800, and setting a vcore of 1.20v static, will cause the VID to drop to under 1.0v at full load. This clearly is not the true VID---the DC setting is only changing the REPORTED VID. (the AC And DC settings are presets to intel design characteristics for how adaptive voltage and vdroop are "supposed" to be handled.

    Desktop boards tend to ignore the IA AC DC setting completely when static voltage is used (the VID will be shown very high but desktop boards have a VCORE sensor also, and the vcore will be MUCH lower than the VID).
     
    raz8020, Papusan and yrekabakery like this.
  32. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    There are some non-minor heatsink differences.

    The P775DMx design has thicker heatpipes and rads, and a 3rd shared heatpipe running over the GPU core:
    [​IMG]

    The P75xDMx/TMx design has thinner heatpipes and rads, and reroutes that 3rd GPU heatpipe to the CPU only:
    [​IMG]

    The P75xDMx heatsink was designed for up to GTX 1070 (115W) since that was the top GPU officially offered for the 15" in that generation. The GPU cooling is probably adequate for that card, but CPU cooling is a little overkill for Skylake/Kaby Lake.

    On the TMx gen, they re-used the same heatsink, which is adequate on CPU side for Coffee Lake, but is inadequate on GPU side for GTX 1080 (190W), since it was originally intended for a GPU with a 75W lower (-40%) TDP.

    BTW this is the upgraded heatsink I bought. More pipes over the GPU and less "unified" (only one joining heatpipe) compared to the 2nd pic above:
    [​IMG]
     
  33. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,897
    Trophy Points:
    931
    Its still only officially supplied with 1070 afaik.
     
  34. Porter

    Porter Notebook Virtuoso

    Reputations:
    786
    Messages:
    2,219
    Likes Received:
    1,044
    Trophy Points:
    181
    I'm really enjoying the power from such a small notebook and understood the limitations of having an 8700K and 1080 in this 15" chassis. I do have LM applied to GPU & CPU by HIDEvolution and temps for benchmarking and general usage seem under control. I can bench overclocked a fair amount and can undervolt a good amount around stock clocks too.

    That being said, one thing I have found is that under gaming loads the temps get toasty pretty fast. The notebook is on a cooling pad (fans on or off don't really make a measurable difference to peak temps). I play with max fans forced on. At stock CPU and GPU clocks, with a -100mv on CPU what happens is after a while (60 mins) the external display will be disconnected, it switches to the internal display, then after a few seconds it will switch back to the external. Surprisingly, games don't seem to care and continue to run. The GPU hits 90-91 which appears to be what is causing the disconnect. This is running 144 refresh and allowing the game to run as fast as it can.

    I tried lowering the CPU clocks (disabled turbo so it's running at 3.7 now), and increasing undervolt (at -150 now which seems stable and not causing any other issues). GPU running totally stock (not sure if you can undervolt it or how). I also lowered the FPS cap to try to identify limits and thermal capability. That helped "fix" the issue but reading on here where some non LM, non prema BIOS folks may be getting much better temps. Is my situation typical, or is it running even hotter than it should be?

    I didn't write this down, so it's all from memory and may not be 100% correct but should give an idea of approx what I was seeing. All from FC55. I have been playing a ton of FO4 but since I keep it locked to half refresh at 72 fps it never had heat issues or disconnects.

    At 144 fps cap - CPU upper 80's GPU hits 91 and get external display disconnects
    At 80 fps cap - CPU 60's GPU 80
    At 70 fps cap - CPU 60's GPU 75
    At 60 fps cap - CPU 60's GPU 70

    So while I can play games decently, I really hate the disconnects. And by “fixing” my disconnects I am crippling the performance more than I feel I should have to. My previous machine was a thin 6820HK and 1080, 120hz display, with a 230w power adapter and had no issues like this. I have tried the latest nvidia driver, moved from an older one that was stable and worked good (except for the disconnects which is only reason I tried the upgrade).
     
    Last edited: Jul 23, 2018
  35. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Clevo spec is 1070 max, but Eurocom, HID, Sager and Sager resellers, and CEG in Europe, etc. offer it with 1080.

    91C is the thermal throttle temp for the GPU. In this machine it is normal for the GTX 1080 at stock to sit at its temp limit under full load, such as playing GPU-heavy games with uncapped FPS, because Clevo never upgraded the cooling to handle past the GTX 1070, as I have been alluding to in my last posts. And why I am curious to see how @matyee is getting so much better thermal results than everyone else even with his GPU overclocked and regular Kryonaut thermal paste.

    You can drop the GPU power and temps significantly while still maintaining a high boost clock if you undervolt using the curve (Ctrl+F) in MSI Afterburner. For example, adding +160MHz to core in the main interface and then flattening out the curve at 913mV lets me run 1848MHz@913mV, which shaves off about 40W under full 3DMark/Unigine/typical gaming loads and drops peak temps from 91C to 80C. However, an extremely GPU heavy game like Witcher 3 that uses the full 190W even with the GPU undervolted will still peak at 88C.

    Keep in mind my temp numbers are with max fans (which doesn't really matter since auto fans spin up to max past 80C anyway on the Prema mod) with GPU undervolted and CPU overclocked to [email protected], in addition to LM on everything and tweaks to improve heatsink contact. So most people will get worse results than those. But I'm not satisfied yet, and am hoping the upgraded heatsink will further lower my peak GPU temps to 85C or less.
     
    raz8020 and Porter like this.
  36. Porter

    Porter Notebook Virtuoso

    Reputations:
    786
    Messages:
    2,219
    Likes Received:
    1,044
    Trophy Points:
    181
    Thanks. I had tried to undervolt GPU using afterburner, and google searches led me to discussions about how it worked, and also that it doesn't work on notebooks at all. I will try it again when I get home. I've never heard of the control+F so maybe that was my lack of reading up on it and just trying to click around.

    Keep us up to date on cooling. If you find something that makes a huge difference (more than a degree or two) then I might be interested in trying it too.
     
  37. m4gg0t

    m4gg0t Notebook Evangelist

    Reputations:
    64
    Messages:
    393
    Likes Received:
    86
    Trophy Points:
    41
    If the P751TM can keep a 1080 bellow 90C then why can't the P775TM1-G keep the 1080 bellow 90C even with undervolt? This was the case when my system was working. Even with the 1080 running at 0.913v it was still hitting 90C.
     
  38. Porter

    Porter Notebook Virtuoso

    Reputations:
    786
    Messages:
    2,219
    Likes Received:
    1,044
    Trophy Points:
    181
    I thought that even though they are different sized chassis that the had the same (or very similar) cooling solutions and limitations?
     
  39. TJCacher

    TJCacher Notebook Consultant

    Reputations:
    56
    Messages:
    117
    Likes Received:
    89
    Trophy Points:
    41
    Well I'm not having any trouble keeping my 1080 (or my 8086K CPU) way, way below 90F during gaming, but my games are not as challenging as I suppose some of yours are. But with Skyrim SE (Ultra), Fallout 4 (Ultra) and GTA V (Very High) games settings, I just use NVidia Inspector to set a global profile that frame-limits to between 75 and 90 fps (I play Fallout 4 and Skyrim at 90 and GTA at 75). Lately, I've been keeping the CPU undervolted by 130mv, which is stable but has no effect on the GPU of course.

    I really don't get the idea of completely uncapping the framerates and letting the machine just kill itself generating frames far and away above what's necessary for completely smooth, buttery gameplay.

    These games, run under these settings for extended playtimes, rarely max the GPU higher than 80F (mainly GTA) or 70F (Fallout 4 and Skyrim SE), and average GPU temps are much lower, tending to be at least 10F or 15F below the max temps, and with the bonus of keeping the fans nice and quiet (or at least unobjectionable...).

    I play inside where ambient temperatures are 75F or below, with the laptop on a lap desk, atop a compact folding aluminum frame that lifts the rear of the laptop about 1" above the surface of the lap desk.

    I am using the fans on Auto, with Windows power plan at maximum performance and Clevo Control Center on "Performance" mode. I only have the stock BIOS, and had Sager upgrade the thermal paste when I ordered the machine.

    Temperatures have been amazing, IMO, but, again, I realize I'm not pushing the machine as hard as some of you are trying to do.
     
  40. m4gg0t

    m4gg0t Notebook Evangelist

    Reputations:
    64
    Messages:
    393
    Likes Received:
    86
    Trophy Points:
    41
    Playing BF1 one with 8700k at 4.7ghz. 1.175v and 1080 at 1848/10516 at 0.931v i still got 84C on cpu abd 88 on 1080 in my P775TM1-G.
     
  41. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    The heatsink needs to be tweaked to improve contact so that LM can be used. It was the same for me, 90C and thermal throttle even when undervolted, until I did that, which dropped my GPU 8C compared to untweaked heatsink and normal paste.

    The P775TMx has a non-unified heatsink and thicker heatpipes and rads:

    [​IMG]

    Because having 144Hz and a GTX 1080, and capping it at 75-90 FPS, is a waste of the screen and GPU. You might as well have gotten a 1070 which has no thermal issues in this machine. CPU and GPU load/power/temps are far lower when you limit the frame rate like that. But it's a band-aid solution that is only effective in not very demanding games, because when you play a game that hits 100% load at your frame cap or lower, it will still overheat, unless you lower the cap even further.
     
    raz8020 and Porter like this.
  42. m4gg0t

    m4gg0t Notebook Evangelist

    Reputations:
    64
    Messages:
    393
    Likes Received:
    86
    Trophy Points:
    41
    Well i guess i have to try out the paper clip mod then when my system comes back.
     
  43. TJCacher

    TJCacher Notebook Consultant

    Reputations:
    56
    Messages:
    117
    Likes Received:
    89
    Trophy Points:
    41
    Fair enough. I don't think I have that problem (100% loading, that is) with my games, though, so I guess even though I didn't actually need the 1080, at least it's not hurting me for now, and I hopefully have a bit of future-proofing. (I bought this machine out of necessity, as I was out of a working gaming machine at the time. Otherwise, I would have waited a year or so, but I didn't have that choice.)

    I haven't really played Witcher 3 much, but I do own it, so I installed it and loaded my last save game and captured some data for you. This is with about 30 minutes of play, but I don't know the map or game well enough to get to the city you posted footage of a while back, so I just galloped my horse around the village I was in and galloped around the dense forests and fields in the surrounding area and killed a few wolves.

    For this session, I completely uncapped framerates, left Vsync off, and set the game to Ultra graphics settings at 1920x1080 Full Screen. My framerate never much exceeded the low 70's at any point (saw as low as 60 and as high as 80 once or twice, but mostly hovered around 70-72fps), but I was still surprised when I saw the max and average values from HWInfo. Here they are (captured a few seconds after exiting the game).

    What should I be doing in this game to really push the hardware? Seems like the framerates just never naturally got high enough to push things into the red zone...

    Witcher 3 Temps.png
     
    Porter likes this.
  44. Porter

    Porter Notebook Virtuoso

    Reputations:
    786
    Messages:
    2,219
    Likes Received:
    1,044
    Trophy Points:
    181
    Thanks for this, I will try the same thing tonight just for (very rough) comparisons.
     
  45. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    This is the problem. Your GPU wasn't fully utilized, that's why your metrics are low. Your GPU Core Load is cut off at the bottom but it's not hard to guess it wasn't close to 100% either. Assuming stock, the card should be hitting maximums of 1.05V-1.063V, ~190W, and 1800MHz+ at full load.

    Something is still limiting your performance. Turn off any FPS caps in-game/driver/RTSS and turn off G-Sync and VSync. Go to White Orchard (beginning area of the game after the tutorial) and ride/run around a bit in the foliage. The GPU should heat up very quickly.

    [​IMG]
     
    Last edited: Jul 23, 2018
    raz8020 likes this.
  46. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,897
    Trophy Points:
    931
    Resolution at 720p by mistake perhaps?
     
  47. TJCacher

    TJCacher Notebook Consultant

    Reputations:
    56
    Messages:
    117
    Likes Received:
    89
    Trophy Points:
    41
    @Meaker@Sager : No, very careful to check graphics settings. All gameplay was at 1920x1080 with framerates uncapped and vsync disabled. I had the machine running at stock clocks, frequencies and using max performance settings in both Windows power plan and Clevo Control Center. When the game is at its initial settings screen, the framerates are over 2000fps, so I know the framerates are unlocked.

    I managed to do some reconfiguring with RTTS and HWInfo and tried to set up my sensor displays as close to the ones used by @yrekabakery for his recently-uploaded gameplay video as I could figure out how to do.

    Since I don't usually play Witcher 3, I didn't have a save game that enabled me to visit the city in his gameplay video, so I downloaded an endgame-level save game and used it to make a new video. Please excuse my bumbling around in-game - I don't know this game at all well, and got lost several times.

    This video is about 12 minutes long, and seems to me to show that the game is not using my machine's full hardware capabilities, although the GPU core-load percent does stay around 95% for most of the video. I don't understand why the CPU clocks and voltages and the GPU clocks and voltages are so low during this gameplay. There are places in the video where I can get framerates above 100fps, but it tends to stay in the low 70's. Even though the GPU usage is pretty high, the temperatures, voltages and frequencies of both CPU and GPU stay quite low, so I don't understand what's going on. I've double-checked to make sure I'm using high-performance settings in the power plan and control center. My other games are quite capable of driving the CPU/GPU to thermal limits if I leave the framerates fully unlocked, so I don't really know what's going on with Witcher 3.

    It does seem like something's going on, but I can't figure out what. Here's a YouTube link to the longer video I made this evening with OSD for most major sensors. I downloaded a different person's more advanced save game so I could visit the same are you did in the video you uploaded - note that I re-encoded the video at a lower resolution (1280x720) and framerate (24fps) to make uploading and youtube processing a bit faster, so the video is currently available at a max resolution of 1280x720, but I took screenshots of the game settings before jumping into the save game:



    Here is a snip of my selected sensors from HWInfo immediately after the video ended (I added the GPU Core Load) - some of the other sensor names have been changed, as that was the only way I could get the labels I wanted on the OSD in-game:

    Witcher 3 Temps.png
     
  48. m4gg0t

    m4gg0t Notebook Evangelist

    Reputations:
    64
    Messages:
    393
    Likes Received:
    86
    Trophy Points:
    41
    That's really strange. 1080 should be hitting just under 1800mhz and your CPU should be 4.3ghz flat.
     
  49. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    I'm stumped. I have no idea why your system is so underperforming in Witcher 3.
     
  50. TJCacher

    TJCacher Notebook Consultant

    Reputations:
    56
    Messages:
    117
    Likes Received:
    89
    Trophy Points:
    41
    Yes, and just to be clear, I ran Fallout 4 without making a single system change immediately after I recorded the Witcher 3 gameplay, with Fallout 4 set to maximum graphics, vsync off and unlimited framerates, and the CPU cores stayed at 4300/4400 MHz for the whole 12-minute play-through, and the GPU clock alternated between 1793 / 1809 MHz. GPU Core usage was above 90% the whole time, and the GPU temperatures gradually rose to the high 80's, peaking at but never rising above 90F. This was what I expected Witcher 3 to do with these settings, but for some reason it just doesn't. Fallout 4 framerates during this session were above 120fps, with a maximum of about 150fps at one point.

    I also played around a bit more with Witcher 3 settings to see if I could improve framerates, and I followed an optimization guide's recommendations to switch off the Hairworks stuff and reduce foliage distance fade to "High" instead of "Very High". By doing those two things, the identical gameplay depicted in my posted video rose from 70-80 fps to 90-120fps, with the average hovering just over 100fps.

    But even so, the CPU and GPU frequencies stayed at the lower-than-expected values seen in the video.

    I also spent some time searching with Google to try and find if others had seen similar issues, and there are a few cases where people playing Witcher 3 were seeing the GPU drop its frequencies to levels normally used for 2D desktop display. This was variously blamed on updates to Windows / NVidia driver settings / GeForce Experience and several other programs, including MSI Afterburner and its ilk. But though some people report solving the problem by uninstalling or changing an application, I didn't see any clear general solution. It seems if you have the problem, you may be stuck with it.

    I did see one single instance where, during Fallout 4 play, the GPU frequencies dropped and "hung" at too low a level, causing framerates to plummet. I waited for it to fix itself for a few seconds, and when it didn't, I task-switched away from Fallout 4 and then back to it, and the GPU frequencies instantly returned to their maximum values. I wonder if this is related to the problem I'm seeing in Witcher 3?
     
    Last edited: Jul 24, 2018
← Previous pageNext page →