The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Any way to disable GPU Boost on GTX1060?

    Discussion in 'Sager and Clevo' started by UzY3L, Oct 24, 2019.

  1. UzY3L

    UzY3L Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    2
    Trophy Points:
    6
    Hello,

    I'm looking for a way to disable GPU Boost on my Clevo P775TM1-G which has a GTX 1060 in it.

    The reason I want this is that I have no need for the extra power. I have a huge backlog of old games for which this card is way more than enough for 90FPs, even at stock clocks.

    GPU boost however sees temp overhead and goes to 1950MHz on the core clock which uselessly drives the temps and also the noise from the fans way up.

    I've tried every piece of software out there and nVidia Profile inspector is the only one who did anything, as in it managed to keep the clock at 1702MHz which is still 300MHz too much. V-Sync doesn't help as again, GPU Boost does its "magic" and keeps the card at 1950MHz at all times in games.

    Can anyone help me with either disabling GPU Boost or forcibly keeping the GPU clock at its stock speed at all times?

    Thank you!
     
  2. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    You can use the voltage/frequency curve editor in MSI Afterburner (Ctrl+F) and either lock it at one point (L), or flatten the curve past that point, and hit apply. Both will accomplish the same thing, which is prevent the GPU from boosting past that voltage/frequency point, but with a locked curve the GPU will not downclock at idle.
     
    Vasudev likes this.
  3. UzY3L

    UzY3L Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    2
    Trophy Points:
    6
    Only options available in MSI Afterburner are Core and Memory clocks. If I set the core clock all the way back, it still boosts. Even if every option is turned on in MSI Afterburner, I still don't get access to voltage.

    The same with most other software I've tried, including XOC, EVGA Precision, something from Asus etc.
     
  4. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    I suggest reading my post again and paying attention to the hotkeys I listed...
     
    UzY3L likes this.
  5. UzY3L

    UzY3L Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    2
    Trophy Points:
    6
    That was a lot of work but simple and effective. Sorry, I was too caught up in disappointment from not finding a solution, to properly read what you wrote. Thank you very much for your help!

    I barely hit 50C now and the system runs quiet also. Thanks again for your help! I haven't been able to find this anywhere else on the web, related to disabling GPU Boost at least.

    Cheers!
     
    Last edited: Oct 25, 2019
    4W4K3 and yrekabakery like this.
  6. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    Have you looked in the nvidia control panel at the power options?
     
  7. UzY3L

    UzY3L Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    2
    Trophy Points:
    6
    Yes. It was set at Optimal Power and also PCIe power settings in windows power plan is set the Max power savings for both battery and plugged in. GPU Boost simply ignores this. Don't get me wrong, it's a great option to have, on a desktop. On a mobile system with limited cooling and the fans being so close to you, not so much.

    yrekabakery's sollution works and even though it takes about 10min to set every single square to my desired frequency, it works and that's what I needed so I can play games from years ago without the GPU thinking I need 400FPS :)
     
  8. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    You can save it once set.
     
  9. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    An FPS limit would’ve been a more elegant solution. ;)
     
    UzY3L likes this.
  10. UzY3L

    UzY3L Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    2
    Trophy Points:
    6
    Followed your advice. Thank you!

    I've tried setting the FPS limit in Rivatuner and it works for its intended purpose, but GPU Boost still keeps the card at 1950MHz. Tried a combination of this + V-Sync set from 144Hz to 60Hz and still, GPU Boost does its thing. Your solution with MSI Afterburner of flattening the curve works flawlessly :)
     
  11. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Well if the GPU load is low enough, the card should automatically downclock itself or even drop to a lower performance state. So idk why it's still boosting to 1950MHz, which is above the stock Pascal max boost of 1911MHz (only achievable at temps below 40-something celsius). Sounds like your GPU is overclocked and also has PowerMizer disabled.
     
  12. UzY3L

    UzY3L Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    2
    Trophy Points:
    6
    I use Thermal Grizzly Kryonaut and Coolermaster Notepal U3. Wish I could change the heatsink to one from the P775DM3 as i think it would help with the GPU temps. The card however, has a max boost of 1962MHz so 1950MHz is not the top but still, way more than I want or need at the moment.
     

    Attached Files:

  13. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    You probably have the 90W OC vBIOS for the 1060 that has a higher boost clock than the 80W version.

    Nah, trust me you don’t want the heat bleed that comes with the unified heatsink.
     
  14. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    Unified heatsinks don't have some kind of heat flow penalty like that.
     
  15. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    IME it does. On my old, more unified heatsink, both CPU and GPU ran hotter while gaming and in other combined loads.
     
  16. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    Then that was just down to differences in design, adding a thermal bridge is not going to negatively impact cooling unless it is somehow impacting fin space or airflow.
     
  17. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Well the proof is in pudding, with the P775TM1 and P870TM1 performing better in terms of cooling than the previous P775DM3 and P870DM3, which had unified heatsinks.
     
  18. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    The 870tm has extra fun space for one thing. I've not paid too much attention to the 775 personally but such a change would likely be related to that sort of thing.

    Energy can neither be created or destroyed so adding routes for it to go is not going to reduce your cooling.
     
  19. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Doesn’t change the fact that there are more or less efficient heatsink designs. I’ve seen people snip the two heatpipes connecting the CPU coldplate to the GPU vapor chamber on the P870DM3 and get better combined load temps. Same with unscrewing the connecting heatpipe on the MSI MS-16L13. Again, a unified heatsink is not optimal for combined workloads and there is numerous evidence out there. I know as a Sager reseller you can’t say anything negative about unified heatsinks considering every single current Clevo model uses them now because it would be a conflict of interest, but it needed to be said.
     
  20. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    You want maximum heat flow on combined loads, one device will go up in temperature and the other will go down or they will even out of their thermal load is the same.
     
  21. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Thermal load is almost never the same between high performance CPUs and GPUs, neither is thermal density, and the hotter component will raise the temps of the cooler component in a shared design.
     
  22. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    Yes and the hottest part will get cooler as a result, the cooler part will get hotter and it will balance out.
     
  23. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Yes, it balances out by both components running hotter. ;)
     
  24. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    How do you explain the physics of a heatpipe adding heat to a closed system?
     
    UzY3L likes this.
  25. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Not adding heat, but being less efficient in removing it from the heat producing components.
     
    UzY3L likes this.
  26. UzY3L

    UzY3L Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    2
    Trophy Points:
    6
    Very informative, gentlemen. I was also under the impression that if the CPU ran cooler by 20-30degrees then it would help the GPU temps get lower also but it seems to be an issue of heat and less heat, not heat and cold.

    Thank for all your info. Seems then that the only way to keep the GPU at a max of 70C is either to downclock it or to cut the plastic cover and direct the flow from the Notepal U3 right around the GPU heatsink and fan.
     
  27. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    There is no such thing as cold, only lack of heat, that's important to keep in mind with thermodynamics.