The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Average 980m overclock?

    Discussion in 'Gaming (Software and Graphics Cards)' started by Phase, May 27, 2015.

  1. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    I know a lot of people have mentioned overclocked 980ms still staying in the 70cs. What's the max overclock from anyone here on the forum?

    and my main question is - given the general clevo cooling systems and cool room temps, what type of overclock is there when keep temps in the high 80cs and low 90cs before thermal throttling?

    I know it may decrease lifespan, but for someone who upgrades every year or two, how far can you push the 980m and keep it below 93c?

    thanks
     
  2. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    About 1450/1450 is the limit for my 980M SLI setup, and that is with AC cooling for benching. The 980M core runs cool, but other components do not. If you push them too hard you will end up with black screens, BSOD, hard locks and what not due to thermal issues with MOSFET or VRM temps even though the core is not getting that hot.

    Around 1325/1400 is as far as I can go without black screen freezes without AC cooling. Even at those lower clock speeds, that's still a ton of horsepower.

     
    Dr. AMK likes this.
  3. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    My experiences are very similar to Mr. Fox's when it comes to OCing the 980m. Though my PSU can't take such OC at all, with readings on the 380Watts.

    I use, at best, a +100/100 core/memory OC but to be honest, I haven't had the need yet.
     
    Mr. Fox likes this.
  4. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    For gaming, that amount of overclock (+100/100) with 980M is substantial. Many, if not most, games lose stability with crazy high GPU overclocking anyway.

    That 3DMark 11 run in the spoiler pulled about 675W on my 900W CyberPower UPS digital power meter. 980M is power efficient at stock clocks, but things get really wild on power demands with overclocking.
     
    Dr. AMK, Scerate and Prema like this.
  5. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Good thing my CPU is lousy enough to merely pull 47 watts, otherwise my PSU would explode :p

    But yeah, at stock clocks, I rarely break the 300watts unless something is very demanding. Like the witcher 3 at 4k maxed out.
     
    Mr. Fox likes this.
  6. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    does having SLI reduce overclocking ability vs a single?

    My favorite game is battlefield 4 and if I get a 980m laptop, I will HAVE to get a 4k screen with it. I know at stock it can do 4k bf4 at around 32-45 fps depending on action. How Many more fps would overclocking give when it comes to 4k in a game like that? That's just my biggest concern
     
  7. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Why do you HAVE to get a 4K screen?
     
  8. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    Photography. 4k video editing. More Screen real estate. Plus it's only a few hundred extra to upgrade to.
     
  9. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    FTFY
     
  10. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    SLI will impede your ability to overclock unless you have a dual power brick setup. My 980Ms at stock along with my 4940MX locked to 47W TDP for consistent results yields 321W peak power draw. Due to my heat issues I haven't tested the mod but since the temps go up 8C with it, I'd imagine the power draw goes into the 350+ range, especially since the voltage is higher (1.065v on both cards vs 1.018v on the master and 1.043v on the slave)
     
    Dr. AMK likes this.
  11. thegh0sts

    thegh0sts Notebook Nobel Laureate

    Reputations:
    949
    Messages:
    7,700
    Likes Received:
    2,819
    Trophy Points:
    331
    haha so masters are lazy and the slaves work hard LOL
     
    jaybee83, 1nstance and ps95 like this.
  12. Mr Najsman

    Mr Najsman Notebook Deity

    Reputations:
    600
    Messages:
    931
    Likes Received:
    697
    Trophy Points:
    106
    I can do +200 MHz (1326 MHz) on the core, +250 MHz crashes in Heaven (haven´t tested above 72,5 mV overvolt).

    With +200 MHz I get a 10-20% increase in min fps and 10% increase in avg fps:

    [​IMG]
     
    Dr. AMK likes this.
  13. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    I just discovered something interesting... If I do a +100/+100 OC my cards never drop below boost... In other words when they power limit throttle, they are throttling above stock boost... So never any lower than 1126MHz. Even better, temps are basically the same and this seems to be a stable overclock at stock voltage on 347.88. I'm not really seeing any reason to not run this 24/7?

    The lowest boost drop running Heaven for 20 minutes was 1152.5 with an average of 1208MHz.
     
  14. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    It can. It means both cards would have to overclock equally, and with SLI they also seem to be more sensitive to overclocking than as a single. Plus your cooling would have to be great, where manufacturers could use a more aggressive cooler on a single than what each would get on a double due to size constraints.
     
    Dr. AMK likes this.
  15. Prema

    Prema Your Freedom, Your Choice

    Reputations:
    9,368
    Messages:
    6,297
    Likes Received:
    16,485
    Trophy Points:
    681
    Wow, your cards must have very high ASIC score (or voltage throttle) if they 'auto-undervolt' that much...(ASIC score controls stock 3D voltage, which for most people with Clevo MXM GTX980M is 1.062v and represents an ASIC of around 65%).
     
    Last edited: May 28, 2015
  16. Keith

    Keith Notebook Deity

    Reputations:
    889
    Messages:
    788
    Likes Received:
    221
    Trophy Points:
    56
    I can run mine at +150/+400 with the CPU set to 4.0 GHz on all of the cores with a single PSU. If I go any higher on the GPU clocks, I get the black screen. It makes no difference if I lower the memory clock, or even leave the memory clocks at stock. Overclocking the core above +150 causes a black screen for me.
     
  17. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    That depends on your system. Dual GPU takes twice as much power, but delivers a massive increase in performance and bandwidth. For example 780M SLI running stock still outperforms a single 980M running with a substantial overclock. AC adapters are a limiting factor whether you have single or dual GPU. The stock AC adapter that ships with most systems are inadequate for heavy overclocking. You need a 330W AC adapter for limit-free overclocking with a single 780M or 980M. With 780M SLI and 980M SLI dual 330W AC adapters are just barely enough for massive overclocking.

    If you want the best experience in 4K gaming, nothing short of dual GPU would be ideal for a desktop or laptop. There are some multi-GPU haters out there that vehemently argue that single GPU is better, and say that too many problems come along with SLI and CrossFire. After owning 5 different laptop models with dual GPU I can tell you that, for me, no single GPU machine will suffice. If and when you do run into a situation where a game is a coding abortion that does not support CrossFire or SLI, or seriously misbehaves with it, the solution is simple. Take 3 seconds to disable it, play that crappy game until you get sick of it, then turn it back on again. You can enjoy the best of both worlds. With one GPU, you can't. You are going to see far more benefit running BF4 with SLI at stock clocks that you would ever begin to realize with a single GPU overclocked. There is no substitute for horsepower and brute force.

    [​IMG]
    - versus this -
    [​IMG]
     
  18. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    You discovered my secret ;)

    Those clocks are like a sweet spot for me, and they don't really require overvolt, or at best, a +12mv. At least for me.
     
    Dr. AMK and Ethrem like this.
  19. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    well if they made a 17 inch laptop with a 4k screen and sli gpus, id go for that instantly. too bad it's not available

    and you say SLI offers more bandwidth? Aorus is making that 15 inch laptop with a 4k screen and has 965ms in SLI. The benchmarks say it's better than a single 980m. And if doing 4k gaming, it would help more in that aspect too because of bandwidth?
     
  20. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Just wait a few months. ;)
     
    Mr. Fox likes this.
  21. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    Q4 according to Prema..
     
  22. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    for 4k screens in 17 inch notebooks!?!?!?
     
  23. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    Don't waste your money on that. A single 980M is going to struggle with 4K and 965M SLI probably won't fare much better. There are also some thermal issues to content with on the thin and light options. Plus, the low-TDP BGA CPUs offered in those and similar machines isn't going to work as well for the other tasks you are interested in doing (photography and 4k video editing).

    ^^^^This... if you are not going to use a 4K external monitor, see what shakes out. If you are planning to use an external 4K display, then grab one of the Clevo SLI monsters with 980M SLI now if you are in a hurry.
     
  24. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    guess i''l have to wait. any chance of nvidia releasing their 980m replacement, the 1080m or 990m or whatever itll be called during the summer? 780m was in the summer. i feel like it's a possibility with windows 10 coming out and then nvidia putting new gpus and trying to fall into the hype machine.
     
  25. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    The only danger in waiting is, you don't know if what you are waiting for is really going to deliver the goods or become another one of the power-crippled throttling turds we have come to expect. NVIDIA is doing progressively horrible things to their own products, so all bets are off on the future with power throttling and clock-blocked abortions.

    With the Clevo P570WM (Eurocom Panther) you can count on having the best experience possible in 4K laptop gaming with a throttle-free beast and a hexacore CPU. The caveat is the 4K external display. Having used QHD+ on two machines for several months, I can tell you (personal opinion... YMMV, but remember you heard it here first, LOL) anything higher than 2560x1440 on a screen 17" or smaller sucks and 15" is miserable. I don't know why people are going for these insane resolutions, especially on tiny laptop screens. Display scaling only looks right at 100% (some are happy with 125%) but text is too small to read comfortably, even if you have great eyesight. Above 100% display scaling, text in dialog boxes and menus starts to become truncated and you end up losing a huge portion the screen real estate that is so valuable in higher resolutions. If you stick with a 1080p laptop display you will have a more serviceable unit after the warranty runs out and won't have a detached retina trying to read text.

    The other thing to consider is the premium price tag for a native display is also something to consider. Assuming the replacement part is a high quality product, and readily availability, if you accidentally break that 4K laptop screen (always possible and not uncommon) a replacement 4K panel is going to cost a ton more than a 1080p display. Be sure to get an accidental damage warranty that covers that screen, or you might be using an external display whether you want to or not.
     
    1nstance and yegg55 like this.
  26. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    i like the '' retina'' displays out there. apples macbook pro thats 15 inches and has a 1800p screen looks amazing. the ppi would be even more on a 4k screen. plus editing and seeing 4k video natively would be nice, along with editing 18 megapixel photos.

    i don't know if anyone has done this, but in battlefield 4, you could set the resolution to native 4k, then use the resolution scale and bring it down to 150 percent or 175 percent. i'm assuming it'll still look better than native 1080p.
     
  27. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Setting resolution scale above 100% is supersampling. So at 150% it would be 4K x (1.5 x 1.5) and at 175% it would be 4K x (1.75 x 1.75). I'm not sure if even Titan X SLI can handle 2.25 x 4K and 3.1 x 4K.
     
  28. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    78.8% master and 72.7% slave

    I'm pretty sure that they would run stock @ 1v flat if I could undervolt. There was a reason I kept bugging you about it lol! ;)

    @ryzeki Nice secret. Interesting that it tricks the throttle...
     
    Last edited: May 28, 2015
    Prema likes this.
  29. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,660
    Trophy Points:
    931
    Personally, I think everything looks terrible at those higher scalings, and you basically forfeit the usable screen real estate, so it really diminishes the value of having the higher resolution. Add to it the hit in performance mentioned by @octiceps and you've more or less negated compelling benefits to having the higher resolution display. On both QHD+ machines I used for a few months I did a lot of adjustments trying to have things just right and finally just gave up and adjusted the resolution to 1080p. QHD+ at 100% scaling on 15" looked really amazing unless you were trying to read text, and then it sucked. Everything was crystal clear and sharp, but VERY TINY.
     
  30. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    whoops. i meant bring it down to like 75 percent, not 175 when youre at 4k
     
  31. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    You would have to reduce resolution scale to 50% since 1080p is 1/4 of 4K and 0.5 x 0.5 = 0.25
     
  32. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    well im saying would putting the the resolution scale halfway between 1080p and 4k when at 4k native resolution still look better than 1080p at 1080p native?
     
  33. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    So at 141%? Idk. What would look better on a 4K screen, 1920x1080 or 2715x1527? My money is on 1527p but is it worth halving performance? 2x isn't exactly great edge smoothing even if it is SSAA. You might find 1080p w/4x MSAA to look and perform better.
     
  34. imglidinhere

    imglidinhere Notebook Deity

    Reputations:
    387
    Messages:
    1,077
    Likes Received:
    59
    Trophy Points:
    66
    In my experience, anything over 1200p shows little gains in smoothing out the edges since the resolution is so high anyway... I can't imagine that you'd really notice any jaggies at 4k...
     
  35. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    My daily overclock is +125/+100.

    Anything over on the memory produces artifacts. Anything over on the core results in driver-level crashes at stock voltage. Have yet to have a BSOD-level crash whilst playing around with overclocking. I'm limited by the voltage and my PSU - upping the voltage means that the system consumes over 180W (that my PSU is rated for). It never crashed on me, but I'd rather not take the risk of blowing my PSU up.

    P.S. 4K UHD is awesome!
     
  36. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    You know... It is crazy how much of an impact AA has on temps... TW2 ultra + ubersampling w/ vsync brought my 980Ms to 84C with max fans in minutes. Just turning off ubersampling brought my max temp down to 61C without the fans kicking in.

    Same story with Sleeping Dogs Definitive Edition. 22C temperature plummet from extreme to high.

    So much for efficiency. But at least I found a workaround to keep my temps low. The GPUs don't power limit throttle as much either so my +100 overclock actually stays at least +50 100% of the time. Not bad for stock voltage on the stock vbios.
     
  37. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    What was your GPU utilization with and without SSAA? Like 99% with/50% without? :p
     
  38. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    Doesn't really matter when the settings don't produce any real noticeable quality change.
     
  39. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Just curious how GPU load affected temps for you is all, since turning AA on/off with V-Sync engaged would have a massive impact on how hard your GPU is working.

    And yeah, Ubersampling is notoriously poor quality. TW2's MLAA and post process sharpening filter are already quite good. Sleeping Dogs though is a jaggy fest on anything lower than High AA. I envy your eyes because some kinds of aliasing I cannot unsee :p. Temporal shimmer on foliage is really what kills me.
     
  40. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    I'll check it later. Don't feel like messing with it right now

    And I agree with both of your points but Square Enix seems to have fixed the high setting on the Definitive Edition. It still looks quite good on high compared to ultra. That wasn't the case for the original game.
     
  41. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yeah that's what I said. High and Extreme AA look good. It's Normal AA that's crappy. Definitive Edition didn't change any of the AA settings compared to original. Didn't do much of anything really except chop off 1/3 the performance with some questionable visual changes.
     
  42. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    Well High AA has 20C+ decrease from extreme with no real quality change so I'll take it.

    Oddly enough, TR 2013 on ultimate runs around the same temp as the high setting on sleeping dogs. Sometimes I think my system has borked thermal sensors... TressFX should be stressful.
     
  43. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    i never use AA in bf4 anyways. not that noticable on a 15 inch screen. the 980m gets like 75fps with 4x msaa. the 880m is rated at 45fps. i play with it off and vysnc on and it stays at 60fps constant. i'm sure the 980m with aa turned off would still allow 60fps between 1080p and 4k
     
  44. thegh0sts

    thegh0sts Notebook Nobel Laureate

    Reputations:
    949
    Messages:
    7,700
    Likes Received:
    2,819
    Trophy Points:
    331
    it seems the best i can do is 1303/1450.
     
  45. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Do you guys overvolt when overclocking for everyday gaming? Or is all the overvolting just for benchmarks? Just curious as I could easily squeeze more juice out of my lemon if I was willing to risk an overvolt.
     
  46. Mr Najsman

    Mr Najsman Notebook Deity

    Reputations:
    600
    Messages:
    931
    Likes Received:
    697
    Trophy Points:
    106
    So far I´m gaming at stock. When I overclock +200 MHz I need a +12,5 mV overvolt.
    I might overclock when I´ve moved from Witcher 2 to Witcher 3.
     
  47. thegh0sts

    thegh0sts Notebook Nobel Laureate

    Reputations:
    949
    Messages:
    7,700
    Likes Received:
    2,819
    Trophy Points:
    331
    yep, looks like +100/+300 is the best i can do.

    EDIT: still tweaking the settings probably the best "best" is +100/+395.
     
    Last edited: Jun 7, 2015
  48. menko

    menko Notebook Consultant

    Reputations:
    15
    Messages:
    114
    Likes Received:
    26
    Trophy Points:
    41
    I can't get much than 150mhz core overclock. Not even with higher voltage. Temps are good bellow 80°C.

    Is it normal looking at my temps that I can't put it higher?
     
  49. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    +250 on core and +450 on memory w/ .5 overvolt has been stable for me. Not too worried about overvolt since the hardware can handle it, it is the same core with disabled shaders as 980 desktop which still uses more voltage than my overvolt. Temps are stable.

    As for reducing the life of the GPU? I plan on upgrading it when the next major update from Nvidia or AMD is released. Probably NVidia, unless AMD can prove me otherwise. But definitely not the upcoming AMD, would have to be the next year. 14nm and nothing less for the next upgrade.
     
  50. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    finally someone else who doesn't care about gpu life. i plan on upgrading every year, two years tops. i feel like having a gpu thats several years old is a waste and can't max out new game settings so i prefer to get the most out of my system
     
 Next page →