I know a lot of people have mentioned overclocked 980ms still staying in the 70cs. What's the max overclock from anyone here on the forum?
and my main question is - given the general clevo cooling systems and cool room temps, what type of overclock is there when keep temps in the high 80cs and low 90cs before thermal throttling?
I know it may decrease lifespan, but for someone who upgrades every year or two, how far can you push the 980m and keep it below 93c?
thanks
-
About 1450/1450 is the limit for my 980M SLI setup, and that is with AC cooling for benching. The 980M core runs cool, but other components do not. If you push them too hard you will end up with black screens, BSOD, hard locks and what not due to thermal issues with MOSFET or VRM temps even though the core is not getting that hot.
Around 1325/1400 is as far as I can go without black screen freezes without AC cooling. Even at those lower clock speeds, that's still a ton of horsepower.
Dr. AMK likes this. -
My experiences are very similar to Mr. Fox's when it comes to OCing the 980m. Though my PSU can't take such OC at all, with readings on the 380Watts.
I use, at best, a +100/100 core/memory OC but to be honest, I haven't had the need yet.Mr. Fox likes this. -
For gaming, that amount of overclock (+100/100) with 980M is substantial. Many, if not most, games lose stability with crazy high GPU overclocking anyway.
That 3DMark 11 run in the spoiler pulled about 675W on my 900W CyberPower UPS digital power meter. 980M is power efficient at stock clocks, but things get really wild on power demands with overclocking. -
Good thing my CPU is lousy enough to merely pull 47 watts, otherwise my PSU would explode
But yeah, at stock clocks, I rarely break the 300watts unless something is very demanding. Like the witcher 3 at 4k maxed out.Mr. Fox likes this. -
does having SLI reduce overclocking ability vs a single?
My favorite game is battlefield 4 and if I get a 980m laptop, I will HAVE to get a 4k screen with it. I know at stock it can do 4k bf4 at around 32-45 fps depending on action. How Many more fps would overclocking give when it comes to 4k in a game like that? That's just my biggest concern -
Why do you HAVE to get a 4K screen?
-
Photography. 4k video editing. More Screen real estate. Plus it's only a few hundred extra to upgrade to.
-
FTFY
-
SLI will impede your ability to overclock unless you have a dual power brick setup. My 980Ms at stock along with my 4940MX locked to 47W TDP for consistent results yields 321W peak power draw. Due to my heat issues I haven't tested the mod but since the temps go up 8C with it, I'd imagine the power draw goes into the 350+ range, especially since the voltage is higher (1.065v on both cards vs 1.018v on the master and 1.043v on the slave)Dr. AMK likes this.
-
-
I can do +200 MHz (1326 MHz) on the core, +250 MHz crashes in Heaven (haven´t tested above 72,5 mV overvolt).
With +200 MHz I get a 10-20% increase in min fps and 10% increase in avg fps:
Dr. AMK likes this. -
I just discovered something interesting... If I do a +100/+100 OC my cards never drop below boost... In other words when they power limit throttle, they are throttling above stock boost... So never any lower than 1126MHz. Even better, temps are basically the same and this seems to be a stable overclock at stock voltage on 347.88. I'm not really seeing any reason to not run this 24/7?
The lowest boost drop running Heaven for 20 minutes was 1152.5 with an average of 1208MHz. -
It can. It means both cards would have to overclock equally, and with SLI they also seem to be more sensitive to overclocking than as a single. Plus your cooling would have to be great, where manufacturers could use a more aggressive cooler on a single than what each would get on a double due to size constraints.Dr. AMK likes this.
-
Wow, your cards must have very high ASIC score (or voltage throttle) if they 'auto-undervolt' that much...(ASIC score controls stock 3D voltage, which for most people with Clevo MXM GTX980M is 1.062v and represents an ASIC of around 65%).Last edited: May 28, 2015randyre, jaybee83, yegg55 and 1 other person like this.
-
I can run mine at +150/+400 with the CPU set to 4.0 GHz on all of the cores with a single PSU. If I go any higher on the GPU clocks, I get the black screen. It makes no difference if I lower the memory clock, or even leave the memory clocks at stock. Overclocking the core above +150 causes a black screen for me.
-
That depends on your system. Dual GPU takes twice as much power, but delivers a massive increase in performance and bandwidth. For example 780M SLI running stock still outperforms a single 980M running with a substantial overclock. AC adapters are a limiting factor whether you have single or dual GPU. The stock AC adapter that ships with most systems are inadequate for heavy overclocking. You need a 330W AC adapter for limit-free overclocking with a single 780M or 980M. With 780M SLI and 980M SLI dual 330W AC adapters are just barely enough for massive overclocking.
If you want the best experience in 4K gaming, nothing short of dual GPU would be ideal for a desktop or laptop. There are some multi-GPU haters out there that vehemently argue that single GPU is better, and say that too many problems come along with SLI and CrossFire. After owning 5 different laptop models with dual GPU I can tell you that, for me, no single GPU machine will suffice. If and when you do run into a situation where a game is a coding abortion that does not support CrossFire or SLI, or seriously misbehaves with it, the solution is simple. Take 3 seconds to disable it, play that crappy game until you get sick of it, then turn it back on again. You can enjoy the best of both worlds. With one GPU, you can't. You are going to see far more benefit running BF4 with SLI at stock clocks that you would ever begin to realize with a single GPU overclocked. There is no substitute for horsepower and brute force.
- versus this -
Dr. AMK, Prema, 1nstance and 1 other person like this. -
You discovered my secret
Those clocks are like a sweet spot for me, and they don't really require overvolt, or at best, a +12mv. At least for me. -
well if they made a 17 inch laptop with a 4k screen and sli gpus, id go for that instantly. too bad it's not available
and you say SLI offers more bandwidth? Aorus is making that 15 inch laptop with a 4k screen and has 965ms in SLI. The benchmarks say it's better than a single 980m. And if doing 4k gaming, it would help more in that aspect too because of bandwidth? -
Q4 according to Prema..
-
for 4k screens in 17 inch notebooks!?!?!?
-
Don't waste your money on that. A single 980M is going to struggle with 4K and 965M SLI probably won't fare much better. There are also some thermal issues to content with on the thin and light options. Plus, the low-TDP BGA CPUs offered in those and similar machines isn't going to work as well for the other tasks you are interested in doing (photography and 4k video editing).
^^^^This... if you are not going to use a 4K external monitor, see what shakes out. If you are planning to use an external 4K display, then grab one of the Clevo SLI monsters with 980M SLI now if you are in a hurry. -
guess i''l have to wait. any chance of nvidia releasing their 980m replacement, the 1080m or 990m or whatever itll be called during the summer? 780m was in the summer. i feel like it's a possibility with windows 10 coming out and then nvidia putting new gpus and trying to fall into the hype machine.
-
The only danger in waiting is, you don't know if what you are waiting for is really going to deliver the goods or become another one of the power-crippled throttling turds we have come to expect. NVIDIA is doing progressively horrible things to their own products, so all bets are off on the future with power throttling and clock-blocked abortions.
With the Clevo P570WM (Eurocom Panther) you can count on having the best experience possible in 4K laptop gaming with a throttle-free beast and a hexacore CPU. The caveat is the 4K external display. Having used QHD+ on two machines for several months, I can tell you (personal opinion... YMMV, but remember you heard it here first, LOL) anything higher than 2560x1440 on a screen 17" or smaller sucks and 15" is miserable. I don't know why people are going for these insane resolutions, especially on tiny laptop screens. Display scaling only looks right at 100% (some are happy with 125%) but text is too small to read comfortably, even if you have great eyesight. Above 100% display scaling, text in dialog boxes and menus starts to become truncated and you end up losing a huge portion the screen real estate that is so valuable in higher resolutions. If you stick with a 1080p laptop display you will have a more serviceable unit after the warranty runs out and won't have a detached retina trying to read text.
The other thing to consider is the premium price tag for a native display is also something to consider. Assuming the replacement part is a high quality product, and readily availability, if you accidentally break that 4K laptop screen (always possible and not uncommon) a replacement 4K panel is going to cost a ton more than a 1080p display. Be sure to get an accidental damage warranty that covers that screen, or you might be using an external display whether you want to or not. -
i like the '' retina'' displays out there. apples macbook pro thats 15 inches and has a 1800p screen looks amazing. the ppi would be even more on a 4k screen. plus editing and seeing 4k video natively would be nice, along with editing 18 megapixel photos.
i don't know if anyone has done this, but in battlefield 4, you could set the resolution to native 4k, then use the resolution scale and bring it down to 150 percent or 175 percent. i'm assuming it'll still look better than native 1080p. -
Setting resolution scale above 100% is supersampling. So at 150% it would be 4K x (1.5 x 1.5) and at 175% it would be 4K x (1.75 x 1.75). I'm not sure if even Titan X SLI can handle 2.25 x 4K and 3.1 x 4K.
-
Personally, I think everything looks terrible at those higher scalings, and you basically forfeit the usable screen real estate, so it really diminishes the value of having the higher resolution. Add to it the hit in performance mentioned by @octiceps and you've more or less negated compelling benefits to having the higher resolution display. On both QHD+ machines I used for a few months I did a lot of adjustments trying to have things just right and finally just gave up and adjusted the resolution to 1080p. QHD+ at 100% scaling on 15" looked really amazing unless you were trying to read text, and then it sucked. Everything was crystal clear and sharp, but VERY TINY.
-
whoops. i meant bring it down to like 75 percent, not 175 when youre at 4k
-
You would have to reduce resolution scale to 50% since 1080p is 1/4 of 4K and 0.5 x 0.5 = 0.25
-
well im saying would putting the the resolution scale halfway between 1080p and 4k when at 4k native resolution still look better than 1080p at 1080p native?
-
So at 141%? Idk. What would look better on a 4K screen, 1920x1080 or 2715x1527? My money is on 1527p but is it worth halving performance? 2x isn't exactly great edge smoothing even if it is SSAA. You might find 1080p w/4x MSAA to look and perform better.
-
In my experience, anything over 1200p shows little gains in smoothing out the edges since the resolution is so high anyway... I can't imagine that you'd really notice any jaggies at 4k...
-
My daily overclock is +125/+100.
Anything over on the memory produces artifacts. Anything over on the core results in driver-level crashes at stock voltage. Have yet to have a BSOD-level crash whilst playing around with overclocking. I'm limited by the voltage and my PSU - upping the voltage means that the system consumes over 180W (that my PSU is rated for). It never crashed on me, but I'd rather not take the risk of blowing my PSU up.
P.S. 4K UHD is awesome! -
You know... It is crazy how much of an impact AA has on temps... TW2 ultra + ubersampling w/ vsync brought my 980Ms to 84C with max fans in minutes. Just turning off ubersampling brought my max temp down to 61C without the fans kicking in.
Same story with Sleeping Dogs Definitive Edition. 22C temperature plummet from extreme to high.
So much for efficiency. But at least I found a workaround to keep my temps low. The GPUs don't power limit throttle as much either so my +100 overclock actually stays at least +50 100% of the time. Not bad for stock voltage on the stock vbios. -
What was your GPU utilization with and without SSAA? Like 99% with/50% without?
-
Doesn't really matter when the settings don't produce any real noticeable quality change.
-
Just curious how GPU load affected temps for you is all, since turning AA on/off with V-Sync engaged would have a massive impact on how hard your GPU is working.
And yeah, Ubersampling is notoriously poor quality. TW2's MLAA and post process sharpening filter are already quite good. Sleeping Dogs though is a jaggy fest on anything lower than High AA. I envy your eyes because some kinds of aliasing I cannot unsee
. Temporal shimmer on foliage is really what kills me.
-
I'll check it later. Don't feel like messing with it right now
And I agree with both of your points but Square Enix seems to have fixed the high setting on the Definitive Edition. It still looks quite good on high compared to ultra. That wasn't the case for the original game. -
Yeah that's what I said. High and Extreme AA look good. It's Normal AA that's crappy. Definitive Edition didn't change any of the AA settings compared to original. Didn't do much of anything really except chop off 1/3 the performance with some questionable visual changes.
-
Well High AA has 20C+ decrease from extreme with no real quality change so I'll take it.
Oddly enough, TR 2013 on ultimate runs around the same temp as the high setting on sleeping dogs. Sometimes I think my system has borked thermal sensors... TressFX should be stressful. -
i never use AA in bf4 anyways. not that noticable on a 15 inch screen. the 980m gets like 75fps with 4x msaa. the 880m is rated at 45fps. i play with it off and vysnc on and it stays at 60fps constant. i'm sure the 980m with aa turned off would still allow 60fps between 1080p and 4k
-
it seems the best i can do is 1303/1450.
-
Do you guys overvolt when overclocking for everyday gaming? Or is all the overvolting just for benchmarks? Just curious as I could easily squeeze more juice out of my lemon if I was willing to risk an overvolt.
-
So far I´m gaming at stock. When I overclock +200 MHz I need a +12,5 mV overvolt.
I might overclock when I´ve moved from Witcher 2 to Witcher 3. -
yep, looks like +100/+300 is the best i can do.
EDIT: still tweaking the settings probably the best "best" is +100/+395.Last edited: Jun 7, 2015 -
I can't get much than 150mhz core overclock. Not even with higher voltage. Temps are good bellow 80°C.
Is it normal looking at my temps that I can't put it higher? -
+250 on core and +450 on memory w/ .5 overvolt has been stable for me. Not too worried about overvolt since the hardware can handle it, it is the same core with disabled shaders as 980 desktop which still uses more voltage than my overvolt. Temps are stable.
As for reducing the life of the GPU? I plan on upgrading it when the next major update from Nvidia or AMD is released. Probably NVidia, unless AMD can prove me otherwise. But definitely not the upcoming AMD, would have to be the next year. 14nm and nothing less for the next upgrade. -
finally someone else who doesn't care about gpu life. i plan on upgrading every year, two years tops. i feel like having a gpu thats several years old is a waste and can't max out new game settings so i prefer to get the most out of my system
Average 980m overclock?
Discussion in 'Gaming (Software and Graphics Cards)' started by Phase, May 27, 2015.