Hello all,
I'm currently in the process of undervolting and underclocking my I7-4720HQ processor in my P35X V3 with GTX 980M to reduce the CPU temps, as currently I'm constantly hitting 95C in several games and throttling, causing stutters. I have not yet repasted but i will do so in the near future with Gelid GC extreme, which I expect to bring down temps up to 5C. I want to underclock it to the lowest possible and leave it at that, as to keep the temps below the throttling range and prevent stutters.
Currently, I'm undervolted dynamic CPU voltage by -60mV and running at 3.2Ghz turbo on 4 cores and 3.1Ghz cache. (seems like a weak undervolt compared to others sadly. -70mV crashes in daily use. I have seen users with 4710hq's undervolting by -90mV, when running 3.3ghz on 4 cores. I expected the 4720 to perform better, since it's a later revision and Intel's manufacturing process would be more refined, thus creating better silicone overall).
When playing War Thunder ground forces at 3k ultra, i seem to manage 60fps with these settings, but when running any lower than 3.2ghz the cpu is bottlenecking it and fps is significantly lower.
I'd like to know your findings on minimum cpu speeds needed in order not to bottleneck a 980M in games.
Kind regards,
Arthedes
-
When I think of it, the factory voltage of the 4720 might be lower than the 4710's. Whats your voltage when running 3.2ghz?
Mine's 1.014V @3.2GHz -60mV
and 1.034 V @ 3.4GHz -40mVLast edited: May 18, 2015 -
-
I do notice power limit throttling in some stress tests, but that's due to the HQ chip being hard limited to 47W according to D2Ultima appearantly. I can't imagine it going above 47W in real world applications like gaming or rendering though but that has yet to be seen...
Definitely going to repaste BTW, gonna use Gelid GC Extreme. How much temp decline can I expect from that? I find it hard to say since its throttling anyway.Last edited: May 18, 2015 -
-
I never said you couldn't adjust current limit due to using a HQ chip. I said your motherboard probably does not allow it.
Next, the clockspeed depends on the game. GTA V, 3.5GHz 1080p60fps is not enough if you don't force a GPU bottleneck.
1080/60 BF4 max graphics shouldn't be a bottleneck with 3.2GHz.
1080p/60 Mass Effect 3 probably wouldn't even be an issue at 2.5GHz
Please note: higher resolutions INCREASE CPU USAGE overall. It's not by much, but it does. People don't often notice, because it MUCH more likely forces a GPU bottleneck. But with that 980M you got there and a TDP locked CPU... well....Mr Najsman likes this. -
How come my processor can run at 58W sometimes, although not sustained, when the motherboard does not allow it? -
-
So, any clock speeds you guys found to be too low for certain games? Whats the minimum you can get away with?
-
-
-
D2 Ultima likes this.
-
-
-
-
I use i7 4710MQ at 2.5 GHz at FHD resolution and I have not experienced bottleneck in games. Sure, I lose a few fps but not instant drops like from 60 to 10 fps which is pretty common with bottlenecks. I use it due to lesser heat and fan noise. Actually 3.0 GHz is the best in terms of balance between heat, fan noise and performance for i7 4710MQ but still, I prefer lesser heat and fan noise for sacrificing little performance. I can't even imagine overclocking my i7 4710MQ to 4.0 GHz. That would be horrible when Haswell CPUs are already running too hot. Maybe someday I will enjoy 4.0 GHz CPU speed with no heat or noise problems after I will sell my laptop and buy a desktop.
-
-
Whatever 4.7ghz does not work with Wprime 1024
-
FWIW I'm not the only one to achieve 4.7GHz on a 4700MQ but perhaps it's a mute point. Maybe if I had a fine machine such as yours Papusan I could achieve a bit more. Also ambient temps where I live of over 30C most of the year round don't help. Better if I lived in a country like Norway where just opening a window can achieve ambient temps of -40C
hahaha. Well, it felt like -40C during the times I've been over there.
What's your best Wprime 32M? I see Mr. Fox ran at 4.9GHz for just over 5 Seconds, should be under 5Secs at 4.9GHz (~4.7Secs) so maybe some throttling or it just wasn't run optimally. The 4930MX is IMO a better binned chip so results should be better but running heavy loads at desktop CPU speeds with a much lower pin count is probably not such a good thing for the CPU. -
we don't need aircondition home if you understand. I run normally my 4930mx ok 24/7 on 4.3ghz so I haven't tested with higher clock speed. I run more Wprime 1024 and Cinebench benchmark tests because I like the test run a longer time and compare with Hq processors
... You know this is not a BGA processor with turbo boost that can run only 28 sec before the clock speed go down. I have tested only with 4.3 and get a score 5.351 in Wprime 32 and 167.560 in Wprime 1024. Many laptop processors(also bga crap) can run with an o.k Score in short benchmark test but not if the test go over a longer time.
@Dufus have you noticed that many of the i7-4940mx processors have more problems with an ok overclock thanlder 4930mx ? I and Mr Fox think maybe they are poorer binned. What is your conclusion on that?
Last edited: May 23, 2015 -
Meaker@Sager Company Representative
High CPU clock speeds are needed for high refresh rate gaming rather than high resolution gaming. For 60 fps 3ghz should do fine.
TomJGX likes this. -
-
Would the CPU run at a lower clock speed if it is not needed? For example, in game A you would get 40 fps with 3.5 ghz and also 40 fps when clocked at 2.5 ghz, would it run at 2.5 ghz all the time or stay at 3.5 ghz?
I havent noticed this really.
EDIT: I just ran AC:U in a busy area on 3k and I get 35 fps with either 3.0ghz and 3.4 ghz on 4 cores. This indicates that in this case the GPU is bottlenecking the situation. However, I find it strange that the CPU would run at a higher clock speed and a higher temperature when the framerate would stay the same. With 3.0ghz the cpu hits 85C and with 3.4ghz the CPU hits the thermal cap of 95C.Last edited: May 23, 2015 -
I thought CPU bottleneck only occurs when one or more cores is 100% loaded. However, in Borderlands 2, when running at 2.6GHz I get less FPS than when running at 3.4 GHZ (even though at 3.4 it will start thermal throttling after a few minutes), even when the cores are not fully loaded (I know this because I run msi afterburner, showing the load of all my 8 individual cores, and not one of them is at 100%, more like 40%)
How do you know your cpu is bottlenecking or not? -
-
. With a bios that is functioning properly so is a i7-47xxmq processor the best choice anyway. I do not bully socketed processors (the best choice). There are only Hq aka BGA processors who is known as garbage... See what is written in my my Custom Title under my Avtar.
-
-
Higher clockspeeds (and voltage) automatically mean higher temperatures™ even if the load is the same (not 50% CPU vs 50% CPU, but 50% CPU at 1.5GHz and 25% CPU at 3GHz, etc). If you are worried about temps and you find that game X is GPU-bottlenecked™ then you might as well lower the clockspeed via Throttlestop or something.
1 - Games don't like to use "100%" of a single core when hyperthreading is enabled. This usually tops out at ~95% if it's heavily stressed, but to your eyes, more power is available.
2 - Games sometimes are designed for x# of cores. If they see more, they can split the load, and your CPU will appear to be quite un-stressed, but it's actually using all that it is coded to use, and thus clockspeed bumps get a nice boost to FPS.
3 - Windows 8. Some games or programs have a distinct limit of the CPU power they're allowed to use. Windows 8 has screwed up by making "100% CPU" mean somewhere between 75% and 80% CPU. For example: HERE is me rendering with Vegas at Win 8's 100% CPU. And HERE is me using TS8's benchmark, which actually uses 100% CPU. So sometimes Windows 8'll think it's at 100% and a game will stop using the CPU or something, even though there's much more power available. HOPEFULLY Windows 10 fixes that, or you can use Windows 7 if you aren't interested in the benefits that Win 8 or Win 10 bring.
4 - Games might just be single-thread heavy. Mass Effect 3 kind of pissed me off like that. I had set it to lock to 125fps and it had trouble maintaining that because 3.5GHz on a single core with all the graphics goodies I turned on via NVI wasn't fast enough (see previous statement about 4.3GHz or faster CPU being wanted by me =D). This one however, should be pretty easy to tell.
5 - Games could be made by Bohemia Interactive, and for some reason have a minimal load on both CPU and GPU and still run like a sloth with broken legs. For reasons yet unknown, overclocking seems to work.
-
-
Wait, come to think of it, ive seen loads of 80 ish sometimes, how could that be explained? -
So pretty much the aim would be to clock the CPU as high as possible.
After tinkering last night, my optimal setting is 3.0 GHz. This way, I'm constantly on the brink of thermal throttling, so the cpu stays at a constant 3.0GHz. If I clock to say 3.4 GHZ, thermal throttle will occur and most of the time it will clock below 3ghz and then up to 3.4 again.
I prefer the constant 3.0ghz to the inconsistent throttling.
Now its time to order some GC Extreme and remove my crappy stock TIM application.
EDIT: would I benefit from disabling HT? Most games dont seem to utilise 8 threads even in this day and age and I heard HT only brings temps up. -
-
That's an example of a predominantly single-threaded game, like this one. Most last-gen console ports are dual- or tri-threaded since that is the number in PS3 and XB360, with a large main thread due to DX9.
Last edited: May 24, 2015 -
The option is not in my bios though... -
Any other findings that could be interesting?
-
Guys, check this comparison between processors and their relationship with resolution.
http://www.tweaktown.com/tweakipedi...80-sli-vs-gtx-980-sli-at-2560x1440/index.html
By their findings, increasing the resolution does not increase the CPU load.Arthedes likes this. -
-
What my understanding is, is that increasing resolution increases CPU hit per frame, but decreases CPU load overall by forcing a GPU bottleneck.
It's like... if you took a 4.5GHz haswell i7 and two 980Ti cards and ran something at 720p then at 1080p, you're probably gonna find you get less FPS without actually maxing the cards at any point.
The increased CPU hit is so far negated by the reduced CPU load (due to GPU bottleneck) that people don't consider it. -
-
-
-
-
-
If higher resolutions don't use more CPU power at all, then regardless of resolution, a CPU bottleneck is a CPU bottleneck. The parts where I drop from 60fps should still drop at 720p, because the CPU is the bottleneck, not the GPU. Does that make sense? If I don't get CPU limited at lower resolutions the same as at higher ones, then it should prove higher resolutions use more CPU power. -
-
Papusan likes this.
-
-
Also, funnily enough... GTA V can use legit 99% of the CPU (and I do mean ~98-100% of all 8 threads; throttlestop saying C0% is 99%) when first starting up. After it loads its title screen etc it seems to get locked to ~62% in-game. XD
Minimum Haswell CPU clockspeed for 3k 60fps gaming
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Arthedes, May 18, 2015.