I have never seen a 6000 MHz memory clocked mobile GPU by default. Why is that? Even the latest high end GTX 980M has 5000 MHz memory clock. Many high end desktop GPUs out there have clocked at 6000 MHz and some even at 7000 MHz.
I am currently using my overclocked GTX 870M at 6000 MHz. I know there are many people using their GPUs at 6000 MHz with overclock. It surely adds considerable heat to the GPU but the gains from it also good.
I have been using GTX 870M like this for six months and now there are some crackling noises come from GPU when it is stressed. I can't help but worry if this is the reason it does like that. It can be very well something else but that drives back to me to the first question. Is it for safety or just for cutting down the power of mobile GPUs? I would like to know your opinions about that.
-
If it works without corruption or lockup, no issue. I am surprised they don't run 6000MHz vRAM considering the cost of the cards compared with desktop. I can't imagine the power requirements or heat would be that much more significant.
-
I have overclocked my 970M before to around 6000Mhz but that was really pushing it... God knows how my new one will do.. I'll run some benches and see...
-
Set it to 5600 - problem solved. Way less risk, less heat, little less performance.
-
for Kepler cards it's fine I think. Never seen someone with a 780M, 870M or 770M that couldn't handle 6000MHz on the memory.
But Maxwell GPUs, at least the 980M, have low voltage vRAM and don't allow the memory clock to push that high normally. It's a similar scenario to the 680M.
Since you didn't list the GPU you were thinking about, I explained both sides. I'm not sure how AMD cards handle things but they're pretty irrelevant right now XD.
-
Meaker@Sager Company Representative
The reason they don't go faster is that the traces are more compact on MXM causing interference limiting the speed.
-
I meant 700M series and newer. I didn't know 980M have low voltage vRAM. I'm sure I saw some people who managed to reach 6000 MHz with their GTX 970M.
-
Meaker@Sager Company Representative
Look at a desktop card, the chips are more spread out allowing greater spacing for all of the GDDR5 data lines, in MXM they are squished together creating greater interference (noise on the bus).
This is why over volting the ram chips does not really help. -
-
Meaker@Sager Company Representative
-
no problem for amd. always running at 6400
Attached Files:
karasahin likes this. -
-
Meaker@Sager Company Representative
Every sample is different, also AMD get less out of that speed so it could come down to how they handle the bus
-
-
870M gets a bigger benefit from memory overclocking than 256-bit cards like 780M because its backend has been chopped off a bit, similar to 660 Ti vs. 670 for example. -
That kind of reinforces my belief that 160GB/s is the sweet spot for memory bandwidth for the most part. I rarely if ever noticed benefits going from 160GB/s to 192GB/s (5000MHz to 6000MHz), but since the 870M's base is 120GB/s, overclocking it from 5000MHz to 6000MHz bumps it to 144GB/s which is a lot better.
And then we have crap like the 960 desktop card which even with maxwell's memory benefits you couldn't hit 160GB/s >____>
Edit: I said 970M, but I meant 870M. Fixed it now. 970M benefits too though.Last edited: Aug 19, 2015karasahin likes this. -
Meaker@Sager Company Representative
Yes I am talking about adding voltage to get beyond 1500-1600mhz memory clocks.
HBM will allow increases now as cores get more powerful to feed then. GDDR5 is about done on the high end. -
-
I wish I knew what causing crackle noises come from the GPU. I want able to use my GPU with overclocked. -
Meaker@Sager Company Representative
Noises are usually coil whine.
-
-
for example, I hear coil whine from my system when playing Witcher 2, but not Witcher 3. -
Get old enough or loose enough of your high frequency hearing and you won't notice any coil whine
TomJGX likes this. -
-
-
It isn't constant noise like this one, it is randomly happening at high loads and temperatures.
Is 6000 MHz memory clock too risky for a mobile GPU?
Discussion in 'Gaming (Software and Graphics Cards)' started by karasahin, Aug 14, 2015.