Hi folks,
We know that,it is easy to overclock gpu clocks and shader,they have sensors which we can monitor tempertures,and they are cooled![]()
But basically,manufactors dont tend to apply a system cooling memory,we know that gpu touches the heatsink,but memory is left alone usually.
So I have a 8600m gs and i can succesfully overclock shader and core,but I dont know how overclocking memory clock will work out.
Memory is an DDR2 running at 400MHZ,I dont really know how to overclock that,i know that memory clock has an impact on gaming performance,so I really want to overclock that thing,but what if it fails?
I need suggestions from experts.
-
Let me just warn you... 8600M GS is from the faulty NVDIA 8 series and it may die anytime... really speaking you shouldn't overclock it at all... but if u really want to do so , just overclock shader and core... memory isn't very safe... if u really want to OC , make sure its about 100MHz at most... do it in 25MHz increments and then see temps... make sure they're under 90C...
-
MahmoudDewy Gaming Laptops Master Race!
You won't get much performance gain from OCing the DDR2 memory and as Sean said especially from the 8xs you need to watch ur overclocks thats if you really need to overclock them because they had soldering problems
-
Even though you can't read memory temps, the memory will let you know when it is clocked too high and/or running too hot by introducing artifacts or crashing the 3d application. The memory cannot die due to overheating when being used in 3d without you knowing well before hand. This is normal procedure on the desktop. I have gotten blisters by feeling the memory temp with my fingers.
-
So as long as there no artifact and the game doesn't crash then it ok to overclocked your memory? I have a GTX 260m and I overclocked it to 650/1625/1100. I ran ATITOOL for 2 and half hour without artifact. My max temperature in game is 65C. So is that ok?
-
-
overclocking memory clock in gpu?
Discussion in 'Gaming (Software and Graphics Cards)' started by lappyftw, Mar 11, 2010.