When I set gpu to 730 and shader to 1825 a lot of artifacts became visible.
I lowered clocks down and, even when there were no artifacts, I noticed that performance was lower (eg. 720/1800) than in 680/1700.
I will experiment with voltages. Now have 1.11 instead of 1.12
-
-
what happens is the GPU, if clocked too high, shuts performance down to preserve the chip. If I remember correctly 700/1800 ws the highest I could run stable but 700/1750 was more dependable.
-
Now I have 680/1700 and voltage 1.11 I will try 690/1725 -
right so 700/1750 is the correct 1/2.5 ratio and as I said more reliable but 1800 was attainable and worked. I found even at 710 on the GPU I would encounter issues with mine.
-
GPU overclocking is useless. It sacrifices longevity for a few more frames per second. I've played around with them and only managed to get 2-3 fps more. I found that using Throttlestop to overclock and/or underclock the CPU gave the slight extra bump in processing power for intense CPU games like Civ V.
-
All accounting on the game I did do better than 2-3 FPS. This though was on games that were already in the 30-60 or better FPS range with stock clocks. I found from stock that with an over clock I got about 10% better FPS, going to the P79 board and 260m was another 10% jump and over clocking that was good for another 10-20%.
What are your optimal clock rates for GPU?
Discussion in 'Gateway and eMachines' started by nindustrialny, Aug 21, 2014.