There is a nice thread about overclocking the hd 5730 here:
http://forum.notebookreview.com/asus-reviews-owners-lounges/457517-asus-n71ja-x77ja-review-oc-freak.html
With 750/950,I basically get the same overall effect as Oc-Freak at 800/1000 (mind that my CPU is just slightly faster). But there is quite a difference between stock and oc'd GPU - around +1,5k 3dMarks 06.
Anyways my CPU and GPU temperature difference under load, between stock GPU clocks and overclocked values, differs only 1°C. Looks like the hd 5730 is made for overclocking![]()
I have also noticed that when idle, my GPU clocks drop to 100/150, which i belive is very interesting... Don't know why??? But should lenghten my battery life, when not plugged in, so i am not touching that.![]()
Regards
-
Thank you and best regards,
K -
-
Meaker@Sager Company Representative
If a chip wont to 600mhz at 1.2v it may to 500mhz at 1.1v. They reduce voltage for lower binned prodcuts because lower frequencies dont need so high voltages for a chip to be stable.
A chip's power consumption scales pretty linearly with frequency, however its a squared relationship with voltage.
Therefore a 10% voltage increase will cause a MUCH larger increase in power than a 10% frequency increase.
That's why overvolting and overclocking a chip can cause chips to run so much hotter. -
-
Thank you for the explanation.
I belive that power consumption is proportional to voltage, it's not a squared function (power = current x voltage). Or am I mistaken? It can be a square function, when using the equation with the resistance - but resistance very much depends on temperature. I am not sure that this is a credible way of calculating the power consumption (at least not with simple linear equations).
Anyway, i tested the card's temperature at stock settings and at overclocked settings (similar conditions) and the temperature difference was 1°C. Throughout the test my fan was running at "normal" speed on both settings (according to NBProbe program) - when the laptop is idle it's says "low and quiet".
So i don't belive that 1°c or even 10°C increase is problematic, as long as the temperature is under cca. 85°C (for GPU). -
I have "maximize performance" while plugged in, but still the clocks are 100/150 - when i disable the powerplay, the clocks indeed jump to 650/800. But i don't know why this is happening if "maximize performance" is selected?
I am not complaining, i think this is a huge power saver, but i just don't understand it. Maybe it's like with the new processors - like wetcardboard said: when extra power is needed, the card gives it... -
ATI 5730 GPU Temps.
Discussion in 'Gaming (Software and Graphics Cards)' started by tbrocato, Feb 4, 2010.