hi guys,
I am having an wonderful experience with my XPSL501X with GT 435M 2 GB card, oc'ed it to a high 820/820/1640 ratio which enabled me to play demanding games with playable/fully playable frame rates avg 40~52 FPS in mid- high settings ( usual setup for most games are 1366*768/1600*900 res, shadow off, 2 or 4AA, texture high, world high, V sync on and 0-2AF)
question is, i had oc'ed it with ntune, for powerstrip and riva had their annoying problems of digitally signed driver in win 7x64, for the first few attempts with n tune, i did manage to ramp up the core clock to a discrete freq compared to the rest two ( shader and memory) but recently i noticed that it can never be put into a frequency other than= shader freq / 2.
whenever i move the shader to lower or higher freq, the core automaticaly jumps to the freq which is half that of shader rounded up to nearest integer value!!
IS IT A NORMAL mandate i haven't noticed ?? or is there something wrong???
-
-
conscriptvirus Notebook Evangelist
as far as i know, it's normal for the shader and core clock to be linked. i use nvidia inspector to oc and i can only set the shader clock (which as a result sets the core clock). and the core clock itself is grayed out.
i believe in Rivatuner, you can unlink them but Rivatuner doesnt work for the 420m/435m since it hasnt been updated for a while. -
okies now i get it.... thnks for replying, i tried powerstrip as well, but as u said, didnt quite work for me for the same reason may be.
-
also to ask, if am pushing the card too hard??, stock GT435 was 650/1300/800 and i oc'ed to 820/1640/820, temp is about 85~86 deg at full load. shd i continue with this or shd it gradually eat up the card??
overclocking GT435M
Discussion in 'Dell XPS and Studio XPS' started by SAM_738844, May 9, 2011.