nah its just a upgraded 8600GT if they come up with a 9600M GT with ddr3 it will beat the 8600M GT ddr 2 but not by alot
-
-
And oh yeh, I had a chance of getting the m1530 but I also wanted other good features apart from the just the graphics.
AND about the 9600GT issue, its a bit obvious that a DDR3 version will be released and it will beat the 8600GT DDR3. If it didn't, nvidia don't really understand the branding of "new generation". -
In anycase the new 9series is a repacked 8 series that produces less heat and has slightly more performance. The whole scheme was just so Nvidia wouldnt have to name the new 9800X2 the 8800GTX X2. They really cant afford to look backward-ish when ATI now has the HD3870x2 chokehold around its neck
But for Future references, if you really want to talk about the 9600gt there is its own thread for you to comment on. If you really want to do a comparison , start one! -
I mean you pretty much told the OP to shutup. Who cares if he really likes his GPU? I personally have the same laptop, and would like more performance out of the GPU (As in 8800GTX type performance). But that is what a desktop will be for. But this card works for him, and the lackluster GS is working for you (sorry, had to do it lol).
And AGAIN it is not the 9600 that is the same as the 8600, it is the 9500 that is the EXACT same as the 8600. Nvidia has been doing this type of number scheme for ages. What was it? The 6800=7600? Etc.... There is actually a good thread about it around here somewhere. -
the only problem is that now since it wont downclock no matter what, I'll need to get a cooling pad to keep heat down and maximize gpu life. -
So the GPU does not download @ all anymore? Booty.
-
-
-
Tested Today on ME oc clocks 640/892, it downclocked 15 minutes into the game. =/. 600/850 works great however, i played an additional 2 hours today with no problems.
I guess it doesnt solve the downclocking problem but alleviates it? -
Hmmm, you probably already mentioned it. But did you try the Dell 174.31's?
-
Last edited by a moderator: May 8, 2015
-
it doesnt work that well for me, i almost feel like a 600 clocked 174.74 is faster than 640 using 174.31, maybe i'll give it one more try.
-
You only do the 3D performance right?
Where it starts at 475/702.
What's the average stable clock people get? About 650/900? -
BenLeonheart walk in see this wat do?
IT all depends on your temperature, and your chips..
thye're all different...
some cant even get past 550 core... -
I just did a little test and after 5 minutes of scanning for artifacts at stock. It is upto 76 degrees.
What does that mean?
I just had a look and apparently artifact searching deliberately heats it up.
Hmmm -
masterchef341 The guy from The Notebook
you have a lot more research to do.
use google and wikipedia, and eventually you will understand what is going on. -
BenLeonheart walk in see this wat do?
I agree with masterchief..
Overclocking isnt just something you come and do...
follow these steps:
read a bit, understand, overclock, game, enjoy, post.
n_n -
I do know in general, I just don't know exactly how ATItools artifcating scanning works.
I just want to stick it up 575/750. Test it for a bit. Then go up to 600/800 if it works alright.
I think my ATItool (0.27b) must not work as everytime I try and adjust the clock it goes back to stock -
The Forerunner Notebook Virtuoso
Bah, Atitools is not reliable in my opinion and very time consuming. Just use the most demanding game you have or 3dmark. basica rule is to leave core/memory alone(each people have their priority) and overclock the other until you artifact. Then leave that at stock and find the max. Then combine the two maxes and lower the number 10-20 mhz. Then just keep track of temperatures.
-
I have the basic version of 3dmark and I cant get it to loop so any suggestions on what to use? -
What I've found is the most potent combination to test is Furmark and Orthos. That way, you can stress them both (CPU&GPU) and find out the max clocks for your GPU that will keep from artifacting at even the most extreme settings.
EDIT: use the Small FFTs test in Orthos and in Furmark, test in stability test mode and don't fullscreen it unless you feel like it (also change resolution to accommodate your fps [mobility HD2600 avgs 20 fps in 1024x768 windowed mode]) -
I'm using Rivatuner to OC now. A bit annoying as it seems well more complicated and backing up the registry and all that seemed a bit dodgy.
It's hovering at about 77/78 degrees with a 550/750 clocks. I think I'll leave it there. Actually Its just turned on a bigger fan and gone down to 70!
I wish Rivatuner had a nice easy setting where it would change clock settings and the close itself like I used to do with my x1950 pro on ATItool -
Gaming. Seriously the best way to stress your entire system. 3dmark is not very good, because it is only stresses out the GPU. For me I have found TF2 is very good at telling me if my OC is stable or not.
Example. I had my OC @ 650/950, could run 3dmark06 no problem. And I let ATITool scan for artifacts overnight. None, it was good to go. Load up TF2 and once I get into the game, system reboot. Stable in that game is 575/850 -
My stupid 8600 GT down clocks the second it gets to 80 degrees for some reason I have found.
Disgusting -
-
I disabled speed step in BIOS config which cooled my cpu temp by 10 degrees and now I can OC higher than normal. Theres a small cpu performance hit i think but the increase in gpu performance more than make it up.
Its funny too because after I disabled speed step, Rightmark diagnosed that the the cpu is only running @ 1.2ghz, but performance in game increased dramatically.
I can run C&C3 at ultra settings 30fps no problem and ME ultra 1280x800 with stable 30fps also.
Before i could only run on high and still would have encountered downclocks.
Strange how thats working out. -
-
-
Uhm yeah I mind
I'm not gonna shake a stable system.
It works like a charm with the laptopdrivers2go 174.31 - at least.. I downloaded them there. -
Is there is big difference if you OC the GPU when it is on Balanced or Performance mode?
Edit, i just did a 3DMark06 on my laptop when it was balanced it was 4999 marks, and when it was in High Performence, it was 5007 marks.. so i guess it doesn't really matter much? -
bump once, going up
-
-
Has anybody been able to successfully increase the voltage for this card yet? I've attempted to flash it to an 8700m GT or to mod the original 8600m GT bios with some 8700m GT values a few times, but none were successful. NiBiTor doesn't appear to be able to increase to voltage either.
-
you really should try to flash your 8600m gt. It really gives no increase to performance. Yes you will get higher clock speeds but these will be instable, since the 8700 bios is geared towards more heat output, which the 8600m wont be able to handle.
Secondly, the 8700m gt is faster not just because of the higher speeds, but it also use a different memory interface. Thats physical, you wont be able to get that through a bios flash.
Why would you want to anyways? you can already OC the card higher than 8700m gt speeds -
I'm mainly trying to see if a voltage increase will yield higher clocks, the attempted flashes were only to get the voltage increase. Even with the flash I reduced the clocks to my default bios OC of 560/1120/760.
The flashes resulted in the card not being recognized and the screen only having its backlight on. I memorized how to do a blind bios flash beforehand in case of this, so it wasn't a problem. -
-
I was thinking of applying a mild overclock to my system, maybe 5-10%, but I do not have a cooling pad. Is it safe to do this? Will my system be ok?
-
-
Anyone have any experience using Tweakforce modded drivers?
-
Ive taken it up to 555/800 stably without a cooler. and 620/850 after downclock the cpu, I think you're safe with a 10% overclock.
The problem with most OC system issues is that people don't monitor their temperatures correctly. You dont want to use your normal operation temperatures since your gpu is running on 2d performance mode while windowed. You don't want to take the temperature after an hour of gaming either, it'll be an understatement of real temperatures during a real gaming session. -
Be careful. Remember there is NO temp monitor for the GPU memory. I was going to try to push my system to the limits. But after finding out about this, and reading a few (still very few) reports of people frying their cards, I have decided it just is not worth it.
OverClocking the 8600gt ddr3
Discussion in 'Gaming (Software and Graphics Cards)' started by WileyCoyote, May 15, 2008.