That's normal !!!!
I have a Asus G1S and i got the same temperature...
Under full load it climbs up to 94°C !
So it's NORMAL
What drivers are you using?
Install the 169.xx series, the temperature drops with 10°C then![]()
-
Dag, well let me restate. That is no where close to normal for my Dell.
Idle = 45-50
Maxed out = 60-65 -
Just got 4403 on my inspiron 1520 in 3dmark06
600/1200/500 Riva tuner 2.06, 169.12 drivers vista Premium, haven't broken over 65C yet, so may try to push it to 625/525. -
please can anyone tell me the best drivers for my 8600m gt that are aout there and have been tested to improve performance...thanx
-
@hyakku, did your 1520's 8600 come with DDR3?
-
No mine is ddr 2, I just actually got 4594 at 625/525 running nice too. I wanna see if XP lets me break 5 with the same settings.
-
-
I don't know what it is, but my 1720 becomes unstable at about 550/450. Shouldn't it be able to OC much higher?
-
I would just like to make a quick post to clear up something that has been asked multiple times.
A lot of people are looking for the "best" driver for their 8600m. This question cannot really be answered unless the hardware is strongly outdated for one, and there are no more specific updates in the releases of the drivers to fix a bug or improve performance.
Your best bet would be to grab some of the latest drivers or drivers that are being spoken highly about, install each one - one by one. Run 3DMark06, run your favorite games, its all up to you. What driver works best for your setup. Yes, it's a long tedious process but well worth it in the end.
You could spend a couple of hours with one driver trying to see if a bug or graphical glitch in your favorite game is fixed, or if this driver allows you to clock your card better. Sometimes a driver can improve your overclock, marginally of course but they can also sometimes hinder it. Where a newer or latest beta release will fix this bug and improve performance here, it might dock a few frames off another. Like I've said a couple of times now, its what driver works best for you. Personal preference.
Here is the part that really sucks though. You can spend a little time and be happy with a driver, just for uncle nV to release a new one. They do this. They release drivers as quick as Intel comes out with new chipsets. You know what? If you are so unsatisfied with the question of "which driver is the best?" when they keep on releasing new ones, and you've found one that works pretty good - you either gotta grit your teeth and take the time to try it out, or just wait for a couple releases, wait for your fellow forum members to put out some results and see if its worth the effort to see how they perform for you.
I don't mean to sound like an ass, just trying to give perspective :]
-
I am running the 169.04 drivers. Unstable by after 10 minutes or less of game play my computer freezes until I reboot.
-
-
Idle:58 Load:67-70
-
Run CoD 4 at around 30-35 FPS all settings maxed except AA.
Crysis runs surprisingly well, run everything at medium and some hig (object texture or whatever, physics high, etc).
Bioshock is also running nice and smooth with maxed settings. -
I just overclocked to 650/540 on my DDR2 256mb 8600M GT and it has idle temp of 53 degrees celcius, full load of ~65 celcius.
raised my WEI graphics score from 4.6 to 5.7.. xD -
boosted it again to 690/575 , 51 idle 68 load, but my WEI wont update. the graphics drivers fail when it gets to direct3d texture load assessment... wtf?
-
epic fail
lowered to 665/540 ... testing it now. temps are fine. -
dude mine does that at 525/450. do I have a driver problem? my temp goes to only 70c at max load. another thing that when I set the core clock to lets say 535, It shows up in the rivatuner monitor at 580.
-
The Forerunner Notebook Virtuoso
Rivatuner misreads the core clock incorrectly. Whatever you set it to is what its on. As you increase the clock more and more the number is distorted more, so just assume what clock you set it on is the clock its on now.
Oh yeah sellick, just use the edit feature when your posting right after yourself to avoid multiple posts like that. -
hey man, same thing. i set it to 665 and it shows up at 700 something....
ahh ok sorry forerunnerbtw, on 665/540 @ 52 * C idle temp no problems so far. testing the load temp using cod4. will update in a few minutes
EDIT: Drivers fail after about 8 minutes using an 3D program. temperatures did not reach above 64*C -
The Forerunner Notebook Virtuoso
No problem sellick. Ah the joys of owning a 17 inch such as better cooling. My max is 600/490.
-
any ideas why my drivers fail after 6-8 minutes? clocked too high?
-
BTW, Is there a particular process for getting the clocks that high, because mine seems to become unstable way before(at like 525/445) that under normal temps(do I need to have a certain ration between the core and the memory?) drivers:169.04
-
i dont think so.. people sometimes have the memory higher then the core...... by the way, im also running 169.04 on vista business 32bit.
-
10 minute stress test @ 600/505 worked fine.
trying 30 minute @ 615/520. -
sellick, temperature isn't everything when it comes to stability. The amount of voltage applied to the chip is also relevant, and some chips just don't run well after a certain frequency.
-
I can overclock the core well past 600, but the memory won't budge past 460. I get crashes and BSOD strangely enough if I go past that.
-
-
I just thought this might be an interesting read for all you 8600M users.
8600M vs Desktop GPU benchmarks
http://www.techarp.com/showarticle.aspx?artno=476&pgno=7
They use inspiron 1501 8600M GT for the test benchmark which is probably not the best 8600M GT out there but its an interesting benchmark nonetheless -
hmm interesting...
meh. plan on upgrading to 8800GTX next year sometime, unless a newer card comes out.
edit: the stress test has completed. not a single problem. going to play americas army for a while and see if it crashes. -
Can the Ait's HD2600 be compared performance wise in games to the 8600m GT?
-
-
-
how could it be that wrong
-
Thanks Odin for the goodness over there...pwned!
-
now im having another issue -.-.
battlefield 2 won't start, although it did before i overclocked/installed 169.04 drivers.
and odin, you owned them B-).
edit: clocked back to stock settings and the game still goes black. installing 169.12 drivers.
edit2: installed 169.12 drivers fine, bf2 opens, can connect to OFFLINE account, but whenever trying to connect to ONLINE account, game crashes.
edit3: problem solved (i think) by deleting all the Video.con files in "My Documents/BF2/Profiles/001" etc.
edit4: yes, problem solved. but a warning. DONT INSTALL THE 169.12 DRIVERS. THEY ONLY SUPPORT UP TO 1024x768 RESOLUTION! -
Odin I think they deleted your comment
EDIT:sorry I had the wrong link -
Code:
http://forums.techarp.com/reviews-articles/23430-nvidia-geforce-8600m-gt-mobile-gpu-review.html#post318742
-
Ah i just read ur comment. Yeah just as i suspected the Dell that they use is not a fair comparison for it but it would be interesting if someone compare a notebook with a desktop cpu and proper 8600M GT with a desktop counterpart (Asus C90 vs desktop maybe??) -
Does anyone know if a decent 8600M GT like the one in Macbook pro will beat a desktop 7800GT ???
Edit: Oops looks like there is already a comment in this thread about the 8600M GT performing just as fast as a desktop 7800GT -
4149 on 3Dmark06 at 580/445
EDIT: My resolution is set to 1280x854 -
-
Hi,
Does anyone how much help does the 8-series Nvidia cards help in playing High Definition files? I tested a 720p file on my laptop with 8600m gt while observing the temperature of CPU & GPU and the CPU usage. To my surprise the CPU had a very high usage between 30 to 45%, I though the Nvidia card was suppose to help unload some of the processing but looks like the card was not function; I reach the conclusion based on high CPU usage and the fact that the GPU temperature doesn't really move up, as in not "working" to decode the file.
Can someone answer how much benefit in terms of playing high definition file does the card support? -
What type of file was it and what codec/filter and mediaplayer were you using?
-
-
nVIDIAs downloadable Purevideo is a DVD-dekoder. And you have to buy it. For purevideo HD (the technology) to work, you need the proper kind of file and the right type of player. A H.264 .mkv file played in in WMP, MPC or VLC will not be helped by the GPU.
HD-DVD & BluRay are accelerated in windvd/powerdvd, DVD's and VC-1 .wmv files are acclerated in WMP, H.264 MP4 files accelerated in quicktime (though that's a bad idea, as QT has a really slow decoder). -
Thanks for the Reply Cinner and Fabarati.
But does PureVideoHD aid at all? I though the PureVideo technology was built into the card, and controlled by the driver; as the 163.44 driver's changelog reflected the improve HD decoding for 85xx,86xx cards. Also, the file is in H.264 codec, with VobSub as the decoder which came from a community codec pack I installed a while ago. Would I see performance if I purchase powerDVD and install H.264 specific codec?
Looks like PureVideoHD is not require after all? From Nvidia Site:
-
Purevideo HD is not a decoder, it's the technology built inside the card. I essentially said the same thing as the nVIDIAS site said.
Now, your decoder is not vobsub (that's for subtitling only). As you are using CCCP for decoding, you're Using the FFDshow filter. FFDshow does not support gpu acceleration.
And It's not merely about the file, it's also about the container (.mkv, .ogm, .mp4). MKV and OGM are not going to get accelerated. MP4 might if you use a "PureVideo HD compliant movie player from vendors like CyberLink, Intervideo and Arcsoft.".
Now the question is, why do you so desperatly want to use your GPU? 30-40% is pretty good (depending on file type). Unless you want the post-processing functions, I see no need for that. And I watch a great many things in HD on my laptop.
If you want to buy a Filter/codec, get CoreAVC, the best software h.264 decoder. They're planning on adding GPU offloadiing at one point. -
So by referring to you last post, PureVideo is a "Dekoder" (what?!), and PureVideo HD is the hardware built in side the card, is this correct?
Finally, thank you very much for the help fab, I'll give some of the professional DVD player a try and see if that helps along with CoreAVC if I can get my hands on a trial, but I doubt it. -
A decoder reads the ones and zeros in a moviefile and turns it into a video that you can watch. The Dekoder/decoder thing is because K is more used in swedish than c. Sometimes that bleedsover to my english.
What kind of file type were you using again? Not merely how it's encoded (you said H.264), but what container as well (mp4, mkv, ogm). Also remember that ther DVD-playing softwares aren't optimal for files. They work better with DVD's. In fact, I prefer WMP over them for DVD's As well. -
@ Fabarati:
So, what you're saying is that my .mkv files won't be accelerated by the GPU if I use FFDShow (I use CCCP)? Or are .mkv's simply not accelerate-able, period?
Is there any other codec packs or media player that I can use so that the GPU can help accelerate my HD playback? The reason I ask is that there are instances where the video (720p) would chug, and it's really annoying (I'm using a weak T5470 1.6Ghz... Which is the reason why it chugs). It'd be nice if the GPU can pick up the slack for my relatively weak CPU.
EDIT: -----
A question in general to anyone who can answer it:
When I have my clock OC'd to 545/495, my GPU temp reaches 66C (via SpeedFan. nTune's nMonitor displays 7C cooler temperature, for some reason. Anyone know which one is more accurate?) after playing Gears of War for some time. I "think" (I'm not 100% sure) I read somewhere that GPU temperature should not exceed 60 degrees, so is this already overheating my GPU?
Also, I'm seeing numerous OC numbers in the 600's for the core clock. Is this normal for a 8600M GT DDR2? Or are those in extremely lucky cases? I haven't tried pushing my GPU past 545Mhz after seeing the temperature it's already reaching...
Nvidia 8600M performance, general discussion
Discussion in 'Gaming (Software and Graphics Cards)' started by mD-, Jul 9, 2007.