interesting, we have the same laptop, but with different temps...
-
-
-
FYI, my room temp is around 16C -
Going to try and OC my GPU before I install XP. Thanks to all who have posted info in this thread.
Anyone find or make a guide for tearing this lappy apart yet? I'm not sure i'm itching to do it but The AS5 on GPU option does sound appetizing. -
He's a Wookie, the fur helps.
-
Alright, so my lappy with vista here had the 8800GTS clocked @
Core: 500
Mem: 700
This sounded waaaaayyyyyy low so I checked GPUREVIEW and found default clock rates for the desktop version of the card are
Core: 650
Mem: 970
I used Ntune to set my clocks to the 650/970 and temps seem to be steady running around 70 C under load... running RTHDRIBL @ full screen 1440x900 and 2x instances of CPU-BURN to try and max out temps inside this beast. Will update with further progress but as of now everything seems stable. Is this low of a default clock standard on these things? I understand the lower clock can save on battery life but that is a huge difference. Also, how hot is too hot? Thanks -
What program did you use to OC? -
I used Ntune to OC and to log temps, just got done with 1+hr stress test and temp is pretty much pegged at 68 though something happened to peak it up to 74 for about 15 seconds. I really thought it was odd that the clock was that low too but that is what mine was set to by default.
Looks like my new clock rates will be at least 650/970 stable. Will update if I find out otherwise.
note: the ambient temp in my house is around 18-20 c
One line showing my temps from NTune Logfile
Code:Current Time Gpu core(3D)(MHz): GPU memory(MHz): SMART status: GPU1:(ºC) CPU(%): Disk(%): Network(%): Memory(%): Profiles loaded: 2008/3/8 17:16:12 650.0 966.0 Healthy 68 100 % 1 % 0 % 35 %
After looking @ my logs the 74C temp happened immediatley after bumping the core & mem clocks up to their current 650/970 settings. It seems possible the fan adapted to the higher temperature and cooled it accordingly as it never got that hot again over the next hour. -
I'm using RivaTuner 2.07 to monitor my temps and I want to start overclocking. The thing is that I can't figure out how to read the RivaTuner figures stock.
There are 3 sections, Standard 2D, Low Power 3D, and Performance 3D.
Under the Performance 3D (Which I think is where I should be going to overclock) has the following figures.
Core Clock: 383Mhz
Shader Clock: 767 MHz
Memory Clock 301MHz
Now, I'm guessing RivaTuner is measuring the Memory Clock in DDR, and I would have to multiply that by 3 to get the real clock. That would put it up to 903 MHz, which seems too high for stock clocks. The Core Clock is also very low. I would think I have to multiply that by two to get the correct reading, but that would make it at 766 MHz, which is pretty high.
Is this how it is supposed to be, or do I have a weird reading from RivaTuner? -
Also, DDR means Dual Data Rate, not 3x..
But yeah, RT is wrong; nTune reports correctly, but with the 174.31s, overclocking does not work at all for me. -
Alright then, I guess I will have to use nTune if I want to overclock. I wonder if RiveTuner will get updated with the correct clocks, soon.
-
-
Well, on nTune, it shows 600/799 MHz clocks. RivaTuner is way off on the core, as I don't think any doubling should be needed. I thought of GDDR3 the same way as you take it, but now I think that I'm wrong too. I believe that the 3 at the end just justifies a newer type, one that should be faster. DDR3 really translates to Dual Data Rate 3, so I think it still is to mulitply by 2.
-
Do any of you guys have an issue when you restart, you're back to stock settings for your video card?
-
-
Thanks for the prompt reply and great suggestions.
-
Wow, this thread may be old, but I'm going to post anyways, in case anyone is checking back. I have a P6831FX, with a brand new T9300 (2.5) in it. I set the GPU core to 602 (using nTUNE), and the mem core to 742. Temp of GPU was 70 C.
3dmark score:
9576
Attached is the graphic.Attached Files:
-
-
Heh, and my twins, nonetheless
.
-
nice Seanjohn. I am currently trying to overclock 640/940 core/mem and am wondering if I could go any higher or if anybody has done any better? Thanks
-
Which video driver are you guys using? I can't seem to OC using 175 drivers. NTune doesn't let me oc.
-
they just added the cards to gpu review.
http://www.gpureview.com/show_cards.php?card1=570&card2=569
to answer someones question on rivatuner and the clocks.. it will never read it right till rivatuner comes out with an update to read the 4th setting of the 8800m gts 2d/thr/3d/extra or 4th clock setting.
jeff..you should be able to over clock all these drivers..including stock. but you will need ntune to get almost the full potential of the card. and this is for an 8800m gts card. not sure what you have. -
Ya I have ntune. I've used it before way back when. I've always used rivatuner but ntune doesn't show the option to OC. I'm using 175.80 drivers from laptopvideo2go.
so which versions are you guys using so i can get it to work with ntune? -
http://www.nvidia.com/object/ntune_5.05.54.00.html
and your going to need to install all the componets to invoke the over clocking ability -
does overclocking help frame rate in gaming at all? i have read some posts saying it doesn't really do anything except higher benchmarks
-
yes....it does.
-
cool...good to know.
-
nvm...got it to work...thx
-
is there a way to make windows to load with overclocked settings? rather than change it every time you reboot
-
yes, under the ntune profile section...you can have it auto load any profile you set up.
-
at 660/900, the display driver shuts off after 5 minutes of stress test
-
-
no, you need to be more around 575 to 600/850 to 925 -
i know, to tell the truth i accidentally misread the number, thought it was 560 lol, just glad my GPU isn't fried
-
with just a small over clock you will notice more frames...but overclocking crysis seems to trick the game into think you have a stronger system than you really do... and it puts your card to the test by maxing them pixels,textiles and memory bandwidth. and since you arent controlling the shader.... the driver fails under stress... (speculation of course)
-
i get 3 more fps for 560/860
-
your not going to see any real major gains till you change your cpu...your running a 1.83 ghz 667 mhz fsb 2 meg cache chip.
the minute you put a 800 mhz fsb chip in you will see the gains. -
actually i just got my T8300 today, installed it with no problem. getting around 25 fps 1440*900 no AA all high except shadow dx9. i was expecting more but i can live with that
-
-
i expect it to not to pull back the GPU like the T5550 does. but it seems i'm getting the same result that t9300 users are getting so i'm happy with my purchase
-
-
hopefully this is not gonna piss anyone off. with T8300 and OC'ed 8800m GTX, I was able to play cysis with 1680x1050 + all high + 2xAA for 25-30fps during most of the scenes. however the discovery site was weird (where I eavesdropped the professor's conversation with his team), after the alien machine reactivated itself, my framerate dropped down to 2fps.
-
nope, we we're already mad when we seen your tagline flaunting an over clocked t8300 and an over clocked 8800m gtx
nah, im messin'
but seriously...your system is what you make of it. and you get out of it what you put in......
hey nirvana, how goes it?
and now im going to have to re look at the section... -
-
ryo1000
Default Re: Overclocking the P-6831FX
actually i just got my T8300 today, installed it with no problem. getting around 25 fps 1440*900 no AA all high except shadow dx9. i was expecting more but i can live with that<-------- i was assuming this was crysis you we're talking about.
crysis seems to be the big debate... unless your talking about some other game? -
edit ==========================
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
nothing to say here -
oh okay lol
-
johnksss!
What GPU/shader/memory clocks do you use when you OC your VGA? Please tell me -
i have went as high as 690/1005/1600
and i still couldn't even touch my record of 11,230
i dont really run my system over clocked. what for? my games all run smooth and i have no issues running it at stock.
but a stable over clock is any where from 600-630/900-930/1300-1400
all depends on what setting you use in the nvidia control panel. to much performance and your system will blank out the screen or reset the nvidia driver. best to set some quality settings in there. -
Good points johnkss. For the most part I'm happy with the GTS at stock speeds. I get a few more FPS out of OC but my fan is, like stock clocks, really working it's little blades to the bone when pushed in games.
I got 3 year warranty so still love to OC. I killed my old soldered in M10 with OC, this was a $250 out of warranty board swap. I would be curious what an out of warranty board swap for the P68xx will come to? This might become an issue with some of our out of country brothers. -
Now, if GW could clone the Clevo M570RU-U for $1200....
Overclocking the P-6831FX
Discussion in 'Gateway and eMachines' started by binro01, Jan 27, 2008.