Hi.
It seems strange to me how a core can become unstable at 55c or 64c , when other core types in other notebooks run stable at 80c+
It makes me think the temp reading is wrong?
Regards
John.
-
Tinderbox (UK) BAKED BEAN KING
-
Ahhh, dont say that :S
Ive used a ton of temp sensors on mine and they all say the same temp.
but then again, the heat coming out of the laptop hasnt really increased at all but its still quite warm when gaming, but thats normal.
They are a new chip so there is the possibility that theres a temperature fault. -
Tinderbox (UK) BAKED BEAN KING
Hi.
Also the Samsung Q310 owners have reported GPU temps on there 9200m as high as 70c and their cores do not become unstable.
It only happens when overclocking beyond what the core will withstand and the temprature gives very little warning of when it will becoming unstable.
I think you should take your core temperatures with a major pinch of salt!
Regards
John. -
Darth Bane Dark Lord of the Sith
shouldn't the pc crash before letting the temps get too high? At the very least, i would think artifacts would appear in games...
-
Yes, if OCing is done properly, you don't actually put it to much harm at all.
Repeatedly pushing it too far or running it at the limit can/will shorten the lifespan of the card. -
Please help me with my dv4t and 9200M GS.
I have installed drivers 180.43 DOX Custumized and ATI Tool 0.27.
Each time when I move Core or Memory slider and press "Set Clock" it turn back to default...
Thank's. -
Tinderbox (UK) BAKED BEAN KING
Hi.
ATI Tool is very problematic , use Ntune as it is made by Nvidia.
Regards
John. -
Darth Bane Dark Lord of the Sith
-
-
Darth Bane Dark Lord of the Sith
-
Hi and thanks for the article. I haven't done any overclocking b4 so sorry if these questions seem stupid.
I have an HP CQ45 with discrete 9200M GS 256MB, I think the mainboard (Compal JAL50) is very similar to the dv4t, however the CQ45 uses DDR2 memory for the GPU which runs at a standard 400MHz rather than the better 700MHz DDR3.
I see you increased your memory clock from 700 to 850 so would it be resonable to increase by 20% on the 400 to 480?
With the GPU itself I guess I should be able to run the same as everyone else with a 9200M GS, core would be 700MHz and the shader? I didn't see any shader frequencies, would 1600 seem reasonable?
FWIW
default clocks shown in GPUZ is core=529MHz Memory=400MHz Shader=1300MHz
Driver nvlddrmkm 7.15.11.7553 Vista
Thanks.
Darth Bane: there are some posts in the HP section where posters have successfully overclocked their dv5t. -
what socket is this GPU on the samsung Q210? Is it MXMII?
I have a nice 8600m GT DDR3 GPU for a MXMII socket. It doesnt get much hotter so would be very cool te replace the 9200 with it. But what socket is it? Anyone? -
Tinderbox (UK) BAKED BEAN KING
Hi.
Nobody knows as you have to get to the bottom of the board , and that means taking the entire thing apart, I would love to know myself.
There are pictures of the top of the board, but none of the bottom.
Regards
John. -
Its doubtful that you can change the GPU because its a very small laptop. I didnt manage to get to see the underside of the board so I cant be certain.
I'll send a message to Samsung, hopefully they will be helpful. -
Are most video cards replaceable with newer cards in laptops?
-
-
-
Darth Bane Dark Lord of the Sith
And all of them are 9600 owners... Show me one 9200m one and i will believe -
are there any other parts that are replaceable ? Or are there any brands that atleast allow a half a year older laptop to be upgraded somehow ?
Or is it preferred to just replace it.. coz its expensive for newer ones =( -
What you can upgrade is the RAM, HDD +/ CPU.
EDIT: What system do you have? -
Ok thanks MP. Dunno about the games though, probably the most stressful that I have would be Q4. I was thinking more along the lines of using 'GPU Caps Viewer' point sprite particle demo. My figuring being that its a constant and so I should be able to get a good feeling for temperature differences plus see a difference in the average fps. I'm also a little worried with the DDR2 temperature so I'll try to check that to.
Darth: The CQ45 uses the same BIOS as the DV4t (InsydeH2O) but AFAIK the DV5t uses something else so not sure what to suggest. -
Darth Bane Dark Lord of the Sith
http://h10025.www1.hp.com/ewfrf/wc/...&dlc=en&cc=us&lang=en&os=2100&product=3744198 -
Okay I have some results here using GPU Caps Viewer point sprite particle demo for benchmarking. I chose the point sprite with default settings because it seems to run my GPU hotter than the other demos.
With the default clocks I got 126 FPS average.
With the overclock I got 189 FPS average, a 50% increase, which seems a bit strange because the clocks were increased a lot less than 50%.
Core temperature appears to have increased by 15% and GPU RAM by 10%.
The GPU RAM is rated at 400MHz, I really wouldn't have expected it to be working at 520MHz.
Well I'm of to get some sleep, hope you enjoy the pictures.
Default Clocks
GPU-Z with OC
Overclock
Last edited by a moderator: May 8, 2015 -
Are there no artifacts whatsoever when you testing it?
Because thats an unbelieveable OC
I recommend getting 3Dmark06 and running the test throughout, watching out for any glitchyness. -
I did try 750/1800/560 (gpu/shdr/mem) and the first time it worked fine for several minutes. I then let everything cool down and went to run at that setting again and plot temperatures but something gave (video going on and off) and I rebooted.
The lower settings appear Ok for now 700/1700/520, but perhaps as you say it needs a lot more pushing. At least it's nice to know I can up it if graphic performance is marginal, not just for games but general 3D applications also.
I did look at 3dMark but a 600MB download and 1.5GB installation dampened my enthusiasm somewhat. I also tried http://www.ozone3d.net/benchmarks/fur/ (1.6MB download) which at 640x480 gave a FurMark of 613 for default clocks and 799 for the above overclock. Even at this lowest resolution the FPS was only 10 and 13 respectively. Didn't seem to make any difference being run windowed or fullscreen.
As for the memory speed of 520MHz it seems insane.
I read LeetPix's post in http://forum.notebookreview.com/showthread.php?t=328025 and he's using 570MHz for DDR2 that normally uses a default 400MHz clock! You can get these RAMs with 500MHz clock ratings but after seeing my own results I think he is using the 400MHz chips too.
Some extra info that may be of use.
Video BIOS 62.98.3C.0.32
Update:
Okay I ran the FurMark Stability Test at the same time as the ATITool 'Scan for Artifacts' for over 2 hours at 700/1700/520 and no problems. Core temperature was 45C above ambient and video RAM 18C above ambient. These temperatures may be a little cooler than normal since I had to leave the main RAM cover off to insert the temperature sensors and this would have allowed more air circulation to the GPU. If there is a major change after the sensors are removed and the cover put back in place I'll update again.
When I installed ATITool and tried using it to set the clocks it made a right mess of it, dropping the values to some ridiculously low values. Since this seemed a bit odd I tried installing RivaTuner and setting the clocks on that made no difference to the real ones. Maybe there's hope for you yet Darth Bane and you can change the values on your DV5t, if you want to try let me know, if not thats cool too. -
Darth Bane Dark Lord of the Sith
high 50's C is okay for gaming temps for a 9200m @ 700/1700/500, right?
-
That was using GPU Caps Viewer Point Sprite Particles demo.
Not sure how accurate these GPU diode temperatures are. A nVidia data sheet would be nice but seems to be as rare as gold dust.
So how did you get the overclock to work after all, ntune? -
I wanna raise Tinderbox's point again about the temp reading possibly being wrong, seeing as the card artifacts at low 60'sC (in my case) whereas other laptops run fine at 80C. But the fact that multiple temp monitors all agree suggests it is the right temp. I would like to know why it glitches at that "low" temp? -
Darth Bane Dark Lord of the Sith
-
Darth: That's great that you got it working. When I bought my HP CQ45 notebook and started reading all those horror stories about nVidia chips I was a little concerned with my temperatures so downloaded nTune to monitor the temps. Unfortunately it crashed every time I tried to run it. To cut a long story short I ended up writing my own program to use one of the nTune libraries to read and plot the temperatures in real time, the same library also allows changing the max clocks. It's a real shame nTune itself will not run as it seems it would be a really good application.
Just a bit of extra info, while not running any graphic intensive applications and the clocks have auto idled down to 169/325/100 the GPU temperature settles down to 12C above ambient. -
Darth Bane Dark Lord of the Sith
is it safe to run @ 700 (core) /1700 (shaders) /500 (mem) a lot? Powermizer is on so as soon as I am done gaming it drops the clocks a lot and I have a notebook cooler so temps are great. I haven't had any stability issues what-so-ever.
-
apparently he finds it safe.
-
-
-
Darth Bane Dark Lord of the Sith
-
Ah, you've got ddr2, my bad.
*Laughs manically* j/k
Well that is a good OC then, congratz -
....
-
my roommate has a stock clocked 9200GS and it runs Fallout3 at 25+ FPS at the lowest setting and resolution, which isn't bad
-
When I play Fallout 3 it may be running at 40 fps but it's still jittery :S
The only time its was hard to play was when I was rampaging through Rivet City though (<20fps).
Its a great lil' card -
) Miscolobo offered some good advice
Also, use HWmonitor to monitor temps... it is the standard -
And I do use HWmonitor, always, what the heck d'you think I was doing?
Jamming a thermometer through my laptops heatsink maybe? haha -
The gpu draws it's power from the mobo and while has headroom at stock clocks, overclocked each seperate processor consumes more power so when a card is being completely stessed out by a game or benchmark and it hasn't enough power you'll get artifacts or even BSOD.
There is, of course, a limit to how much current the gpu is able to draw, thorough it's "interface"(random word) with the mobo. Sometimes this is the limiting factor in an overclock, sometimes heat. -
Manic: What temperature does your GPU run at when idle and what is your room temperature?
-
Right now 47C, room temp... I would guess is anything from 20C-25C.
-
geforece 9200 which proccesor is good with it?
CPU
1)T4200 (2.0 GHz)
2)T6400 (2.0GHz)
3)P7450 (2.13GHz)
4)P8400 (2.26 GHz)
5)P8600 (2.4 GHz)
6)P8700 (2.53GHz) -
What are you using it for?
The difference between the T and P is how much power it uses, T being 35W and P being 25W so the P series will help increase the battery life.
I'd recommend the P8400 because it's a great balance between price and power. But if you have the money, options 5 and 6 are good too.
-
i be using it for light gaming like starcarft, and games similar to wow wats best cpu for tat?
-
question above
-
Option 3 or 4 would be best, because they use less power
-
Hi guys.
I tried to overclock my 9200gs 256mb GDDR3 a bit and I obtained 2655 with these frequencies: clock 694mhz/memory 827mhz.
With stock clocks I achieved a lil more than 2.1k.
The strange thing is that Gpu-z says that my 9200gs has 16 unified shaders instead of 8...
Am I lucky or...?
Ps. my laptop is Samsung p210 with C2d P 7350, 4gb ram and 320gb HDD
9200M GS put to the test.
Discussion in 'Gaming (Software and Graphics Cards)' started by Manic Penguins, Sep 15, 2008.