These 8400GS are pretty O/Cable, but I am not aware of any of the new drivers that you can overclock Mobile nvidia cards.
-
-
I haven't tried OCing my 1330 because I haven't had a need too. But I have used the 169.xx series drivers from http://www.laptopvideo2go.com/ to OC my G1S. I have had success with both Rivatuner 2.06 and Ntune. I would be curious to know what level of OC has been done on a 1330.
-
Yes, confirmed that you can overclock using latest drivers from laptopvideo2go.com. Install those drivers and report back your overclock results, I'd like to compare overclocks.
-
Hum, I think I am using the latest drivers. I am using XP vs Vista. Are you using Ntune or Riva tuner?
-
I have both installed, but use Rivatuner more. Both work just fine though. If you "think" you're using the latest drivers, you probably aren't. Get the latest 167.xx on laptopvideo2go for your OS. I'm using Vista x64 and its running well.
-
I got it. Using the latest 169.09 drivers with Riva Tuner. Doing some testing at the following clocks: 500/1050/800. I might open it up and re-apply some AS5 to help with temperatures. But the more I use this little notebook, the more I like it. With XP Pro, T7500, 3GB ram and a 320GB 5400RPM Samsung drive, this notebook feels as fast as my Quad Core desktop
-
Thanks -
I monitored my M1330 yesterday while running through 3DMark06 and the max temps were quite high:
CORE (GPU): 95C
CPU0*: 71C
CPU1*: 78C
*I included the CPU temps just for the sake of it although I've had them up in the low 80s running Intel's TAT at 90% load on both cores. The above CPU temps were recorded during the cpu tests.
I also played through Half Life 2's Loast Coast a few weeks back and although I didn't "monitor" the temps, judging by the fan's speed (as in way-maxed out), I'm sure the GPU was at & above 90C the entire time.
Thus, I can't really imagine OCing the onboard 8400M much higher without having to lower the ambient room temps quite a bit (e.g., working in a "meatlocker", lol). Furthermore, the 16x drivers were never intended for the mobile platform and thus they will certainly cost you in terms of battery usage, even when not using 3D apps, imo.
What I noted when monitoring the 3DMark06 bench was that in between the different benchmarks (game 1, game2, etc.), the GPU and associate memory clocks fell dramatically:
GPU -> 432-297-182 MHz
Shader -> 864-594-365 MHz
Memory -> 602-301-4 MHz (Yes, that's four Megaherz...).
This is clearly a stepping mode of sorts to conserve power when not in active use. I'm farily certain that using the 16x drivers will not have this stepping behavior and so if you go with 16x drivers then you'll have to live with a great reduction in battery time...
One little Notebook with one very hot GPU (and CPU and Northbridge*) and a lot less mobility time: Sorry, but what's the point, then?
*Keep in mind that all the above mentioned chips "share" one heatpipe and thus, one chip's heat will affect the cooling efficiency of all chips... Have a look here for a pic of the Heatpipe and associated chips if you're interested and haven't bothered to unscrew the back of your M1330 yet
edit: Also please keep in mind that there have been other problems reported when using other drivers such as the Notebook not going in / coming out of sleep mode correctly, etc. -
-
That's because my main-PC GPU's "states" all default to the same clocks, unlike the 8400M GS. For the few of us who don't already know this, the three states for the 8400GS using the default 15x drivers (and their respective default clocks) are:
Standard 2D (169/100)
Low Power 3D* (275/301)
Performance 3D (400/600)
*I'm not quite sure what exactly falls under "low pwr 3D" but maybe VISTA's Aero is an example and/or HW motion compensation used by DVD Players or maybe even 2D menus in 3D Games... anyone know?
Just one other thing bothering me, though: can anyone explain why Rivatuner's monitor shows the GPU "stepping" through different clocks - even up to the 3D Performance values - in 2D mode? Just start up the [RivaTuner] monitor and do a little bit of standard surfing to see what I mean.
-
Oddly enough, Rivatuner's monitor reports 432MHz instead of the default 400MHz (and 862 instead of 800 for the shader), but 3DMark reports the correct figure of 400 and I assume it is 400. So Rivatuner reported my 450MHz OC as 486, but again, 3DMark reported 450.
Long story short, the GPU temps peaked at 97C vs. the 95C I reported above for the standard clocks. CPU0 and CPU1 reported 72C (+1) and 80C (+2) respectively.
So yes, the temps do increase, but no, not enough to cause any great concern (at least with my modest clock). I haven't tried to go up to Vengance's 500MHz OC which I guess will add another 1-2C but again, still no real concern.
I also remembered that I once tested 18kfangui's monitoring capabilities and that tool logged only 86C as the stock-clocked GPU's peak temp during 3DMark06. That's a very very big gap of 9C between the two tools and I don't really know which one to trust, per say.
=sidebar=
To verify my OC stability, I played through Half Life's Loast Coast for about 40mins and while in general there was no sparkle-show, I did note that during one run, reflective surfaces (but not the water) were covered in a red-pink color *. When I re-loaded the level, the surfaces were reflecting [sun]light again, but I suspect that my OC may already be too high.
Moreover, I had Rivatuner monitoring temps during the session and the GPU was averaging 94C and peaked at 104C! I'd say that these temps are really too high for my taste but maybe the thermal pads on my unit are not as well-sealed as those on another unit and so each unit is bound to vary by a few degrees.
*Edit: From the nVidia Driver Release Documentation - "Half Life 2 Lost Coast: Color corruption occurs in the video stress test after
changing the display mode." Helps to read those whitepapers sometimes, lol -
i dont think u should worry bout temps, once i forgot to plug in my fan and my 7900gt was idling at 125 degrees, but after ilet it cool down it worked flawlessly (aprt from a problem with vith vista installtion) and played crysis at medium
-
Elsewhere I posted that I had my 7900GT averaging over 100C during gaming for more than a year, but this was only to point out how much cooler the 8800GT ran due to it's 65nm (vs. the 7900GT's 90nm) process. My 8800GT 512MB ( OCed to 675MHz) averages around 84C in comparison. That's 10C cooler than the 8400M's running in my M1330.
There's an "old proverb" (lol) that says an electronic circuit's life is halved for every 10C increase in operating temperature. Nothing to really get all worked up about when we're talking about a PC video card (that will become obsolete well before it dies), but I'd like to see my M1330 achieve a five-year lifespan
Please don't get me wrong, I'm not against overclocking, I myself do it all the time, but generally in areas where I also have control over the cooling process too. I think it's wise to document temps along with OCing successes so that the reader has the whole picture... -
Traveller, FYI the drivers you asked (16x.xx) about DO have steppings for 2d, low power 3d and 3d. They work flawlessly on my Precision M4300 in Windows Vista Ultimate x64.
Sorry I forgot to mention that I'm not using an M1330, just thought I'd chime in on my experience and since my lappy uses the same G86 GPU. That being said, I would hesitate to overclock much if any on the M1330 since the chassis is a lot smaller and cooling thus is not quite as good as in a big notebook. -
I was just wondering if you still back up the statement you made about zero increase in GPU temps when OCing? The reason I ask is that simply put, my own testing proved the contrary...
Thx & cheers 8^) -
13.3" UltraSharp WXGA (1280x800) CCFL Display (220nits) with TrueLife
which came with a standard driver:
Generic PnP Monitor on NVIDIA GeForce 8400M GS
The resolution is nice (1280x800) but it seems the display flicks - it works at 60MHz with that standard driver - which kills my eyes.
Have you got an idea where I can find a better driver?
cheers,
Vlado
O/Cing the 8400GS in the Dell 1330
Discussion in 'Dell XPS and Studio XPS' started by vengance_01, Jan 12, 2008.