Hey there,
I've mentioned noting a GPU (Core) temp. variance of up to 10C(!) between 18kfanGUI (3.1) and RivaTuner (2.06) on my M1330 in another post, but I thought I'd "officially" ask the community if anyone else has noted such a variance.
It's not the variance that bothers me (you can offset the values in 18kfanGUI, for example), it's not knowing which one is correct...My money's on RivaTuner (which is the higher of the two values) but of course, I'd rather have it be the opposite...
![]()
Anyways, if anyone esle has noted a variance (or no variance), I'd love to hear from you, thx![]()
-
It has to do with the drivers. Before, when I used the 169.04, 169.09 drivers from LaptopVideo2Go, both I8k and Rivatuner gave me idle temperatures of about 45C, the readings were always the exact same. However, most of the newer drivers after that (169.12 onwards) have produced a 10C increase in Rivatuner's GPU temperature. I don't know why this is the case, as my laptop doesn't feel any warmer than it previously had. So now with the 174.12 drivers, I get idle temperatures of 55C on Rivatuner, while i8k remains at 45C. I'm not sure why this happens, but I believe i8k is producing the correct results.
-
I8kfan does not know if it's reading the right temp.
As I mentioned on the other thread, you would have to adjust I8kfan's temp to ATItool, or ntune.
I say go with ATItool, and ntune, then make I8kfan match them because ATItool, and ntune should have same temp readings, and have been accurate in my experience. -
I have to disagree. For example, right now i8k is reading at 43C, while rivatuner has it at 53C. It's barely warm, and the fan isn't even turned on.
-
Sigh... why are there never any easy answers to my questions, lol
I took up Achilles' suggestion and installed nTune and its temps perfectly match RivaTuner, which means both read the same [sensor] value and apply the same coefficent(s), if any. 18kfanGUI's exactly 10C less when the GPU's idling, but under load, the figure varies considerably.
SmoothTofu, while I follow your logic and respect how you arrived at your conclusion, I've never, ever, ever had a "modern" GPU that idled at 45C. It's just too low. Of course, I am assuming that you're not running the fan above nominal rates (which is most likely to be 0-1500rpm @system "idle" on the M1330). I know that my 8800GT which idles just below 60C with moderate [dedicated-] fan speed is probably the coolest modern GPU I've had in a while (with the 6800 & 7900 having had the secondary function of heating up my apartment...)
What I'm getting at is that I tend to believe the RivaTuner / nTune figures more than 18kfanGUI's. I have no idea why 18kfanGUI reports a difference of exactly 10C (@idle) and maybe I'll eventually ask Mr. Diefer about it.
Btw, I did read this over at guru3d:
"...RivaTuner ... uses ... NVAPI interfaces for driver-level thermal monitoring...." -
So all you have to do is make I8kfan match the ntune like I suggested.
My i8kfan temp matches ATItool/ntunes temps all the time. As long as it matches, it is a dependable program. -
*For the record, I've only managed to get OSD to appear while running 3DMarkxx benches and I haven't been able to view it with the games I tested so far (Half Life 2, RB6-Vegas, STALKER,...).Last edited by a moderator: May 8, 2015 -
Don't let other programs manage the fan speed.
If you run it too fast all the time, heat sink will get dusty much faster, and GPU will over heat.
Let the programs monitor, not control. Let bios take care of the temps. -
Reminder to all: 18kfangui's GPU temp value needs to be offset to reflect the GPU's true temp.
M1330: 18kfanGUI - RivaTuner GPU Temp Variance
Discussion in 'Dell' started by traveller, Feb 18, 2008.