So the title speaks for itself.... I have been surfing around the gaming area and also the asus/compal/sager areas and I have been noticing that a few people are reporting odd stock core memory clocks for their new laptops. Well, not necessarily odd, but somewhat out of the norm. For example, there was a sager post that showed his 8600m GT clocked at 513 memory, according to rivatuner. Now, I'm not saying that's incorrect at all, it just seemed weird for the manufacturer to clock it 513.. not 510, 520, but 513. It may have been 510 and rivatuner might have rounded it off, I don't know. Also, I think the stock clock by nvidia for the 8600m GT is 475, so I don't know why they would overclock it out of the factory, maybe to offset the lower memory clock (that particular laptop was clocked at 400 memory) So I decided to download three apps that showed my clock speed to see the differences: ATItool, nTune, and Rivatuner. This is the result that I got:
![]()
As you can see, ATitool and nTune are reporting clock speeds of 500 core/700 memory. But if you look at rivatuner, it gives me a clock speed of 540, 40 mhz higher! If you look at the offset options, you can see that it is not offset at all. I went back and checked the clock speeds in riva tuner through the overclock options. And here is the result:
![]()
It looks like the monitor is displaying the clock speeds 40 mhz higher than it should be. Now if I take this value and subtract it from 513, it would be 473 ish (yes I can do a little math), which is very close to the nvidia stock clock, which in turn would also make a little more sense. But that's just me.
Please keep in mind that none of this is confirmed at all and this is just an experiment, result and opinion from a single user. I just wanted to know if anyone could confirm this and maybe try it out for themselves to see if they get the same or different result. I have a feeling this would affect people who want to overclock their cards and give them a little more room to overclock if the rivatuner monitor is indeed incorrect. Oh and I have a 8600M GS btw.
-
-
I think you hit on something serious.
-
Nice work, thank you.
-
Wow, if this is true then I hope the Sager's GDDR3 is really misreported in this instance.
-
why not turn on coolbits and see what it says there. those clock speeds are probably the most accurate since it's nvidia's utility.
-
I thought nTune was nvidias utility, anywho I'll dl coolbits and post my resuts.
-
I downloaded coolbits and added to the registry dunno how to actually "run" the program though.
-
coolbits lets you overclock your gpu, but it also tells you what the current clocks are. read this: http://forum.notebookreview.com/showthread.php?t=61174
-
A couple of points that might be of interest:
Firstly the frequencies used by graphic chips are made by PLLs (Phase Locked Loops). These take a base frequency (usually 14.318MHz), multiple it by a number (often called N) and divide it by another number (usually called M). However there are some restrictions. N and M are integers and have limited ranges, plus the multiplied frequency (14318000*N) must fall within a certain range.
This means you can make a lot of frequencies, but not all of them. The way this is usually implemented is you ask for a frequency (say 530MHz), whatever bit of maths (in the driver or somewhere else) works out the closest it can get by choosing N and M values that meet the rules and gives you that. However you have two frequencies, the one you asked for and the one you got. It is possible the tools report different ones (desired freq from the registry and actual freq from the chip).
Secondly, the other reason to get odd frequencies is EMC. To sell the notebook they must (allegedly, don't get me started) pass assorted emissions tests. One aspect of the limits they have to keep to is there is usually a step up (i.e. less stringent) to the limit as frequencies go up. If you have a peak over the limit just below that step a quick way to get rid of it is to tweak the clock driving the signals producing it up a little, moving it to the other side of the step and below the less stringent limit (been there, done that ;-)
Of course the utilities might just be wrong too...
John -
Well in desktops, in reference to the 8800 series, it also shows up the same as reading the clocks less then what they are actually set as. Other information, such as the BIOS for the card, the kind of memory, heat, fanspeed/control, work as normal.
I think the deal with the 8 Series is something to do with the driver/hardware that makes some programs (I believe ATi Tool has the same issue as Rivaturner) read a slightly different clock. In either case the clocks are always within range of the desired set speed, a few MHz isn't really going to matter much in the long run.
Or maybe Rivaturner is just broken. -
Anyway, recent NVidia videocards (especially mobile ones) have used different clock speeds for different parts of the GPU. The result you get simply depend on which part of the GPU is being measured. And none of them may be what you expect. I don't know if ATI do the same though. -
I have a 8600m gs and have the same problem with Rivaturner.
It displays my memory clock speed at 799.8 but i don't think it is.
It looks like a problem with Rivaturner. -
I have a 8600m gt (256 DDR 2) and rivatuner is also showing memory clocked at 799.20, the stock clock for the ddr2 ram on this card is around 475. I recently updated the driver for the card, I suspect it's something to do with rivatuner not being fully compatible with the latest drivers (I read you can't over clock the latest drivers, as of yet).
Of course I could be wrong cos I'm a n00b! -
Could be reading as higher memory clock because of ddr: double data rate? My 8400m gt's memory clock runs at 600mhz
475 seems a little low for a superior card, then again mines ddr3 so that could be it.
-
I've experienced similar things with RivaTuner as the OP. It reports clock speeds higher than I've set them. It's been a while since I've checked them, so I don't remember how much higher, but definitely higher (40MHz sounds familiar).
-
even in my 8800m GTS in the HDX it displays seriously wrong clocks dunno
-
It's a long story so I will keep it short:
Basically NVIDIA changed the way the GPU/MEM calculate their clock frequency since G92(8800M GTS/9600) and probably will keep this new method in later cards.
The author of Rivatuner noticed this change and didn't make correspondent change in the software yet.
So for all you want to use Rivatuner with new cards like 8800M GTS/GTX or later, probably you have to wait.
Rivatuner displays incorrect clock speeds on new video cards? Or all vid cards?
Discussion in 'Gaming (Software and Graphics Cards)' started by minxshin, Jun 29, 2007.