I have a 6920G with 4Gb of RAM, a C2D T9300 and the Geforce 9500GS running Vista 64-bit.
For some reason I'm only getting 3DMark03 scores around 5200-5300 but I've seen benchmarks of identical systems much much higher.
Why is this?
My brother-in-law has a 6920 with a gig less RAM, a lesser C2D T5700 but a Radeon HD3650 and his 3DMark03 score is 10500.
Should my 9500GS really suck so much ?
-
-
His card is definitely more powerful, especially with the difference in GPU memory type. The 9500M GS isn't that strong of a card but it handles most games on medium settings fairly well. 3DMark03 scores aren't really accurate in comparing performance IMO since there are a lot of other factors (such as resolution) that affect score. What games do you play? The difference would be less noticeable in most games.
-
@.@? Obviously, it is a wrong number.
3Dmark03 is an old benchmarking tool. Do use 3DMark06 or Vantage.
They're more accurate.
In addition, 9500M GS isn't that suck. My OC'ed 9500M GS with C2D T8100 2.1GHz and some System Tweaking allow me to score 48XX points on 3DMark06.
Basically, it will score 5-digit points on 3DMark03.
Maybe you need to update the Driver.
Get the latest Driver from Nvidia(easiest way) and install it.
Download the latest DirectX(currently is august 2009 version) and install it.
These will pump up your laptop performance. -
It's possible that your CPU is throttling during the test, and causing your score to plummet; it's a problem that's been known to affect 6920Gs.. Check the Definitive Guide to diagnose the problem.
-
SM2.0= 1140
SM3.0= 1090
CPU= 2198
I ran this with CPUID Hardware Monitor in the background this time. The GPU never got over 68C and the two CPU cores maxxed at 51C.
I have the very latest WHQL notebook specific drivers from NVidia. 186.11 iirc and the most recent version of DirectX -
Right, I don't understand this.
I overclocked to 600/425/1200 then ran 3DMark03 and got... 11889!
I'm assuming the moderate overclock can't account for a 5250 to 11889 jump, so Powermizer must be gimping my scores at stock speeds but disabling itself when overclocked.
GPU is still only getting to 74C... is it worth pushing for more? -
Sometime the temperature changes are very insignificant even with higher Clocks too.
For example(my 9500MGS): My stock clock 9500MGS, Max temp = 72C
OC 515/450/1275 and Max temp = 74C
OC 615/470/1415 and Max temp = 75C
Take Notes, Memory Clock doesn't have a thermal sensor, so, we don't know the temperature of it. Avoid very high OC on Memory Clock because of that. In addition, I find Memory Clock is the easiest spoil component.
Increase Core and Memory Clock can pump performance a lot.
Shader clock is not very significant on increase performance.
Lastly, Shader is not always needed to be 2x of Core. It can be 2.1x, 2.2x, 2.3x, 2.4x, 2.5x.
I would suggest not over 2.4x of course.
Acer 6920G: Very poor Geforce 9500GS performance
Discussion in 'Acer' started by Big_Rich_UK, Oct 4, 2009.