Yes. Even stock, but since the QX9300 is an "extreme" cpu, the multipliers will be unlocked.
Theoretically:
3.5 * 2 = 7
2.5 * 4 = 10
And lets take a wild assumption that the QX9300 will allow the full 3x multiplier increase in the BIOS, bringing it to 3GHz, that would be:
3 * 4 = 12
So for multithreaded applications, there will be substantial benefit to the QX9300.
-
but thats the way it is
four cores at 2.53 GHz is for multithreaded applications
&
two cores at 3.50 GHz is better for gaming
but in the long run wouldnt the four core be better to get
for people who are buying now or by the end of this yr ? -
It all depends on what you are doing. For example. Big mike likes to test and tweak stuff, so I would think he'd be better off with 2 cores clocked at 3.5Ghz. If you like to do a lot of stuff at once, then the 4 cores at 2.5GHz will get her done. Not saying two cores at 3.5GHz won't, but applications will start being configured to take advantage of those 4 cores.
For the stuff I like doing, I'll be much happier with 2 cores clocked at 3.5GHz. That's assuming you can't overclock those 4 cores of course. -
I do like to tweak and test stuff, but I also would love to boast of my Quad-Core Monster Machine... just sounds cool to have quads in a mobile platform.
Now if the QX9300 cannot overclock via multipliers, I'll be settling on the X9100. However, as a photographer and CS4 coming soon, I think the Quad would settle better. But then again, CS4 is based off GPGPU... I should collect my thoughts here... I will be sad saying goodbye to my 25w CPU...
So the 9800M GTX! Great card! Wish it had 128 shaders and 1.1v!
New single card Vantage record: P6474
CPU Score
24272
Graphics Score
5203
Clocks:
705 / 1675 / 975 @ 1.03v 45F ambient
Running temperature under full load was 51C GPU. System is becoming unstable with any higher clocks. I believe I've reached the maximum clocks the voltage can handle. But I won't stop till I hit P6500 on this hardware! 100 points higher from my last post, and much higher clocks. Default score for this system was P5000. An increase of P1474 isn't too bad. An increase of 30% and a theoretical throughput of 546, making it a fair bit more powerful than the brand new 112-shader 9800 GT, and more powerful than the theoretical Shader Processing Rate of the 8800 GTX, which has a throughput of 504Gbps and 518Gbps respectively.
-
whats about overvolting with nibitor? I set 1.1V at Extra-Mode in Voltage Table Editor!?! Will this work?
-
We have about eight voltage options which are ranging from 0.70V to 1.03V with steps of 0.05V except at the last step which is 1.03V. This means you can select 1.00V as well or lower and max is going to be 1.03V and nothing more. To mention it again, this is not controlled by what value you put next to the VID level. The VID level alone is dictating the voltage and doesn't look at the value you put next to it, so you can put whatever fake voltage next to it. But the right voltage is the one that NVIDIA initially set in the BIOS and nothing else. That voltage change is not giving 1.1V, cause the VID 00 which is the highest possible voltage is only giving 1.03V, the VID level controls it not what value you put next to it, so VID 00 gives always 1.03V, even if you put a value of 2.00V next to it, it won't give that as that is just a label to make it easy for users to use.
I believe a hardware modification would be required in order to make higher voltages work. -
Removed double post
Guess I might as well write something useful...
I've been benchmarking the system at night on my porch. Getting down to around 45F here in the mountains of Utah. Not too cold to do any system damage, but cold enough to get a good overclock.
I've heard of some people spraying NO2 into their systems to cool them. It would take a lot of NO2, but how cool would that be? -
Has you testet the overvolting in Nibitor. I overvolted my 7950 gtx in my Dell XPS M1710 with Nibitor and this works...
-
-
Syster, it sounds like sombody was a little rough on their car...
P6504 ! ! !
CPU Score
24274
Graphics Score
5287
Clocks:
712 / 1678 / 977 @ 1.03v 42F ambient
Beat that! (non-sli / quad core people...) -
1.03 is the maximum voltage you can go to. If you can prove otherwise, I'd love to bump it up. But as it stands, 1.03v - VID 0, is the max, as can be found via the NiBiToR app.
-
Niiiiiiiiiice
-
This is a common misconception. Setting the clocks to 1.05V which you have indicated isn't possible. Once again as we have mentioned many time before, the voltage value you are putting does not control the voltage. That is ONLY controlled by the VID level, and the voltage value is just a label to make it easy to use. So what you got is 1.00V is supported and 1.03V but not what you have tried at 1.05v. If the vBIOS does not have a value above 1.03, then it cannot go any higher. Do a bench at 1.03v, then again at say, 1.15v. Check the heat from both, and you'll see little to no change. Try finding your maximum clock at 1.03v, then try at 1.1 or even 1.15. You'll again see little to no change.
I wish it was that easy, but its not. -
Ok i know 1g 9800 gtx sli x 2 is better
but how much better is it to
Ati 3870 x 2 crossfire
will the 2 of the 9800 be 20 / 40 / 60% more or .....
thxs -
@johnksss, I assume you manually configured VID0 to 1.05 from 1.03? If your card is running 1.05v stock, you'll likely be able to clock higher than me. Otherwise, I believe you'll find the same stability @ 1.03v when compared to 1.05v.
-
emike, it seems a completlely different bios!!!
Attached Files:
-
-
Your right, it does! While our vBIOS is newer, if his has the ability to go up to 1.05, I'll try flashing my board with it and see what happens. His board also show 3980003 while mine shows 3980503.
johnksss, will you upload a copy of your vBIOS? -
Wow i hope it will work!
700 MHz GPU-clock 24/7 -
lol how nice would that be... I'd have to use it outside only to cool that beast!
-
-
It is possible. It just gets hot quick.
I've topped at
712 / 1678 / 977 @ 1.03v @ 42F ambient.
Once the core was down around 30C, I began the benchmarking. Its just barely stable at that. The temp maxes around 60C at that ambient temp, so for me, heat was not a factor, only voltage. -
-
Nibitor says that this will be a unknown Bios file. The GPU clock is 28.1 GHz! ^^
<edit> my bios file is 62 KB and yours 61 KB ... ?? -
Allright since you stated your 9800m GTX is faster than a 8800GTX desktop we will see who is right. I will install vantage on my desktop and run the benchmark for the fun of it. Now since you are overclocked my XFX 8800GTX is factory overclocked so let´s run some benchmarks both gaming and synthetic Emike. This is not that I doubt or anything just nice to see what that 9800m GTX holds up against a desktop 8800GTX.
I can say right away my max res of my TFT is 1440x900. So I can´t run any 1920x1200 res on any game where your 9800m GTX would start to struggle right away 256-bit bus compared to 384-bit bus, also we can apply AA and AF where the 8800GTX really shines due to it´s wider bus.
But I will post later today if I get Vantage installed. But we could begin with Crysis at 1440x900 DX10 High then DX10 Very High. -
Awesome sounds good. I've got the stock scores on poat 1, and my top OC scores on page 12, post 117. Would be interesting to find out. To keep things fair, can you downclock your E6600 to a 2.5GHz OC from 2.9? While the 9800M GTX does only have a 256-bit bus, it's 1GB of VRAM help it out a fair bit in the higher resolution realm when compared to the standard 512MB 256-bit GPUs.
-
@ emike09
whats about the 1.05V Bios? Will it work? Can you set higher clocks? -
Please don't post information regarding a different product. Just makes things confusing. -
Emike yes downclocking to be fair, well actually I can downclock it to stock 2.4GHz instead since your´s is at stock too. Will post some benchies in the next few days.
-
Here are my Crysis scores. Since its hotter today, I'll wait till tonight to bench using a decent overclock.
1440x900 DX10 HIgh Stock
Play Time: 71.46s, Average FPS: 28.04
Min FPS: 18.41 at frame 1941, Max FPS: 35.82 at frame 983
Average Tri/Sec: -27194628, Tri/Frame: -971622
Recorded/Played Tris ratio: -0.94
TimeDemo Play Ended, (4 Runs Performed)
==============================================================
1440x900 DX10 Very High Stock
Play Time: 103.48s, Average FPS: 19.33
Min FPS: 3.04 at frame 68, Max FPS: 23.93 at frame 990
Average Tri/Sec: -8658203, Tri/Frame: -447964
Recorded/Played Tris ratio: -2.05
==============================================================
1440x900 DX10 High Mild OC - 600 / 1600 / 950
Play Time: 57.53s, Average FPS: 34.77
Min FPS: 18.96 at frame 1948, Max FPS: 48.14 at frame 61
Average Tri/Sec: -33764816, Tri/Frame: -971193
Recorded/Played Tris ratio: -0.94
TimeDemo Play Ended, (4 Runs Performed)
==============================================================
1440x900 DX10 Very High Mild OC - 600 / 1600 / 950
Play Time: 80.43s, Average FPS: 24.87
Min FPS: 0.00 at frame 138, Max FPS: 29.40 at frame 1006
Average Tri/Sec: -16208990, Tri/Frame: -651846
Recorded/Played Tris ratio: -1.41
TimeDemo Play Ended, (4 Runs Performed)
==============================================================
In this reviewers testing with a QX9650 @ 3.0GHz and other far superior hardware specs, my OC'd score is apx 3fps higher than that of a 8800 GTX 768MB desktop card. Were I to take my clocks up to my record benchmark, it would almost be beating the 8800GT is SLi.
http://www.techspot.com/article/83-crysis-patch-performance-multigpu/page5.html
I know the test is for v1.1, but even still, a P9500 2.5ghz dual core versus a QX9650 quad core extreme and a 8800 GTX desktop, you'd think that desktop would haul @$$ compared to a laptop -
Wow thats impressive. I just got an M570TU from Xotic too, with almost the exact specs. I like this card alot. My question tho is how does nvidia set the clock to 1.03v? What do they do to do that? And is possible to set volt higher using same way?
-
I've wondered that too. No idea. I figure its on a firmware of some sort.
-
thats what ithink too. I'm curious about the 8800GTS card and the9800MGTX card. Wheres the benches for the 8800gts?
-
Emike here is my scores all stock no overclocking, 575/1350/900, CPU at 2.5GHz. Vista 32 Ultimate, no Service Pack 1 yet. 178.13 Drivers.
DX10 High
!TimeDemo Run 2 Finished.
Play Time: 52.64s, Average FPS: 37.99
Min FPS: 21.22 at frame 142, Max FPS: 48.16 at frame 67
Average Tri/Sec: -36726584, Tri/Frame: -966634
Recorded/Played Tris ratio: -0.95
DX10 Very High.
!TimeDemo Run 1 Finished.
Play Time: 79.77s, Average FPS: 25.07
Min FPS: 13.09 at frame 138, Max FPS: 29.82 at frame 994
Average Tri/Sec: -11095624, Tri/Frame: -442575
Recorded/Played Tris ratio: -2.07
Mild overclock 620/1530/1000
DX10 Very High.
!TimeDemo Run 1 Finished.
Play Time: 72.95s, Average FPS: 27.42
Min FPS: 14.93 at frame 150, Max FPS: 32.93 at frame 998
Average Tri/Sec: -12161351, Tri/Frame: -443574
Recorded/Played Tris ratio: -2.07
Finally my XPS M1730 with 2.4GHz T7700, 8800m GTX SLI.
8800m GTX SLI overclocked to 625/1550/950 Crysis DX10 Very High 1440x900
!TimeDemo Run 3 Finished.
Play Time: 63.14s, Average FPS: 31.68
Min FPS: 11.74 at frame 138, Max FPS: 41.80 at frame 990
Average Tri/Sec: -12581973, Tri/Frame: -397182
Recorded/Played Tris ratio: -2.31
TimeDemo Play Ended, (4 Runs Performed) -
I loose :'( Vantage time.
-
I'm surpsied stoo. all the bnechmarks ive seen show the 8800gts 640 being weakr than the 9800mgtx
-
No probs Emike your GPU is still very fast, not too far away from a 8800GTX in my opinion.
-
According to this site, the 9800M GTX scores near a 4GHz quad running an 8800 GT under stock clocks, and I'm just under an 8800 Ultra with OC'd clocks. Passed up the 9800 GTX also. wierd.
http://www.pcgameshardware.de/&menu=browser&image_id=813478&article_id=641615&page=1&show=original -
Yep though synthetic benchmarks is different than to real gaming. I barely rely on either 3D Mark nor Vantage, it is in games I want to see the real performance.
-
Right I agree, but a synthetic benchmark is still a benchmark. It is standardized, simple, and easy to use. You going to do your Vantage scoreS?
-
Yes will install Vantage today.
-
I noticed on XoticPC that the 9800M GTX is now the only GPU option. Good to see that!
I'm excited to see Magnus's Vantage scores. See if he can top Big Mike's. -
Who says I'm big? I think its funny that when I'm referred to it's "Big" Mike, or "Ol" Mike or something like that
-
ha ufunny. you beat your score yet? and whatver happend to magnus? I think he realized his score = teh suck and copped out.
-
Hello, i also come from Germany...
I have the D901 C with one 9800M GTX.
CPU is a Q9400.
I would like to overclock my 9800m GTX like itsfun.
@itsfun...Kannst dich ja mal per PN melden, dann kannste mir ja vllt. helfen.
But i have another Version of the Bios... in GPU Z
GPU 0617 and itsfun have 0608
I flash my Card with nvflash, everything works correctly, Update Succesfully.
An the the Display is black.
I flashed back to the original Rom and everything works (for my luck) correcty.
Can anybody help me?
THx
Is this the Problem? -
Are you saying you have a 617 version and are flashing with a 608? Generally not a good idea if so.
It is best to flash with your device ID only. Are you using NiBiToR to set your own clocks? Make sure you are saving your vBIOS WITH a checksum. you should be able to increase your frequencies on everything by at least 100mhz. -
yeah, i was wondering about that...my string is also 0617 for a 9800m gtx....
-
If I remember correctly, the 617 card is designed for use as SLi or future SLi, while the 608 does not have SLi capabilities.
-
-
The Problem is the 0617 can i flash, but the Nibitor Read the clocks not correctly out. Do you undestand.???
2D Core 21860 Shader 400 Memory 100
Thrtl 21860 550 301
3D 21860 767 301
What is 21860 MHz?
Nibitor Read not correctly...out.
GPUZ write:
GPU Clock 500 MHz
Memory: 799 MHz
Shader Clock: 1250Mhz...
I changed the clocks in my bios, and then i flashed...The Result was a black screen...
I used my original rom (0617) whic i modified (clocks!)
I hope that you understand me!
Thank you
P.S. I have had 10799 3D Marks
But are 12000 not better.
I would like to test ist, but i have no more Idea.
h**p://img3.imagebanana.com/img/y1h9ogcw/Unbenannt.jpg
h**p://img3.imagebanana.com/img/7a0vtrrh/Unbenannt.jpg
I also try to set the 21860 to a real clock, but it wont work...Please help me...THx
9800M GTX Thread
Discussion in 'Sager and Clevo' started by emike09, Sep 14, 2008.