Here's the deal: Macbook Pro 17" 2.4 Ghz with 2GB Ram and an Nvida 8600m GT 256MB DDR3 videocard. Overclocked to:
Core - 665 MHz (default is 520 MHz)
Memory - 860 MHz (default is 650 MHz)
Overclocked with Rivatuner 2.04 and old Bootcamp driver (101.34). This gives me a 3DMark '06 score of 5123! Temperatures seem good - never above 81 degrees.
See the picture for my results![]()
![]()
-
Cool results!!!!
-
wow...
/drool
and i thought my 3.4k score was good
im afraid to overclock though -
I ran the benchmark in the default settings btw (1280x1024 resolution).
-
Hmmm....I dont think this is possible. That means the 8600gt is as powerful as a 7950gtx...something is not right.
-
why do i think that that score is with 1024x768...
rather than the default 1280x1024....
if thats the case, please post the 1280x1024 default resolution test scores. -
the 3dmark benchmark is not necessarily a true indicator of in game performance, so i wouldn't go so far as to saying it's as powerful as the 7950gtx
-
I can believe it, I get around 4000 with standard clocked 162.18 drivers on my 2.4 GHz MBP, and I think Apple underclocks the card a bit. So if you were willing to overclock it, I could see how you could, possibly, get that high - though it DOES seem quite substantial.
-
like I said... I think the score is misleading....
since its probably a score from a 1024x768 test.
... NOT the default 1280x1024 score (in which mos people use to compare there scores) -
What is, generally, the difference between a 1024x768 score and a 1280x1024 score? Is there a standard-ish % increase?
-
The Forerunner Notebook Virtuoso
That is highly improbable. An 8600 with 600/900 and a more powerful cpu then yours got about 4600 at default res.
-
Man thats higher than my 7950gtx
Is there any way to overclock a 7950gtx to get higher performance? There seems to me no mention of a OC for the 7950gtx
-
-
Let's hope this guy saying truth not like 725/725 8600M GT OC guy
-
-
We really gotta stop these threads. I'm really tired of everyone comparing e-peni. If i said that my GPU gets a 20 gagillion in 3DMark98325409, while running in a liquid nitrogen tank, what good does that do anyone else? Same thing with me going "lair!!! I bet your pants are on fire!"
-
none, it just a good way to boost ur ego
-
-
yup, we'll just have to wait again. btw, check out the thread by our mental friend: http://forum.notebookreview.com/showthread.php?t=172710
-
Man, will you stop pulling attention to him? Trolls thrive on attention. Ignore him, and hopefully he'll get banned.
-
u never know...maybe he's the next bill gates of hardware....
-
He's going to be a next genius or Thomas Edison -
-
-
-
That screenshot includes the resolution the test was taken at under Settings. -
Allright, I ran the test again but it will only show my results online, not in 3DMark itself. I guess that's because I have the free version. Anyhow, here you can see my settings after the test completed.
-
Skilled troll is skilled.
-
guys, 5000+ at 1280*1024 should be correct at that clocks.
offtopic :
Personal record 3mark 2006 - 6023 - XP - 1024*768 - 165.01
Vista - 1280 * 1024 - 4780 (don't rememnber drivers)
My only problem is that I and probably everybody that OC , can run a 3dmark without problems since the benchmark last what, 8 min ?? and 2 or 3min of then are on the cpu! And the test is between the SM2 and SM3 benchmarks so the GPU has time to cool down a bit, so you have more or less 5min at top on the GPU.
My consern here is 100% stable. I do not need higher 3dmark scores, but want to have a OC and play without problems for 5 ou 6 hours not 5 min. I've been struggling with atitool to find the 100% stable mark and I found it.
I did not saw until last night that the shadders work by steps; The poblem with that table is that is does not include shaders bellow 1188
http://forums.guru3d.com/showthread.php?t=238083
Set core | Resultant frequency
frequency | Core Shader
---------------------------------
509-524 | 513 1188
525-526 | 513 1242
527-547 | 540 1242
548-553 | 540 1296
554-571 | 567 1296
572-584 | 576 1350
585-594 | 594 1350
595-603 | 594 1404
604-616 | 612 1404
617-617 | 621 1404
618-634 | 621 1458
635-641 | 648 1458
642-661 | 648 1512
662-664 | 675 1512
665-679 | 675 1566
680-687 | 684 1566
688-692 | 684 1620
693-711 | 702 1620
712-724 | 720 1674
725-734 | 729 1674
735-742 | 729 1728
743-757 | 756 1728
Problems - since the 8600 comes stock with 1:2 ratio , e.g. (STOCK) 475core 950shader, so (1:2) ratio. That means that the maximum gpu will be cut by the maximum shader. The maximum gpu is 564 that gives the "112x " shader step, at 566 you have "113x" and that's a step above (i see this with rivatuner) and atitool 3dview after some minutes of running crashes my card giving desktop artifacts.
The funny is that I can play hours @ 580 / 1160 but once again atitools 3dview crashes my card. It's the only think that crashes the card at that clocks. At 600 / 1200 you have the shaders on another step and this time hours of gaming crashes the card , but 3dmark runs quite well LLOOLL
The problem here is not temps but shaders limits. Atitool 100% stable 564/112X , not atitool stable but gaming stable 580/116x
To finish. I was allerted that nibitor has VID (voltage) editor, and indeed it has, but I did not see a single gain about it. I changed vcore values and at least i was expecting different temps but no, so it seams that we are vcore stucked. -
i also never got how people openly lie about some stuff that never happened. use ur imagination in a different way. -
-
oh, my bad, thought those were the actual scores, not the online ones. my apologies
-
Everyone's so impressed with all these scores, my desktop gets over 12000, but you don't see me going around talking about it.
Why does anyone care about 3DMark? Only thing I think it's good for is burn in test. Have it running for a few hours everything as high as it goes.
Now if he had said "Hey look I've got Half-Life 3 running at 200FPS" and a video. That would be cool.
Also, I'd say he's telling the truth, doesn't have any reason to be lying about it and nothing looks out of place in any of his screenshots. -
Nice results! -
-
Some of you guys are really suspicious aren't you?
'Trolling' these forums is the least of my intentions (look up my other posts). I thought most members liked these kind of results. I'm not posting to make myself feel better about my purchase or something like that. Just trying to show the 8600 GT has some good overclocking potential, and can easily be clocked to the same speeds as it's bigger brother, the 8700 GT. That's all really.
Now if only the latest drivers could be overclocked, that would make these results even better -
yeah, man, i wasn't specifically referring to ur persona, but a lot of people doing it just because. nice results, congrats. the thing is, u got a ddr3 version of the card, ur clock were already better, to begin with
-
-
http://service.futuremark.com/compare?3dm06=2692773
Futuremark - ORB - Project Comparison
-
That's the ORB comparison, that's not the page you see when you submit your results.
-
Is it possible to uninstall a full version of 3dmark06, then install the basic edition. Run it, and then put the serial code back it to make it full edition again??
I dunno, I remain suspicious of this result, until I see the resolution with the score on the same screenshot, I won't believe this. -
I don't trust screenshots-there's many thing you can do with them... -
Is there any guides to OC a 7950gtx???
I see no guide in this before..As is usually runs at 59 degrees full load i figure i could overclock it. Just for the fun of it
-
-
I mean like what is the usual core speed and memory speed than can be overclocked etc
And how to actually overclock with rivatuner
-
I think it might be true, i'm trying to get to that score on my 15.4 t61p
I did try to match OP settings but the card gave out even though the temps never went very high
Core - 632MHz (default is 475 MHz)
Memory - 820 MHz (default is 702 MHz)
This is as far as i can push it any higher and 3dmark wont even run it and yet my temps never went above 75c so i am rather curious how OP got his score.
Core - 642 MHz (default is 475 MHz)
Memory - 840 MHz (default is 702 MHz)
-
wow... 5000? my 7900gs didnt even get that on a crappy venice core
-
I wasn't able to duplicate my first test run (with core 665 MHz) without getting artifact by the way. I don't know why. So I ran it at 660 MHz (5 MHz slower) the second time and that went well. 3DMark score was lowered by a good 100 points (still 5000+ though). Also did some gaming at that speed succesfully, allthough I haven't played for a few hours non stop yet. -
Another board member was kind enough to offer me his 3D Mark Pro key, just so I could take the following screenshot showing my test results + settings. Hopefully that's enough 'evidence' for everyone, and if not, so be it. I'm not running that damn test again
I have lowered my overclock settings by 5 MHz because I couldn't duplicate the original settings without artifacts in 3D Mark. That's why the score is a little lower:
I used Paint to save the screenshot btw. I did not use it to edit my results -
my OC'd 8600m GT just got me a 5000+ 3DMark score
Discussion in 'Gaming (Software and Graphics Cards)' started by Cinner, Sep 24, 2007.