Having upgraded my C90P's GPU from the lowly and rather rubbish 8600M GT to the much more powerful ATI 4670 mxm 2.1 II card I proceeded to push my system to its limits. Those results have amazed me so much I though I would share them here.
Having OCed my Q9650 CPU to 3.3ghz in the bios and clocked the 4670 at 831mhz on the core and 981 mhz on the memory clocks which are 100% stable with zero artifacts in ATI tool and ran the test and clocked nearly 9300 points in 3dmark06. Somewhat a record for a mid-range card methinks![]()
![]()
![]()
-
King of Interns Simply a laptop enthusiast
-
IT'S Over 9000!!!
-
ViciousXUSMC Master Viking NBR Reviewer
3dmark06 is a cpu benchmark by todays standards your high score is not in any way a reflection of your gpu or its performance. Its the big cpu overclock.
-
King of Interns Simply a laptop enthusiast
Fair enough. However without the CPU overclock I get 9113 so its not even 200 points more with the 10% CPU overclock. This evening I will try to push the GPU even further and see what I can get. So far everything is nice and stable as the fans are always going 100%. I think Vicious certainly knows how loud these fans are
-
ViciousXUSMC Master Viking NBR Reviewer
Not too bad
its a nice relaxing sound.
You have to be one of the few C90 owners at this point, you should get like a collectors item certificate for it -
King of Interns Simply a laptop enthusiast
Yeah lol. I think one of the even rarer C90P owners I certainly also am hehe. At least there are a few C90S's still hanging out on the forums so I can pretend I have company
I think it is amazing that so much performance can come from a tiny mxm II card
-
Nice score !!! If you do a search I was able to push my Toshiba F55 with a Nvidia 9700M GTS to over 10K on 3DMark2006 about a year or so back. Sure is fun trying to see the limits on these laptops
God BLlss
-
King of Interns Simply a laptop enthusiast
Pretty awesome JDeluna however the 9700M GT is a 256 bit card and is MXM III it has those advantages that additional size and power allow. The 4670 card I have is in the same class as the 8600M GT and 9600M GT on a tiddly MXM II 2.1 PCB
-
Wow man, congrats! Those are impressive scores! I really need to push my rig to see how high I can go, but with such high temps... I don't know haha
-
King of Interns Simply a laptop enthusiast
Thanks ryzeki! Just wondering guys is it possible to unlock the shader and voltage settings for ati cards. At the moment the shader clock is not shown and therefore not adjustable and the voltage settings are 1.2,1 or 0.9 is it possible to up it to 1.3 for example?
-
Congrats for the great numbers, mate. You really pushed it to the limit
I don't think that you would be able to run it @ 1.3v though. It would either wont flash the vBIOS or you wont be able to make vBIOS with that voltage at all. There is a third option as well - you would fry your GPU. It's highly unlikely though, because I think that you wont be able to flash it. Anyway, I'm not going to stop you. Let us know how it goes
-
King Of Interns - You will not be able to change the Shaders on an ATI card because they are essentially "unified" with the core. They run at the same speed as the core, and change with the core speed change.
-
RECORD!!! I am seriously envious
*turns green* Maybe I should have gotten an Envy (pun intended, lame
) King Of Interns..gimme your GPU. You have the best HD 4670 ever!!
-
King of Interns Simply a laptop enthusiast
Cheers melthd
what you want is C90 cooling mate! 4x 9500 rpm fans running at that speed all the time is what keeps everything nice and cool and allows such ridiculous overclocking. I am about to install the original Crysis and see how that performs, I hope quite well
-
That's exactly what I need in order to get better stable clocks. Sadly I don't have free time to refine my cooling
I can only wonder what kind of temps you'll get if you undervolt the damn thing... maybe below 0°C
-
King of Interns Simply a laptop enthusiast
Lol that would be nice but almost impossible unless you put it outside in the winter
At 837mhz/985.5mhz which I have now found to be the ultimate highest clocks I can achieve before artifacts appear I idle in the lower 40C's(idles at the full clocks no powermizer when OCed) and load at about 66/67C -
Congrats for a great score on such an old platform. Impressive.
(if it is a laptop...!)
Last edited by a moderator: May 8, 2015 -
King of Interns Simply a laptop enthusiast
Ha ha yeah okashira it is a laptop. It originally came with a 150W powersupply and supported officially up to E8500 3.15ghz core 2 duo desktop processors but then I decided to stick in a Q9650 and it worked nicely although it required me to upgrade power supply to 180W to be able to OC to 3.33ghz stably while I undervolt from 1.15 to 1.1 for my OC to save power. The C90P is a 15.4" laptop with a P35 chipset and MXM 2.1 unlocked module support a really great laptop although a little bulky for 15.4".
Whats more impressive is that at 3.33ghz 1.1V temps don't even reach 65C in prime 95 -
Yorkfield running 3.33ghz @ 1.1V? (is that voltage bios setting, measured idle or measured load?) Now I am impressed and jealous. The Q9450 (not Q9650, was a typo) in my desktop eats ~1.33V BIOS setting to run stable at 3.46Ghz. Nearly 80°C under load using a cooler that would never fit in a laptop (or...err.. how big is your laptop?) At 1.1V, you're probably all the way down to ~60W TDP. Voltage makes a huge difference.
What's the stability? I Run 100 Cycles of LinX at ~3.4GB dataset (approx 20 hours?) and accept no less myself. But I'm anal like that. My overclock is bleeding edge too. Literally one increment lower voltage (which is like 0.0025V on my mobo) or 1MHz faster FSB and it will eventually crash LinX. -
King of Interns Simply a laptop enthusiast
I haven't run such a test but after many hours Prime 95 small fft's 1.1 is plentiful voltage at 3.33ghz. Any less it will crash. At stock 3ghz 1.0375V is enough to be stable. E0 stepping is the key. All Q9650's are E0 and select Q9550's are also E0. I believe alot of Q9550's and Q9450's are C1 stepping which mean Intel didn't use as good quality silicon in the chips which in turn means a hotter chip that needs more voltage. Anyways I haven't much control I have to use RMclock to control voltages and multipliers (from 1.025V-1.15V) as there is no such option to change voltages in the bios it being a laptop. Nor can I alter the multipliers or dividers there. I can only change the fsb. 3.33 is right at the limit of the chipset stablility in the laptop anyways and is enough for me I find either way.
I think the weight is somewhere just under 7 Ibs so about 3.5KG but the power brick must be at least a KG it is massive! -
I haven't had or used a laptop since I sold my Lenovo T60p approx. six months ago. That thing would do less than 2000 3DMark06 points. On a whim I just placed an order for an HP Envy 15 2nd tuesday ago (Apr 6) (Thanks coupons, etc.)
I already have an i5-540M in the mail to upgrade it with, lol. Given my current desktop is the Q9450 and an ATI 5770, this might be a good complete replacement, except the internal storage. Maybe I can get close to 10000 with this new lappy. Anyways, hope you are enjoying your system. -
King of Interns Simply a laptop enthusiast
Dude it is a 15.4" laptop lol it is 6.83Ibs to be exact
The heatsink is large by notebook standards but small for a desktop processor however it has 3 x 9500 rpm fans all to itself so it does pretty well for itself.
I haven't read much about the envy what sort of GPU does that have? -
Now see, If I actually used my C90s for more than 10 minutes a month I might try this, but..... To much effort for those 10 minutes.
Nice job though. -
-
Or is it just for noise to scare off insects and crows? Ha Ha. I keed. I keed.
Seriously I think my desktop motherboard with memory and HSF alone will top 6.8 lbs, but you have a keyboard, video and a monitor there too, lol.
Envy has a 5830 Mobility which is a Desktop 5770 with lower voltage core and speed and slower memory. My desktop 5770 runs memory at 2600MHz effective, the 5830M runs 800MHz stock, some overclock to 1100MHz. So it's like 1/2 to 1/3 as fast as the desktop chip. But at 24 Watts, it's really impressive and that's why HP went with DDR3. The GDDR5 based 58xx laptop cards will easily double power consumption. It was a good design choice by HP, despite many others opinion... IMO. Everything is a trade off. It's not like HP could have slapped in GDDR5 without other comprimises (e.g. cost, battery life, integrity under load, weight) I'm happy for $900 after coupon and rebate. -
King of Interns Simply a laptop enthusiast
Yup it is pretty much the only flying object with jet engines operational in the UK at the moment
In the meantime I have tried the card for the first time in a game; Crysis of all gamesand it is beyond awesome. So far played at native 1680x1050 resolution with all eye candy cranked to maximum apart from shaders that I left at high and in dx10 mode and it still flies through the game. Maximum shaders and it is very smooth at 1024x768 so that is a massive performance hit. Still playing at 800/900 currently so still room for improvement
Very happy customer here ATI. This is world's away from the the 8600M GT in terms of performance. -
Nice runs man, get some OC'd vantage runs in
.
-
-
King of Interns Simply a laptop enthusiast
Is this score any good? I never done a Vantage run before.
-
awesome my man! that the fastest HD 4670 there ever was
my Vantage at my best OC is in sig. And it ain't even close to what you got there.
-
King of Interns Simply a laptop enthusiast
Cheers! Although you have to figure in my Q9650 into the score. What score did you get for the Graphics alone? Anyways I am pushing it further while I am typing here stress testing at 838 core haha
-
Hmmm, the stock 3DMark06 I get with the trial version nets a little over 7900 for me. I wonder how your quad core doesn't beat my T9900. I never got around to OCing my GPU. I'd rather enjoy it rather than chance it since I probably won't get another one. It already pushes 84C while gaming undervolted.
-
King of Interns Simply a laptop enthusiast
That is an interesting point. I don't know why either. Unless clock speed is more important in 3dmark06 than the number of cores. Even so though I thought it supported more than 2 cores so I should get a higher score right? I just remembered now that I was running two instances of GPUz, CPUz and had RMclock and AVG running in the backround. When you did your test did you disable all backround programs?
You get 84C while undervolted too!! I think I am too luckyat stock 1.2V running the core at 830mhz and the memory at 985.5mhz I have yet to get past 70C in either Crysis or ATI tool. Performance seems to show; it is smooth at 1280x1024 with all settings at very high in dx 10 or 1680x1050 with same settings except shader set at high.
I am going to up the pci-e frequency to 101mhz and see whether that helps the CPU maybe there is a bottleneck going on. -
-
yes gtx260m is 2.1, as is gtx280m
what you mean to say is fastest MXM type II card -
not even close to a 260m gtx.
my gpu score is 5600 with a small overclock.
his gpu score is 3448 with a huge overclock.
his gpu is 61 percent that of a 260m gtx. -
indeed, it won't get close because the gtx260 is a bigger card with more powerful components and gets more power through the power tab on the card introduced in mxm type HE and IV, which isn't on type II
-
I had ESET, GPUZ, and Speedfan, and WMP running when I did mine. Granted my mp3 sounded horrendous.
-
-
2805 GPU score. And isn't an i5 somewhat comparable to a T9900?
-
Apparently the high 3dmark06 score comes from the Q9650, which easily beats all mobile i7s and dramatically boosts the end score.
-
but YOU are wrong (and right at the same time)
2.1 type cards cover Type II, III, HE and IV
so the gtx260m and the gtx280m are both 2.1 and HE cards
the 4670 is also 2.1 but type II
type 3.0 was a design change in the cards, the 2.1 type all had the same basic design, they just got larger and had the extra power tab added, with a change in roman numeral type for each increase. -
King of Interns Simply a laptop enthusiast
The GTX260M and 280M are MUCH bigger cards even if they are also 2.1.
Also Echoshade I checked my CPU score against the score of other Q9650's and also the T9900 on google and it seems that it is performing as it should. If you CPU score does indeed beat mine I would love to see evidence of itI am pretty sure you must have run the test at a lower reso than 1280x1024 to get over 7900 at stock clocks with that processor.
-
I can't really get it to record or give me a sheet since it's a trial version. It only dumps the number out for me on their website. Where do you guys keep getting the full version?
Also since it was the trial version, I wasn't allowed to change any settings so I just went with what was default. -
-
King of Interns Simply a laptop enthusiast
I don't know about you guys but I am pretty proud that my mere mid-range MXM 2.1 II card is able to perform 61% as well as a slightly OCed GTX260M! Does this mean my card is closer to 70-75% the performance of a non OCed high end card
It sure makes me happy haha.
FYI sean MXM HE is also mxm 2.1 so basically he is agreeing with ya that there are mxm 2.1 GTX 260/280M but that they aren't MXM 2.1 II like the 8600M GT, 9600M GT, ATI 3650, 4650,4670 etc. -
lol at shadow lame pic lol
-
Interesting results. 9000 for a HD4670 is impressive, regardless of the Q9650. Did you run 3dMark06 at 1280x1024? I'm going to play around with my new SXPS and see if I can't break 8000 with my 'lowly' T9550 + 4670.
Here's what I have managed so far: http://service.futuremark.com/resultAnalyzer.action?resultId=13760850&resultType=14
My clocks are 750/880, and I've yet to see any artifacts during testing. I'm going to push it further now. -
good luck with OCing your GPU, aznofazns. But I see no point in OCing the SXPS, because even at default 675/800 the fans spin up when gaming. During OC, its even worse, and can be tiring to your ears. But then...for bragging rights...lol
Record 3dmark06 score for an MXM II 2.1 card - 9300!!!
Discussion in 'Gaming (Software and Graphics Cards)' started by King of Interns, Apr 18, 2010.