yeah. i was was confused. so i'm stuck with the 9800m gtx basically right?
(still in a confused state. lol.)
-
dondadah88 Notebook Nobel Laureate
-
Soviet Sunrise Notebook Prophet
From the looks of it, yes. The D901C maxes with two 9800M GTX's, or perhaps one GTX 280M if it is confirmed functional by johnksss. I don't think Clevo will be making anymore GTX 280M's for MXM 2.1 slots.
-
i agree with soviet, in fact clevo is smart using mxm 3.0 for the new gen of laptops, if the d901c could use sli gtx 280m there is no m98nu or m17x that could beat it. so an old model is the father of benchmarking???? thats why there is no sli in the mxm 2.1 gtx 280m
-
What I'd like to know is why Clevo had to shift to an 18.4" model while alienware was able to put the same thing in a 17". -
i dont think so. the gtx 280m are just too new. it´s only matter of time that the gtx 280m show the real power.
-
The m98nu shouldn't be able to push the 280m any farther than the 5797. Also it'll be running 2 instead of 1 so they should heat up even faster, thus reducing the OC capabilities. Also you can just look at the shader and clock specs. A fully OC'd 9800m gtx vs a OC'd 280m only shows a 5-10% difference in max specs. -
Soviet Sunrise Notebook Prophet
-
patiente. i know that is the same g92, in fact nvidia still use g92 because even with an old chip is capabe to defeat ati cards hahaha. abut the heating in the m98nu, the gtx 280m is made in 55nm so it just run cooler ando don´t forget it´s a clevo, if clevo can put to work an i7 cpu in a laptop, sli gtx280m is just a piece of cake, im sure
-
Soviet Sunrise Notebook Prophet
There is nothing more to squeeze out of the G92M. Even if the cards were watercooled, the GTX 280M cannot overclock higher than it can now and maintain stability unless the voltage is raised, in which we cannot push farther than 1.05v. This is the hardware limitation of the G92M. Nvidia's R&D has better things to spend their money on, such as the 40nm GT200M cards promised to us later this year.
-
im not even going to test it if it doesn't have the sli connector on the bottom left hand side of the card. ill just send it back.
-
too bad hear that john. but gtx 280m sli is only mxm 3.0
-
can some one compare this with 9800m gtx sli please...
http://www.driverheaven.net/reviews.php?reviewid=782&pageid=9
what is your score jonh ^? -
just checked and it has no sli connector on bottom of mine.
-
sorry john
you should make your own -
that is bad new's ... Sh*T
johnksss how do your laptop perform in comparison of the this anadtech review of the m17x
http://www.driverheaven.net/reviews.php?reviewid=782&pageid=9 -
) that is a standard no update/no real fixes/ ran by a reviewer who reviews computers as is and not with the latest and greatest fixes...so we will have to wait till some real home users get their hands on this... competing against reviewers only seems to bring on the endless debate of what if's and you don't knows and yada yada...we should have real world answers soon though.
-
john but you test that single card in your lap?
-
single card?
just got beat at dual card vantage mark..asus w90 -
the 280...
-
That'll last about a month then sli 280's take it back. I'm gonna put the nail at 11,800. -
i was talking about the gtx 280m single card in the np9262. if you have the card why don´t you try to find if it works
-
Single card should work.
-
-
so i think thats the last gpu upgrade for the np5797, the next gen of gpu´s are going to be mxm 3.0
-
yup seems they are dead and the switch to mxm 3.0 will be the same as DTV
-
im inclined to agree...
-
... but you don't want to ....
the 4850's are Mxm 2..... you could try those but they are equal or lesser then your 9800m gtx so there would be no point -
nahhhh ati cards s*ck.
-
ATI Mobility HD 4850 is a MXM-III (2.x) module.... with ~50-70w TDP
MXM-II has a 25w TDP cap.... so you wont find high-end (256-bit) videocards using this form factor. -
i think mr. moo was talking about the version of the mxm in the hd4850, that card is a type III but 2.1 version
-
it gets confusing now since we have to discern from numerical and roman numerals. -
-
i figure ya'll knew i was talking about the cards for the m17 -
I've finally managed to place a GTX 280m in my M860tu that was with a 9800m GTS by default. It was hard as I had to remove the back plate of the 9800m GTS to place it on the back of the GTX 280m. Lots of thanks to H-emmanuel for helping me out here.
Previously I stated that it runs quite hot, yes it does, however, after removing the back panel and peeling off the black sticker surrounding the GPU core, my temps were fine. Now I'm running @ 550/1300/900, that would soon change though, when I know how to configure my card's bios rom via nibitor.
Many of you would be wondering why would I underclock the core and shader. It's because I'm using setFSB for that extra 200MHZ, which inadvertently overclocks my GPU. As you can see from the picture below, my processor is a tad bit pathetic compared to everyone else's. As I did not want to over volt my card, seeing as it runs quite hot (the temp in my country isn't that cool, hot and humid all year round, I had to do this, since the GTX 280m isn't that kind to overclocking on stock volt.
Perhaps the mark 06 score doesn't seem that good (5.4k GPU score on Vantage), but after going through some games, I know that it outperforms the 9800m GTS. I'll post more screenshots of benchmarks later.
Idle Temps after leaving it on for a night (was downloading Fedora 11 with Thunder 5)
I ran ATItool's artifact scanner for 2 hours and my temps were hovering 77C, which is good. I opened CPUID a little late though, that's why the minimum's at 72C.
Here's more, back with DMC 4 DX 10 @ 1680*1050 Super High without MSAA.
Crysis benchmark DX 10 Very High on all settings 50 runs max temp:87
And here's Crysis Warhead DX 10 Enthusiast settings Framebuffer Benchmark [UPDATED:Ambush, Avalanche, Frost]
The Last Remnant is a really hard game. Hard in the fact that you don't get to specifically choose what to do in a battle and the game rubs it in your face sometimes.
@1680*1050
3d Mark Vantage. This benchmark is seriously unforgiving even in the performance preset.
GTA 4 Benchmark
Statistics
Average FPS: 32.47
Duration: 37.39 sec
CPU Usage: 97%
System memory usage: 81%
Video memory usage: 86%
Graphics Settings
Video Mode: 1680 x 1050 (60 Hz)
Texture Quality: High
Texture Filter Quality: Highest
View Distance: 40
Detail Distance: 45
The others were all on high, except water which was on very high. Shadow scroll was on 6 while vehicle density on 30. I guess it's a bit too much for the P8400 to handle. -
With the amount of downclocking you performed on the Core and Shader it did not really affect it too much. I have installed the GTX 280M also on my Alienware M15X and with my T9500 CPU at 2.6GHz and stock settings of my 280M I am only getting a little over 10,700+ in 3Dmark2006. I hope this helps. God Bless -
Yeah I got the same thing stock with the P9600. 10,664 with 185.85. With a little bit of overclocking and a CPU swap, I got 3000 more points, heh.
Do you think the 260M would have been more comfortable in your NP8660? -
ok...so where you guys getting these new avatars at?
-
dondadah88 Notebook Nobel Laureate
lmao i'm saying the same thing. but i posted a for a new one in the nbr loudge(the name should be changed lol) hopefully aan310 is still on.
-
-
The card's overclock'd (after raising the FSB, the core and shader is operating more than default clocks), so I guess the temp's are fine for me as long as it doesn't exceed 95c.
P/S : Currently I'm using the 185.85 driver. Is this driver any good for the 200 series? -
I think 185.85 is the only driver out that officially supports the 200m series. However, I've had better experiences with Dox 185.85 than standard 185.85
-
no the m17x with the 280 runs 171 i think im not sure check it out
-
I'll get to posting DMC 4 benchmarks and stuff, as of right now, I'm looping 3d mark 06 50 times (excluding the CPU tests) for stability check. I've got GPU-Z logging in the background to check for min/max/avg temps.
I think I'll change my card's voltage soon. Does anyone know how to get Nibitor 5.0 to recognize the GTX 280m? -
Looping 3DMark2006 is not really a good stability test, try ATITools and use the builtin Stability Test, this is the best way. You will see that it will scan for artifacts and heat up your GPU. Again I hope this helps. God Bless -
Neah, the best solution is to run furmark and atitools in parallel.
-
Well, I did looped every test except for the CPU ones 50 times continuously (via the options), it didn't crashed for me and the test went on for 5 + hours, that counts for something right? The highest temp I've got after reading through the GPU-Z log is 81, I think that's satisfactory.
Anyway, will try out Atitool and furmark soon.
@Slouby
Here's a link to the Bios, I'm not a registered user so this is the best I can do.
http://uploading.com/files/RQKSC7FO/060A.bin.html -
If you have one of those cards, it would be great if you could help the community by giving your bios here :
http://www.mvktech.net/component/option,com_joomlaboard/Itemid,34/func,view/id,54482/catid,13/
Thank's
-
I've updated my post in page 59 to include DMC 4 DX 10 benchmarks as well as a 2 hours run of ATItool along with the temps and a simple benchmark of GTA 4. My temps aren't that high anymore because of a little thing I did when I reopened my laptop's back panel. I removed that black sticker surrounding the GPU core, I guess it was obstructing the heatsink from coming in contact with the GPU.
Now inclusive of CRysis warhead benchmarks -
dondadah88 Notebook Nobel Laureate
yeah dmc4 and the first fear will bring the heat out of you gpu's. i know that for a fact. the 3 scene is the worse as it stresses out your stream cpu's with all that fire. run dmc4 some more and see what you get.
-
After all the stability test with ATItool, 3dmark 06 and benchmarks, I guess I've found the my stable O/Ced clocks for the GTX 280m in the NP8660 without overvolting it.
In fact, I compared some of the benchmarks which I had posted with the ones anothergeek posted. I don't think I'm missing out much performance wise, am I? Let's not talk about my CPU.
The GTX 280M Overclocking and Benchmarking Results Thread
Discussion in 'Sager and Clevo' started by anothergeek, Apr 10, 2009.