Not bad at all.
-
Settings used = fullscreen @ 1920 x 1200, no AA, settings to "Gamer"
If you ain't in the ice map, gamer works playable. If you are in the ice map, change to mainstream. Or you get unplayable FPS.
End of benchmark tool.
Good enough for you? Hehe -
Did this thread die or something?
-
Yah, well we'll get it up and running again!
For starters, I have been worried about some graphical gitches I have been getting in every game...they're very small, just little hair sized blips, but I can see them. In Fear I get black stretched walls sometimes, in other games I get the little half circled hair sized blips every once in a while. Ive always had artifacting with my cards for the first week, and they break in afterwards...but I'm thinking this is a driver issue due to the new card? I'm using the newest WHQL Nv driver. I was using Dox, but Fear was acting up ingame, and found out it was due to some stupid HID device. I dunno, should I be worried yet? I have a year warranty of course, and about 15 more days of return time with PcTorque. -
What was the best score for a single GTX 280M?
With the D900F I was able to push it around 16,000 on 3DMark06 (default everything).
As for the Core i7.... I got it to a whopping 4.0GHz in the notebook.
... too bad it was not stable, so I brought it down to 3.8-3.9 GHz and it was fine. -
Larry@LPC-Digital Company Representative
-
I assume his i7 965 contributed to that ridiculous laptop score.
-
Well, on stock settings, with my P9700 I score about 11,000 .... it's ridiculous how CPU dependent 3dmark06 is.
-
I did not right down all of the settings... but I reached 15k with 625/975/1650 (since I have a snap of that).
as I said in the review, 3DMark06's score is too affected by the CPU.
So compare the SM2.0 and HDR/SM3.0 scores:
SM2.0 = 6134
HDR/SM3.0 = 5860 -
Gophn, why don't you use Vantage, IMO it depicts performance differences much closer to real gaming performance.
-
-
Yeah, I've just checked your review and I've seen you did Vantage as well. Good review by the way. Just the way they should be done.
-
A number of people that gave me feedback actually likes the lengthy style of review.
I might start a new type of reviews..... the "Ultra-Detailed Review". -
i have an m17x with dual 280m, can any one give me the safest OC safe clocks, tanks
-
Soviet Sunrise Notebook Prophet
Start at 600/1500/950 and slowly work your way up in increments of 10MHz. Keep the core and shader clocks linked for now and make sure you do not exceed 1000MHz. You can skim through this thread to help you out a bit more. Each card is different in quality so you will need to find out the limit of your card on your own.
-
I've played around with the clocks of my video card today.
Conclusion:
Memories are stable at 940 Mhz
Core can go way up, I've got it to 650 and stopped with 1.05 V.
But I think I will stick with the 1V and do a 620/1550/910 -> runs much cooler with 1V.
And anyway, once you hit somwhere around 100C with 620/1550 clocks ... things go to hell, card becomes very unstable, although in real game performance it will rarely get close to 85C but given that I have to work with this notebook for a couple of years more from here on... I prefer to play on the safe side. -
-
-
I too would like to play it safe as I don't plan on buying another laptop within the next 2 years (although I may upgrade to the 380m in time) but would also like the best performance I can safely get out of this card as I'm quite the enthusiast gamer.
Edit: Ah, just noticed you have MXM 2.1, would that make much of a difference then? -
Argh - sorry for the double post, don't know why that happened.
-
Yeap, MXM 3.0 is much better for overclocking and stability. You should probably be able to run 620/1550/980 at 1.05V with no problem and fully stable. With the voltage at 1 I can only run 600/1500/900 fully stable, you might be able to run higher than that. You should get a boost of about 8-9% with clocks set at 620/1550/980 over stock.
-
-
Yes, to get the 1.05V you need to flash the BIOS. But once you have settled for your permanent overclock you should flash the BIOS anyway as it will make you card more stable and allow you to enjoy years of good gaming
.
-
-
I've done it many times... and a lot of other people on this forum have done it as well.
Just know what you are doing before you start. Do some research, googleing... etc. -
-
-
-
Regarding overclocking, I saw a big increase in performance in Saints Row 2 from a small GPU overclock on the GTX260M. Since the MXM 3.0B clocks are higher than the MXM 2.1 by default, I expect better performance when my laptop arrives from Kobalt. The GTX285M will give me about a 20% increase in performance over the GTX260M in gaming at its default clocks. -
Worst case scenario, I just ask them to change my order as they've yet to send it out and likely won't have by tomorrow morning regardless. The price is identical (actually a bit cheaper as they upgraded the HDD for free but I don't care enough to argue that) and as I only upgraded my RAM (hardly taxing work), I'm hopeful that they'd agree. -
If you decided to go with the standard GTX280M, I guess you could always use Nvidia Tools to up the clocks to the GTX285m's but stability isn't guaranteed since it varies from chip to chip. -
I wouldn't mind having to wait an extra few days if it meant getting a 10% performance boost in some games which Justin from Xoticpc showed the 285m was capable of over the 280m in Crysis on a different thread. -
-
Now to wait until the new ETA of Tuesday 16th - a little annoyed that it's taken 3 weeks instead of 2 but the added horsepower more than makes up for itAs for the MXM, that was the final thing that clinched the deal in favour of this unit instead of the ASUS - if the 380m provides an extra 30% boost or more against the 285m, I'd probably take the plunge in a year or so.
-
The benchmarks for the GTX 285M I saw are probably the same ones you have seen here. If you take a look at the table chart at the bottom you can see a 15% performance increase in COD. Regarding ATI cards, many say they get better game stability due to more support for Nvidia as well as more frequent driver updates although that is changing.
I heard that the Asus G73JH-A1 has a proprietary card format which to me would mean I would never purchase that regardless.
Regarding the G860, you also have the possibility of unofficial upgrades. For example, your warranty may have run out so you decide to upgrade to an unsupported CPU/GPU to prolong the performance of the unit at your own risk.
Examples are the previous model called the Nexus / NP8662, many upgraded to the QX9300 and / or the GTX280M but the officially supported max were Q9000 and the GTX260M. -
Also, the driver support was another reason I chose Nvidia over AIT - I currently have an x1700 and that card isn't even recognised on ATI's driver page and apparently doesn't exist. Bad experience made me wary of going for them again.
And you're right, that was the chart I had been looking at - still, only 5 more days and I get to benchmark it myself, I can hardly wait *crosses fingers for no more delays/problems* -
At some point, I might do similar ones to the benchmarks Core2Duo Vs Core2Quad. -
just tested Bioshock 2 at 600/1500/950..
but fallout 3 is working fine.
Crashed during start-up..
just want to confirm, Sager 8690's is at 1.05 v? I don't to flash my bios for overclocking the gpu, right? I'm using the GTX 280m card.
Thanks... -
No you don`t have to flash, but it is better than having nvidia system tools do it when you want to game. Also, its limited how far you can overclock before you have to increase the voltage.
-
Soviet Sunrise Notebook Prophet
-
First forgive me if this has been discussed, but nbr search function does not pull
up any thing on how to overvolt the 280m.
Ok, so i have this ovolted280m.rom file I downloaded from this thread. It looks compressed. Do I just burn an image to a cd and then let my notebook boot from the cd? If that's what I do, then what? -
Soviet Sunrise Notebook Prophet
Do not flash the ovolted280m.rom onto your notebook. I will say it again, do not flash the ovolted280m.rom onto your notebook. You have the new D900F, which bears the new MXM 3.0 GTX 280M. The BIOS you downloaded was from an MXM 2.1 GTX 280M from an M570TU. The D900F did not exist at the time that BIOS was put up for download. It is not compatible with your card. You need to extrapolate your own card's BIOS or the BIOS from another MXM 3.0 GTX 280M, preferably from another D900F, and modify it yourself as well as flash it.
-
got him before he did anything...he's in dons and scooks hands now...
and he is pushing up on 15k 3dmark now....go Q56!! -
I will be posting the detail settings/res I finally settle on for each game once I'm done tweaking to get the best performance and the average FPS I get for that though.
Kobalt finally updated their order status page - laptop currently in the build phase and scheduled to arrive Tuesday(Although they'll probably get a call Monday morning just to check that's the case and that I did indeed get the 285m - they've already assured me once on the phone that I would but this all just seems to good to be true and, with the amount of stuff that's gone wrong over the past 2 months in attempting to order this from various other sellers, I've become a little paranoid...)
-
"
Well, despite the fact that I didn't even know the 285m was coming out, looks like things worked out well for me in the end in having only decided on a laptop in late January after 2 months of searching - awesome. -
Is 280 gtx and 285 gtx identical exept for the stock clock.?
I am running mine 280 gtx at 600/1020/1500 with no problem at all. -
A few of us contributed to those benchmarks in that link I posted, it was very interesting to finally get a clear and comprehensive comparison of Dual vs Quad.
What will be interesting with the GTX 285M benchmarks are the performance in different resolutions. I would have done a direct comparison to your full HD screen but the problem is that I went with the Core i7-820QM as opposed to the base Core i7-720QM so it would make results unreliable to an extent even if most games are GPU dependent.
At least you can still switch resolutions and see the different results. I estimate a minimum difference of 9-10 fps or more in some cases. Either way, you will be able to max out Batman (apart from Physx settings), Mass Effect 2 will max out and all other games will either play on high, max or a combination. You won't have any problems enjoying any game that I can think of regardless of whether they are Dual, single, tri or Quad Core optimised. GTA IV should run pretty well too.
Fine by me, they are very rigorous in their testing where some other resellers just wouldn't bother before shipping. I bet you are glad you ordered from them now, if I remember correctly you got an upgraded HDD and a better GPU at no extra cost. Some one should let Kobalt know about all of these unofficial sales reps on these forums :laugh:
I remember you were planning to buy overseas before. -
I'm pretty happy having gone with Kobalt (although the laptop has yet to arrive so I shouldn't start singing their praises too much just yet) and finding out that I have a free upgraded GPU definitely made my day.
Ended up costing me £200 more to buy in the UK as opposed to overseas but the European warranty, larger screen (originally was going for the 15.3" model) and better GPU have all made it worth while - even if I've had to wait slightly longer than originally intended.
Can't wait to get my hands on this
Now to decide if I want to play Crysis at 1920x1080 with mainly high settings and shaders/shadows on medium or at 1600x900 with all on high (or CCC level 4)... -
Thanks man, I never did it! Thank goodness. -
What program to Overclock gtx 280m txs
The GTX 280M Overclocking and Benchmarking Results Thread
Discussion in 'Sager and Clevo' started by anothergeek, Apr 10, 2009.