Some advice please..
I will be buying a Clevo X72000 primarily for gaming, I'm not really interested in OCing (for fear of destroying the thing) or running 3 different benchmarks at the same time. Will I really need the extra 'daisy chained' PSU for maxing out the latest games with a 990X and SLI 580M?
The double PSU issue does put me off a bit as it really seems like a cop-out on Clevo's behalf or an effort to shift a warehouse full of 300W PSUs. Has there been any mention/rumour of the future release of a 350W or 400W PSU for the X7200?
Thanks for any help.
-
-
Am I the only one seeing 14%-39% better performance than 6970M?
Or that because you have more power efficient cores you can OC the 580M too? -
Larry@LPC-Digital Company Representative
_ -
probably not worth the extra $200, glad i jumped on the price mythlogic was offering
-
-
Larry@LPC-Digital Company Representative
-
That HDD is ruining the screenshot babyhemi
-
Do you really believe does performance statistics were done in an control environment, using game titles which don't favor Nvidia or AMD and are in no way just a marketing tactic?
Until I see some clear cut benchmarks from a reseller I don't buy into the propaganda. It is faster than the 6970M, no doubt, but not by 30%. -
Tests provided by Nvidia. Both single card and 580M SLI
http://www.notebookcheck.net/NVIDIA-GeForce-GTX-580M.56636.0.html -
Whats the consenus here? I am very confused and its hard keeping up left and right with these screenshots.
Can some one please explain in words, what perforamnce gain % on the 580m do we see VS 485m?
Need to do a cost/performance analysis here...
So far I am getting the vibe that 580m isnt worth it at all esp since 485 is discounted now!
Please help.... -
Larry@LPC-Digital Company Representative
_ -
Yes I know...what about performance gains...is it worth it???
-
-
Larry@LPC-Digital Company Representative
-
-
-
Larry@LPC-Digital Company Representative
_ -
Larry@LPC-Digital Company Representative
-
Speedy Gonzalez Xtreme Notebook Speeder!
lol
-
Electric Shock Notebook Evangelist
I would estimate the 580M's performance as just under a single desktop GTX 470 or HD 5870. Ubersampling will bring both of those to their knees as well. -
Aside from the obvious 'is it worth the extra money in relation to actual gaming performance' question, I'm interested in the power consumption. Certainly in the P150HM, which doesn't have graphics switching, it will be very important how much more efficient it is when compared to the GTX 485M.
Or in other words; how much more runtime you can get in various situations.
I don't think we'll know for sure wether or not this is an overclocked GTX 485M or a completely new chip until someone attempts to flash the GTX 580M vBIOS onto a GTX 485M. -
-
-
-
Due to the price reduction of the 485m I really don't think the 580 is worth it unless more accurate benches are posted
That being said i would also like to know the improvement ( or lack of) power consumption fr battery life and it's heat temp #s -
Larry@LPC-Digital Company Representative
Lost Planet 2 Benchmark:
Settings:
Resolution: 1920x1080
Display mode: Full screen
Refresh rate: 60.00Hz
Vertical Sync: off
Anti-Aliasing: MSAA4X
Motion Blur: On
Shadow Details: High
Texture Detail: High
Rendering Level: High
DirectX 11 Features: High
Test A:
Rank: B
Average: 30.9
Scene 1: 31.3
Scene 2: 27.3
Scene 3: 34.6
Test B:
Rank: C
Average: 26.9
Scene 1: 26.9
_ -
-
in test b i got 42.4 FPS. i imagine overclocked nvidia could hit 60 in this test. -
I dont know how long until the 485m is phased out but if you want one for a nice discount buy soon before there gone then you will HAVE to pay big bucks for the top of the line NVIDIA card. My guess would be the next 2-3 months until stocks gone.
-
Whne will i see a single gpu that at stock speeds does over 15000 pgpu in vantage ?
Who designed that ?
The Aliens ?
You pay 1500 euros for the cards alone to play in 10 frames per second.
WIth that money you can buy xbox and ps 3 and a copple of 20 games for each console and play untill you spill your gutts out, and you still have money left to go to.
I just don't see how is 580m sli worth it from THIS point of view.
Either way, i say both of them arent worth it, untill we will have 28nm gpus.
My opinion now, best bang for the buck is crossfire 6970. A little bit more expensive than 460m and very very powerfull for that money.
Fhysics and CUda is marketing bull.
Open CL will clear this out. -
1) Dell/Alienware deals with different PSU designers than Clevo
2) Designing a 300W adapter made someone willing to further it and develop a 330W adapter given the M18x came after.
3) Clevo wasn't willing to pay the $ that constituted the R&D to develop an adapter higher than 300W that their company wanted. Dell/Alienware being much larger, was.
I'd say option 3 is most likely. -
Delta makes the 330Watt PSU. Delta makes PSUs for Gateway and MSI etc...
-
^ Bytales' post (I am NOT about to quote and pick out text)
1 - the benchmark table for games is using apparently a driver we haven't even tested; I'd say the true power of that card won't come out until at least 4 months after its release; as with just about all nVidia cards, because nVidia appears to not understand that a good driver set upon launch is the best thing in the world. Still better than AMD, though.
2 - The Witcher series just might have issues marrying desktop/laptop technologies. Or simply be unoptimized for laptops in general. I've had problems with the game and random FPS drops that'd kill a benchmark here (the first game) on my 280M. I know the 280M isn't anything compared to the GTX 400M/500M series, but still. It should not have such issues; and I'm more willing to blame The Witcher's coding than my card. Same thing happened with GTA 4. Patch 1.4 = game runs swimmingly. Patch 1.5 = inability to get above a fluctuating 15-30 FPS excepting changing all settings to low. Hell, at all "medium", I get 10-25 FPS, and at all high-highest (a mix) I get 20-30 FPS. Game's fault.
The card just came out today self, I think it has more potential than is shown here.
3 - DX9 *CAN* do nice lighting/overall visuals, but models tend to be flat up close. That's its limit. DX10 is like Windows Vista. Fantastic, but heavily underrated and largely passed over. When it's properly implemented, it's often unoptimized, and not worth using *glares at BF:BC2 & Shattered Horizon*. That being said, the PC age seems to be moving into an era where people are optimizing their games once more, and not just taking advantage of the extra power over a console's hardware so that power is wasted and we need significantly more powerful machines to get slightly better visuals. Let's hope this trend continues, and we'll see more and more worth in these cards. Crysis 2 is not under any circumstances optimized. Another example is World of Warcraft, where DX11 gives people 30% better framerates than DX9.
As for paying for the 580M now, if I wanted to buy a laptop for $2400 with a 485M in it, and now I get it with the 580M, happiness abounds. But I agree, I wouldn't sell a 485M for a 580M. At least not with my broke self. Better buying a GPU than doing drugs though =D -
It looks like the 485m is going to be peddled off at fire-sale prices to get maximum profit for its successor, unless there's really a need for the greatest and fastest card available now it would make more sense on paying less for the 485m now and then using overclocking to reach stock 580m levels.
-
Hm, strange...
in 3DMark 11, my 6970´s (single in P170HM/ X7200) make over 31XX.. the GTX 580m only 30xx?
In Vantage, the GPU Score is about 1300 points higher... -
-
so I am about to buy a new P170HM from Malibal, trying to decide if I should Get the 485 and then buy a SSD from like newegg or go for the 580 is it really worth it?
-
-
-
-
the decrease you MIGHT see will be measured over years, many years, 10+years. go to Overclock.net and ask this SAME question and you will get the same answer. at the end of the day if you want to keep your laptop for 5 years or LESS then OC responsibly, if you plan on keeping it 15+ years then overclock some morethis is all i will entertain this topic.
-
A question for the experts out there, does it really matter that much that the 580m uses a different core from the 485m even though the only perceptible difference is a higher-clocked core? (GF104 vs GF116 for the 580m)
-
The GTX 550 Ti uses this core, I think. -
Anthony@MALIBAL Company Representative
-
-
-
-
Despite lack of a die shrink, it really does appear that the only significant benefit from the core change is the ability to clock existing 485m cards much higher and at a stable rate over the long-term without too much difference in power consumption and heat wear.
Now to see what my reseller will charge for the new cards compared to the 485m. -
My single 6970M is running The Witcher 2 on Ultra, with Uber turned off. -
Also, it might just not like SLI/Crossfire. There are games funny like that.
Gtx 580m
Discussion in 'Sager and Clevo' started by materax, Jun 8, 2011.