Some advice please..
I will be buying a Clevo X72000 primarily for gaming, I'm not really interested in OCing (for fear of destroying the thing) or running 3 different benchmarks at the same time. Will I really need the extra 'daisy chained' PSU for maxing out the latest games with a 990X and SLI 580M?
The double PSU issue does put me off a bit as it really seems like a cop-out on Clevo's behalf or an effort to shift a warehouse full of 300W PSUs. Has there been any mention/rumour of the future release of a 350W or 400W PSU for the X7200?
Thanks for any help.
-
-
Am I the only one seeing 14%-39% better performance than 6970M?
Or that because you have more power efficient cores you can OC the 580M too? -
Larry@LPC-Digital Company Representative
Yes if you are a hard core gamer or extreme bencher you may need the extra PSU. If not it should be fine. Many, many, are doing fine with one PSU...
_ -
probably not worth the extra $200, glad i jumped on the price mythlogic was offering
-
The reason they did that is that they couldn't find a PSU designer willing to exceed 300W.
-
Larry@LPC-Digital Company Representative
-
That HDD is ruining the screenshot babyhemi
-
Do you really believe does performance statistics were done in an control environment, using game titles which don't favor Nvidia or AMD and are in no way just a marketing tactic?
Until I see some clear cut benchmarks from a reseller I don't buy into the propaganda. It is faster than the 6970M, no doubt, but not by 30%. -
Tests provided by Nvidia. Both single card and 580M SLI
http://www.notebookcheck.net/NVIDIA-GeForce-GTX-580M.56636.0.html -
Whats the consenus here? I am very confused and its hard keeping up left and right with these screenshots.
Can some one please explain in words, what perforamnce gain % on the 580m do we see VS 485m?
Need to do a cost/performance analysis here...
So far I am getting the vibe that 580m isnt worth it at all esp since 485 is discounted now!
Please help.... -
Larry@LPC-Digital Company Representative
The 485M has gone down in price, the 580M is the same price the 485M was yesterday. You are paying the SAME price for a 580M as you would a 485M yesterday.
_ -
Yes I know...what about performance gains...is it worth it???
-
What's the deal with Witcher 2?
-
Larry@LPC-Digital Company Representative
I guess it would depend on what your favorite game is. According to the chart below if you play Lost Planet 2 or simular games it could be great gains. Of course more will be out later on the performance of the card
-
I don`t know. Never played it but I have heard that it is one of the most demanding games out there right now. And it is in 1080p with high details. But 11 FPS lol? Maybe someone else knows
-
Because everyone should rely on a chart where they pick their competition's drivers
There's a bottleneck of some sort because SLI only gives a 1 FPS jump. I don't pay enough attention to benches to know if that's a sign of memory bandwidth issues or what. -
Larry@LPC-Digital Company Representative
By no means rely on it, but it is some sort of guide...more results from non-bias sorts will be out soon.
_ -
Larry@LPC-Digital Company Representative
-
Speedy Gonzalez Xtreme Notebook Speeder!
I get 75fps on high settings with 2 580's but on ultra only half of that then if i try 3d on ultra i get like 15fps sad
lol
-
Electric Shock Notebook Evangelist
Ultra Detail probably turns on Ubersampling on Witcher 2 which renders the scene many times over with more and more details. I'd recommend desktop GTX 480 SLI for that lol.
I would estimate the 580M's performance as just under a single desktop GTX 470 or HD 5870. Ubersampling will bring both of those to their knees as well. -
Aside from the obvious 'is it worth the extra money in relation to actual gaming performance' question, I'm interested in the power consumption. Certainly in the P150HM, which doesn't have graphics switching, it will be very important how much more efficient it is when compared to the GTX 485M.
Or in other words; how much more runtime you can get in various situations.
I don't think we'll know for sure wether or not this is an overclocked GTX 485M or a completely new chip until someone attempts to flash the GTX 580M vBIOS onto a GTX 485M. -
That's also done with the Verde 270.61 drivers guys; we're up to 275.33 WHQL, with 275.50 beta (I think that's the right beta). They've been stated to give better performance gains with the GTX 500M series cards, so... that's most likely an understated performance report. Of course, since you'd need a modded .inf to install the new drivers, I don't know who can test it right now.
-
Thanks for that. Was having trouble finding a straight answer to that one.
-
In that case I think it's worth it. Sort of like the mid year refresh that Intel does. A little bit more for the same price. Not bad, but not a reason to upgrade either. See it like a bonus for those who want to buy a laptop now.
-
Not really a bonus. Why pay half more the price of the 485 for 10% gain which in gaming will be negligible.
Due to the price reduction of the 485m I really don't think the 580 is worth it unless more accurate benches are posted
That being said i would also like to know the improvement ( or lack of) power consumption fr battery life and it's heat temp #s -
Larry@LPC-Digital Company Representative
Lost Planet 2 Benchmark:
Settings:
Resolution: 1920x1080
Display mode: Full screen
Refresh rate: 60.00Hz
Vertical Sync: off
Anti-Aliasing: MSAA4X
Motion Blur: On
Shadow Details: High
Texture Detail: High
Rendering Level: High
DirectX 11 Features: High
Test A:
Rank: B
Average: 30.9
Scene 1: 31.3
Scene 2: 27.3
Scene 3: 34.6
Test B:
Rank: C
Average: 26.9
Scene 1: 26.9
_ -
You do realize that the 485M is being phased out?
-
i wish you had GTX 580m SLI overclocked to test. in test A my overclocked 6970s get 59.3, 50.41 and 73.4 with rank A 61.1FPS avg.
in test b i got 42.4 FPS. i imagine overclocked nvidia could hit 60 in this test. -
I dont know how long until the 485m is phased out but if you want one for a nice discount buy soon before there gone then you will HAVE to pay big bucks for the top of the line NVIDIA card. My guess would be the next 2-3 months until stocks gone.
-
Haha, my 460m sli is still faster than a single 580m.
Whne will i see a single gpu that at stock speeds does over 15000 pgpu in vantage ?
Hahaha, looks like i'm seeing it now, stock pgpu vantage score is 16240 ? Is this confirmed ? I'm seeing a pgpu vantage score of little over 13000 is that correct ?
Alienware m18x comes with a 330w psu.
Who designed that ?
The Aliens ?
Lol, withcer 2 mops the floor with 580m sli.
You pay 1500 euros for the cards alone to play in 10 frames per second.
WIth that money you can buy xbox and ps 3 and a copple of 20 games for each console and play untill you spill your gutts out, and you still have money left to go to.
I just don't see how is 580m sli worth it from THIS point of view.
Perhaps some further drivers will enhance the sli performance by more than 100%. It has happened before in the past with some titles.
I dont mind since i heard its a ty game. What im interested in is diablo 3 perfomance and last time i heard it will be dx9 and frames will be off the charts with 460m sli, and i saw in crysis 2, dx9 CAN make nice graphics.
The price you would pay for the 485 in the past was worth it since there was nothing you could get , from nvidia, for 200 dollars less that would be only 10% slower. But now, when you have the 485m for 200 less and 10% slower, its kinda hard to get the 580, unless you allready have the money you wanted to spend on the 485m.
Either way, i say both of them arent worth it, untill we will have 28nm gpus.
My opinion now, best bang for the buck is crossfire 6970. A little bit more expensive than 460m and very very powerfull for that money.
Fhysics and CUda is marketing bull.
Open CL will clear this out. -
Three scenarios:
1) Dell/Alienware deals with different PSU designers than Clevo
2) Designing a 300W adapter made someone willing to further it and develop a 330W adapter given the M18x came after.
3) Clevo wasn't willing to pay the $ that constituted the R&D to develop an adapter higher than 300W that their company wanted. Dell/Alienware being much larger, was.
I'd say option 3 is most likely. -
Delta makes the 330Watt PSU. Delta makes PSUs for Gateway and MSI etc...
-
^ Bytales' post (I am NOT about to quote and pick out text)
1 - the benchmark table for games is using apparently a driver we haven't even tested; I'd say the true power of that card won't come out until at least 4 months after its release; as with just about all nVidia cards, because nVidia appears to not understand that a good driver set upon launch is the best thing in the world. Still better than AMD, though.
2 - The Witcher series just might have issues marrying desktop/laptop technologies. Or simply be unoptimized for laptops in general. I've had problems with the game and random FPS drops that'd kill a benchmark here (the first game) on my 280M. I know the 280M isn't anything compared to the GTX 400M/500M series, but still. It should not have such issues; and I'm more willing to blame The Witcher's coding than my card. Same thing happened with GTA 4. Patch 1.4 = game runs swimmingly. Patch 1.5 = inability to get above a fluctuating 15-30 FPS excepting changing all settings to low. Hell, at all "medium", I get 10-25 FPS, and at all high-highest (a mix) I get 20-30 FPS. Game's fault.
The card just came out today self, I think it has more potential than is shown here.
3 - DX9 *CAN* do nice lighting/overall visuals, but models tend to be flat up close. That's its limit. DX10 is like Windows Vista. Fantastic, but heavily underrated and largely passed over. When it's properly implemented, it's often unoptimized, and not worth using *glares at BF:BC2 & Shattered Horizon*. That being said, the PC age seems to be moving into an era where people are optimizing their games once more, and not just taking advantage of the extra power over a console's hardware so that power is wasted and we need significantly more powerful machines to get slightly better visuals. Let's hope this trend continues, and we'll see more and more worth in these cards. Crysis 2 is not under any circumstances optimized. Another example is World of Warcraft, where DX11 gives people 30% better framerates than DX9.
As for paying for the 580M now, if I wanted to buy a laptop for $2400 with a 485M in it, and now I get it with the 580M, happiness abounds. But I agree, I wouldn't sell a 485M for a 580M. At least not with my broke self. Better buying a GPU than doing drugs though =D -
It looks like the 485m is going to be peddled off at fire-sale prices to get maximum profit for its successor, unless there's really a need for the greatest and fastest card available now it would make more sense on paying less for the 485m now and then using overclocking to reach stock 580m levels.
-
Hm, strange...
in 3DMark 11, my 6970´s (single in P170HM/ X7200) make over 31XX.. the GTX 580m only 30xx?
In Vantage, the GPU Score is about 1300 points higher... -
But doesn't that decrease the lifespan of the 485?
-
so I am about to buy a new P170HM from Malibal, trying to decide if I should Get the 485 and then buy a SSD from like newegg or go for the 580 is it really worth it?
-
Overclocking? NO, I have yet to see any degradation or any other negative effects beside more heat and power consumption from overclocking, and ive been OC'ing for years now. if you know whats going on and do i responsibly then NO.
-
I would say its to early to tell if the 580m is really that much better than the 485m. i would give it a few weeks (2-3) then we will have a better idea of what the 580m is capable of.
-
Exactly, increased heat...decreases product's life....
-
LOL! please dont start this, how many cases have you personally seen where responsible overclocking has led to decreased life (1-2 years) ???
the decrease you MIGHT see will be measured over years, many years, 10+years. go to Overclock.net and ask this SAME question and you will get the same answer. at the end of the day if you want to keep your laptop for 5 years or LESS then OC responsibly, if you plan on keeping it 15+ years then overclock some more
this is all i will entertain this topic.
-
A question for the experts out there, does it really matter that much that the 580m uses a different core from the 485m even though the only perceptible difference is a higher-clocked core? (GF104 vs GF116 for the 580m)
-
Yep. The core is what dictates performance, as the 580M has a better core, hence better architecture.
The GTX 550 Ti uses this core, I think. -
Anthony@MALIBAL Company Representative
A better core doesn't always equal better architecture. What it means, at least for Nvidia, is that it's the newer technology. It usually means lower power consumption and a smaller process, not necessarily increased performance. (as noted by the 460m to 560m. It went to a newer more power efficient process that allowed them to clock it higher, which is what makes the speed difference) -
If indeed it does then YES, it would mean that the OC ability, power consumption, and heat produced would be a different animal/possibly better in all aspects. the reason for core changes such as this one is to produce a more efficient product, i personally would rather have what NVIDIA considers a superior core (580m) then the older model (485m) but right now i dont know for sure which is which or if its in fact different at all so until further details/user reviews come out for the most part you can consider this a guessing game.
-
Using 460m SLI I get 25-35 FPS on medium settings. That game is very very demanding. But it's also the prettiest game I have seen.
-
Also tells a good bit of truth, all my knowledge is from the desktop side of things but it goes hand in hand for the most part.
-
Despite lack of a die shrink, it really does appear that the only significant benefit from the core change is the ability to clock existing 485m cards much higher and at a stable rate over the long-term without too much difference in power consumption and heat wear.
Now to see what my reseller will charge for the new cards compared to the 485m. -
Are you sure SLI is even working?
My single 6970M is running The Witcher 2 on Ultra, with Uber turned off. -
I think The Witcher series has problems with mobile nVidia cards? I mean really, 10 FPS with SLI 580Ms?
Also, it might just not like SLI/Crossfire. There are games funny like that.
Gtx 580m
Discussion in 'Sager and Clevo' started by materax, Jun 8, 2011.
![[IMG]](images/storyImages/np8150580m3.th.jpg)
![[IMG]](images/storyImages/np8150580m4.th.jpg)