You have no idea how often that happens. I work for a major telecom company in the US, and we have atleast 1 fiber cut a day because of idiots like that. Crack heads even cut the cable from towers thinking its copper, when it clearly says FIBER OPTIC CABLE written in bright orange.
-
hahahaha
-
lol the reason its funny is because there was one cable that was cut for a whole country.
-
Oh wow. That has "EPIC FAIL" written all over it.
-
Hi folks! I'm thinking of getting a high-end gaming laptop and right now, i'm wondering if the 580m warrants the extra $200 over the 485m. How much of a performance increase will it have in terms of fps, 5 - 10 fps?
-
http://www.notebookcheck.net/NVIDIA-GeForce-GTX-580M.56636.0.html
benchmarks out.... huh. seems to vary widely... some games are seeing improvements of about 15%... some games are seeing almost none. -
And in BC2, 485m performs even better than 580m haha But all in all, we are seeing a 10% - 15% increase and this is with the unpolished drivers so i suppose we will see an even bigger jump in performance when the new drivers are released. Thanks for the link!
-
as was said before: since the architecture and the gpu core are pretty much identical (the nomenclature GF104 vs. GF114 is just a PR thing, since it only highlights a slightly different transistor placement of the EXACT SAME chip in order to (very slightly) improve its performance/watt efficiency), its highly doubtful that newer drivers will bring any performance boosts to the 580m. but even if thats gonna be the case, both the 485m and the 580m will experience the same boosts, so my money is on the difference of ~10% staying exactly the same
anything else is just wishful thinking! sad but true, the 580m is just gonna give u 10% more, thats it! i definitely expected more myself, tbh...
cheers
PS: if anyones interested, heres a summary of the benchmarks listed on notebookcheck.com so far:
580M vs. 485M (positive values favor the 580M)
synthetic benchmarks (3DMark, Vantage, etc.):
ranging from -3% to +14%, averaging +8.0% across 15 individual benchmark values
game benchmarks (across all resolutions & detail settings where both single 485M and 580M were tested):
ranging from -9% to +33%, averaging +10.7% across 39 individual benchmark values
that should settle any further speculations as to the performance differences
-
@jaybee83 agreed and repped, the performance increase due to better drivers will apply to the 485m as well.
-
any updates? still deciding about the GTX 580m. So basically it's a 9.3% OC 485m with lower voltage...meaning lower temps?
-
What?
It's a card which works slightly better with the same power consumption and temperatures. There are no better or worse temps, just a 10% performance gain over the 485M. The above benches are proof of that. It is exactly what everyone should have expected, it is exactly what Nvidia delivered. OC doesn't mean anything in this case, because you can OC both cards the same way, meaning that the 10% performance gain will remain even in the case of overclocking.
Now the important question is... how will the 6990M fare when going head to head with the 580M. -
well put blacky! especially concerning the OC potential.
if i had to guess, id probably place the 6990M either on par or slightly better than the 485M. Beating the 580M, thus requiring a performance boost of approx. 20% over the 6970M is a bit unrealistic imho...
cheers -
I think you are wrong. I think 6990 will definetly be better than 485, because now and there are apps where 6970 fares better than 485, i think 6990 will be faster across the board than 485m , and the battle , the final battle of the 40nmm gpus will be between 6990 and 580.
Remember, that compared to 485 vs 580, 6990 will have more shaders, or whatever are they called on radeon devices, than 6970. Even at the same clock, 6970 vs 6990 there will be a noticeable improvement, or so i.d like to think.
We will see, but what will undoubtetly be the case, is that dolar per dollar a 6990 crossfire will be way more performant than a 580 sli. -
i think 6990m may equal or outperform 580m
see the pc colamax HD 6850 green? the card dont need any power connector, just mobo pcie 75w, but runs full hd6850 speed!
ColaMax Radeon HD 6850 Green ? ???? -- qk123
ati/amd is better than nvidia at performance per watt which plays an important roll in notebook
-
If the 6990 turned out faster than the 580m, Nvidia will through their 585m to keep up the rank.and here we go again....It seems like the ongoing compitition between the red and green teams is what driving the desision making and that is why we see small increments in upcoming cards. the 485m sufice till kepler, but the 580m is a pre-response to the potential 6990 and the 585m is another response is the 6990 beats the 580m.
-
Why would there not be ongoing competition?? You really expect anything else?
-
As long as they continue to be compatible with MXM IIIb in my Sager, I'll be a happy camper. Give me upgrade options in a year or so.
-
Nvidia has nothing left.
-
Interesting...I think you're right and as a supporting point Ichime's ES 6970M with 1120 shaders benches about 10-15% above the standard 6970M. If AMD ups the clocks a little, we may see a solid 20% gain and the card will be very close to the GTX580M. Still, everything is just a specualtion but there are some rumors that the 6990M will be officially announced in a few days...
-
i never said anything about nvidia having the better performance per watt ratio, thats not really the issue at hand here
and true, there are benchmarks where the 6970m outperforms the 485m, but all in all, it shows an approximate 21.1% (vs. 485M) and 25.5% (vs. 580M)
lower performance (source: 51 / 35 gaming benchmark values on notebookcheck.com, respectively)! youre welcome to go over the numbers urself, in case i made a mistake calculating the averages.
i highly doubt that this gap is gonna be easy to close, no matter the amount of increased shaders and performance per watt ratio. still, i would always be happy to be proven wrong, so that we could all enjoy a monster gpu at a reasonable pricing!
cheers -
you know whats weird on notebook check. the p150hm with a 485m and a 2920xm is always a few fps slower than the one with the 485m and the 2630qm. whats the deal with that, do they have different drivers, or is the 2920 sucking up so much power that its slightly limiting the gpu?
-
I didn't say there shouldn't be competition. I just believe it should be more "responsible" competition in such a way where it is not all about who's cards are faster or who's cards were released first. heck, Nvda or Ati are not getting enough chance to improve their drivers or implement better software utilization just because they're all busy trying to beat the hell out of each other.
-
Your numbers are horrible, especially the 21.1% slower than 485M claim.
Anandtech's testing.
The 485M and 6970M are virtually equal. -
Right 485m and 6970m are for all intents and purposes equal. 485m wins in some games/benches and 6970m in others. The 6990m will most likely beat the 580m in all instances too.
Now that 6970m has been dropped, the 485m is your best price/performance. And I have a feeling that the 6970m was dropped now for two reasons. One is that the 580m came out and dropped the 485m by $200 making it close in price to 6970m, plus the profit margin may be higher for the 485m than the 6970m. So having comparable cards at comparable prices, they chose the one with the higher profit margin. -
hmmm, havent noticed that before. the 2920xm model was tested in january whereas the 2630qm model was tested in february. could very well be that they were using older / beta drivers the first time around? but then again, the differences are pretty minor at best, averaging around 2-3%. besides, the cpu doesnt play a major role in gaming, especially when it comes to sandy bridge. the gpu is the limiting factor looong before the cpu is maxxed out! they even put up a test on exactly this subject quite recently: Sandy Bridge Processors in Gaming
theres barely any discernable difference between the quadcore models, and sometimes the 2920xm even shows the lowest fps values, but thats all inside the measurement accuracy tolerance.
these arent "my" numbers, im just quoting notebookcheck`s benchmark tests
cheers -
A bit off-topic, but interesting that the idle power consumption was pretty much identical across all CPU's. Good info there!
-
ive seen a lot of talk about the 6970m being phased out or dropped, but so far all the resellers ive checked are still offering the card! so....what info is that based on?
-
I meant Sager specifically. None of the Sager resellers offer it, because they order directly from Sager who dropped it. Some Clevo rebuilders may because they bought the cards themselves and still have some in stock.
-
@jaybe83 its interesting that the cpu is moving along faster than the gpu especially considering that Intel basically has no competition in the mobile market. and it was also interesting that the xmg a501 pulled more than 120 watts with both the 2630qm and 2720qm, yet it only had a gt 540m which has a max tdp of 35 watts. further more how could it be pulling that much, the power supply was only 90 watts! I wonder if it would have been different with a gtx 485m or 580m.
-
You're mistaken in this regard. Most games *DO* make little use of the CPU, it's true, but these people used the wrong games to test. If they tested with games like Metro2033 and GTA 4 you'd see a huge difference. Most games that recommend i7 CPUs don't actually need them. The i3 CPUs are fine for those purposes as well because they provide effective quadcores. Even though it's only hyper threading, it *does* give an effective boost when programs require CPU power. Black Ops showed a real difference because that program is so CPU hungry it will grab whatever it can. It uses ~80% of my i7-950 just to load a map, even. There are more CPU-intensive games to call, but I forgot them. I don't really remember them off-hand hehe. But what I do do, is I use playclaw and its CPU/GPU overlays when gaming. So I can see when my CPU is being eaten up often. So I know there are some CPU-hungry games out there. But with the majority of games coming out for Xbox 360/PS3 with shoddy PC ports, it's not like they're going to require *that* much power.. I mean, the graphics are acceptable on 6-year old hardware. You gotta think =3.
That wall of text being said, with BF3, Bethesda-based games and Valve games all being PC-first, if PC gaming gets a solid refresh within the year (as seen by MW3 devs suddenly rushing the idea that they should, in fact, add dedicated server support for PC MW3, something they weren't planning to do in the first place), then REAL CPU-intensive games may very well see the light of day soon =). -
any idea if the 580m will support 3D?
-
it does in the m17x. oh wait no it doesn't
-
main reason I haven't pulled the trigger yet. -
It does support 3D, but it has not been implemented in the drivers.
-
soon enough it will be approved, if you look on malibal's site you can order the 3d screen with the 580m and then eventually it will get approved, but there's also a chance that it wont.
-
their page says you need the 485m for 3D
-
yes but you can still order it that way
-
naturally, some games are more CPU-demanding than others. but still, if u compare the black ops results, theres just a 23% difference between the top model i7 quadcore and the "lowly" i3! on the other hand, compare a lowrange gpu of the current architecture (e.g. Desktop Radeon 6670) with its high-end counterpart of the same generation (e.g. Desktop 580 or 6970) and ull pretty much triple or even quadruple ur framerates in games! thats not going to be possible to get with a cpu upgrade, even when u OC to 5Ghz with liquid nitrogen
(the same goes for mobile hardware as well, of course)
cheers -
I didn't say it does that mid-game, I just said it was CPU-hungry. When loading maps though, it gobbles up CPU power like crazy. As for the rest of my post, there will be other games that will want high CPU usage, it's simply a matter of when they'll come out. Plus, there are other things people do while gaming, for example, playclaw uses my CPU to record. I give it all 8 threads, it uses what it needs, and I'm happy. I don't need to be doing video editing just to need a fast CPU =3.
I will agree that super overclocking to like 4GHz and stuff is unnecessary though. At least for now. -
So talking about minute performance enhancements-which would garner the most gain in performance?
Intel i7 2630QM 2.0-2.9GHz--> i7-2720QM, 2.2-3.3GHz
or
GTX 485m-->GTX 580m -
In games? 485m->580m, of course.
-
well, depends what ure planning to do with that hardware. if its gaming then definitely the 580m, if its cpu-intensive tasks like file compression / encryption, video converting (in case ure not using CUDA, of course), etc. then id go for the stronger cpu. in terms of raw performance, both pairs have a gap of roughly 10% separating them.
cheers -
Alrighty..thanks. About to get my first notebook! and this decesion has been confounding me for so loong!
I think I may wait for some hands-on reviews of the GTX 580m--see how well it overclocks, its temps, etc. -
Larry@LPC-Digital Company Representative
In case you would like to see a bios shot of the Sager NP7282 - Clevo x7200 R2 with the 580M's in SLI....
-
Not really, only if you're willing to send that machine my way free of charge!
-
i just want that bios and ec!!!!!!!!!!
-
Has anyone tried an M18x with GTX 580m SLI? I think it's the only notebook out there with both a Sandy Bridge CPU and the option of an SLI configuration for the 580m, so it'd be interesting to see how it performs. So far as I'm aware, the NP7282 is only available with 1st gen i7s at the moment.
-
They don't have a 580msli m18x right now except on the config page to order.
-
First gen desktop i7s are still as fast as 2nd gen mobile Sandy Bridge.
-
on that note i found that whilst playing bfbc2 on full maxed setting that my cpu usage was 30%, so honestly it wont make a difference, and i only have the 2630qm.
Gtx 580m
Discussion in 'Sager and Clevo' started by materax, Jun 8, 2011.