Wait a week or two for Cayman to launch. Even if you don't want the AMD cards there's a good chance it will force Nvidia to lower the price some on the GTX 580.
-
-
Nvidia just released the 540m, a midrange card... but there's hope that other 5xxm cards will follow soon.
-
Back to the topic, if there is going to be a GTX 580M I strongly doubt it will be the full blown Fermi; most likely will be a renamed desktop GTX 460 part. -
That's still a Fermi...
-
-
I'm talking purely mobile to mobile. A 460M holds it's own (trades blows) with a 5870, so I don't see AMD dominating. 80% of DX11 is again just a desktop score. Notebooks are more than 50% of PC sales these days, and that's where we are focused on in this forum.
As for the type of GF104 please refer to this:
NVIDIA 400M: DX11 Top to Bottom Solutions Now Available - AnandTech :: Your Source for Hardware Analysis and News
You'll note that the 470M is based on a core that could potentially go to 384 cores (likely at a lower speed granted, but impressive nonetheless). Give that one more respin (Sep 10 to Jan 11 seems like enough time to me) and you have the makings of a 580M.
So I think a lot is coming. And I don't think a 6870 with 10% more ability than the 5870 is going to keep up. I don't think it will lose by much either and price will keep a 6870 "in the game" if you will. But that's what I'm calling a competitive situation, not one where either is dominating.
Joy is, no matter which you might prefer, both should be purchaseable in a few months or less -
-
I agree with Phinagle. The fact of the matter is that the 470M is already pushing power limits, and the desktop chip it's based on performs below the desktop chip the 6900M cards will be based on, and knowing AMD it probably will scale and overclock better in notebooks too. This is especially the case if that TDP control function is enabled on them. The 6970M will take the lead unless NVidia implements some serious power-saving method that doesn't hurt performance.
-
6970M will drop a 15-20% deuce on the 470M.
I will bet on that statement. -
-
I agree with dropping-a-deuce... the only thing I wonder is whether AMD will "feel the need" to beat the 470M by that much. 10% will justify the marketing line of "fastest mobile GPU" and lower clocks have other economic advantages.
I think in 3 months NV's doom will being foreseen again, again.
6990 > 580 single
6900s > 460/560/570
6800M/6900Ms > 460M/470M
By ">" I mean overall a "better buy": superior in a balance of price/performance/power.
Separately, I'll be interested to see how much 470M parts drop in $$$ by... -
-
-
eBay? I meant retailers. It costs me more than 645$ to get it from distributors.
-
-
Then that's a steal
-
but shipping was like $70 for the 470m on the ebay page, so not really a steal. tried to sell mine for $600 but no one wanted it, hahaha
-
-
the lack of official support for the card is what sucking right now, hopefully after the new year thing will change.
-
Honestly Phin, at the perf levels we have today, I think you can't really go "wrong" with any of the high end chips.
So let's see how things go. Should be only a month or two till we have all the chips on the table, so to speak.
And
-
-
http://www.shrani.si/f/1y/Q/1WFleUNx/f56cf3e96fab7866177d0748.jpg
but the hd6970 should better in terms of performance/power
Radeon HD 6970 has 250W PowerTune Maximum Power
An underclocked hd6950 in a laptop is even a possibility due to the low 140W TDP in gaming power mode
AMD HD 6950 works at 800MHz -
Looks like a LegitReviews.com graph....specifically this one for the review they did on the ASUS Ares 4GB LE card.
Seems odd to me that they wouldn't make up a chart that compares it to the HD6800 cards and GTX 580 since they have all those scores already.
NVIDIA GeForce GTX 580 GF110 Fermi Video Card Review - 3DMark Vantage - Legit Reviews
I'm going to lean towards faked until I see the results confirmed. -
Even then what he claims is 140W in 'gaming power' mode... that's AMD's metric for average power draw while playing the average game. Furmark is another thing and is like 200W+.
No manufacturer will be brave enough to put a 75W 'gaming power' card in without a throttling mechanism, unless they charge so much to offset the kiddies melting their lappys doing OCed Furmark+Prime runs to see how big their e-wang is.
In any case the high-end 6 series laptop GPUs will be from Barts (the 6800s), not Cayman (6900s). -
IMO it's called Fudzilla for a reason. "FUD is generally a strategic attempt to influence public perception by disseminating negative and dubious/false information designed to undermine the credibility of their beliefs"
Fudzilla is one of those tech sites I still cannot fathom why it exists still or why people even go to their site. -
Anybody knows where the information about the GTX 580M on Wikipedia comes from? Sure a 384 shader would be awesome, although 336 seem more realistic. Although it would make some sense: Have the GTX 580M with 384 shader and 100W and ridiculous pricing simply to have the strongest GPU (and some people always want to buy the best), and have the GTX 570M with 336 shaders as the actual option that most manufacturers will chose. In any case, both 384 and 336 shaders is a huge jump from 288 shaders, and the architecture is almost the same... It's also bad that everybody is juming on to the 1,5GB+GPU-memory-train as it increases price unnecessarily.
-
-
SemiAccurate :: Nvidia to launch GF119 at CES
That's the latest info on the 580M. -
The biggest news from that, is GTX 580M being just 70W.
That definitively rules out an unlocked GF104. -
-
By "GPU only", are you saying "not including the memory and MXM module"?
Or am I misunderstanding? -
At least if they do, then you will be able to compare them with AMD's since their figures are just the GPU core.
-
If that's GPU core only, we're talking close to 85W, minimum.
-
580m in a laptop sounds like a good way to sterlize yourself.
-
-
Blog - What does TDP mean, Nvidia? | bit-tech.net -
just a quick question sorry for reviving an old thread, why has the GTX 480M as normal and SLI been taken from the kobalt site? have they finally decided they didn't want their laptops to double up as microwaves?
-
It's because 470M SLI is faster and cheaper. The 480M was a mistake.
Also, new GPUs are on the horizon. -
-
-
NVIDIA GeForce GT 540M refreshes mobile graphics midrange (update: hands-on pics) -- Engadget -
[edit] -
500M will be rebadges/respins of 400M GPU but none of them should be based on GF110. -
At best, you get 480Mv2. Since it's GF104 for mid range desktop and notebooks, maximum 384 shaders paired with 256-bit interface. At maybe 500mhz core clock. The real thing (GTX 560) is saddled up there at 820mhz. I want one of those, a P67, and Intel 2500k.
-
-
Still no 580m yet, but Nvidia just announced the 485m as their fastest mobile card. See here: NVIDIA makes GeForce GT 500M family official, introduces GTX 485M as its fastest mobile GPU -- Engadget
-
Well I was right, except they called the card the 485M instead of 580M. 575mhz is kicking. A full GF104 is an impressive GPU for a notebook. It will cost more than a GTX 580 though.
Maybe 15% faster than the 480M. Sli would be as fast as the GTX 580.
Ties in well with Sandy Bridge on the way. -
Google Translate of Notebookcheck's 485M review
Monster.
Gtx 580m?
Discussion in 'Gaming (Software and Graphics Cards)' started by Maxiiboii, Nov 15, 2010.