![]()
Nvidia is offering their slower GPU for $300 more in a single config, or $600 more in SLI. Did Nvidia get owned this time?
/discuss
-
GapItLykAMaori Notebook Evangelist
Graph is somewhat misleading. The difference will probably be somewhat minimal between these 2 cards. ATi/AMD have always been better bang for buck, especially in the laptop world. Doesnt suprise me much tbh, nvidia flagship cards for laptops always carry a stupid price tag.
-
The graph has already been posted in another 580m GTX thread. Note that in the footnotes it is mentioned that ATI used an equivalent PC GPU for the 580m GTX results.
-
Worth taking note of is that the graph does not go down to 0%. It is chopped off at 60%.
-
Also, it should be assumed that the 580m is the 100 % value. Would be better if indicated.
-
Amd already put out a request for websites to pull that graph, because the desktop gpu equivalent card still performed better than the stock 6990m. So its more like similar or slightly better performance but at a much lower price, still a good deal.
-
I believe the graph is misleading on some points. But still, the 6990m is going to be better than the 580m, I don't doubt that. Maybe not by 20%, but by at least 10%, and at a cheaper price point, why would you pick the 580m?
-
Maybe because some people need laptop 3 weeks ago and 3 weeks ago only 580M is available -
-
besides you sure 6990M already available in Alienware 3 weeks ago ?
Note by available i meant the GPU already in builder hands ready to build and ship in 3 days -
Yeah little_one. I know what you mean, and if this was 3 weeks ago I would of bought the 580m instead. I mean right now though.
-
1. PhysyX on Batman AA,Mafia 2
2. Proven overclocking ability
3dVantage 16221 on GPU . 3d2011 3733(GPU Score) on 0.87(stock) voltage
3dVantage 17578 on GPU . 3d2011 4035(GPU Score) on 0.92V voltage
see here http://forum.notebookreview.com/sager-clevo/595787-extreme-gtx580m-overclocking.html
needs to compared to 6990M overclocking though
3. 3d on Clevo model -
Of course, software has to be specially tailored to take advantage of these features (Batman Arkham Asylum, Adobe CS5).
Generally speaking, the AMD 6990M offers a better value.
Cheers.
-
In clevo machine 6990M price = 560M + $ 170 . Great deal !
-
2. I hope the 6990m over clocks well too.
3. Wish it had 3d on the m17x, but I'll just hook it up to a 3d TV anyway. -
Is there any indication Nvidia will come out with a 590m? Or are the thermals already too high?
-
The 580M has maxed out Nvidia's options. Nothing beyond the GTX 560 Ti is feasible as a mobile GPU.
-
Heh, Nvidia has worked there way into a bit of a corner huh.
-
Much as you won't see a 6995 card either. They've both maxed out.
28nm for the next gen, next year. -
-
I can't wait to get my 6990m. I'll be posting lots of benchies, temps, and fps in my games.
-
I'm curious how beneficial it would be to upgrade from the 6970M to the 6990M, or to just overclock.
I love my system right now, however once the 6990M was released just a couple weeks after I got my laptop shipped to me (had to wait 2 months for the screen to arrive to Xotic), I began to wonder what kind of upgrade options were available to me.
Decisions! -
I bet these Clevo / Sager's will accept the new cards. -
The 6990m is about 15-20% better than the 6970m. It's not worth the time, or hassle to upgrade however, because if you think about it, 15% is only 46fps on the 6990 vs 40 fps on the 6970.
-
those who have the m17x r3 with 580m+Optimus? have you ever encountered the Optimus-related problems where in the game detects only the integrated GPU and unable to switch to the discrete one?
-
[email protected] Notebook Consultant
-
There have been a lot of posts lately about whether it will be possible to use nvidia drivers for the M18x w/580m SLI instead of sticking with the Dell drivers.
Does anyone know if M18x's with the 6990m CF would be subject to the same restriction? -
-
What frequencies are you running your 6970M at?
-
Star Forge Quaggan's Creed Redux!
-
-
That is correct fgocards. But, I have compared my vantage runs to others with that of a 6990. The guy in the M15x forums with a 6990 and a beaver as his picture lol. My OC 6970 gpu score was higher than his stock score. I know they are not the same cards.
Edit: At Walzii, I OC my card to 830/1150 when I game and bench. -
Here is widezu69's vantage run post.
Here is my vantage run. -
-
I mean you had to overclock your GPU by 22% on the core and 27% on the memory to get a 600 points advantage from a stock HD6990m.
He overcloked his HD6990m by 8% in the Core and 11% his memory and overcame your OC'd HD6970m score by 1k.
Of course if you have an HD6970 GPU or an nVidia 485m, it is not worth it to upgrade unless you want the absolute best, because the price difference for the performance is not that great. -
That was the only point I was making. I was just telling fgocards that it is not worth the upgrade when the 6970 is already so powerful.
-
6970m and 6990m are both fantastic cards. Now back on track, 580m vs 6990m...
Which one has more overclocking ability? nVidia has the tendency to run hotter in the first place, so I feel that you have more OC headroom with the 6990m. Yes or no? -
-
-
-
If it helps Physx isn't a huge boost and is only supported in about 3 games. The 6990m is a good bit more powerful too, which should make up the difference.
-
But only Batman AA (and I never cared for it, since walking past paper to make it ruffle is a 50/50 chance whether it works or not, and the cape animations aren't really realistic, even with Physx on) - this is from my i7 2600k/GTX580 desktop).
and one other game I don't recall right now, actually matter for GPU Physx, as the rest purely suck, or are just paid off tech demos (that claim to be games).
Lots of games using CPU based Physx, including quite a few console games. -
I don't see how anyone can say the 6990m is a "good deal more powerful". If it's even 10%, I'll be legitimately surprised.
And plenty of games utilize PhysX, what are you talking about? If cloth need be rendered, PhysX is probably there. Cloth is pretty common, because I'm sure parents aren't fans of nudity.
Anyway, there's really no reason to get the 580m with pricing the way it is. -
GeForce.com - Get the Most Out of Your GPU
and click on "physx" on the left hand side.
Which of those games have you actually played?
For me:
Batman AA: makes the cloak different, but generally still unrealistic (I'm told it makes the cloak different, I've always played this game with GPU Physx set to advanced - so I think it looks like crap, and that's without seeing the other version, I'd assume I'm not missing much). Also interesting to note, the console versions didn't use Physx.
Mafia II: No difference. Maybe in a mission there was a cutscene or a glass breaking differently? I dunno.
Mirror's Edge: Physx only made the game slower in the ONE area that uses them: glass breaking. And this is with a GTX580 + i7 2600k. If this setup cannot handle this effect (which, btw, also happens in this game without Physx), then I'd have little faith in this gimmick actually meaning anything. Especially since it doesn't add anything to the game.
Metro 2033: I have no clue where Physx was ever in this game. But whatever, maybe it was on the allies cloathing (they gotta be dressed or else nude), but then, they are always cut off from you via silly game mechanics and storyline.
Cryostasis: No clue this even had Physx if I didn't check. Luckily, I got the game free from eVGA, so I didn't have to spend any money on this.
SuperSonic Sled is a tech demo. Not even a fun tech demo, unless if you are one of the people that find fun in the chase the dot flash games.
Darkest of Days: If you even call this a half decent FPS, you are not a PC gamer. I'd bet a majority of people, if forced/requested to play this game will agree with me on this.
So in the 7 games I have played (some rather extensively) with GPU Physx, I haven't found any good reason for them to have it... I've played the games on both my old i7 930/HD5870 CF setup, and when that died, my i7 2600k/GTX570 SLI setup (later stepped up one to a GTX580, and sold the other). Just my opinion on GPU physx... it's rather useless in the games that have it. In some cases, terminally useless (mirror's edge and cryostasis come to mind...). -
i've a video of a desktop i7 2600k+6870 run Alice 2 with high physX using CPU, will an i7 2720QM+6990m do it, too?
if so, do i need to do some mods to just to force the cpu to run physX?
i don't play a lot of those physX games but Alice for one, is what i'm intending to play and i've seen the difference between off, low and high physX settings, it does some pretty cool additions to the gameplay and cosmetics itself.
Alice Madness Returns: PhysX on CPU Test - ATI/AMD GPU PC 1080p‏ - YouTube
there are other videos, too. allowing physx on cpu. but i want to know if a laptop (particularly 2720QM) can do that, too? -
-
please delete post -- wrong thread
-
On the 28th of July my m17x should arrive. Specs: i7 2820, 8gb 1333mhz, 2x320gb RAID 0, 6990m. Someone mentioned that it would be better to have some single gpu benchmarks. I will get the newest drivers for catalyst and the gpu and run some tests and post them if no more benchmarks have come out by then. I plan on running pcvantage, 3dmark11, 3dmarkvantage and hopefully cinebench11.5. Anyone suggest other tests to run? thanks
-
Star Forge Quaggan's Creed Redux!
-
They are similar in power and one is better for some things and the other for others. Long story short the 6990m and 580m are for all intents and purposes roughly equal.
Things may change as drivers mature of course.
Truth be told, at the prices we are seeing at resellers even if the 6990m is 5% slower in everything its STILL the overall better mobile GPU due to price and performance/price alone.
AMD HD 6990m vs Nvidia 580m GTX - Discussion Thread
Discussion in 'Gaming (Software and Graphics Cards)' started by fgocards, Jul 21, 2011.