actually we cant say anything, they are both on par in performance, and those physics simulation are but a gimmick, amd wins on bang for the buck, which for me is the one
-
Karamazovmm Overthinking? Always!
-
[email protected] Notebook Consultant
Although if you want 3d, better monitor and drivers in your M17x, then you have to get the 580m.
-
Those arguments, since April 2011, are invalid
AMD has made an amazing breakthrough with drivers. They are full of optimizations and bug fixes, which pleases everyone. You can have 3D in AMD, you know?
So yeah, now it's a definate 6990M > GTX 580M. -
3D is still in it baby infant year and is not fully develop yet. I wouldn't waste the extra money for 3D screen and glasses unless it already came pre-quipped.
Seriously folk, we been on 2D gaming since the dawn of men, it not the end of the world if you don't have 3D.
-
Besides, there are already glasses free 3d laptops out there. I think there is a significant chance that stereoscopic 3d will be phased out in the next years to come, due to parallax barrier technology development.
Either way, I'm still leaning over the 6990M in my next purchase. -
I don't know. Glasses bring huge benefits that glassless displays cant. The major one being capable of displaying different information/content to each pair of glasses, and depending on how capable the machine, both in 3D.
I will gladly take fullscreen co-op instead of split screen coop any day hahah. Additionally, watching completely different shows/movies would be possible too. -
WEll 3D glasses and 3D only work if you have a 3D screen which does not come with the NP8150 so therefor 6990m is a no brainer. However those who have the 3D screen and glasses, then by all mean go for the 580m, it only $200 more.
-
Meaker@Sager Company Representative
I think a lot of people just want it for the screen to run at 120hz.
You can't beat a fast first person shooter running an a monitor above 100hz. -
Wait what? What is hz? Is it really that important for BF3?
-
Meaker@Sager Company Representative
Number of frames per second the display can render.
So your graphics card may be doing 200fps, but a 60hz monitor will only show 60 of them. A 120hz monitor would show 120 of them, therefore smoother movement. -
Well, Nvidias real 3D implementation is propietary so you will need to buy that. Unless you use their own non propietary 3D version, which is basically what AMD does, then you can use whatever you want.
I don't think 580m has any advantage other than CUDA in a handful of programs only, by which it is not something that I care, thus I will go for the cheapest solution. -
299 upgrade for 6990m vs 599 upgrade for 580m, I'm hoping nvidia make some move on their pricing strategy later.
-
theoretically true but I wonder how many ppl can see the difference though
in the old CRT days 85Hz is all that's required to totally get rid of flickering for human eyes, at least for all the ppl I know of. -
120Hz screens show fast actions smoother--that's a fact. As to 3D--AMD's 3D driver support is good but Nvidia has more maturation.
But seriously...unless you're in Crossfire or SLI, it isn't really worth it to go 3D since single GPUs just aren't that powerful. I would 100% rather max out a game with all the graphics settings on ultra than have medium settings with 3D. -
There is no "max fps the eye can see" per se. It really depends on what's going on. Very very fast action sequences will seem all choppy and nonsense in 24 frames per second. If you are watching something really slow, like a cloud moving, even 24fps will seem smooth like butter. That's why fast paced games, typically FPS are preffered to be in 60fps and sometimes well in excess of such number.
Likewise, RTS games are often more than playable and smooth at 25 fps.
It just depends on the type of game/video/thing you are watching. -
I hate when they don't show the graph from 0
Attached Files:
-
-
Woah, better then anything I have xD
-
Isn't this where Vsync come into play?
-
Megacharge Custom User Title
Those cards are looking sweet if the slide holds true.
-
Your avi is so cute!:laugh:
-
Yes and no.
With vsync off you risk frame tearing from what I understand. So, your graphics card will be producing 300 frames, but since your monitor is limited to 60 you can end up with two frames being spliced together. For example, your character might be walking through the wilderness with a large mountain the foreground. You notice some ruffling in the bushes ahead and begin to strafe to the left. As you do so, your graphics card might splice together a couple of frames so that 70% of the screen makes sense, but now the top of the mountain is no longer in line with the bottom of the mountain. It might have been fixed, but I haven't come across anything saying so yet.
-
Meaker@Sager Company Representative
Vsync introduces it's own problems like input lag, and you still only get 60 frames, it's still choppy in a fast FPS.
There is still a very nice jump going from 60 to 120 fps. -
Exactly, I play AC brotherhood at 19fps, seems fluent to me ;-)
-
It would seem that this card from ATI, has actually beaten Nvidia's 580m. There are some people that say in some forums and most importantly, from people who resell Sager laptops, say that this card is on par with Nvidia, and adds that there are some games that 580m beats the 6990m in, and viceversa.
As a side note,
let me just say, having a friend from this forum post on this topic with a 580m and another with a 6990m so as to see the results would be much appreciated and in order. This will help us settle this with certification.
This is a great thing, because this means to any gamer out there, that Nvidia is making people cough up some extra cash just to get this performance with...wait for it.......wait for it....... WITH DRIVER SUPPORT!
Now wait a minute, doesn't the 6990m already prove to have support from the laptop manufacture that provided you with the card.
PLUS, the new modder is coming out for ATI known as the CAT Modder! Yessir, the CAT modder is the new mobility modder which is currently in Beta testing but will prove to be well worth the wait. I have modded my Laptop with the previous mobility modder for my HD2600 and have to say, haven't experienced any problems at all.
If this new modder comes out of beta testing and releases officially, I believe, that ATI would be the better choice for graphic cards over Nvidia, for price and performance factors.
Im not a fanboy because i was really going to get the 485m until I saw this new card
ALPHA -
Karamazovmm Overthinking? Always!
It doesnt matter. brand loyalty is a strong thing plus the cuda and other things are indeed going to persuade users to go nvidia. You have to remember that things aint going always the way smallest price. I indeed think that the 6990m is a superior card to the gtx 580m, however in the high end those kind of things aint that important -
So we are agreed that this card is the she-it? I always wanted an Nvidia card ever since I bought a laptop, but I was stuck with an ATI, and I must say, that my card is on par if not better, than the 8600m by Nvidia.
When I was introduced to overclocking, I got this results with a game known as DMC4. Before OC I got an average of 32fps. With OC, I got an average of 42fps. This is with everything high but AA off at a max res of 1280x800.
My card is the HD2600 mobility @ 256 + 1,500 Shared memory. OC was @ 750/600.
In addition, I read on multiple forums at the time that ATI cards have better OC potential than Nvidia cards, because they are generally under clocked.
All the same, I think the 6990m is the best card currently with all things considered.
ALPHA -
So ATi has a similar thing to PhysX and 3D and CUDA?
COOLIO
Last edited by a moderator: May 8, 2015 -
Meaker@Sager Company Representative
Well AMD chips can support 3d tech, but really who would want that?
120hz 2d is far, FAR better.
As for PhysX (lol?) and CUDA (only thing I would use it for is fast video encoding, which you can do on ATI chips too.....) -
PhysX alternative: AMD gives away GPU physics tools | thinq_
And AMD has their HD3D... the Envy 17 had a 3D version with the Radeon 5850, I believe.
And OpenCL in general is a replacement for CUDA. CUDA ties you to Nvidia... AMD took the "good citizen" route and went with the platform-agnostic language, OpenCL. Nvidia can run it, too. The soft-body physics in 3DMark11 is done with DirectX compute shaders which are also platform-agnostic.
PhysX, 3D Vision and CUDA are just Nvidia's way of showing you they threw a lot of money at a developer to get an exclusive feature. It's actually nothing AMD can't do technically, and possibly do better.Last edited by a moderator: May 8, 2015 -
hotblack_desiato Notebook Consultant
I hate proprietary stuff. Which is why I hate nvidia physx. Plus it generally does nothing for gameplay at all. So AMD for me!
6990M review at anandtech.
Discussion in 'Gaming (Software and Graphics Cards)' started by Meaker@Sager, Jul 12, 2011.
