The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    6990M review at anandtech.

    Discussion in 'Gaming (Software and Graphics Cards)' started by Meaker@Sager, Jul 12, 2011.

  1. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    actually we cant say anything, they are both on par in performance, and those physics simulation are but a gimmick, amd wins on bang for the buck, which for me is the one
     
  2. NateN34@gmail.com

    [email protected] Notebook Consultant

    Reputations:
    20
    Messages:
    187
    Likes Received:
    0
    Trophy Points:
    30
    Although if you want 3d, better monitor and drivers in your M17x, then you have to get the 580m.
     
  3. Phistachio

    Phistachio A. Scriabin

    Reputations:
    1,930
    Messages:
    2,588
    Likes Received:
    145
    Trophy Points:
    81
    Those arguments, since April 2011, are invalid ;)

    AMD has made an amazing breakthrough with drivers. They are full of optimizations and bug fixes, which pleases everyone. You can have 3D in AMD, you know? ;)

    So yeah, now it's a definate 6990M > GTX 580M.
     
  4. _Cheesy_

    _Cheesy_ Notebook Hoarder

    Reputations:
    9
    Messages:
    1,060
    Likes Received:
    0
    Trophy Points:
    55
    3D is still in it baby infant year and is not fully develop yet. I wouldn't waste the extra money for 3D screen and glasses unless it already came pre-quipped.

    Seriously folk, we been on 2D gaming since the dawn of men, it not the end of the world if you don't have 3D. :eek:
     
  5. doombug90

    doombug90 Notebook Evangelist

    Reputations:
    15
    Messages:
    311
    Likes Received:
    1
    Trophy Points:
    31
    Besides, there are already glasses free 3d laptops out there. I think there is a significant chance that stereoscopic 3d will be phased out in the next years to come, due to parallax barrier technology development. :D

    Either way, I'm still leaning over the 6990M in my next purchase.
     
  6. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,088
    Trophy Points:
    431
    I don't know. Glasses bring huge benefits that glassless displays cant. The major one being capable of displaying different information/content to each pair of glasses, and depending on how capable the machine, both in 3D.

    I will gladly take fullscreen co-op instead of split screen coop any day hahah. Additionally, watching completely different shows/movies would be possible too.
     
  7. _Cheesy_

    _Cheesy_ Notebook Hoarder

    Reputations:
    9
    Messages:
    1,060
    Likes Received:
    0
    Trophy Points:
    55
    WEll 3D glasses and 3D only work if you have a 3D screen which does not come with the NP8150 so therefor 6990m is a no brainer. However those who have the 3D screen and glasses, then by all mean go for the 580m, it only $200 more.
     
  8. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,202
    Likes Received:
    17,917
    Trophy Points:
    931
    I think a lot of people just want it for the screen to run at 120hz.

    You can't beat a fast first person shooter running an a monitor above 100hz.
     
  9. _Cheesy_

    _Cheesy_ Notebook Hoarder

    Reputations:
    9
    Messages:
    1,060
    Likes Received:
    0
    Trophy Points:
    55
    Wait what? What is hz? Is it really that important for BF3?
     
  10. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,202
    Likes Received:
    17,917
    Trophy Points:
    931
    Number of frames per second the display can render.

    So your graphics card may be doing 200fps, but a 60hz monitor will only show 60 of them. A 120hz monitor would show 120 of them, therefore smoother movement.
     
  11. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,088
    Trophy Points:
    431
    Well, Nvidias real 3D implementation is propietary so you will need to buy that. Unless you use their own non propietary 3D version, which is basically what AMD does, then you can use whatever you want.

    I don't think 580m has any advantage other than CUDA in a handful of programs only, by which it is not something that I care, thus I will go for the cheapest solution.
     
  12. mangos47

    mangos47 Notebook Consultant

    Reputations:
    84
    Messages:
    194
    Likes Received:
    0
    Trophy Points:
    30
    299 upgrade for 6990m vs 599 upgrade for 580m, I'm hoping nvidia make some move on their pricing strategy later.
     
  13. mangos47

    mangos47 Notebook Consultant

    Reputations:
    84
    Messages:
    194
    Likes Received:
    0
    Trophy Points:
    30
    theoretically true but I wonder how many ppl can see the difference though

    in the old CRT days 85Hz is all that's required to totally get rid of flickering for human eyes, at least for all the ppl I know of.
     
  14. terminus123

    terminus123 Notebook Deity

    Reputations:
    4
    Messages:
    766
    Likes Received:
    4
    Trophy Points:
    31
    120Hz screens show fast actions smoother--that's a fact. As to 3D--AMD's 3D driver support is good but Nvidia has more maturation.

    But seriously...unless you're in Crossfire or SLI, it isn't really worth it to go 3D since single GPUs just aren't that powerful. I would 100% rather max out a game with all the graphics settings on ultra than have medium settings with 3D.
     
  15. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,088
    Trophy Points:
    431
    There is no "max fps the eye can see" per se. It really depends on what's going on. Very very fast action sequences will seem all choppy and nonsense in 24 frames per second. If you are watching something really slow, like a cloud moving, even 24fps will seem smooth like butter. That's why fast paced games, typically FPS are preffered to be in 60fps and sometimes well in excess of such number.

    Likewise, RTS games are often more than playable and smooth at 25 fps.

    It just depends on the type of game/video/thing you are watching.
     
  16. key001

    key001 Notebook Evangelist

    Reputations:
    776
    Messages:
    657
    Likes Received:
    7
    Trophy Points:
    31
    I hate when they don't show the graph from 0
     

    Attached Files:

  17. inuyasha555

    inuyasha555 Notebook Enthusiast

    Reputations:
    0
    Messages:
    26
    Likes Received:
    0
    Trophy Points:
    5
    Woah, better then anything I have xD
     
  18. _Cheesy_

    _Cheesy_ Notebook Hoarder

    Reputations:
    9
    Messages:
    1,060
    Likes Received:
    0
    Trophy Points:
    55
    Isn't this where Vsync come into play?
     
  19. Megacharge

    Megacharge Custom User Title

    Reputations:
    2,230
    Messages:
    2,418
    Likes Received:
    14
    Trophy Points:
    56
    Those cards are looking sweet if the slide holds true.
     
  20. _Cheesy_

    _Cheesy_ Notebook Hoarder

    Reputations:
    9
    Messages:
    1,060
    Likes Received:
    0
    Trophy Points:
    55
    Your avi is so cute!:laugh:
     
  21. Tilt

    Tilt Notebook Consultant

    Reputations:
    19
    Messages:
    121
    Likes Received:
    0
    Trophy Points:
    30
    Yes and no.

    With vsync off you risk frame tearing from what I understand. So, your graphics card will be producing 300 frames, but since your monitor is limited to 60 you can end up with two frames being spliced together. For example, your character might be walking through the wilderness with a large mountain the foreground. You notice some ruffling in the bushes ahead and begin to strafe to the left. As you do so, your graphics card might splice together a couple of frames so that 70% of the screen makes sense, but now the top of the mountain is no longer in line with the bottom of the mountain. It might have been fixed, but I haven't come across anything saying so yet. :)
     
  22. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,202
    Likes Received:
    17,917
    Trophy Points:
    931
    Vsync introduces it's own problems like input lag, and you still only get 60 frames, it's still choppy in a fast FPS.

    There is still a very nice jump going from 60 to 120 fps.
     
  23. Partizan

    Partizan Notebook Deity

    Reputations:
    241
    Messages:
    1,697
    Likes Received:
    0
    Trophy Points:
    55
    Exactly, I play AC brotherhood at 19fps, seems fluent to me ;-)
     
  24. chiefalpha

    chiefalpha Newbie

    Reputations:
    0
    Messages:
    6
    Likes Received:
    0
    Trophy Points:
    5
    It would seem that this card from ATI, has actually beaten Nvidia's 580m. There are some people that say in some forums and most importantly, from people who resell Sager laptops, say that this card is on par with Nvidia, and adds that there are some games that 580m beats the 6990m in, and viceversa.

    As a side note, :cool: let me just say, having a friend from this forum post on this topic with a 580m and another with a 6990m so as to see the results would be much appreciated and in order. This will help us settle this with certification. :)

    This is a great thing, because this means to any gamer out there, that Nvidia is making people cough up some extra cash just to get this performance with...wait for it.......wait for it....... WITH DRIVER SUPPORT! :p

    Now wait a minute, doesn't the 6990m already prove to have support from the laptop manufacture that provided you with the card.

    PLUS, the new modder is coming out for ATI known as the CAT Modder! Yessir, the CAT modder is the new mobility modder which is currently in Beta testing but will prove to be well worth the wait. I have modded my Laptop with the previous mobility modder for my HD2600 and have to say, haven't experienced any problems at all.

    If this new modder comes out of beta testing and releases officially, I believe, that ATI would be the better choice for graphic cards over Nvidia, for price and performance factors.

    Im not a fanboy because i was really going to get the 485m until I saw this new card :D

    ALPHA
     
  25. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    It doesnt matter. brand loyalty is a strong thing plus the cuda and other things are indeed going to persuade users to go nvidia. You have to remember that things aint going always the way smallest price. I indeed think that the 6990m is a superior card to the gtx 580m, however in the high end those kind of things aint that important
     
  26. chiefalpha

    chiefalpha Newbie

    Reputations:
    0
    Messages:
    6
    Likes Received:
    0
    Trophy Points:
    5
    So we are agreed that this card is the she-it? I always wanted an Nvidia card ever since I bought a laptop, but I was stuck with an ATI, and I must say, that my card is on par if not better, than the 8600m by Nvidia. :)

    When I was introduced to overclocking, I got this results with a game known as DMC4. Before OC I got an average of 32fps. With OC, I got an average of 42fps. This is with everything high but AA off at a max res of 1280x800. :D

    My card is the HD2600 mobility @ 256 + 1,500 Shared memory. OC was @ 750/600. ;)

    In addition, I read on multiple forums at the time that ATI cards have better OC potential than Nvidia cards, because they are generally under clocked.

    All the same, I think the 6990m is the best card currently with all things considered. :D

    ALPHA
     
  27. AlienTroll

    AlienTroll Notebook Evangelist

    Reputations:
    319
    Messages:
    598
    Likes Received:
    0
    Trophy Points:
    30
    So ATi has a similar thing to PhysX and 3D and CUDA?




    COOLIO :D :D :D
     
    Last edited by a moderator: May 8, 2015
  28. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,202
    Likes Received:
    17,917
    Trophy Points:
    931
    Well AMD chips can support 3d tech, but really who would want that?

    120hz 2d is far, FAR better.

    As for PhysX (lol?) and CUDA (only thing I would use it for is fast video encoding, which you can do on ATI chips too.....)
     
  29. Pitabred

    Pitabred Linux geek con rat flail!

    Reputations:
    3,300
    Messages:
    7,115
    Likes Received:
    3
    Trophy Points:
    206
    PhysX alternative: AMD gives away GPU physics tools | thinq_

    And AMD has their HD3D... the Envy 17 had a 3D version with the Radeon 5850, I believe.

    And OpenCL in general is a replacement for CUDA. CUDA ties you to Nvidia... AMD took the "good citizen" route and went with the platform-agnostic language, OpenCL. Nvidia can run it, too. The soft-body physics in 3DMark11 is done with DirectX compute shaders which are also platform-agnostic.

    PhysX, 3D Vision and CUDA are just Nvidia's way of showing you they threw a lot of money at a developer to get an exclusive feature. It's actually nothing AMD can't do technically, and possibly do better.
     
    Last edited by a moderator: May 8, 2015
  30. hotblack_desiato

    hotblack_desiato Notebook Consultant

    Reputations:
    34
    Messages:
    124
    Likes Received:
    0
    Trophy Points:
    30
    I hate proprietary stuff. Which is why I hate nvidia physx. Plus it generally does nothing for gameplay at all. So AMD for me!
     
← Previous page