The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Is the GTX 560m good enough or should I go for the GTX 580m?

    Discussion in 'Gaming (Software and Graphics Cards)' started by EvilEvil, Oct 21, 2011.

  1. EvilEvil

    EvilEvil Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    I've been thinking of picking up a gaming laptop and can't decide if the GTX 560m is more than enough for games or if I should go all out for the GTX 580m. What do you guys think?
     
  2. Baka

    Baka (・ω・)

    Reputations:
    2,228
    Messages:
    2,111
    Likes Received:
    20
    Trophy Points:
    56
    It depends on what your standards are. If you want to game on the highest possible settings and want it to last 2-3 years at high settings, 580M is the way to go. 560M would get you up to medium-ish in 2-3 years though it can run most games on high or near max right now
     
  3. funky monk

    funky monk Notebook Deity

    Reputations:
    233
    Messages:
    1,485
    Likes Received:
    1
    Trophy Points:
    55
    If you have the option then go for a 6990m over the 580m. While the 580 is *slightly* more powerful, it's trivial to the point of just burning money, the only other difference is that you get physX and CUDA.
     
  4. SlickDude80

    SlickDude80 Notebook Prophet

    Reputations:
    3,262
    Messages:
    4,997
    Likes Received:
    3
    Trophy Points:
    106
    i have lots of experience with this GPU. It will have a hard time playing at 1080p with IQ levels up on the newer more graphically intensive triple-A titles. I had to put The Witcher 2 down to 1600x1024 and do a bunch of tweaks to have it running smooth, Crysis 2 DX11 with ultra texture pack is playable at 900p on high, but still slows down in some places.

    i don't have any of these issues with my current computer with a 6990m. Both games running at 1920x1080...the witcher 2 has everything on except ubersampling and Crysis 2 is running 1920x1080 on ULTRA

    i'd go with the 580m for serious gaming...a good compromise is the 6990m. You can save cash and put it towards other future upgrades
     
  5. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    The 6990M is as fast, and faster in some cases. Is there a reason you'd skip over it, for the outrageously priced 580M?
     
  6. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    Well ove always loved nvidia more than Ati for a billion of reasons. First driver support and general "better" performance across new/old titles; outstanding OpenGL engine; Physx and cuda (Physx games are just the cherry on the cake, just look at Alice, Batman AA and the upcoming AC).
    I know that most Ati users say that Physx is gimmick, but this world is free and everyone can choose their own products.
    Since I'm clearly on nvidia side, I say go for the gtx580m, you will not regret it. Also I do use 3d Vision alot and nvidia own 3d implemetation is awesome, you have to see Crysis 2 in 3d at ultra, is just outstanding.
    Also I had Ati as well with crossfire as well (5870) and I had only problems an worst performance in games.
    Since I've swapped them again for my current gtx570 I've just put the word stop to continuos problems and silly hotfixes that fix 1 thing and break other 100 thousand everytime.
    Just check guru3d amd Ati forum, sometimes it's embarrassing.
    Oh well.
     
  7. redrazor11

    redrazor11 Formerly waterwizard11

    Reputations:
    771
    Messages:
    1,309
    Likes Received:
    0
    Trophy Points:
    55
    This is the first post I have ever read from Baka where Baka did not self reference Baka.
     
  8. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    My only beef with PhysX is that it's only used in one or two gamer per year.
     
  9. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    Thats true, and i admire those devs who choose to not implement it - or - implement it by user flavour as an option, like it happens with all physx games (excluding those that are "forced" to use it no matter what, such SHIFT1 as an example).
    I really like physx when its implemented like it should. I loved it in Batman 1 and fell in love with it with Alice. It really adds a new layer of depth to games, all eye candy goodness.
    If you are like me a graphic-intensive and eyecandy lover, physx without doubt its a god send.
    With Batman AC they seems to have raised the bar of physx quality effects. By the short video posted on geforce.com, althought the game runs the same without physx, it does really miss alot, too much for my tastes.
    The dollars flying and scattering around the floor in catwoman robbery section - or electricity sparks, fog, volumetric lights and so on are one of the most finest physx implementations ive seen.
     
  10. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    Not a god send since Havoc and other physics engine can do anything PhysX can do without discriminating against gamers who chose not to pay more for less performance from Nvidia.
     
  11. Ari3sgr3gg0

    Ari3sgr3gg0 Notebook Consultant

    Reputations:
    18
    Messages:
    194
    Likes Received:
    3
    Trophy Points:
    31
    If you can afford getting the 580m or other powerful option, go for it. That way you don't have to upgrade in such a short time and can still enjoy the power of a gaming laptop. Until not to long ago I had my laptop with an 8800m gtx still playing games like Battlefield Bad Company 2 and such
     
  12. Dakks

    Dakks Notebook Consultant

    Reputations:
    0
    Messages:
    190
    Likes Received:
    0
    Trophy Points:
    30
    Go for the strongest gpu you can afford, the 560m is comparable to the desktop gts 450, if that means anything to you.
     
  13. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    There's a catch: havock runs on CPU, and althought it's not heavy, it needs to be implemented by the devs when optimizing their games to use more than one core. That forces the developers to raise system requirements a bit.
    Just look at BF3 and BC2. They run fine on dual cores however the bottleneck is insane since the engine is forced to use 1 core for general processing and physics simulation at once, without the option by the user end to disable it.

    Physx runs via hardware using cuda processing cores, the driver simply interfaces it with the libraries used by the games. This practically shifts the load all on the gpu without slowing down the CPU processing which only translates the Physx infos handled from the gpu.
    However Physx can be run on CPU as well if the user wants to, meaning it's scalable and totally usable by user preference.

    In the end i think that Physx is much more convenient than havock looking by the developers side, they can use the effects they want an let the user decide it because they are backed up by the hardware and drivers itself.
    Ati should really implement havock in their stream processing units making it officially supported via hardware, at that point things will change.
     
  14. funky monk

    funky monk Notebook Deity

    Reputations:
    233
    Messages:
    1,485
    Likes Received:
    1
    Trophy Points:
    55
    I just think Havock should re-write their engine to use OpenCL. That way anyone can use GPU accelerated physics without promoting one single company.