The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    AMD 7970m vs GTX 680m

    Discussion in 'Gaming (Software and Graphics Cards)' started by x32993x, Apr 20, 2012.

  1. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    Source? Haven't found anything.
     
  2. SlickDude80

    SlickDude80 Notebook Prophet

    Reputations:
    3,262
    Messages:
    4,997
    Likes Received:
    3
    Trophy Points:
    106
    ^^ This is very similar to what i get...i'm speaking more about the smooth gameplay. I'm seeing 50+fps in Ultra. It never drops below mid 40's for me either

    and I'll quote from another thread by nbr member Long:

    :)

    Bro, i wish there were some set benches. I would love to run them for you and everyone.

    The repeatable benches that we have done numerous times are:

    Vantage
    3dmark11
    Unigine Heaven

    The numbers are off the charts.

    All i can give you right now are game play impressions...and the card makes my laptop feel like a gaming desktop. that is all.

    I'll bet you a starbucks coffee that the 680m comes September, is a tad faster than the 7970m and costs $300 more ;)
     
  3. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
    Yes and i think santa is real.
    Do you have any evidence why you think so? I mean other than you like green stuff.
     
  4. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Heh, what source do you have? I already told you guys that a chinese reseller is saying June at Computex.
     
  5. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
    Link?


    My source are my eyes. I have seen the santa with my own eyes. Went to visit him in Finland. :D
     
  6. micahmatthew

    micahmatthew Notebook Deity

    Reputations:
    98
    Messages:
    766
    Likes Received:
    0
    Trophy Points:
    30
    Slickdude how well do you think bf3 will run on a 1440p monitor on ultra ? You think still constant 30plus fps without AA (won't really need with 1440p). i want a 1440p or 1600p IPS or OLED 120hz screen next year hehe. I'll drop to high settings if i have to. these screens will be much cheaper next year and i just got a good job offer so i plan on getting one .
     
  7. SlickDude80

    SlickDude80 Notebook Prophet

    Reputations:
    3,262
    Messages:
    4,997
    Likes Received:
    3
    Trophy Points:
    106
    yikes...1440p? 2640 x 1440? i don't even want to take a stab at it. I think it would be very difficult for any laptop to maintain constant 30fps at that resolution. you may be able to do it if you drop the IQ level
     
  8. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    You might want to look into a dual card machine for that sort of res.
     
  9. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
    Or play at lower details... Im sure medium-high details you will get your 40+FPS easily.
     
  10. rorkas

    rorkas Notebook Consultant

    Reputations:
    118
    Messages:
    280
    Likes Received:
    0
    Trophy Points:
    30
    Assuming linear performance drop with resolution, it may get 30 fps slightly OC, but min fps would probably dip to ~25 on min fps. That's all based on Slick's 50 fps average 1080p.
     
  11. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    @micah with 1440p you won't need AA that much, after all it matters the most with lower resolutions (human eye won't perceive the jiggies in that kind of resolution), so I think you will be alright considering the might of 7970m :)

    btw nice looking lappy buddy! enjoy the good times :)
     
  12. GTO_PAO11

    GTO_PAO11 Notebook Deity

    Reputations:
    173
    Messages:
    1,309
    Likes Received:
    205
    Trophy Points:
    81
    Slickdude, when you benchmarked BF3, the resolution you were using is 1920. 1080, right?

    The youtube video used 1660 x w/e
     
  13. SlickDude80

    SlickDude80 Notebook Prophet

    Reputations:
    3,262
    Messages:
    4,997
    Likes Received:
    3
    Trophy Points:
    106
    i only ever play at 1920x1080 ;)
     
  14. aerpoblast

    aerpoblast Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    Well, according to this article, released just 2 hours ago, GTX 680M is coming out on June 5th with 744 CUDA cores, 4096MB GGRD5 memory at 100 watts.

    37% faster than GTX 670M and its 3D Mark score is 4905 only.
    Not even a full GK104 chip.

    If true, I wonder if this chip would even match up to HD7970M.

    NVIDIA GeForce GTX 680M Detailed and Pictured | VideoCardz.com
     
  15. SlickDude80

    SlickDude80 Notebook Prophet

    Reputations:
    3,262
    Messages:
    4,997
    Likes Received:
    3
    Trophy Points:
    106
    If this is true, then it is a serious fail....no, it does not match a 7970m. This is every nvidia fanboy's nightmare. This CAN make sense though given what i have been told by insiders. They are respinning the silicon and it won't come till sept (which is what i'm being told). in the meantime, they release a stop-gap measure.

    This rumor about a 4900 3dmark has been around for months. For reference, the 7970m scores (depending on CPU) somewhere between 5500-5800 3dmark11's stock
     
  16. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
  17. YodaGoneMad

    YodaGoneMad Notebook Deity

    Reputations:
    555
    Messages:
    1,382
    Likes Received:
    12
    Trophy Points:
    56
    Obviously I did not play all of Metro with fraps up. I played the last maybe 6th of the game, that includes the big outdoor fight, I never noticed any drops below 30. Not exactly scientific, I am sure it could and maybe did drop, but it was not noticeable at all in game play, and I can guarantee it was never sustained because I would have noticed for sure. I posted a screenshot at the main menu that showed FPS, far from perfect but it is a rendered menu and I would be interested to know what you got there.

    I am also being pretty conservative with my estimates, as you can see just in the arena (which is by far not the most challenging scene) I am upwards of 45 FPS. In somewhere like the forest it will drop into the 30's.

    Here are some witcher 2 screenshots I just took, settings used are included, I shrunk them to try and not destroy the forums, they were originally 1080p as shown on the settings menu:
    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]


    You are correct, last time I saw that I just glanced, saw that only 3 of the games even run below 60 on the 580m, and Skyrim with the newest drivers should be above 60. But yea, those results, especially BF3, are pretty sweet. Again, that seems well below my cards performance but I am of course OCing.

    I have said before, I am OCing, I use a custom vBios to run 740 core, stock voltage, stock memory. The 580m is just begging to be OCed, and with a modded vBios it is incredibly easy and I never have to mess with it. I use these settings at all times. I realize I am comparing OCed to not OCed, but it is mostly just for my benefit and other people with 580m cards. Basically, in actual use with an easy OC the 580m is extremely competitive in gaming to the 7970m. It gets very good FPS, where-as the 7970m will get over 60 (thus better, by some amount). If you OC the 7970m you will get even higher over 60, but unless you just love screen tearing or run a 120hz display over 60 FPS is just a waste.

    My main point has always been that the 7970m is not SO MUCH BETTER that everyone should toss out their old cards and buy one today. It is a nice card, if you are getting a new computer right now it is a clear choice, but it doesn't do anything that a 580m can't do, there is no game (that I have seen) that is unplayable on the 580 and suddenly butter on the 7970m.
     
  18. rorkas

    rorkas Notebook Consultant

    Reputations:
    118
    Messages:
    280
    Likes Received:
    0
    Trophy Points:
    30

    If this is true we have to call 911 to Cloudfire's house. He's gonna have a heart attack.
     
  19. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
    ... Or starts talking that GTX670 will be GTX685M instead. :p
     
  20. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    That 3DMark11 score is the 680M revision #1 overclocked. It was the 768 cores tested. Now they are suggesting that Nvidia went down to 744 cores and made a even worse performance?

    We all know that Nvidia isn`t releasing a flagship GPU that is worse than 7970M. If they did, and it consumed 100W like 580M that would be an epic fail. Only other reasonable explanation is that this 744 core is the successor of GTX 570M, consumes about 75W and scores around P4500++ and they release the GTX 685M in September. But this isn`t their flagship GPU. No way

    They are right about the date though. Computex in June like I said ;)
     
  21. YodaGoneMad

    YodaGoneMad Notebook Deity

    Reputations:
    555
    Messages:
    1,382
    Likes Received:
    12
    Trophy Points:
    56
    That article is based off the same leaked benchmarks we saw like a month ago.

    If true then the 680m is a fail, my 580m gets over 4000 with an easy OC, a card that only adds a few hundred points is a total flop. Will be nice for me though, totally validate my decision to wait for the 780m.
     
  22. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    @Yoda, when you said ultra I thought it was including ubersampling, I get 30fps with the same settings, you just OC to 740 really? I game at stock.

    also yeah that 4905 is about for 3 months, I don't believe it is true, it was just a fanmade screenie to hype things up until AMD crushed it with 5.6k..

    Also how the hell you can disable 22 CUDA cores from 768, they are aligned with groups and I am sure those groups are not a power of a prime number..

    EDIT: sorry it was 744, so it maybe possible to disable 24 cores maybe, but I don't see it happening..

    Btw, Yoda, do you think we will be able to keep our lappies for upgrade around 700m series? (Maybe PCI 2.0 speeds will seriously bottleneck this time?)
     
  23. SlickDude80

    SlickDude80 Notebook Prophet

    Reputations:
    3,262
    Messages:
    4,997
    Likes Received:
    3
    Trophy Points:
    106
    See that is a little sneaky yoda...you're trying to push off your scores like any 580m can do that. When in reality, You are pretty close to the 580m's limit, and your card is crunching away to produce playability while the 7970m just strolls through the park to get what it gets. And why is depth of field disabled in Witcher...when i turn that off, i get another 5-7fps

    I will overclock it to 990/1450 and see what kind of numbers i get. So far, everything reported is bone stock.

    The bottom line...580m scores 3200-3400 3dmarks, the 7970m does 5500-5800. I'd say there was a huge difference, but you are blinded by your nvidia glasses
     
  24. YodaGoneMad

    YodaGoneMad Notebook Deity

    Reputations:
    555
    Messages:
    1,382
    Likes Received:
    12
    Trophy Points:
    56
    Yea, not ubersampling, just the preset ultra spec and then I bump up the video memory. Yes, just using a flashed vBios to always run at 740. You could use inspector also of course.
     
  25. GeoCake

    GeoCake http://ted.ph

    Reputations:
    1,491
    Messages:
    1,232
    Likes Received:
    2
    Trophy Points:
    56
    Well the 7970M allows you to boost your ingame settings from Very High in Crysis 2 (the best a 580M could do - just check out the later levels which stress the GPU) to Extreme/Ultra. Furthermore a 7970M can effortlessly max out Crysis 1 and Warhead modded with Custom Crysis Config (Very High/Enthusiast), the 580M would just lag in this situation (best it can do is High/Gamer).

    Skyrim with ENBseries can also be fluently played in 1080p, which was not possible on 6990M or 580M. Plus in Sniper Elite 2, maxed out, the 580M has a little lag spike when you zoom in with the snipes, whereas the 7970M is 60FPS fluid throughout. This is what I have noticed really (only games I have been playing quite intensively). For my situation, I like using mods and stuff, the upgrade was well worth it. Bear in mind that the 7970M will run cooler, consume less power, etc, on games that it easily maxes at 60FPS, meaning quieter fans, etc.

    Anyways, that 680M article seems to base itself from VERY old info.. the very first leaked benchmark and the card pic that was posted a few days ago..
     
  26. YodaGoneMad

    YodaGoneMad Notebook Deity

    Reputations:
    555
    Messages:
    1,382
    Likes Received:
    12
    Trophy Points:
    56
    Far from the limit, at .92v I can add another 80-100 to core easy and bump this even higher. If at some point a game won't run on my card I will just flash to a modded .92v and get another big boost. I see no downside to running at 740, with the flashed vBios I never even notice or mess with it.

    Cinematic depth of field is off because I don't like mass blur, regular DOF both in gameplay and cinematics is on. That is why I posted my settings though, so you can match them if you desire.

    And really I am blinded by the performance I get from my card. Yes, objectively the 7970m is a lot better, in real usage I just don't see the advantage FOR ME. I have also said many times I would not give up Nvidia features. I use adaptive vSync in every game I play, I often use ambient occlusion or FXAA, I love Nvidia exclusive features. I am not trying to hide that at all, it would take an awful lot to get me to go AMD, simply because of all the stuff I would give up to do it.
     
  27. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    Definitely bro, noone is saying you should give up what you like, or nvidia does a poor job with their powerhouse gpus. But you kinda sounded like 7970m doesn't improve that much over 580m, I think (at STOCK) it improves quite a bit about 50% over 580m, and I think we have enough benches to validate that point, this doesn't mean we don't like nvidia, even though I bought a 7970m (and assuming it is still alive :D) I will buy 680m just for the kicks, because I am nvidia fan :)
     
  28. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
    744 cores doesnt make any sense. Its surely 768.
     
  29. YodaGoneMad

    YodaGoneMad Notebook Deity

    Reputations:
    555
    Messages:
    1,382
    Likes Received:
    12
    Trophy Points:
    56
    That is why I added the "that I have seen" to my statement. I played all the Crysis games when they came out on other machines. I personally don't like the ENBSeries look, I think ambient occlusion along with some graphics mods from the workshop give a better effect, so I had never tried it. I have actually had no trouble with Sniper Elite, it seemed to be very fluid, though I admit never using Fraps, I didn't notice any hitching. Important to note I normally use triple buffering along with adaptive vsync, so that might fix it. If you have vsync the hitching might be it dropping from 60 to 30 and then back, which adaptive vsync fixes.
     
  30. YodaGoneMad

    YodaGoneMad Notebook Deity

    Reputations:
    555
    Messages:
    1,382
    Likes Received:
    12
    Trophy Points:
    56
    I agree with what you said here. It is a faster card for sure, and a clear choice in a new computer. Just for my circumstances it isn't enough faster to justify getting excited. I want at least double my current performance before I get a new card, and I am hoping the 780m will deliver it.
     
  31. Torment78

    Torment78 Notebook Evangelist

    Reputations:
    261
    Messages:
    555
    Likes Received:
    0
    Trophy Points:
    30
    what driver you using i got the 301.24 cant find the setting for the adaptive vsync
     
  32. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    ^^ at the bottom of manage 3d settings
     
  33. YodaGoneMad

    YodaGoneMad Notebook Deity

    Reputations:
    555
    Messages:
    1,382
    Likes Received:
    12
    Trophy Points:
    56
    Same driver as you, it is under Manage 3D settings, Program Settings, then select the game, it is at the very bottom Vertical Sync and you can choose adaptive which syncs to 60 FPS, or adaptive with half refresh which syncs to 30 FPS. I always use regular adaptive, it seems to work great.
     
  34. GTO_PAO11

    GTO_PAO11 Notebook Deity

    Reputations:
    173
    Messages:
    1,309
    Likes Received:
    205
    Trophy Points:
    81
    Booyah!!!!
     
  35. GeoCake

    GeoCake http://ted.ph

    Reputations:
    1,491
    Messages:
    1,232
    Likes Received:
    2
    Trophy Points:
    56
    I'm just stating my experience, for me it was well worth it. The 675M is beast, but just didn't cut it for me. It keeps you happy, so cool beans. :)
     
  36. Torment78

    Torment78 Notebook Evangelist

    Reputations:
    261
    Messages:
    555
    Likes Received:
    0
    Trophy Points:
    30
    thanks m8 nice one
    i got a 120hz screen is it realy needed for me what you recomend
     
  37. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    ;) Hahaha. Interesting.
     
  38. jefflackey

    jefflackey Notebook Evangelist

    Reputations:
    96
    Messages:
    352
    Likes Received:
    38
    Trophy Points:
    41
    When I first started reading this thread, which I found as I was trying to figure out how to configure the P150EM I decided upon as my next notebook (after years of Dell machines, but that is fodder for another thread!) I was surprised at the passion of people for "their side" - i.e. NVIDIA vs. AMD.

    Then I remembered way back when - yes, I'm an old fart - and the graphic card wars in the days of 3DFX. ;)

    Anyway, in the FWIW category, I worked closely with a variety of chip designers (NVIDIA, AMD, Intel, IBM, TI, Samsung, etc.) and fabs and foundries over the years in my role as a tech director for a company that provided "stuff" to the chip makers. I've been in the tsmc foundries in Taiwan many times, and I can tell you the French restaurant and wine the fab manager in northern Taiwan prefers. ;) As well as his main goal for the year from his bosses (30% cut in overall cost per wafer.) I've also dealt with product managers from NVIDIA and AMD and Intel (and other companies.)

    My speculation, and it is only speculation from past experiences:

    1. NVIDIA is hugely important to tsmc. They have, in the front lobby, several plaques and the like from NVIDIA. While they make chips for a lot of people, NVIDIA has a place at the front of the line. And, if NVIDIA tells tsmc, we have a new chip design we need you to make and we need it made immediately, tsmc will find a way to do it. The pressure they will put on their own people, and their suppliers, will be immense, but they will do it.

    2. It would not surprise me at all if NVIDIA is actively trying to leak performance rumors around the 680m that will give people pause in terms of buying the 7970m - i.e. influence people to wait and see before buying the 7970m for fear the 680m will come out a month or two later and blow it away. Even if they don't have the chip completely designed yet. They know that if a lot of people buy the 7970m now, those people will be unlikely to buy the 680m a few months later, they are competing for that one pool of customers. So they are incentivized to try to keep as many people as possible from buying 7970ms right now.

    3. There are jobs/careers at stake in NVIDIA right now, depending on not allowing the 7970m to keep the crown of king of the mobile GPUs.

    For those of us who have no vested in interest in either company, these are good times to be in the buyer's seat! ;)
     
  39. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    ^^ indeed! they are sweating their a$$ off to come up with better and better performance, consumers win!! :)
     
  40. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
    We dont win anything if the latest rumors are true and GTX680M is indeed weaker and more expensive than 7970M.
     
  41. Zalgradis

    Zalgradis Notebook Consultant

    Reputations:
    45
    Messages:
    169
    Likes Received:
    0
    Trophy Points:
    30
    That post regarding the 680M made me very sad...

    Still if the 680M is only a few weeks away as it suggests it might be worthwhile for those torn between the two to wait and see what it can do.

    Even if the 680M is say 10% slower than the 7970M, it would still be a worthwhile option for those who prefer nvidia drivers (as long as the price reflects this performance difference, which i somehow doubt it will...)
     
  42. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
    Theres still hope that its on par or bit above the 7970M. Perhaps those early drivers were (incredibly) buggy. :D
     
  43. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    A slower 680m would be terrible. Late and slower? That would be a massive fail for Nvidia.
     
  44. littleone562

    littleone562 Notebook Deity

    Reputations:
    1,417
    Messages:
    993
    Likes Received:
    59
    Trophy Points:
    66
    Isn't that picture used for the 3dmark score the old one that was leaked?
     
  45. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
    Forget about benchmarks, I want to see 7970M numbers for games, especially in Crossfire mode. $20 says AMD's crappy driver support will rear its ugly head as usual.
     
  46. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
    If you havent noticed, CF works wonders now with HD7K series. Often the scaling is 100%.
    And what are the mysterious driver problems? I havent countered any with my HD7970. I think its more like a washer issue than actual driver issue for many.
     
  47. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Which is why it will never happen. Not at 100W.

    Yeah, a week ago or something Nvidia got priority of 28nm from TSMC because Nvidia have been complaining over low yield. So yeah, I think they pretty much listen to what Nvidia have to say.
    Yields are getting better each week too, so these poor times will end sooner or later.

    Nvidia is losing potential customers for every day without the 680M, so I`m pretty shure they are doing everything they can to push it out to the market
     
  48. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
    Yes. because NV simply never makes slower cards than AMD.
    Oh wait...
     
  49. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    Their philosophy has changed for 2012, but that doesn't mean their driver support has. AMD has been constantly late for game support and fixing issues. Metro/Skyrim/RAGE etc...

    I'm entertaining the idea giving AMD one last chance.

    I've heard of the complaints from companies working for nVIDIA and the pressure they put on their affiliates (everyone but eVGA). I've always found nVIDIA to be EXTREMELY quiet about the mobile area. It's always about desktops, desktops, and more desktops. Tesla gets brief fame but the desktop GeForce is their grail, so to speak.

    In regards to your 3rd point, I'd never want to work as a chip engineer. :eek:


    I believe XFX dropped nVIDIA due to the pressure AFAIK.
     
  50. BlackSabs

    BlackSabs Notebook Consultant

    Reputations:
    177
    Messages:
    278
    Likes Received:
    16
    Trophy Points:
    31
    ...Exactly!
     
← Previous pageNext page →