The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    AMD 7970m vs GTX 680m

    Discussion in 'Gaming (Software and Graphics Cards)' started by x32993x, Apr 20, 2012.

  1. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    That's an article from 2 years ago, and the chart in it is totally wrong. Kepler has less DP FP performance per watt than Fermi, not even close to the 2x/3x perf expected.
     
  2. Vahlen

    Vahlen Notebook Evangelist

    Reputations:
    243
    Messages:
    323
    Likes Received:
    1
    Trophy Points:
    31
    Ya AMD's CF Hybrid is a very interesting piece of technology, hopefully they can get their processor performance at up to grade level because their iGPU department is looking extremely strong. I would love my next purchase to have the option of running that hybrid setup in a laptop.............obviously it's not as optimum for gaming as true Xfire but it will be a significant boost to the 15" market.
     
  3. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    they (one.de) already have them in stock and configurable on their website. upgrade pricing from 675M is 290€, thus approx. 200-250€ more expensive than the 7970M.

    good find btw! +rep
     
  4. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
  5. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    so what is the final rumors about 680m? what's rumored 3dmark11? (I am not following this thread well anymore :()
     
  6. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    weeelll..... approx. 200-300 $/€ more than 7970M, approx. 10-15% faster than 7970M (in Vantage), supposed to be presented soon during the computex, supposed to be available sometime in july/august.

    did i miss something?
     
  7. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    10 points to jaybee :)
    GTX 680M scored 14% more in Vantage than 7970M
     
  8. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    Cloud can you post a few screenies you found in the last couple days?
     
  9. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    oh and we could pretty much say that a 7990M turned from "fabled" to "probable" sometime this fall, right? ;)
     
  10. Chazy90

    Chazy90 Notebook Evangelist

    Reputations:
    318
    Messages:
    409
    Likes Received:
    1
    Trophy Points:
    31
    @Cloudfire:
    I read the original text (google translate sucks for someone that knows german :D) and yeah very probably in a few days we will see the review and it should arrive next month. But the whole discussion was that one.de lists often things and they can't keep the promise for the delivery date so that's why I say "probably". :D

    The 7970M is already a monster....what about the upcoming 7990M? :D
    Hard times for Nvidia...
     
  11. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    From Bluemobility, a Polish reseller of Clevo

    [​IMG]

    How did you come up with that? First of all, the 7970M isn`t even available to a lot of countries yet. Late summer delivery like I posted earlier.
    Second of all, where on earth will AMD get the extra performance for 7990M? The 7970M is already at 100W, the max consumption the MXM can deliver. They can`t just magically get more performance out of the same consumption. Well there is one way, but its through architecture improvement, like the 560M to 460M (both Fermi), but AMD won`t be out with 8000M series until 2013...

    AMD went all in with the 7970M and crushed everything previously released. Now they can just sit and prey that Nvidia won`t release something that will beat their flagship. But either way, 7970M will sell like crazy because of the cheaper price.

    You AMD fans keep finding excuses for everything
     
  12. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    dude, chill out :)

    im aware of all these points @100W already reached, etc.

    second, im neither a green nor red team fanboi, thank you very much ^^

    thirdly, apparently u dont read the sources u post here very carefully? you urself just quoted the 7990M part here :D

    my guess is an OCed 7970M, and why not? seeing that its easily OCeable up to 20% :) besides, the 7970M is rated at 100W but that doesnt mean it consumes the full 100W. lots of users have already posted that the 7970M consumes significantly less power than the 580M, altho both cards are rated at 100W. add to that the jump from 485M to 580M which also just took 5 months (and also included no real arch improvement but rather just a simple 9% OC) :) thus, a release in september or october wouldnt be that far-fetched ;)

    cheers
     
  13. SlickDude80

    SlickDude80 Notebook Prophet

    Reputations:
    3,262
    Messages:
    4,997
    Likes Received:
    3
    Trophy Points:
    106
    starcraft2 uses the Havok engine for its physics (Havok works with any video card...i.e., ATI, Nvidia, Intel etc)
    Diablo3 initially used Havok, but it was replaced with a custom physics engine that works on all platforms

    I think you are confusing Nvidia physX with game physics.

    PhysX is nvidia's implementation of physics but is proprietary to nvidia cards...meaning it only works on Nvidia cards. Generally speaking, game devs like to use technology that doesn't require you to have specific hardware...but nvidia has deep pockets and may "sponsor" a game to use it
     
  14. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    LOL I read 7970M instead of 7990M. Whoopsie. Well that retailer guess, he haven`t seen the 7990M at all.
    There is no way they will release a fully 7870. AMD wasn`t capable of releasing a fully 6870 last generation either. The 6870 was clocked at 900/1050, the 6990M which was their greatest was at 715/900.
     
  15. DocOccam

    DocOccam Notebook Geek

    Reputations:
    27
    Messages:
    90
    Likes Received:
    0
    Trophy Points:
    15
    PhysX added a few thousand points to my GTX 580 score before I stopped using Vantage in favor of 3DMark11. This 7970m will be my first high performance AMD or ATI product in more than a decade unless nVidia delivers a strong counterattack. If the rumored dates hold for the 680m, it would arrive just in time for school...then I could continue my nV streak: 2 ti, FX 5600 Ultra (what was I thinking?), 6800 GT, 7800 GS, 7950 GT, 8800 GTS, GTX 260, GTX 580.
     
  16. Andycinoz

    Andycinoz Notebook Consultant

    Reputations:
    36
    Messages:
    131
    Likes Received:
    11
    Trophy Points:
    31
    Probably a dumb question, but would a game relying on Physx be completely unplayable on my soon to arrive 7970m, or would it just underperform?

    Really hoping that most major releases won't be exclusive one way or the other - especially GTA5!!
     
  17. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    It would perform better. PhysX is one of the most failed gimmicks in gaming history. Mirrors Edge is probably the only game that needs it.
     
  18. DEagleson

    DEagleson Gamer extraordinaire

    Reputations:
    2,529
    Messages:
    3,107
    Likes Received:
    30
    Trophy Points:
    116
    Dont think Rockstar would use PhysX, since it will still use the engine used in GTA IV, Red Dead Redemption, L.A. Noire and Max Payne 3.
    What we need to worry about here if we will be stuck with a horrid console port on PC.

    But for non-Nvidia hardware i recommend to turn on PhysX.
    Its crippled when running on CPU.
     
  19. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    u mean turn off PhysX for non-Nvidia hardware? ^^
     
  20. HaloGod2012

    HaloGod2012 Notebook Virtuoso

    Reputations:
    766
    Messages:
    2,066
    Likes Received:
    1,725
    Trophy Points:
    181
    hmm, so it looks like with my overclocks and scores my 7970m is up there with a stock gtx 680m
     
  21. long2905

    long2905 Notebook Virtuoso

    Reputations:
    2,443
    Messages:
    2,314
    Likes Received:
    114
    Trophy Points:
    81
    which if true is a bit...depressing :(
     
  22. nissangtr786

    nissangtr786 Notebook Deity

    Reputations:
    85
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    0
    whats so depressing a 680m was gonna be a low end card before amd 7970m came and nvidia changed there whole graphics for 680m. Theres a reason why a gt650 660m kepler is close to 670m or 675m in performance as nvidia were trying to make as much money as possible while amd did what nvidia should have done release a top end card with real performance not try to get a 600m series with minimal increase performance.
     
  23. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
    Which is why competition is a good thing.
     
  24. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Why is that depressing?

    What would have been depressing is that if 680M was the weak P4900 like the first sample they had. Not much competition there to drive the performance up on future models. We should be glad they keep competing like Joker says :)

    As for the previous Kepler models, Nvidia should have settled with GT650M. It seems like a sweetpoint, both in performance (GTX 560M) and TDP 45W (compared to 560M with 75W TDP). I feel that GTX 660M was a half as*ed attempt from Nvidia and didn`t bring anything new to the table.
    The rest of the 600M series, 670M and 675M is pathetic and I wish that Nvidia would stop lying to their customers abut top models (renaming) and instead focus on bringing the price down for the 500M GPUs. Look at AMD. They do it the right way
     
  25. SlickDude80

    SlickDude80 Notebook Prophet

    Reputations:
    3,262
    Messages:
    4,997
    Likes Received:
    3
    Trophy Points:
    106
    depressed? Why? 120+ fps not enough for you in Diablo3? 50+ fps not enough for you in BF3 Ultra?

    the 680m will cost a fortune and availability is still months away. Enjoy your 7970m. It is currently the fastest mobile vid card in the world

    i will confirm or disprove the numbers when i bench the 680m
     
  26. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    You will buy that too? :p
     
  27. SlickDude80

    SlickDude80 Notebook Prophet

    Reputations:
    3,262
    Messages:
    4,997
    Likes Received:
    3
    Trophy Points:
    106
    the simple answer: yes i will.

    the convoluted answer: if i can get one at a reasonable price, yes. What i hear is that they are still at the ES stage. The early ES cards are essentially impossible to get unless they are stolen or you work for nvidia engineering. When they get to the qs stage and there are some of those cards floating around with vendors and laptop manufacturers, i will get my hands on one...but like i said earlier, only if the owners of said cards don't gouge on the price ;) I hate paying a fortune for something that they essentially got for free
     
  28. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Niiice. Who is telling you all of this? Hook me up with one please :D :D
     
  29. long2905

    long2905 Notebook Virtuoso

    Reputations:
    2,443
    Messages:
    2,314
    Likes Received:
    114
    Trophy Points:
    81
    relax guys :-s. Its just the wording that makes it seems (to me) that he was aiming for a high end desktop card or something and subsequently demean the value of the 7970M.

    And your last line doesn't help slick :p
     
  30. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    hahaha, classic slick with his special connections :p :D
     
  31. widezu69

    widezu69 Goodbye Alienware

    Reputations:
    3,079
    Messages:
    4,207
    Likes Received:
    168
    Trophy Points:
    131
    I'll get one as soon as it comes out. Dell have been pushing to replace my system but I've been holding on the 680m at the moment. I've already benched the 7970m and shown what it can do, which is amazing (6951 3DMark 11 tess on :D).

    I have high hopes for the 680m but I'm keeping it all reserved. I expect it to be good but not amazing because it will be let down by price. Anyways, I have no intent to be dethroned from my Alienware pedestal :p

    PS. I strongly believe that with those newer numbers, the 680m will be similar to the rumoured 660 Ti with 1152 shaders. I'll be really P'd off if that wasn't the case but instead it was 768.
     
  32. DEagleson

    DEagleson Gamer extraordinaire

    Reputations:
    2,529
    Messages:
    3,107
    Likes Received:
    30
    Trophy Points:
    116
    Or did i? :p
    TypoCeption
     
  33. w3ak3stl1nk

    w3ak3stl1nk Notebook Consultant

    Reputations:
    3
    Messages:
    217
    Likes Received:
    0
    Trophy Points:
    30
    rofl, good one
     
  34. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    Same here, but if the 680M ends up being a 768 shader GPU, then at the very minimum, the performance would be the GTX 660M's performance X2. And that's usually the case when nVidia doubles their shader count (i.e GTX 580M being almost double the performance of the GTX 460M).

    To compete against the 7970M while using 768 Kepler shaders, nvidia would have to clock the 680M to something like 950-1000mhz, even more to get it 10% faster. Given's Kepler's architecture, this is possible. That, or it's more than 768 shaders in actuality.
     
  35. Vahlen

    Vahlen Notebook Evangelist

    Reputations:
    243
    Messages:
    323
    Likes Received:
    1
    Trophy Points:
    31
    Who's making excuses? AMD obliterated Nvidia this series in both performance and price as far as I am concerned. I mean come on, Nvidia had to significantly delay the release of the 680m because it's performance was pathetic compared to the 7970. Even if Nvidia's 680m tweaked release state is more powerful than the 7970 it will have come far to late and for far too high of a price.

    I'm also looking forward to see how the 680 overclocks, I'm willing to bet it has one of the lowest ceilings we have seen from a card yet as all this time has probably been spent trying to figure out ways to push performance out of a "low-end" card (which is slightly impressive I'll give them that).
     
  36. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    If the 680M does turn out to be 10% faster..
    - given the a clock somewhere 950-ish I would probably buy a P150EM barebone w/ the 680M, a cooling fan and let the OC begin!
    dear god please make this happen :cool:

    imagine the power... I can't
     
  37. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    Guys i understand defending each side, either green or red.

    But also i want to stop a second and thing how the technology evolved. Either if youll get a 7970m or 680m - those two cards have a technology evolved to a point which 5 fps difference doesnt matter anymore, because basically they can output almost any game at 60fps and beyond.

    10-15% faster translates in a bunch of FPS. Now when the difference hits 10-12fps, thats a thing to keep in mind.

    If you got the 7970m, you got a beast of card. If you/will get the 680m, probably it will be a bit faster, but it will be in the same range as the AMD beast, not a huge jump "above it".

    Then as an nvidia fan and user, i could say that drivers are far better than AMD, but some experienced AMD user can call BS on me because they could affirm the same.

    Either way enjoy what you have because theres a lot of power that makes 5-10% speed bump not that important.
     
  38. w3ak3stl1nk

    w3ak3stl1nk Notebook Consultant

    Reputations:
    3
    Messages:
    217
    Likes Received:
    0
    Trophy Points:
    30
    besides power numbers... nvidia has better color?
     
  39. Prasad

    Prasad NBR Reviewer 1337 NBR Reviewer

    Reputations:
    1,804
    Messages:
    4,956
    Likes Received:
    10
    Trophy Points:
    106
    What!? Please explain yourself.
     
  40. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    He didn't seem sure, hence the question mark afterwards.

    No nVIDIA doesn't have better colours, nor does the GPU look better.
     
  41. Nick

    Nick Professor Carnista

    Reputations:
    3,870
    Messages:
    4,089
    Likes Received:
    644
    Trophy Points:
    181
    When I purchase a laptop, I don't care if it has an AMD or Nvidia video card. I've had quite a few systems with AMD/Intel, Nvidia/Intel, Nvidia, and AMD graphics. AMD's method for switching video cards is horrible and there drivers were always harder to install the latest version. That said, I loved the HD 5730 in my Envy 15 and XPS 1647. The Nvidia Optimus in my MSI X460 was much better and with all of my Nvidia laptops, it was easy to install the latest drivers. Plus, isn't AMD stopping support for the 5xxx and down?
     
  42. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    There are only 2 sides, companies and consumers. Any other division is silly.
     
  43. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    Amd cant make an unified driver for who knows what reason.
    They kept doing this after each refresh. Maybe its their policy to discountinue legacy gpu very soon compared to nvidia, or maybe its a way to admit their problematic driver development.
    Cutting out old cards means having more resources to work on newer gpus, but also this makes them not reaching a point where a driver can be called "mature" or "stable".
    Months ago with Rage and Skyrim they basically acknowledged that a driver cant contain fix for both games lol. Or either you use a driver for 1 game, or the other. Then some tweaks came to light and things have been sorted by the users, and lately from amd itself.

    What i find funny is that they keep on release the "fastest" gpu around, each year is the same story.
    Lets get this straight: the hardware is sublime. It has numbers, powers, whatever. But it lacks, seriously, drivers.

    Now before someone start flaming me, i just want to explain why i usually say it "lacks" drivers. Obviuosly it doesnt, because drivers keep being updated, but if you came from nvidia (and im speaking about having nvidia as a main brand for at least 10 years) you know what i mean.
    Nvidia has issues as well. Their driver used to kill gpus. It doesnt happen always, but they can fix something and break another. Their gpus used to die without reason (everyone remembers right?), and some of them keep on doing that - but - in the majority of the cases, everything is like 1) get a good gpu 2) install drivers 3) play the game. 99% Of the cases drivers will perform exactly as they should even with newer games that hasnt been supported yet.
    This percentage of "success" is by far, more and more "wide" on nvidia drivers than AMD.

    My experience with AMD hasnt been great. The Xfire desktop setup i had (5870) was not even pushed to decent levels because of the worst drivers i ever seen. Back in time their habit was to release a driver update after....after who knows. Ive been left with no new drivers for what, 3 months. While nvidia users with cheaper cards at the time were releasing betas each 2 weeks. And what made me mad is the fact that their cheaper cards used to kick my Xfire or Single in some games where i was supposed to destroy them.

    Its love and hate i guess. Im ready if god wants to grab a 7970m and turn to a red team believer. But seriously they need to get their things together. Denying their bad habit into developing drivers is like defying reality.

    I wish that every AMD fanboi had at least an Nvidia GPU of the same segment of theirs for AT LEAST 2 months. Then i would like to know their POV possibly unbiased.
     
  44. w3ak3stl1nk

    w3ak3stl1nk Notebook Consultant

    Reputations:
    3
    Messages:
    217
    Likes Received:
    0
    Trophy Points:
    30
    Thought so, was under the assumption that CCC colors were washed out compared to nvidia. Guess it was a simple mistake. Maybe it was my monitor brightness or maybe it was sleep loss... Oh well, just a passing thought
     
  45. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    Oh no no, that would break hell loose! :D
     
  46. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    Actually from my experience with the xfire setup I had in my previous desktop I could say quite the opposite - and I'm an nvidia user mainly, althought I don't consider myself a fanboy or biased toward them.

    Ati drivers gave me always the best color range, more vivid and brilliant than nvidia, and I reached a point to doubt my eye sight.
    Then I did some tests on a calibrated monitor and I was impressed that my theories were right, althought the gap wasnt that great as I imagined.

    It's strange to believe it since technically a color output should be the same on the same monitor; but then someone experienced in movie/photo business began explaining things like color luminance,compressions,video quality,signal quality,cable quality,distortion,artifacts and so on.
    I was blown away on all the things that can make the same color displayed in the same way/palette so different.

    Dunno how it has been lately because I have an nvidia gpu right now but I tell you: I resort every driver installation to the digital brilliance slider to get things at last not that WASHED like they are at stock.

    Just my 2 cents.
     
  47. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    i think he was talking about image quality :)

    Sent from my GT-I9001 using Tapatalk 2
     
  48. w3ak3stl1nk

    w3ak3stl1nk Notebook Consultant

    Reputations:
    3
    Messages:
    217
    Likes Received:
    0
    Trophy Points:
    30
    Interesting read... I have no idea what it is now either as I stopped using ATI when they merged with AMD
    Nothing against AMD as they seem to have come around, but back in the day they had me hot. Maybe it's turned a new leaf...

    Edit: contradictory info here: http://forum.notebookreview.com/ali...e-cpu-heatsink-7970m-crossfire-upgrade-7.html
     
  49. Andycinoz

    Andycinoz Notebook Consultant

    Reputations:
    36
    Messages:
    131
    Likes Received:
    11
    Trophy Points:
    31
    For me, AMD wins this one hands down.

    For the simple reason that they got their hardware to market before Nvidia did.

    I'm no fanboy, although I do have a 5650 card in my current laptop. The reason? When visiting the States before moving here, my old laptop died, and I had to get something semi-decent to game on, and Office Depot had an HP in stock that had the same.

    In January, I came into position to buy a replacement laptop, and waited for Ivy Bridge to release as it seemed sensible to get the latest tech. And AMD came out with the best card on the market, so rather than wait around another 6 months and pay about $200 more for a card that might (we still don't know for sure) beat the 7970, I dropped my money on what was known and available at the time.

    And I suspect I'm not the only one in this position.
     
  50. PaKii94

    PaKii94 Notebook Virtuoso

    Reputations:
    211
    Messages:
    2,376
    Likes Received:
    4
    Trophy Points:
    56
    i agree sometimes waiting is too much.
     
← Previous pageNext page →