The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    AMD 7970m vs GTX 680m

    Discussion in 'Gaming (Software and Graphics Cards)' started by x32993x, Apr 20, 2012.

  1. Arestavo

    Arestavo Notebook Evangelist

    Reputations:
    188
    Messages:
    559
    Likes Received:
    215
    Trophy Points:
    56
    I'm guessing that is when the cards down-clock?
     
  2. HaloGod2012

    HaloGod2012 Notebook Virtuoso

    Reputations:
    766
    Messages:
    2,066
    Likes Received:
    1,725
    Trophy Points:
    181
    its because 3dmark gets the system information before your gpu gets loaded for tests, which is dumb, it should take system info clocks when gpu and cpu are at full load so we can see exactly what overclocks are running
     
  3. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    The word is that no one has them until like July.

    Now back to your previously scheduled nonsense and speculation.
     
  4. evoandroidevo

    evoandroidevo Notebook Evangelist

    Reputations:
    0
    Messages:
    321
    Likes Received:
    0
    Trophy Points:
    30
    YAY : D

    Sent From My Rooted EVO 3D
     
  5. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    I really hate to say this but, I realized nvidia put p150em and p170em to their article for 680m, I think it also won't be compatible hm series :( (for example see that they cite m17x, not r4) sorry guys :(
     
  6. Abidderman

    Abidderman Notebook Deity

    Reputations:
    376
    Messages:
    734
    Likes Received:
    0
    Trophy Points:
    30
    Wow, I'm very mad right now. I was stupid and went to work.... and it seems I missed quite a bit of drama. I hate when real life interferes with reality internets.

    It is funny how this arguement comes up with every card release by the 2 companies. I saw it when the 5870m came out (vs 460m, I believe) and when the 485m came out (vs the 6990m I think). So I can't wait for the
    17650m vs the 999m in a few years. Deja Vu is happening all over again (to steal a phrase).

    Both look like they will be absolutely great cards. Unless you absolutely need the absolute best, get the one with either the best price, or that is actually available. I have had both AMD and nV over the years, and with the exception of a couple of noteable missfires, each company continues to either crank out great cards, or refreshes other cards. We fight over which is bigger, faster,stronger ( did you see the 6 million $ man reference...err, actually, did you GET the reference?) for anywhere from a few to several months, and then do it all over again later in the year. Or early the next year. Same discussions, different people.

    Buy the card you want to buy. Then enjoy it.

    See you all later this year.

    Seriously, thanks to you that already have the 7970m for the feedback on it. Looks like a great card. In spite of all the name calling and such, most of us really appreciate what your letting us know. And nice job on your card Slick. That thing rocks. Meaker, will be looking forward to your feedback after your done play.... err, I mean, testing it out. See you on here after the weekend, if you know what I mean. All you people that actually use the hardware, test it and tweak it are why so many of us come to NBR. I always have more respect for what I get from the users, and have learned so much more, rather than the PR that gets tossed out there. Great job, sorry for the TL;DR version.
     
  7. Frost451

    Frost451 Notebook Enthusiast

    Reputations:
    0
    Messages:
    32
    Likes Received:
    0
    Trophy Points:
    15
    I almost ordered my M17X R4 with a GTX 675M, so I wanted to say thanks to slick and the rest for educating me. I have been seeing green for so long I almost forgot to do my home work. I could care less if the 680M is faster I just know that I didnt pay $150 more for card that is like 50% slower. The 7970m is fast enough to stop thinking about gpus and just play some games.
     
  8. Tyranids

    Tyranids Notebook Evangelist

    Reputations:
    332
    Messages:
    525
    Likes Received:
    0
    Trophy Points:
    30
    You mean 7970M? I'm confused :eek: , 680M is $400 more than 675M.
     
  9. Frost451

    Frost451 Notebook Enthusiast

    Reputations:
    0
    Messages:
    32
    Likes Received:
    0
    Trophy Points:
    15
    I was talking about Dell trying to charge more for the GTX 675m, I meant 150 not 300. They are still charging $150 more for 675m right now over the 7970m.
     
  10. GTRagnarok

    GTRagnarok Notebook Evangelist

    Reputations:
    556
    Messages:
    542
    Likes Received:
    45
    Trophy Points:
    41
    It's $150 over the 7970M, not that it changes anything. Still ridiculous pricing.
     
  11. Tyranids

    Tyranids Notebook Evangelist

    Reputations:
    332
    Messages:
    525
    Likes Received:
    0
    Trophy Points:
    30
    Haha wow, what a steal (for them). Good choice going with AMD then.
     
  12. p1n0yBaLLeR

    p1n0yBaLLeR Notebook Consultant

    Reputations:
    67
    Messages:
    260
    Likes Received:
    0
    Trophy Points:
    0
    Apparently Xoticpc says they have an ETA of the GTX 680m by the end of June, $295 upgrade from 7970m.

    I think i'll wait for 680m benchmarks on games like bf3, 4 GB sounds awesome and good for OC'ing.

    and LoL @ Xotic now selling Alienware for people who can't wait for the 7970m. Good for them.
     
  13. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Of course they only mean the R4, but no one is going to say R4 every time, when it's obvious that a new M17x just came out, so you're referring to that one by default.

    And why would Nvidia put the HMs in an article about brand new graphics with brand new notebooks, when the machines are no longer being sold?

    Chill fam, you're trying to read into things but you're doing so a little bit too erroneously. All we can do is sit back and ask Sager to test it for us.
     
  14. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    [​IMG]

    Dirt3 1080p ultra preset 4xMSAA
     
  15. DJStarscream

    DJStarscream Notebook Consultant

    Reputations:
    2
    Messages:
    243
    Likes Received:
    0
    Trophy Points:
    30
    OC'ed right?

    Chances of FPSs in Crysis 1/W/2 and Witcher 2? My 7970M should be with me next week.

    Thanks boss. :)
     
  16. SlickDude80

    SlickDude80 Notebook Prophet

    Reputations:
    3,262
    Messages:
    4,997
    Likes Received:
    3
    Trophy Points:
    106
    almost 75fps? ya, i think that's overclocked (but not much)

    Based on Nvidia's own PR results, it will take an overclocked 680m to beat this score.
     
  17. evoandroidevo

    evoandroidevo Notebook Evangelist

    Reputations:
    0
    Messages:
    321
    Likes Received:
    0
    Trophy Points:
    30
    Alright so I think I'm going to go with the 7970m so I would like to know on stock clocks what's the average temperature idle and the temps when playing heavy game's and anyone kind enough to tell me when there average temps are when they OC and what the OC is please

    Sent From My Rooted EVO 3D
     
  18. amirfoox

    amirfoox Notebook Evangelist

    Reputations:
    260
    Messages:
    626
    Likes Received:
    13
    Trophy Points:
    31
    When considering that the 675m is a rebranded 580m, which, in turn, is a rebranded card as well, this is hardly a laughing matter. Such a lack of business integrity and ethics really makes me question my decision on going on the 680m.

    Have any of you tried a 670/680 dekstop card and can perhaps attest to its FXAA and adaptive v-sync? I've read a positive review on [h]ardocp which makes me believe the adaptive v-sync alone is almost worth the price of admission. Then there's the superior drivers..?

    Man, I hope the 680m will be worth it.
     
  19. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Metro 2033 1920x1080, very high, 4xAA, 4xAF, tess enabled (DX11)

    900/1575

    Average Framerate: 40.00
    Max. Framerate: 99.34
    Min. Framerate: 9.46

    925/1575

    Average Framerate: 43.24
    Max. Framerate: 88.79 (Frame: 1744)
    Min. Framerate: 10.57 (Frame: 1301)
     
  20. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    I can bet a few 20s that 680m will be compatible with r3..
     
  21. Frost451

    Frost451 Notebook Enthusiast

    Reputations:
    0
    Messages:
    32
    Likes Received:
    0
    Trophy Points:
    15
    I tried the adaptive v-sync and FXAA with the Driver version 301.42 with my GTX 580 and I can tell you there is not much difference between regular v-sync and regular AA and AF. If you are buying just for these you will probably be disapointed. These features are supported on any 8 series or later graphics cards, so try before you buy.
     
  22. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    How exactly did it manage to fall 10 FPS.. anyways, nice 3FPS average there
     
  23. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Max can fluctuate a lot. It's a meaningless number anyway.
     
  24. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    Why post it then, I've always wondered that, one would be better off with the average posted instead of all of it, but I guess it's for the convenience of another
     
  25. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,086
    Trophy Points:
    431
    You are better off knowning the minimum framerate instead than average. Nothing sucks more than an average of say, 45fps with a minimum of 15.

    Anyways, nice numbers there! I can't wait to order mine :D I noticed we have a similar system Meaker, except im running older gen hardware lol, but similar speeds, config etc.

    Sucks that my SATA controller is not as fast :(
     
  26. YodaGoneMad

    YodaGoneMad Notebook Deity

    Reputations:
    555
    Messages:
    1,382
    Likes Received:
    12
    Trophy Points:
    56
    I disagree with this, I think it is a massive improvement over normal vsync. Most of my games run very near or above 60 FPS. I always use vsync. There is nothing more annoying than having the FPS drop to like 50 and get it forced down to 30 causing stuttering and problems because of the vsync. With adaptive it is smooth all the time.
     
  27. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    @Frost451, I liked adaptive vsync when I was using, also FXAA is crazy good are you joking? it is like 4xMSAA without performance hit (seriously) because of that BF3 performance increase didn't look too much when I upgraded (the other games are nuts better, but BF3, I needed to force 4xMSAA with 7970m to get the same quality)
     
  28. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,086
    Trophy Points:
    431
    What? Adaptive Vsync is good but if you are not running at 60fps, its essentially turned off.

    It's just switching between Vsync on and off, to prevent tearing above the screen refresh rate without the performance hit of going from 60fp to 30 for a small drop in fps.

    It's actually a pretty simple idea which makes me wonder why it took a while to be developed... but if you don't have the performance to even reach your display's refreshrate, its the same as running vsync off.
     
  29. arcticjoe

    arcticjoe Notebook Deity

    Reputations:
    66
    Messages:
    877
    Likes Received:
    67
    Trophy Points:
    41
    I really dont like FXAA as even though it gets rid of jaggies its not the same as regular MSAA, - image with FXAA looks less sharp, more blury.
     
  30. amirfoox

    amirfoox Notebook Evangelist

    Reputations:
    260
    Messages:
    626
    Likes Received:
    13
    Trophy Points:
    31
    HARDOCP - FXAA Image Quality - NVIDIA's New FXAA Antialiasing Technology

    And then page 4 and 5.

    The bottom line - the 'blur' is actually how the technology works, and I understand that it shouldn't bother you once you get used to it (can anyone attest to that?)

    When looking at screenshots, it seems that FXAA is much better than 2X MSAA (or none), and nearing 4X MSAA in quality, but not quite as good (ie: the blur.) The way I see it, though, at almost no performance hit, it's a very nice compromise later on when games will start to be more and more demanding and prevent you from activating MSAA and still be able to play them properly. I have to admit, though, that when the time comes that my laptop won't be able to even run a game with 2X MSAA on, then it will be time to get a new one.

    So what's the point of it? Well, I've looked at reviews and screenshots, and high MSAA+FXAA combined looks much more crisp than high MSAA alone, without a degrade in performance. So there's that.

    Regarding the adaptive v-sync, sure, if you can barely run the game the game on 40-50 FPS then it's completely irrelevant, but if the game constantly runs at 80-90 but dips into 30-50 at times (like Skyrim, for example), then it's a very effective workaround, as far as I'm concerned.
     
  31. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    yes, diablo 3 is well above 60 fps for 580m, that's what we are talking about...
     
  32. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,086
    Trophy Points:
    431
    I can understand that, I was just more or less defending the point that running it on weaker hardware is almost pointless, since if you lack the punch to run at high fps, you will miss on the point of adaptive vsync.

    At any rate, adaptive vsync should be a standard... haha.
     
  33. arcticjoe

    arcticjoe Notebook Deity

    Reputations:
    66
    Messages:
    877
    Likes Received:
    67
    Trophy Points:
    41
    It definitely bothered me in BF3, it just seemed like it took out some of the sharp detail, which is pretty useful for trying to do some long distance shooting / spotting. in the end I actually preferred playing with no AA at all, than to use FXAA.
     
  34. amirfoox

    amirfoox Notebook Evangelist

    Reputations:
    260
    Messages:
    626
    Likes Received:
    13
    Trophy Points:
    31
    Yeah, I've read that complaint quite a lot of times with BF3 players.

    In that regard, if that is indeed the case, then I agree completely that FXAA is pointless for you.

    But I don't play FPS games. At all. I suck at them and I hate them. I like RPGs the most, and these get more and more demanding on the hardware each year. When playing those games, losing a bit of detail here and there in the distance shouldn't be that much of a problem, I think.
     
  35. Frost451

    Frost451 Notebook Enthusiast

    Reputations:
    0
    Messages:
    32
    Likes Received:
    0
    Trophy Points:
    15
    I tried both adaptive vsync and FXAA in alot of the games that I play and I was not impressed, but this is subjective and my point of view. Again I was simply advising to test this out before making a buying decision based on just these 2 features.

    PS FXAA looks really bad to me in racing and flying games, IMO just my opinion.
     
  36. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
  37. svl7

    svl7 T|I

    Reputations:
    4,719
    Messages:
    3,758
    Likes Received:
    134
    Trophy Points:
    131
    Extremely great sample you got there... Imagine what you could do at 1.225v :D
     
  38. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    if only.. you might even brick it (even thought I HIGHLY doubt so)

    I could imagine something very close to the 680 desktop version if overclocked correctly for a while (maybe slightly less that is)

    edit: okay no, I might have been a little too optimistic there.. but the power would be insane :D :D
     
  39. ObserverJLin

    ObserverJLin Notebook Evangelist

    Reputations:
    77
    Messages:
    382
    Likes Received:
    7
    Trophy Points:
    31
    I read today that 7970m is soldered to the motherboard of Clevo P170EM. Meaning I can't change or upgrade the GPU in the future.
    Is this true? I thought P170EM was an upgradable laptop.
     
  40. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    Where did you read that..?

    No it is not..
     
  41. ObserverJLin

    ObserverJLin Notebook Evangelist

    Reputations:
    77
    Messages:
    382
    Likes Received:
    7
    Trophy Points:
    31
  42. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    That guys doenst know anything, let alone what he's talking about.

    The 7970m is an mxm card like nvidia cards, so yes, you can upgrade to what you want anytime.
     
  43. amirfoox

    amirfoox Notebook Evangelist

    Reputations:
    260
    Messages:
    626
    Likes Received:
    13
    Trophy Points:
    31
  44. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    basically the above

    pathetic how smart people think they are once they get hold of the anonymous internet identity........ so sad false information everywhere
     
  45. ObserverJLin

    ObserverJLin Notebook Evangelist

    Reputations:
    77
    Messages:
    382
    Likes Received:
    7
    Trophy Points:
    31
    But that's not the P170EM. That's the one where you can put dual cards and a desktop CPU. Those chassis's are discontinued by Clevo and they are still selling it.
     
  46. amirfoox

    amirfoox Notebook Evangelist

    Reputations:
    260
    Messages:
    626
    Likes Received:
    13
    Trophy Points:
    31
    I was referring to the 7970m card itself, not Clevo, as did this 'jamwllms' in the link you provided, from what I could tell.

    Edit: I'm also undecided between a 680m and a 7970m. I currently pre-ordered a machine with a 680m, though.
     
  47. burnie22

    burnie22 Notebook Enthusiast

    Reputations:
    0
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    hi everyone... i was originally planning to buy an m17x r4 this week but this got my interest.. http://vr-zone.com/articles/msi-s-g...7970m...-and-trinity-a10-4600m-apu/16190.html
    i wanted to the amd cpu will give any advantage over intel's?... i really like msi but i cant wait for the gt70 to get the new 680m...if i had a choice i would pick msi over alienware...

    Sent from my GT-I9100 using Tapatalk 2

    nevermind... i've decided on the 17x.. :)

    Sent from my GT-I9100 using Tapatalk 2
     
  48. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Pairing 7970M with A10-4600M is a total and utter epic fail. It will bottleneck the GPU and will deliver crappy FPS in games where the CPU plays a big role compared to the Sandy Quads

    In pure computational power the 4600M isn`t even as good as i5 Sandy Dual cores.

    So sad to see MSI cheaping out, greatly hindering "what could have been" just to satisfy some AMD CEO.
     
  49. x32993x

    x32993x Notebook Evangelist

    Reputations:
    124
    Messages:
    473
    Likes Received:
    0
    Trophy Points:
    0
    A10 will only bottleneck benchmarks like 06.
     
  50. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Anyhow, found this:

    Computex: MSI Showing Off Five G-series Gaming Notebooks | PC Perspective
    Computex: MSI Showing Off Five G-series Gaming Notebooks | PC Perspective

    I wonder if they tested them with heavy tesselation or with 4/8xAA or something since 7970M score 59.5FPS on normal tessellation with no AA enabled. Its not tessellation disabled and with no AA selected since we know 680M score 65.9FPS with that. It could be that the 4600M is wearing the performance down. Or that 680M runs away once the visual settings is enabled. Shame it doesn`t say which settings they used.

    EDIT: Searched around and the forums say that Heaven benchmark is almost a completely GPU benchmark and CPU doesn`t matter. And yes, I still think benchmarks matter very little.
     
← Previous pageNext page →