The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    AMD 7970m vs GTX 680m

    Discussion in 'Gaming (Software and Graphics Cards)' started by x32993x, Apr 20, 2012.

  1. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    The GK110 (The true 680) and the final nail in Tahiti.
     
  2. SlickDude80

    SlickDude80 Notebook Prophet

    Reputations:
    3,262
    Messages:
    4,997
    Likes Received:
    3
    Trophy Points:
    106
    the gtx 685 isn't expected until sept-october time frame. i wonder if AMD will have anything at all to answer to it
     
  3. lastshady

    lastshady Notebook Consultant

    Reputations:
    0
    Messages:
    100
    Likes Received:
    5
    Trophy Points:
    31
    Did anyone tried the 7970m in a r2 yet?
     
  4. SlickDude80

    SlickDude80 Notebook Prophet

    Reputations:
    3,262
    Messages:
    4,997
    Likes Received:
    3
    Trophy Points:
    106
    flingin is getting two soon. he will be the first one to put them in the R2
     
  5. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    given the launch probably we will see the new 8970
     
  6. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    I believe it's being expedited.
     
  7. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Really and you heard this where? Also what does that have to do with the 680M because there is 0 chance of such a chip being any good in a notebook.
     
  8. Torment78

    Torment78 Notebook Evangelist

    Reputations:
    261
    Messages:
    555
    Likes Received:
    0
    Trophy Points:
    30
    i just want the f-ing 680m to come so i can find out if its going to be compatible with np7280/x7200
     
  9. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    I said "believe" for various reasons. It's what my intuition tells me and the various strategies nVIDIA has been known to pull. It's also based on the fact that nVIDIA is now getting MORE supply of TMSC wafers?

    AND

    I'm not part of the pseudo-engineering group that believes 1344 CUDA cores is going into a 680M.
     
  10. develoso

    develoso Notebook Enthusiast

    Reputations:
    36
    Messages:
    47
    Likes Received:
    0
    Trophy Points:
    15
    Really, I think making speaculation here is fine. The thread is named AMD 7970m vs GTX 680m. Nobody knows s**t about the 680m, so ppl can just speculate. It's fine.

    I really believe Nvidea will come up with a faster GTX 680m. Would really look bad for them if they fail to do so. Imagine, months of wait for a 680m weaker than a 7970m? No sense in that (not discussing how much faster neither price).

    And tbh, I do wanna see a powerfull GTX 680m, as powerfull as they can make it. This puts pressure on the market, and makes technology develop, more and more. We win. Always.

    Again, I do think GTX 680m will be faster than the 7970m, but IMO, if Nvidea takes too long to release that, it's already AMD win.

    We shall see what the 15th bring forth.
     
  11. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    Nothing is wrong with speculation, if it was car 1 vs car 2 thread and someone jumped in saying car 2 will be faster than the speed of light people would laugh at him.

    Problem is many people here who got issues clearly even when being corrected how false is their arguments with facts they keep spamming the very same false things about the 680m

    Or you got the dude above you saying 1344 cuda is not possible as if he know it, Kepler low end GPU 640m DDR3 got as much cores as gtx 580m (high end) different architecture, nvidia can keep them or reduce them no one cares and never was the argument but coming here and basically posting lame stuff and saying @lol 1344 cores just shows how bankrupt they are to add anything useful to the thread besides the 12 year old lame trolling

    We got kiddies who quote and did before instead of responding with anything with sense they respond with "cry me a river" or other garbage nonsense, believe it or not some of us check the thread looking for interesting posts and not interested in stupidity because we are not 12.
     
  12. Botsu

    Botsu Notebook Evangelist

    Reputations:
    105
    Messages:
    624
    Likes Received:
    0
    Trophy Points:
    30
    That story has been around for only a few days and it's a pile of s*** according to some.

    It's possible but it would likely have a low clock speed.

    So far looking at released competing (and comparable) parts I see this :
    GK 107 / Cape verde : 384/512 cores, targetted at the same segments => not hard to believe their TDP would be similar. Stock frequencies are in the same ballpark.
    So 33% more cores for AMD at comparable clock speeds for similar TDP and performance. It's even said they have a slight edge in that regard but there is still much speculation.

    I find it hard to believe that a scaling up of both chips would yield significantly different results. Therefore I doubt nvidia could turn a chip with a higher core count (1344 / 1280) into a mobile GPU with clock speed & TDP similar to its counterpart.
     
  13. develoso

    develoso Notebook Enthusiast

    Reputations:
    36
    Messages:
    47
    Likes Received:
    0
    Trophy Points:
    15
    I just think ppl should calm down...

    . . .
     
  14. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I`ve been thinking a little and if I can speculate freely is that,this might happen:

    A) Nvidia make GTX 680M out of GTX 670 and release it in July and calls it a day
    or
    B) Nvidia wait for GTX 660 TI to come out and won`t release GTX 680M until September leaving AMD alone on the mobile market for many months. Nvidia don`t follow what they did with 500 series, aka downclocking a 170W GPU, but instead do it with a 140-150W GPU
    or
    C) Nvidia make GTX 685M out of GTX 670 and release it in July, but also makes GTX 680M out of GTX 660 TI and release it in September like someone have heard in this thread. With this they have two new Kepler GPUs to offer:
    One that is the truly high end of Kepler in the mobile department that surpasses the 7970M by a good amount and match its power consumption of 100W. And another one that is the true predecessor of GTX 570M, scores P4600 and is around 75W.
    This would be the ultimate alternative in my opinion :)
     
  15. fantabulicius

    fantabulicius Notebook Consultant

    Reputations:
    82
    Messages:
    226
    Likes Received:
    0
    Trophy Points:
    30
    Well i am not waiting who knows how many months, i am buying my high end notebook now. So amd wins.
     
  16. vilmeister

    vilmeister Notebook Enthusiast

    Reputations:
    0
    Messages:
    42
    Likes Received:
    0
    Trophy Points:
    15
    Amd Wins, Gg
     
  17. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    Yes, AMD wins now, nVIDIA wins later. Perhaps a revised Tahiti will stand a chance against a 685M.
     
  18. long2905

    long2905 Notebook Virtuoso

    Reputations:
    2,443
    Messages:
    2,314
    Likes Received:
    114
    Trophy Points:
    81
    A hypothetical 685M.
     
  19. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    Yes. It's all dependent on nVIDIA's yields.
     
  20. p1n0yBaLLeR

    p1n0yBaLLeR Notebook Consultant

    Reputations:
    67
    Messages:
    260
    Likes Received:
    0
    Trophy Points:
    0
    Will buy 7970m now, than wait for the 680m thats most likely just as fast and twice as expensive as the 7970m. We all know that Nvidia is not an overachiever like AMD.
     
  21. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    Well it all started with the HD 3870. Low pricing is genius.
     
  22. Botsu

    Botsu Notebook Evangelist

    Reputations:
    105
    Messages:
    624
    Likes Received:
    0
    Trophy Points:
    30
    No it won't. Because

    1/ Tahiti sucks in efficiency compared with Pitcairn. They'd already have to significantly increase the TDP just to be on par with the 7970M. 100W is already quite high, what next ? 130W ? Good luck fitting it in current designs and especially anything under 17". Also the chips would be larger, more complex and therefore more expensive. This would make no sense, at all.

    2/ Be that as it may, if nvidia does indeed outperform AMD's current offer the difference will most likely be marginal and won't nearly warrant a new mobile GPU before their next architecture upgrade. AMD's 7970M performs just where it should at 100W and is probably quite cheap to manufacture too. I see no reason to think nvidia could get significantly better results with the same TDP envelope, as good as Kepler is.
     
  23. GTO_PAO11

    GTO_PAO11 Notebook Deity

    Reputations:
    173
    Messages:
    1,309
    Likes Received:
    205
    Trophy Points:
    81
    How about benchmarks for games on the 7970m?
     
  24. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    That's why I use the word revised. ;)

    I also don't believe 7970M uses 100W except in furmark.
     
  25. rorkas

    rorkas Notebook Consultant

    Reputations:
    118
    Messages:
    280
    Likes Received:
    0
    Trophy Points:
    30
    This situation reminds me of a bunch of hungry people looking for food. They found nice meal and are about to eat it, when there comes some stranger and tells them that he has a very reliable (but unconfirmed) information about different kind of food coming around if they wait a little bit more (or possible longer). It may be better (or may not), but most likely there will be less of it (more expensive). What would a hungry man do?
     
  26. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Wow your knowledge of TSMCs process and Nvidia's transistor design is pretty extraordinary.

    GK110 is not getting "pulled forward" they can't just rush out such a massive chip thats been in development for 4-5 years.

    They had to cancel GK100 because they could not make it for crying out loud, so what makes you think that they even could if they wanted to? But yeah, lets rush and get 0.5-1% yields, that sounds like a great option.
     
  27. Nick

    Nick Professor Carnista

    Reputations:
    3,870
    Messages:
    4,089
    Likes Received:
    644
    Trophy Points:
    181
    In this case, we aren't starving, just interested in a meal soon. We know that the different food is coming, just not if it will be better or the same as the current food.
     
  28. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    That's why there's a question mark. You guys and your reading comprehension :rolleyes:. I have other reasons but this thread is about the 680M/7970M
     
  29. fantabulicius

    fantabulicius Notebook Consultant

    Reputations:
    82
    Messages:
    226
    Likes Received:
    0
    Trophy Points:
    30
    Can we get more 7970m gaming benchmarks ?

    Both on stock and oc ?

    At least we can have something to chew on.
     
  30. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Your the one bringing up GK110 which will have no impact on the mobile market. You keep offering these vague reasons without backing up the evidence. So rather than saying you have a belief, you seem to have faith that they are going to do something.
     
  31. dd_wizard

    dd_wizard Newbie

    Reputations:
    23
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    5
    I've been lurking in this thread for a while, and thought it might be time to weigh in. I think Cloudfire is on the right track, but he may well have it backwards. I just checked Newegg for 600 series graphics cards, and got the following results:

    1. Nineteen different 680 cards, all out of stock.

    2. One 690 card, out of stock.

    3. Four different 670 cards, all in stock.

    Apparently, the 1536 bin is empty, but the 1344 bin isn't. I'm guessing the 1152 bin is getting pretty full. There's really no reason to wait for the 660 or 660TI to lanuch, and there's a very good reason to get a 680m out the door. It wouldn't surprise me to see the 680m launch with 1152 cores in July, with the 685m following in September with 1344 cores.

    This round reminds me a lot of the GF100 chip. Many assume it was TDP that limited the initial offerings to 448 cores, but it may well have been yield problems. It certainly looks like Nvidia is finding it very hard to bin out fully GK104 functional parts.

    Also, the GK110 will never be a mobile chip for the same reasons Tahiti won't. All those extra transistors needed for GPGPU functionality will make it almost impossible to build a high graphics performance, 100W chip. There is quite a bit of speculation that the GK110 may end up in Quadro and Tesla cards only. This will allow Nvidia to fully differentiate their consumer and professional lines based on hardware, rather than artificial crippling of DP performance.

    dd_wizard
     
  32. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    GK110 will be GPUs with huge die, like GTX 580 was, with similar power consumptions, and I don`t think they will release them before AMD is out with 8000 series

    @dd_wizard: That is certainly a very plausible scenario too although I can`t remember Nvidia releasing mobile version first and then desktop after?

    GTX 670 are available all over the place, and you`re right, 1152 cores are most certainly richfull too. It makes both of them good candidates as the next 680M. It makes you wonder if Nvidia goes for the kill now and release 1344 core first then 1152, or the other way around like you say. But I do wonder, what AMD card are they gonna combat with the 1344 in September if it happens in that order? 7970M is already 100W so there are no more room there. AMD can always improve their architecture, intro 8000 series, but that isn`t out until 2013 as far as I know?

    Nevertheless, two possible scenarios now. Interesting :)
     
  33. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    You responded to me when I responded to someone talking about the 685. I didn't bring anything up.

    With the immense shortage of 690s, the 670 using the 680 PCB (not to mention OC models based on 670 outperform 680s), and the 7970 Ghz edition coming, nVIDIA has an opportunity to beat ATI this summer by releasing form of a card with the numbers 685, okay?
     
  34. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    It wont be this summer, they have no reason to rush it.

    I dont think Gk110 will offer any significant gaming peformance over the gk114.
     
  35. fantabulicius

    fantabulicius Notebook Consultant

    Reputations:
    82
    Messages:
    226
    Likes Received:
    0
    Trophy Points:
    30
    Well i still want to know what they have to say on may 15.
     
  36. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    Just like there was no reason to release a 670 so close to 680 performance? There has to be a strategy within.

    No one will care by then, everyone will be playing Diablo 3 -- LOL.
     
  37. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    edit double post
     
  38. fantabulicius

    fantabulicius Notebook Consultant

    Reputations:
    82
    Messages:
    226
    Likes Received:
    0
    Trophy Points:
    30
    But anyway from now on we can actually say that laptops are viable for high end gaming (minus multiple monitors and 4k resolutions) having 7870 and possible 670 level gpus on a notebook is awesome.
     
  39. JupiterW

    JupiterW Newbie

    Reputations:
    0
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    5
    GCN structure is really good for laptop!
     
  40. GTO_PAO11

    GTO_PAO11 Notebook Deity

    Reputations:
    173
    Messages:
    1,309
    Likes Received:
    205
    Trophy Points:
    81
    Why no gaming benchmarks on 7970m?
     
  41. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,086
    Trophy Points:
    431
    There is no reason for the existance of GTX680 at all. It's a useless GPU now that they released a cheaper, similar performing GPU.

    AMD can easily fight back with another GPU of their own. They can apply what they learned with Pitcairn and balance a higher end GPU. But they just don't need to. There is hardly competition when the competitor has just 1 available product. The bulk of sales comes from mid to lower end.

    The same applies to the mobile market. While we love our high end GPUs, most sales come from middle ground. Which is currently favoring nvidia it seems.

    In the end, I do think GTX680m or whatever is going to be the highest end nvidia mobile GPU, can be based on GTX660ti and still be 10-15% above the HD7970m on stock. Considering what the GTX670 can do, I wouldn't be surprised if the GTX660ti is 100w, and 20% less performance than GTX670.
     
  42. misterhobbs

    misterhobbs Notebook Evangelist

    Reputations:
    715
    Messages:
    591
    Likes Received:
    9
    Trophy Points:
    31
    Slickdude did some and probably a few others. They were either posted earlier in this thread or in the Alienware threads.
     
  43. GTO_PAO11

    GTO_PAO11 Notebook Deity

    Reputations:
    173
    Messages:
    1,309
    Likes Received:
    205
    Trophy Points:
    81
    Show me the link, please.
     
  44. GTRagnarok

    GTRagnarok Notebook Evangelist

    Reputations:
    556
    Messages:
    542
    Likes Received:
    45
    Trophy Points:
    41
    Scaling in 3DMark isn't as good as it is in actual games though. SLI seems to give about a 50-60% higher score in 3DMark Vantage, but 75-100% higher fps in actual games.

    GeForce GTX 570 SLI review
    http://images.anandtech.com/graphs/graph4897/41440.png
     
  45. GTO_PAO11

    GTO_PAO11 Notebook Deity

    Reputations:
    173
    Messages:
    1,309
    Likes Received:
    205
    Trophy Points:
    81
  46. GTRagnarok

    GTRagnarok Notebook Evangelist

    Reputations:
    556
    Messages:
    542
    Likes Received:
    45
    Trophy Points:
    41
    Well, the 6990M scales really well in crossfire so the "actual" Vantage score for the 6990M CF should be about 27000 vs the 7970M's 22000(?). I guess it would be about 75% of the 6990M CF in games. Overclocked, it should be very close or maybe even better in DX11 and tessellation heavy games.

    BTW, your avatar is my Xbox 360 profile pic which I unlocked by beating Ninja Gaiden II on Master Ninja. Hardest thing in video games I've ever done.
     
  47. fantomasz

    fantomasz Notebook Deity

    Reputations:
    147
    Messages:
    1,113
    Likes Received:
    0
    Trophy Points:
    55
    I want to purchase Clevo 170EM but don't know what to do.Get 675 now or wait for 7970.I found this pic online but don't know if it's true.


    [​IMG]
     
  48. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
  49. p1n0yBaLLeR

    p1n0yBaLLeR Notebook Consultant

    Reputations:
    67
    Messages:
    260
    Likes Received:
    0
    Trophy Points:
    0
    i should have it by next two weeks in the mail. First thing I'll do is play bf3 and see how much fps i get on fully maxed settings. I doubt it'll be more than 40fps... but i'm hoping the benchmark above is not lying.
     
  50. p1n0yBaLLeR

    p1n0yBaLLeR Notebook Consultant

    Reputations:
    67
    Messages:
    260
    Likes Received:
    0
    Trophy Points:
    0
← Previous pageNext page →