The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    AMD 7970m vs GTX 680m

    Discussion in 'Gaming (Software and Graphics Cards)' started by x32993x, Apr 20, 2012.

  1. Chazy90

    Chazy90 Notebook Evangelist

    Reputations:
    318
    Messages:
    409
    Likes Received:
    1
    Trophy Points:
    31
    Just waiting for the Dual 680M SLI :D
     
  2. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,462
    Likes Received:
    12,847
    Trophy Points:
    931
    the 680 is stronger than the 7970, but...you have to go way way way beyond the normal call of duty to prove that fact. this goes for vantage and 3dmark11.
    but again, it's just too much work to make that happen. the 7970 can jump out the gate doing these things up to a point...then of course more in depth means are needed to go farther.
    as of right now...they are not maxing out the 3gb cards in the dt world so 4gb might play out long before the next cards roll in..(speculation of course)
     
  3. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Considering even at 3 monitor resolutions that SLi 680s with 4gb see little to no benefit, I doubt even 2 680Ms will at 1080p.
     
  4. nissangtr786

    nissangtr786 Notebook Deity

    Reputations:
    85
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    0
    I knew it nvidia must have though oh s... the 7970m is 30% more powerful so they did the same to get 6000 ish but now it should be 100w afterall kepler is similar to amd gcn performance per watt.

    Imagine if amd weren't around nvidia would be selling crappy p4600-p4900 gtx 680m at a ridiculous price.
     
  5. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    It is not 100W according to some tech heads in this forum who have studied the MXM card. Atleast thats what they guess

     
  6. nissangtr786

    nissangtr786 Notebook Deity

    Reputations:
    85
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    0
    can you give me a link to this?
     
  7. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Magic has not yet supplanted physics.

    You need to give up on your dream. The 75W rated GPU is already here, and it's called the GTX 660M. They are not going to double it's performance at identical consumption.
     
  8. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    i was just about to say... u cant just choose to combine the picture of the early prototype with the supposed 6000+ score of the alleged newer version. that wouldnt make any sense cloud :)

    Sent from my GT-I9001 using Tapatalk 2
     
  9. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    o
    Yes it does make sense. What Ive been trying to explain to you people is that the P4900 score is from the first 680M, NOT the one in the picture. So it might score P6000. :)

    @Kevin: They can make a 75-80W if they want. We had 75W 560M and the greater 80W 570M earlier. Well not this high jump in performance though but hey, it might happen. Anyways I would be satisfied with P4900 too :) Plus with 80W 680M it makes room for a 100w 685m ;)

    Anyways, the people who knows their stuff thinks its 75-85W. Lets just wait and see if they are right or wrong :)
     
  10. Vahlen

    Vahlen Notebook Evangelist

    Reputations:
    243
    Messages:
    323
    Likes Received:
    1
    Trophy Points:
    31
    Logic does not follow in the footsteps of your conclusions................

    Nvidia is not going magically pull a 75W mobile card that is 15-20+% faster than the 7970. I'm sorry it is just not going to happen, even if they could they would not be putting 4gb of RAM onto it.

    We all want Nvidia to put a monster mobile GPU out, but looking at this from a non-biased logical perspective that just doesn't seem to be in the cards for Nvidia atm.
     
  11. fantabulicius

    fantabulicius Notebook Consultant

    Reputations:
    82
    Messages:
    226
    Likes Received:
    0
    Trophy Points:
    30
    Yes they will probably release something that scores 4800-5200 and later this year release an updated model, like they did with the 480m and then 485m.
     
  12. Chazy90

    Chazy90 Notebook Evangelist

    Reputations:
    318
    Messages:
    409
    Likes Received:
    1
    Trophy Points:
    31
    So there is a chance there is going to be a 685M and maybe an AMD rival 7990M later this year? :D
     
  13. bennyg

    bennyg Notebook Virtuoso

    Reputations:
    1,567
    Messages:
    2,370
    Likes Received:
    2,375
    Trophy Points:
    181
    Don't know... can say for sure though that something faster will be released at some point in the future :p

    It depends on whether it is needed. If the 480M wasn't crap, there wouldn't have been a 485M. If the 6970M wasn't slower than the 580M arguably there wouldn't have been a 6990M which was similar power to 580M for a much cheaper price.
     
  14. Chazy90

    Chazy90 Notebook Evangelist

    Reputations:
    318
    Messages:
    409
    Likes Received:
    1
    Trophy Points:
    31
    You could bet on this and win a LOT of money. :D

    Kidding, but my real question if this year there is something better coming than 7970M/680M because of course next year there is going to be something better.
     
  15. mahalsk

    mahalsk Notebook Consultant

    Reputations:
    262
    Messages:
    286
    Likes Received:
    6
    Trophy Points:
    31
    hi guys, just found this on the pcspecialist.uk forum, which is a uk sager resseler : "We are planning our Q3/Q4 schedule and as many of your are aware the 4GB Nvidia GeForce GTX 680M will be released during this period and we expect to receive stock from September onwards, according to the current schedule.

    We are seeing pricing that is approximately £180 more than the 2GB HD 7970M but according to benchmark performance tests it will only be around 10% faster than the 2GB Radeon HD 7970M"
     
  16. Prasad

    Prasad NBR Reviewer 1337 NBR Reviewer

    Reputations:
    1,804
    Messages:
    4,956
    Likes Received:
    10
    Trophy Points:
    106
    That's just ridiculous...
     
  17. Chazy90

    Chazy90 Notebook Evangelist

    Reputations:
    318
    Messages:
    409
    Likes Received:
    1
    Trophy Points:
    31
    4GB is just ridiculous. :D
     
  18. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
    Hahha. At least its going to be more poweful then.
    ETA: September and onward? Happy waiting nvidia fanboys. :D
     
  19. GaryO

    GaryO Notebook Geek

    Reputations:
    50
    Messages:
    87
    Likes Received:
    0
    Trophy Points:
    15
    So it should be around the same time that AMD sort out their drivers then for the 7970m ;) :D
     
  20. Prasad

    Prasad NBR Reviewer 1337 NBR Reviewer

    Reputations:
    1,804
    Messages:
    4,956
    Likes Received:
    10
    Trophy Points:
    106
    Seems their crazy marketing is working just fine! :rolleyes: And how is that extra RAM supposed to help performance??
     
  21. fenryr423

    fenryr423 Notebook Evangelist

    Reputations:
    29
    Messages:
    542
    Likes Received:
    0
    Trophy Points:
    30
    You nvidia fanboys are ridiculous... The drivers are not that bad at all... You really need some new ammo
     
  22. fantabulicius

    fantabulicius Notebook Consultant

    Reputations:
    82
    Messages:
    226
    Likes Received:
    0
    Trophy Points:
    30
    So let's say most people that want a 7970m have it by july 1, Nvidia fans will end having to wait 2 to 3 months and pay more for minimal performance increase.

    Yea i think i am happy that i bought my notebook now.
     
  23. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    didnt the leaked picture and the P4.9k score go hand in hand? so how can that then already be the revised gpu model??
     
  24. bennyg

    bennyg Notebook Virtuoso

    Reputations:
    1,567
    Messages:
    2,370
    Likes Received:
    2,375
    Trophy Points:
    181
    By not bottlenecking something like Crysis 2 + ultra res texture pack + 4xAA/AF etc which happens on 1.5Gb and 2Gb Nvidia cards that CAN run those visual settings. Not arguing though that overspeccing on VRAM is rife, esp on the low end cards which don't have the grunt to run settings that would fill it half full.

    And Nvidia had a slightly faster card for much-more-than-slightly more $$$ with 580M vs 6970M, while Nvidia does it far more obviously - that's what both sides do most of the time when they have "the fastest".

    And can the AMD fanboys argue that AMD driver *haven't* deserved their reputation?
     
  25. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    6990M is slightly behind the 580M, to the point where the better dual card scaling with the 6990M puts it on par or slightly ahead of 580M SLi.

    It's just the 580M happens to have extra voltage options in the bios that lets it pull ahead after tweaking.
     
  26. Chazy90

    Chazy90 Notebook Evangelist

    Reputations:
    318
    Messages:
    409
    Likes Received:
    1
    Trophy Points:
    31
    Maybe there is some new 23'inch Gaming Laptop with a 2560x1600 resolution upcoming and only Nvidia knows about it? :D
    Or maybe one with 2 or evem 3 screens?"lol"
     
  27. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    TBH on 17 and 18 inch machines 2560x1600 would be great.

    However ram would still not be an issue.
     
  28. AhPek00

    AhPek00 Notebook Enthusiast

    Reputations:
    0
    Messages:
    26
    Likes Received:
    0
    Trophy Points:
    5
    notebook with 1600p resolution is just not possible at the moment, let alone notebook with 2 or 3 screens :p
    anyway 10% perf increase for £180? pfftt :eek: :rolleyes: :p :D
     
  29. fantabulicius

    fantabulicius Notebook Consultant

    Reputations:
    82
    Messages:
    226
    Likes Received:
    0
    Trophy Points:
    30
  30. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    Remember also that althought a "gimmick" as people's say, nvidia own 3dtvplay and 3dvision can be used with an hdmi output and a compatible hdmi 3d tv set.
    That requires a lot of VRAM. Still I think 4gb is a bit overkill but if the 680m turns out to have the same power as a desktop 670 which is only marginally inferior to the 680, the big VRAM would help games such Witcher and Crysis even in the near future plus their 3dVision feature.Also Kepler can output on 3 monitors like eyefinity, but at the same time along with Nvidia Sorround you can output 3d as well. So it would be Nvidia 3d Sorround.
    And that requires a lot of VRAM as well.
    As an example 2 gtx580 with at least 1.5gb of ram are essential for Nvidia Sorround setup.
    Now that Kepler doesn't need Sli for multi monitor setups it still requires a good amount of ram.
     
  31. Chazy90

    Chazy90 Notebook Evangelist

    Reputations:
    318
    Messages:
    409
    Likes Received:
    1
    Trophy Points:
    31
    WOW! :D
    When are we gonna see Laptops with higher resolution than 1080p?

    @AhPek00:
    Just think of a Nintendo 3DS built Laptop. :D

    @Kingpinzero:
    GTX 670 desktop performance would be GREAT!
     
  32. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    Indeed it will. Even if nvidia comes out with a gtx 660ti, which can be used as a base for our "probable" 680m, it would be enough for everything.
    Since mostly it would compete with desktop 7870 which is what 7970m is based on.
    The current state of things on desktops shows that nvidia has two cards which compete with amd 7970, gtx 680 and 670.
    I really hope they will come out with something decent.
     
  33. Vahlen

    Vahlen Notebook Evangelist

    Reputations:
    243
    Messages:
    323
    Likes Received:
    1
    Trophy Points:
    31
    It's not, Cloud is just how do you say.............floating amongst the "clouds" (chuckles).
     
  34. KCETech1

    KCETech1 Notebook Prophet

    Reputations:
    2,527
    Messages:
    4,112
    Likes Received:
    449
    Trophy Points:
    151
  35. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    2160p would be 4 times the pixels of 1080p.
     
  36. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    I missed this post cloud.... What? You put 1W in, you get 1W out, law of conservation of energy.
     
  37. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    But not all of it is heat. So what Cloud is saying about a power/heat constant is plausible.
     
  38. min2209

    min2209 Notebook Deity

    Reputations:
    346
    Messages:
    1,565
    Likes Received:
    3
    Trophy Points:
    56
    All of it IS heat. There is no macro level potential, kinetic, chemical, or nuclear energy created. The energy is completely dissipated in resistive heating as currents travel through traces and vias.
     
  39. evoandroidevo

    evoandroidevo Notebook Evangelist

    Reputations:
    0
    Messages:
    321
    Likes Received:
    0
    Trophy Points:
    30
    Ummm are they really going to make this o_O would be epic
     
  40. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    No they didn`t. Its all the sites who are slow and didn`t pay attention to what was happening in the chinese scene. I made the whole Kepler thread, I found a bunch of stuff. Believe me, I followed everything.

    The P4900 score was leaked in early april on a chinese forum. P4600 was the stock score, P4900 was the overclock score. They couldn`t get a GPU-Z screenshot from it, but they noted that it was reported as GK106. Look up the "HURRAY, 600 series not only Fermi" and the M17xR4 speculation thread. I posted the scores there.

    Then the 7970M was tested in this forum in late April followed by the official launch.

    First in early May the photo with a guy holding the MXM 680M card was posted on the internet. It was someone in the barebone MSI community that got a hold on the MXM card. They reported it as a revisited 680M, and it was now a GK104 680M. And the score I posted yesterday from a chinese forum have seen the GK104 680M score and it apparantly broke P6000 in 3DMark11. Tech people in this forum have studied the MXM card and thinks its 75-85W.

    So the P4900 and the MXM picture you see in all the 680M articles right now are not related, at all. They are mixing a GK106 score with a GK104 picture.

    This is what happend and its a complete mess since rumors after rumors have followed the mobile Kepler. Its all guess, based on very little information
    ----------------------------

    And no I`m not doing the whole heat/watt discussion again. We had that month ago and I`m done with it
     
  41. KCETech1

    KCETech1 Notebook Prophet

    Reputations:
    2,527
    Messages:
    4,112
    Likes Received:
    449
    Trophy Points:
    151
    chances are the 4k resolution would come into play. 3840x2160 or 4096x2160 as were already editing video in these resolutions today and there are lots of cameras and projectors that use it, its just coming into consumer displays though

    http://en.wikipedia.org/wiki/4K_resolution
     
  42. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    Uh, no. Power (e.g. Watts) = Resistance * Current ^2. By virtue of being a closed circuit, there has to be a current flowing out of the CPU (otherwise the CPU wouldn't have power because there would be no current flowing through it, i.e. an open circuit). And by virtue of the fact that CPUs/traces/wires/etc. are not made out of superconductors, there will be resistance. Therefore there is power leaving the CPU. So what you're saying is that the CPU turns all the input energy into heat, and then magically creates more energy to output. Which is clearly untrue.

    The obvious explanation is that the CPU does not "use" all of the energy. Just like an electric motor does not turn all of the input energy into mechanical energy, a CPU does not turn all of the input energy into heat. The heat output will be slightly less than the power going in minus the power going out. And since there will always be power going out (by virtue of being a closed circuit, as explained above), the heat output will never be equal to the power going in.

    Also, even if all the energy going into the CPU did get transformed into some other type of energy, the would be some kinetic energy, manifested as the physical degradation of the CPU. Also, there would be a small amount of EM radiation and sound. So either way, not all of the energy gets turned into heat.

    Well, that's as far as I figure it at least.
     
  43. min2209

    min2209 Notebook Deity

    Reputations:
    346
    Messages:
    1,565
    Likes Received:
    3
    Trophy Points:
    56
    Yes - after four years of education in electrical engineering with an emphasis on high speed circuits, I am fairly well aware of the closed loop nature of circuits.

    Power is at the same time defined by Voltage * Current (assuming zero phase difference between the two). The port t which current leaves the circuit has no energy with respect to the ground of the circuit. That is not to say that you cannot extract more energy if you connect that to a more negative potential, which will indeed cause more energy conversions. When a manufacturer says a chip consumes 100W, it does not mean that it will somehow have 100W pass through it and then return an amount to the network. That's simply how electrical power works.

    Essentially all of the power loss in a chip is due to IR^2 heating, as you mentioned. I noted specifically that there is no MACRO kinetic energy (coherent, unidirectional motion of large masses). Random motion by individual atoms in a lattice structure such as that of silicon is characterized as thermal energy. Temperature is a function of average kinetic energy possessed by the particles.

    Furthermore, electrons traveling in a circuit is not like water rushing out of a pipe. There are two types of currents involved in silicon - diffusion and drift currents. In both cases, at the crystal lattice level the electrons do not in any way travel straight. If you look up the figures of diffusion coefficients or electron mobility and factor in the typical carrier concentration differential and the electric fields established, the net, directional velocities of electrons is on the order of at most centimeters per second. This is because electrons bounce back and forth within the structure, and the NET motion (eg 1 cm forward, 0.999 cm backward, 1cm forward, 0.999 cm backward) is the "current" flow. It's not like 1kg of water rushes out of a pipe at 1m/s possessing 0.5J of macro level kinetic energy.

    Lastly - yes, the device will cause time varying EM fields due to switching currents. However - not all transistors switch at the same time, and not all currents flow in the same direction. The net EM field emitted by the chip is usually low in magnitude, as sub components cancel. As well, in order for actually significant amount of energy to be transmitted to an environment through electromagnetic coupling, you'd need a certain amount of inductive / capacitive effects between the chip and the environment, which you don't have.
     
  44. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    I think you misunderstood my initial statement. My statement was that not all of the power going in will get turned into heat. What you seem to be saying is that the power that does get consumed gets turned into heat.

    So using the previous example, if you put 1w in, you will get 1w out in some combination of energy, but not all of the 1w will be heat.
     
  45. Chazy90

    Chazy90 Notebook Evangelist

    Reputations:
    318
    Messages:
    409
    Likes Received:
    1
    Trophy Points:
    31
  46. WaR

    WaR Notebook Virtuoso

    Reputations:
    2,391
    Messages:
    2,128
    Likes Received:
    0
    Trophy Points:
    55
    R3d and min2209, I'm just gonna go ahead and give you two some rep. Nice round you had going there. :cool:
     
  47. KCETech1

    KCETech1 Notebook Prophet

    Reputations:
    2,527
    Messages:
    4,112
    Likes Received:
    449
    Trophy Points:
    151
    the w700ds may be funny, but that unit was a dream for content creators, too bad it never caught on more.
    we have one at work still
     
  48. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
  49. Zalgradis

    Zalgradis Notebook Consultant

    Reputations:
    45
    Messages:
    169
    Likes Received:
    0
    Trophy Points:
    30
    It is worth it for those of us who i would not necessarily label as "Nvidia Fanboys", but who have come to consider the level of compatability and efficiency of Nvidia drivers as the standard.

    I have nothing against ATI and recognise that the 5870M and 6990M trumped their Nvidia counterparts in terms of price :performance ratio, however poor driver support (relative) and lack of proper vsync and triple buffering support via CCC always lead me to opt for the Nvidia counterpart.

    Again as I currently own a 580M system, although I have been salvating over the 7970M for a good few months now I would gladly wait until September to pick up a card that may only be 10% faster at a 50% price increase (10% overall system price increase for me), but also provides me with peace of mind.
     
  50. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Well you are not alone. There are many many people who buy Nvidia year after year because they are satisfied with the driver support, Optimus, stability with their GPUs and the extra performance over AMD. Its really bad though, because it sort of gets a habit after a while. Its like if you want to BIOS update. Why change when you have been satisfied with Nvidia for so many years? (For those who didn`t catch what I meant: They say you shouldn`t update BIOS when there is nothing wrong with your machine)

    Plus not everyone (like myself) care too much about price. Atleast not $170. Thats not even a half SSD.
    But I do really understand those in the opposite end: Why pay extra for Nvidia when you are satisfied with AMD?
     
← Previous pageNext page →