The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    AMD 7970m vs GTX 680m

    Discussion in 'Gaming (Software and Graphics Cards)' started by x32993x, Apr 20, 2012.

  1. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    what are those extra features?
     
  2. Roken911

    Roken911 Notebook Consultant

    Reputations:
    51
    Messages:
    172
    Likes Received:
    0
    Trophy Points:
    30
    What extra features?

    Features that are proprietary such as CUDA?

    Just so you know both companies copy each other....just like everyone else.
     
  3. SlickDude80

    SlickDude80 Notebook Prophet

    Reputations:
    3,262
    Messages:
    4,997
    Likes Received:
    3
    Trophy Points:
    106
    yoda is talking about physX, Cuda and 3d vision. All proprietary and all in danger of being extinct for open source tech...well, maybe not 3dvision, but most people don't like the FPS hit of 3d, especially on a laptop
     
  4. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    PhysX is pretty much dead, nVidia is starting to delegate Cuda/GPGPU stuff to its quadro line (see how the GTX 680 performs against the 7970 in compute and GPGPU applications) and while nVidia's 3D tech is much more complete than AMD's solution, it still has to compete against standards proposed by Sony, Samsung, Panasonic, etc; all of which have different ways to providing 3D imaging. I doubt nVidia's method would beat out the likes of something backed by Sony or Samsung.
     
  5. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    thought as much, but those are non issue for me thus, not an advantage
     
  6. Roken911

    Roken911 Notebook Consultant

    Reputations:
    51
    Messages:
    172
    Likes Received:
    0
    Trophy Points:
    30
    3d is stupid. I personally don't care for 3d. If I wanted things coming out at me I'd put my face in front of a pitching machine.

    This 3D fad will die soon enough.
     
  7. SlickDude80

    SlickDude80 Notebook Prophet

    Reputations:
    3,262
    Messages:
    4,997
    Likes Received:
    3
    Trophy Points:
    106
    ok, that made me laugh out loud
     
  8. develoso

    develoso Notebook Enthusiast

    Reputations:
    36
    Messages:
    47
    Likes Received:
    0
    Trophy Points:
    15
    1st - The question was about the cards having the same price. We don't know how gtx680m will perform. Thus we cannot think of them as similar cards. We just don't know.

    2nd - Like you said, I also believe "Nvidia will hold back the 680m as long as needed to release a comparable product." . The problem is, how long will they hold it back? Does it worth waiting? Can you wait? Does it worth waiting months and months for a card that will possibly be just 5%-10% more powerfull and possibly more expensive, just for the sake of benchmarks? (cuz if ur concern is gaming, the 7970m is already a monster).

    Like I said: today, depending on what you want the GPU for, Nvidea might offer advantages for you with the CUDA cores, but that is to change.

    physX - You can count on ur fingers games that use it.
    Cuda - If your concern is gaming, won't make a difference
    3d vision - Not enough knowledge to talk about it, but I believe most ppl wont use it, and for those who will, AMD solution might be just as good for their real need.
     
  9. Andycinoz

    Andycinoz Notebook Consultant

    Reputations:
    36
    Messages:
    131
    Likes Received:
    11
    Trophy Points:
    31
    A randomly thought up pricing question, but if they have had to basically scrap the original plan for the 680m, respin the silicon and start from scratch, will nVidia NEED to pass on some of the cost of this to the customer?

    I'm guessing that they might, as I'd imagine it's pretty expensive. I know they're known for being an expensive company any way, might this mean that they'll have to be more uncompetitive?
     
  10. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    No, because it's not as id Nvidia is busting out brand new, custom silicon for each mobile card. All they would've done in "scrapping" the original 680M, was change from which desktop core it was to come, then adjust accordingly.

    If it's ultra expensive, it'll be for the same reason it always has been, which is because the people making the decisions are as-.. er, I mean jerks.
     
  11. Zalgradis

    Zalgradis Notebook Consultant

    Reputations:
    45
    Messages:
    169
    Likes Received:
    0
    Trophy Points:
    30
    The only reason I opt for Nvidia over AMD at the moment is that i'm one of those guys who detests horizontal tearing with a passion.

    The ONLY game i have found the Nvidia control panels force vsync option to fail with is Starcraft 2... which has native vsync support so no s given.

    On the other hand I am ye tto find a game in which AMD CCC force vsync actually works... OH DEAR.

    With my last AMD GPU the 5870M my saving grace was D3DOverrider, however AMD GPU owners have been complaining that D3DOVerrider does not work properly if at all with Windows 7 SP1 for some reason and with no other decent method of forcing vsync in games with no proper / working native vsync support (Metro 2033, Dead Space 1 and 2 [native vsync 30 fps HURRR], Silent Hill 3, Gears of War, Crysis, and many others) I feel a little anxious about purchasing a 7970M :(

    However if any AMD owners can confirm D3DOverrider works in Win 7 SP1 or have a viable alternative I would be over the moon as i;m ready to buy a 7970M right now o_O;

    Queue Slickdude "Dan just wait!" :D
     
  12. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    I've been using D3DO with Windows 7 since the Beta, and I'm still using it today with SP1, in every game I play. Zero issues.
     
  13. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Seems like GTX 670 might end up as a notebook GPU after all if Nvidia wants to go that route.

    GTX 670 have about the same power consumption as 7870, but completely destroys it.
    GG 7970M. It was nice to see you. :p

    Review of 670:
    NVIDIA GeForce GTX 670 2 GB Review | techPowerUp

    [​IMG]
    [​IMG]
     
  14. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    Desktop 670 > Desktop 7970
    670 is a great card there is no doubt about that.

    In AnandTech - NVIDIA GeForce GTX 670 Review Feat. EVGA: Bringing GK104 Down To $400

    All the popular games 670 did better than 7970 (skyrim-batman-battlefield etc)

    670 is in fact better than 680 once you overclock it (overclocked 670>overclocked 680)
    And at stock speed there is no "real" difference between 670 and 680

    Amd need to drop 7970 price to 399$ and 7950 to 299$ otherwise they are getting slaughtered
     
  15. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Yeah its so sick. And the best part is that it draw less power (Under 50W less!!!) than the GTX 560 Ti which the GTX 580M was based on. But 670 is 100x better. Man this could really happen :cool:
     
  16. long2905

    long2905 Notebook Virtuoso

    Reputations:
    2,443
    Messages:
    2,314
    Likes Received:
    114
    Trophy Points:
    81
    Now to see if nvidia would actually implement it and when and how much :-? If its reasonable enough (~6-700), otherwise i would sit this one out :p
     
  17. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
    I think you should take a look at average and maximum power consumption. This should be closest to its actual cooling needed, if we draw some parallel to notebook cards.

    VS HD7870 it uses 41W more on average and 18W on maximum screen.
    [​IMG] [​IMG]
    Looks like notebook card can be done from this. The question is, how well power draw drops by cut clocks. :rolleyes:

    Also take a look that perfomance per watt is still 10-18% behind HD7870. So dont expect miracle P7000 here. You will be disapointed.

    http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_670/29.html
     
  18. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Nvidia are also struggling to get cores at the moment too.

    We will have to wait and see what they can actually produce.
     
  19. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    perfomance per watt is not everything, just look at the same chart and compare 670 vs 7970
    670 is more than 100watt less during max power consumption yet it does outperform 7970 in most games, 670 is basically 680 with 5% less power, in some cases the shipped SC version perform better than 680.

    Anyone who does downplay 670 performance/watt basically lose any creditability

    If 680m was downclocked 670 it will outperform 7970m easily and will reach P7000 there is no question about it, the real question is how much and when, if nvidia take too long then AMD will have 8970m by then
     
  20. Zalgradis

    Zalgradis Notebook Consultant

    Reputations:
    45
    Messages:
    169
    Likes Received:
    0
    Trophy Points:
    30
    Thats good to hear, thank you
     
  21. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30

    You compare GPGPU monster card to Kepler? Seriously? You may do it on any desktop forum but here we are trying to speculate upcoming GTX680 perfomance vs Pitcairn XT. HD 7970 is Tahiti core and it has absolutely nothing to do with Pitcairn vs Kepler perfomance on notebooks.
    Its like comparing Boeing 747 to F18 if that example makes more sense to you...
     
  22. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    I've been looking up Tahti's performance vs. Kepler and it explains a lot. nVIDIA's desktop 680 performs worse than its desktop 580 in some cases.

    Benchmark Results: Sandra 2012 And LuxMark 2.0 : GeForce GTX 690 Review: Testing Nvidia's Sexiest Graphics Card

    GPGPU Monster? More like God.

    Imagine the results in QuadCrossfire; all GPUs @ 1.2Ghz.
     
  23. KernalPanic

    KernalPanic White Knight

    Reputations:
    2,125
    Messages:
    1,934
    Likes Received:
    130
    Trophy Points:
    81
    Take a step back. You are misinterpreting things here.

    His point is is the performance level of the desktop 670 being well above the desktop 7870 which is the basis behind the 7970m.

    The 670 desktop might very well fit into the mobile arena much like the desktop 7870 did. (indeed the data shows the two similar in actual power consumption)

    Yes, much like the desktop 7870 was cut by 15% clock rate, the desktop 670 will need some cuts to make the power cutoff... but from the looks of things it could theoretically work.

    This speculation is well within the topic.
     
  24. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    Off-topic subjects ≠ interpretation

    And back on-topic I doubt the 680M will have 13xx CUDA cores.

    [​IMG]

    It would need a beast cooler.
     
  25. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    7970m already runs very cool more than whats needed, 670 runs cool specially compared to previous generation, heat will be the least of their problems once they get their GPU less than 1oo watt
     
  26. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    7970M is not a 670. Look at the graph and extrapolate it to mobile form. GK104 runs hotter than PitCairn. 560Ti(580M) is not GF110 either, it's GF114 which runs cooler.
     
  27. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    Are you reading another thread or something? no one said 7970m is 670

    580m is based on 560 to which runs hotter, use more watt and weaker in performance compared to 670

    The metro chart does not show everything
    http://images.anandtech.com/graphs/graph4135/35200.png
    http://images.anandtech.com/graphs/graph5818/46462.png

    Under max load 670 is cooler or the same as 560 ti heat and both of these charts are for nvidia reference design
    You are ignoring the move to 28nm

    Reality is Nvidia did cut a lot of power consumption from 680 to 670 and almost didn't lose much performance
     
  28. nissangtr786

    nissangtr786 Notebook Deity

    Reputations:
    85
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    0
    zero989 They are on about the desktop version being made for the laptop.
     
  29. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    so to sum up all available data so far,its definitely in the realm of possibility that the 680m will be based on the desktop 670. that would indeed be amazing and really pretty much destroy the 7970m. on the other hand, nvidia is not really left with anything else to do,since theyre so far behind AMD in their time table. considering the awesone OCeability of the 7970m, one could speculate on an OCed version with a "7990m" stamp on it,thus closing the gap to nvidia without robbing them of their obligatory 5% perf. advantage / 50% higher price disadvantage :p

    just my 2 €cents ;)

    Sent from my GT-I9001 using Tapatalk 2
     
  30. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
    I have no doubt that they CAN make GTX680M from GTX670.
    BUT. Dont be naive here.
    They need to cut lot of clocks/voltage, because GTX670 is still hot card and consumes more power than 7870 does. Its not linear curve we are talking about here. GPUs are efficent on fairly small range and im not sure that clocks needed for 100W fits in that range.
    Also we have no idea, how GK104 reacts to such low clocks.
    Besides. GPU voltage needs to be cut a lot as well. We dont know what would be the minimum voltage for it to run and how it will scale with clocks.
    Its possible that GPU doesnt like too low voltages and crashes even with low clocks.
    Im still very doubtful that underclocked GTX670 will be the new GTX680M. For me it seems too big task to accomplish.

    I think they will shut down some more cores and perhaps then its able to fit inside 100W envelope.
     
  31. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    Different coolers & Different benchmarks. I believe FurMark is more demanding than OCCT (could be wrong).

    [​IMG]

    [​IMG]

    The only way for a 680M to receive 1344 CUDA cores is if the card were to run at 100W+, in which I could be wrong. It would also be the greatest engineering feat in the history of engineering. It suggests that nVIDIA would somehow reprogram their power management so that the card would never exceed MXM 3.0(x) limitations, and perhaps underclock the card in extreme circumtances as seen in OCCT?

    I'm aware.



    I'm glad someone understands.
     
  32. zham61

    zham61 Notebook Enthusiast

    Reputations:
    1
    Messages:
    12
    Likes Received:
    0
    Trophy Points:
    5
    I couldn't wait anymore on the 680m. The AAFES had a deal yesterday for 25% Alienware laptops. I got a M17x R4 with:

    Alienware M17X R4
    Alienware M17X R4 with Soft Touch edit
    Operating System
    Genuine Windows® 7 Professional, 64bit edit
    Processor
    3rd Generation Intel® Core™ i7-3820QM (8MB Cache, up to 3.7GHz w/ Turbo Boost 2.0) edit
    Memory
    32GB Dual Channel DDR3 at 1600MHz (4DIMMS) edit
    Keyboard
    English Keyboard edit
    Display Panels
    17.3-inch WideFHD 1920 x 1080 60Hz WLED edit
    Video Card
    2GB GDDR5 AMD Radeon™ HD 7970M edit
    Hard Drive
    500GB 7,200 RPM Storage + 64GB mSATA Boot Drive edit
    AlienFX
    Mars Red edit
    Adobe Reader Acrobat SW
    Adobe Acrobat X Reader edit
    Hinge Up
    Stealth Black with Soft Touch Finish edit
    Audio
    Creative Sound Blaster Recon3Di with THX TruStudio Pro Software edit
    Optical Driver
    Slot-Loading Dual Layer Blu-ray Reader (BR-ROM, DVD+-RW, CD-RW) edit
    Wireless Networking
    Intel® Advanced-N WiFi Link 6250 a/g/n 2x2 MIMO Technology with WiMax and Bluetooth 4.0 edit
    Adapter
    Alienware M17x 240W A/C Adapter edit
    Documentation
    Alienware Documentation edit
    Office Productivity Software
    Microsoft® Office Home and Student 2010 edit
    Shipping Material
    Shipping Material - Black edit
    Security Software
    No Anti-Virus Software Selected edit
    Additional Software
    Additional Software edit
    Primary Battery
    90WHr 9-Cell Primary Battery edit
    Alien Wallpaper
    Alien Red Glyphs edit
    Hardware Support Services
    2 Year Basic Plan edit


    2145 OUT THE DOOR!!! I couldn't pass that up. It's going for over $3K now.

    I still can't believe they don't have a Bluray Burner for this thing
     
  33. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    wow, that indeed is an amazing deal, good job buddy! :)
     
  34. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    This is what the 680M would likely be:

    CUDA 960 or 1152
    700-900Mhz .9v?
    256-bit GDDR5 128GB/s
    TDP 100W

    I believe it would still be faster than the 7970M.

    Edit: nm the 7870 is too close to the 7950.
     
  35. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    HSN21, just give it up. There are way too many AMD biased people in this forum that will never see the light of the day wether it is right in front of their damn face.

    This GPU completely devastates the HD 7870 which the 7970M is based on. If Nvidia decide to make a GTX 680M out of this baby, we should expect a downclocked 7970 since GTX 670 trade blows with it. :cool:
    BUT as a precaution, we don`t know if Nvidia can scale this GPU down to a mobile version equally good as AMD did without sacrificing more performance.

    FACTS:
    - GTX 580M is a downclocked GTX 560 TI
    - GTX 670 draws 56W less than GTX 560 TI with the GPU on 100% full. Maximum is tested with Furmark.
    - GTX 670 draws 7W more than 7870 (which the 7970M is based on) under heavy gaming, i.e Metro 2033.
    - GTX 670 draws 4W less in average than GTX 560 TI while playing Crysis 2 1080p Extreme profile.

    As for temperature I really don`t care because Nvidia will downvolt this GPU to in to the thermal envelope of notebooks anyway, and that will have *NO* performance hit. Anyhow:

    - GTX 670 runs measly 5 degree warmer than GTX 560 TI, 670 playing Metro 2033 and GTX 560 TI Crysis.
    http://images.anandtech.com/graphs/graph4135/35199.png
    http://images.anandtech.com/graphs/graph5818/46461.png
    - TDP of 670 is 170W, TDP of GTX 560 TI is 170W. They should get equally hot in average.

    And Supranium, please, 99% of people in this forum wouldn`t give a rats a** about GPGPU performance.
     
  36. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    They don't reach the same temps because the 670 (desktop) has water vapour cooling, but I could be wrong. I'm not sure if the 560 had it. It also has better HSF, in theory (the heatsink AND the fan).

    Fact: The wattage output in different situations is proven to be higher than the 7870.

    And it's not bias, it's fact. I've owned more nVIDIA gpus than AMD as well.

    My eyes are set on the GTX 685 (desktop) this year so, I don't see how there's any bias.

    Also the 3DM11 scores would be much higher than 6.5-7K if the 680M rev 2.0 had 1344CUDA cores.
     
  37. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    Im hoping that the 670 is a 680m downclocked and so forth, I do hope that it shreds the 7970m, we have to see how the clocks are going to scale since its a new arch, lets hope it aint much of a hit.
     
  38. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    Exactly, some are just bringing laughable arguments like nvidia have to release a GPU that is running cooler than 7970m as if it was standard or nonsense unrelated arguments, you posted the facts which i brought before here but they ignored, to them 680m will be weaker than 7970m if their argument is true (since to them perf per watt is everything)
     
  39. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    and perf per watt might not be everything, but the higher the tdp and the consumption is, the lower the clock and the voltage are going to be.
     
  40. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    Double post.
     
  41. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    Again are you reading another thread? stop twisting facts to fits for your agenda, read the very same post you quoted, i made it clear that both are using identical nvidia reference cooler, if you don't in fact the 670 end up running much cooler and at load is making less noise than 7970 or 7870 at ilde

    http://www.techpowerup.com/reviews/ASUS/GeForce_GTX_670_Direct_Cu_II/27.html

    670 is one of the coolest running GPUs ever in its range, in most cards the fan is running at very low speeds, just because they appear in benchmarks at 70C it doesn't mean much since these GPU ARE programed like that (70c is normal and the fans don't need to kick in faster to cool it, A gpu with fans running at 80% speed with 78C is not identical in heat to a GPU running at 78C with 30% fan speed, since if you manually boost the fans speed at the second card to the level of the first card it will end up much cooler)

    He doesn't "understand" and i explained why.


    OK so according to you wattgage is everything
    and since 7970m is based on 7870 which does better job therefore 680m if was based on kepler should be weaker than 7970m

    We got it, we laugh at it because it's proving plenty of times and posted yet you decided to ignore, look at 560 ti and 580m Performance per watt is not identical neither is 670 and 680 you fail to realize that your argument is baseless
     
  42. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    Wattage is the highest factor, yes.

    580m MATCHED the CUDA cores in the 560 Ti.

    Are you saying the 680m will match the CUDA cores in the 670?

    You seem to be ignoring that.

    670 stock score is 9K in 3DM11
    7870 stock score is 6.86K in 3DM11

    The 680M will likely still be faster but it won't be AS based on the 670 desktop as the 580M was/is to the 560 Ti.

    And your fan % doesn't help your point because cooling capacity is based on total wattage. The HSF is not the same as the one on the 560 Ti. The complexity has increased though with the Kepler and so its thermal & power management somehow shine without increasing fan speed.
     
  43. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Nope!
    GTX 670 have the exact same BLOWING fan type as GTX 560 TI. No vapour chamber there.

    Do you really think the official TDP from Nvidia is a lie? TDP of GTX 670 is 170W, TDP of GTX 560 TI is 170W. End of discussion. It is not much hotter, if anything. And just for fair comparison, 7870 have a TDP of 175W, it should run equally hot too. But like I said earlier, they will undervolt the desktop GPUs to make them cooler anyway and with no performance hit, so the heat is not so important imo.
    http://wccftech.com/wp-content/uploads/2012/03/02a_800x445-635x353.jpg

    Yes the GTX 670 draws more power than 7870, I am not arguing there, but listen to this:
    GTX 560 TI which the 580M is based on, draws 159W in Peak, aka when gaming in extreme mode. HD 6870 which the 6990M was based on draws 128W in Average. AMD only had to sacrifice 28W, Nvidia had to sacrifice 59W to fit to 100W. GTX 580M STILL beats 6990M although Nvidia had to sacrifice more.

    Now we have GTX 670 which the 680M is based on, draws 152W in Peak. HD 7870 which the 7970M is based on draws 115W. AMD only have to sacrifice 15W this time, but Nvidia have to sacrifice 52W.
    Don`t you think we should see some similarities this time too as well?

    BUT last time, with 560TI and 6870 competing, there was very little performance difference between them. Now we have GTX 670 and 7870 competing in the notebooks, there is a HUGE performance difference between them. That is why I think Nvidia will crush 7970M this time.
     
  44. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    How many CUDA cores will the 680M have?

    That's my ONLY question. Blowing fans doesn't mean it's the same HSF. The fan on the 560 Ti is in the centre and exhausts in the case? The 670 exhausts outwards. You guys are really odd. The HSFs are different designs and have different cooling capacities.


    [​IMG]
    [​IMG]
     
  45. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    1344 cores like the 670 model.
    I shure hope so :)
     
  46. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    Stop trolling and read the link in my previous post
    670 at load is is less nosier than 7870 at idle and you still talk heat? zero review complained about 670 temperature, the 670 is not struggling to keep the GPU below 80c even at minimum fans speed unlike amd cards.

    even if 7970m were to run at 10c at load in theory no one cares since Nvidia has to compete within the limit of around below 90c not 10c so again all of your 7970m info is not relevant and pure nonsense

    You were proven false at watt issue so you moved to butttt the "temperature!" and you were proven false at that too so we await more nonsense argument from you
     
  47. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    ......

    Then why does it score 6.5K-7K unless we're getting a 680M rev 3.0?

    if the 680M has 1344CUDA I will literally post in this thread apologizing along with admittance that I'm wrong.

    How am I trolling when you guys think different HSFs correlate equally. :/
     
  48. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Rumors. We don`t know if they are true or not. When looking at the 670 performance now, I think it might tip over 7K :)
     
  49. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    You realize if it actually does have 1344 CUDA cores and using this technology for a higher end GK110 would yield a Kepler literally unbeatable by anything AMD could offer. AMD would have to use a dual GPU to beat a single GPU kepler. It would also suggest Maxwell would literally annihilate AMD without anyone needing to know anything.
     
  50. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    You are because you are complaining about heat of the 670 when it's the coolest GPU ever in its range, all benchmarks will show GPU in the 70~80C range since they are designed to handle that heat, the different is how hard the fans gotta work to keep it at that range, for the 670 you can basically keep them at minimum speeds and it wont overheat, if you were to do that in 7970 the card would be burned.
     
← Previous pageNext page →