The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    AMD 7970m vs GTX 680m

    Discussion in 'Gaming (Software and Graphics Cards)' started by x32993x, Apr 20, 2012.

  1. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
  2. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    hmmm looks like the 670 is gonna be like 15% slower than the 680. substract another 15% to fit the 100W thermal envelope and u could make a first guess as to the perf. of the 680M, what do u think guys?

    sooo...lets see... desktop 680 minus 30% would be pretty much equal to a desktop 7870 according to this overview, thus approximately 5-10% faster than a 7970M at stock clocks. would make sense if nvidia manages to pimp the 680M as good as their mobile cards in the previous years ^^

    cheers
     
  3. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
    Who told you that 680M is based on GTX670?
    And your calculation is surely incorrect. GTX670 has about 14% less cores. In addition it is 10% lower GPU and 20% lower Memory clock.
    Roughtly estimated it will be at least 25% weaker than GTX680. Its tdp is 30% lower. Judging that mid range cards have slightly better perf per watt ratio, i think GTX680 -25% is quite accurate prediction for GTX670.
     
  4. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    no one, still pure speculation at this point ^^ it would just make sense TDP-wise (if above mentioned specs hold firm) and perf-wise as well, since a 680M based on a desktop 660 would be too far behind the 7970M in my book ;) nvidia surely wouldnt allow AMD to stay the mobile gpu king this year, they would do their damnedest to at least be on par with them!

    some further confirmation on the specs and a few pics of the desktop 670: (use google translate if needed)

    http://www.computerbase.de/news/2012-05/bilder-einer-msi-geforce-gtx-670-aufgetaucht/

    cheers
     
  5. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Because that makes perfect sense.
    GTX 580M was based on GTX 560 Ti, a 170W GPU. They managed to take away 70W to fit the mobile version. 7970M is based on 7870 which is a 144W. Same story here as well, except now Nvidia doesn`t have to sacrifice as much with Kepler since its more efficient than Fermi, only shave away 50W instead of 70W if they don`t go allin and make a 80W GPU instead. I guess that depends on how good the GTX 670 is compared to 7870.

    GTX 680M will be a 256bit like the GTX 580M, GTX 660 Ti is a 192 bit bus so it doesn`t fit that.
     
  6. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
    Ok. Lets say its true and GTX680M will be based on GTX670.
    Some math.
    Predictably, reference GTX670 will be around the same TPD as AMD HD7870. Or at least, even if the number doesnt say so right now, im fairly sure that those cards power cosnumption is in same ballpark.
    We know that HD7870 clocks are 1000/1200 and HD7970M is 850/1200.
    In 3DM11, the 7970M is about 12% slower than HD7870.
    Now, judging that Nvidia does job as good as AMD and gets about same numbers, then we could predict the GTX680M perfomance like this
    GTX680 -25% -12%.
    Taking the base numbers from overclockersclub reviews for HD7870 and GTX680, i can predict that GTX680 will score around P6200 on 3DM11.
    That will be fair amount higher than HD7970M.

    Hopefully we see some more leaks soon.
     
  7. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    your observations are correct, but u cannot add the specs like that, id rather take the average of the sum, thus -14%+-10%+-20%=-44% / 3 = approx. 15%. would make more sense to give u an overall view of the card. of course ud have to weigh it like: cores > GPU clock > Mem clock but that would involve too much variables at this point
    ^^

    edit: funny, we pretty much arrived to the same conclusion with different assumptions :p
     
  8. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30

    No, you cant calculate like this.
    Perhaps you can like this:

    14% less cores should be at least -10% loss on perfomance. This is my rought estimate based on many desktop cards.

    However 10% lower core and 20% on memory perfomance loss can be calculater like this: (10+20):2=15%.
    This should be fairly accurate, since -20% memory is roughly 2 times less perfomance hit, compared to GPU frequency.

    That is -25% overall like i sayd.

    I wish we knew exact power consumption of GT660M. It would be nice to do some math in comparison to this theory here.
     
  9. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    yeah well, u pretty much took it up where i left off@different weighing of the gpu components ^^
     
  10. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    you guys are arguing over nothing. you should know that.

    you can't assign a single performance impact percentage based on removal or reduction of particular processing elements. the impact will vary dramatically based on usage case.

    arguing over a few percentage points to try and find "the number" is absolutely foolish.
     
  11. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    lets wait and see :)
     
  12. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
    No. Not arguing at all. Just trying piecefully predict the perfomance of this upcoming card. Thats what forum is for. :)

    And. Its not foolish to do that kind of predictions. I belive its fairly accurate. Mark my words. We shall see.
     
  13. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    3DMark11 P6200 for GTX 680M? Do you confirm Supranium? :p
     
  14. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    @masterchef: easy tiger ^^ were not arguing, just playing around with numbers to get a feeling of where we could place the 680M. of course its all still pure speculation at this point, nobody said otherwise :)

    cheers
     
  15. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
    I speculate that its very close to trueth IF GTX680M is indeed downclocked GTX670

    I have my doubts too.

    1. GT650M and GT660M are weak in my opinion.
    2. Pitcairn has better perfomance per watt than Kepler.
     
  16. long2905

    long2905 Notebook Virtuoso

    Reputations:
    2,443
    Messages:
    2,314
    Likes Received:
    114
    Trophy Points:
    81
    Unrelated, but interesting none the less
    [​IMG]
     
  17. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Seems like a bit of a cherry pick to me :p
     
  18. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,086
    Trophy Points:
    431
    Well considering that even an overclocked HD7970m seems to consume less power than the 580m, it is very possible for both a 680m based on the GTX670.

    This could lead to an OC'd version renamed HD7990m considering how easy it is to overclock it. It's the only way to remain competitive vs such a powerful GPU as the GTX670.
     
  19. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    It is a cherry pick because Crysis 2 is the opposite.
     
  20. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,086
    Trophy Points:
    431
    Exactly. Very few games like Crysis 1 and Metro2033 favor the radeons. In general the GTX690/SLI 680 wins most rounds, just as the single cards solution show.
     
  21. long2905

    long2905 Notebook Virtuoso

    Reputations:
    2,443
    Messages:
    2,314
    Likes Received:
    114
    Trophy Points:
    81
    Yeah i did unintentionally :p since i just reached that part of the whole review :p
     
  22. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    We all know what games "matter":

    BF3
    Crysis + Warhead
    Crysis 2
    Metro 2033
    Skyrim
    The Witcher 2

    1. Where did the bolded come from?

    2. AMD will not use a card with an identical core count for the 7990M, if it is to exist. It doesn't fit the company's m.o.
     
  23. 5482741

    5482741 5482741

    Reputations:
    712
    Messages:
    1,530
    Likes Received:
    17
    Trophy Points:
    56
  24. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,086
    Trophy Points:
    431
    Yeah I know it doesn't fit well with their M.O. but I think it's a bit hard to fit down an HD7950 to mobile, plus it won't bode well vs the HD7970m since its almost a direct translation of an HD7870 whereas an HD7950 would have much more sacrifices to keep consumption and dissipation in place.

    That is, if they even care about competing with a higher end 680m, but at this point I doubt it.

    Thanks, it's interesting to see that my 150w psu might just be enough for my new GPU.
     
  25. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30

    We all know that GTX680 came out close to 4 months later. Nvidia had enought time to adjust clocks as much as possible.
    We all know that HD7970 reference cards are extremely underclocked and GTX680 are not.

    There is no doubt that both are great cards. One is no better than other. Both have their strenghts and weaknesses.

    Keep this is mind in your claims ;)

    To topic.

    Having looked some GT660M power consumtion posts, im afraid that GT680 will not be as good as hoped. With a 75W TDP the 660M is truely weak.
    Hopefully the 256bit and more than triple Cuda count GTX680 will be way more efficent.
     
  26. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Holy hell the 690 reached higher mem clocks than the 680, freaking 7Ghz! Thats a real clock of 1750mhz! Silly.
     
  27. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    He's saying the 7970M equipped system draws 40-50W less than the exact same one with the 580M installed. Really?

    I have to question why anyone would believe that is completely accurate.
     
  28. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Because he is a user (less chance of bias) who used a power meter of +/- 3% accuracy lol.

    I'll be able to double check for you when I get my card and compare it to the 570M when overclocked if you like.
     
  29. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,086
    Trophy Points:
    431
    I never claimed any to be bad cards at all. And yes, one is "better" than the other at particular tasks. Is that difference significant? I don't think so.

    Also, I am an AMD/ATi Fan, so I prefer their products over nvidia most of the time. That doesn't mean I can't see how both offer great products and I use both vendors, for both notebook and desktop offerings. I tend to favor AMD when their performance is similar.

    And I think the GCN architecture is fantastic. Kepler is great but it didn't have much of an impact on me because its late, hardly available at the moment, and barely faster on reference boards. At least it brought prices down for the competition haha.

    Back on topic: Yeah the 660m is quite weak at 75w TDP. Besides shaders cut down for the GTX670, was there any other cut downs? ROPs? Texture units? I just hope they don't end up putting very high core count with very low core locks just to match a TDP limit.
     
  30. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    But that vast of a difference in power draw would've been noticed versus the 6970M or 6990M, long ago.
     
  31. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,086
    Trophy Points:
    431
    Unless the 6970m/6990m also consume quite a lot power in the same inefficient way?

    Who knows? 580m and HD6990 wasn't exactly the pinnacle of power consumption efficiency.
     
  32. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    @long2905:
    You forgot to post these results :p

    [​IMG]
    [​IMG]
    [​IMG]
     
  33. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    And for everyones attention, the Dell TDP specs of GTX 660M IS WRONG. The 660M is actually around 45-50W GPU and a great design by Nvidia

    There is a guy in this forum who own the G75 with the 660M who have benchmarked the hell out of it.
    The power draw running 3DMark Vantage is 86 watt. What is in his notebook drawing power at that time is A) His Ivy Bridge Quad core CPU 35W (He says its a BestBuy model and the only model they have there have the 3610QM and is a 45W Quad but I don`t get it to fit in to the first picture?) and B) His 660M. C) RAM, Screen, fan etc

    The power draw doing not GPU intensive tasks is 36.3W

    Here are the pictures as proof


     
  34. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    power draw != from tdp
     
  35. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,086
    Trophy Points:
    431
    Haha those batman and Elder scrolls run obviously had bugs. It's not uncommon. There is also a bug with Crysis 2 at 2560x1600 res in DX9 for the Crossfire setup which crashes, but it can run at 5k res no problems.

    For example in elder scrolls we also have this:

    [​IMG]

    And well the power consumption is completely true. The CrossFire setup consumes much more power, just as it has better idle consumption as well.
     
  36. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,086
    Trophy Points:
    431
    Correct. Power drawn and TDP are NOT the same thing. Kepler SHOULD have lower power drawn than any radeon counterpart right now.
     
  37. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    And here we go again. :rolleyes: Could anyone of you two provide with any evidence that TDP does not equal power consumption with notebook GPUs?

    Are you saying that the GTX 660M draw 45W but magically take additional 30W from thin air and outputs that as heat?

    @Ryzeki: Bugs? You want me to post more results where the 690 beats the 7970CF? There isn`t any bugs. Check around with various reviews and you can see more games where the 690 push ahead with just as much juice ;)

    Here is a summary if you don`t believe me
    [​IMG]
    http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/53901-nvidia-geforce-gtx-690-review-27.html
     
  38. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    its in the definition of tdp there is no need to prove anything.
     
  39. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    That was a very vague statement.

    Tell me this: How can a GPU output more heat than it draws from a system? Is it teleported from electric outlet? :p

    I will comment on this: It`s the other way around. Power consumption is HIGHER than the TDP, atleast with desktop GPUs. Unless I missed something very vital when I went to school :p
     
  40. omnivor

    omnivor Notebook Consultant

    Reputations:
    0
    Messages:
    116
    Likes Received:
    0
    Trophy Points:
    30
    If I understand things correctly, the TDP rating is simply the maximum amount of heat it is designed to dissipate, how how much it actually generates. Obviously, manufacturers would want to design something with room to spare, so, outside of OC'ing, the TDP will likely always be higher than the actual power consumption / heat generation.
     
  41. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    I always thought the 75W figure was off base.

    However do remember that when you go into vantage the IGP wont be working at full pelt, it will just be outputting the frames from the buffer so is likely at its lowest clock so the CPU wont be drawing anywhere near 35W.

    Also even at 45W you don't want to make a comparison to the 7970M....
     
  42. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,086
    Trophy Points:
    431
    You surely seem an nvidia fan fighting back instead of being rational! and I have been saying all along here that GTX680/690 are better performers than their radeon counterparts. So chill dude. I know very well how both perfom. I have read countless reviews for both. I love em both!

    Hahaha! yes bugs! You know, such as when SLI/Crossfire game crashes but not on single card. Crossfire has some bugs with some games in some specific, reproducible scenarios.

    If you see the graph, when Crossfire performs exactly the same as single card, or worse yet... worse than single card, then CrossFire isn't working. Depending on the drivers and site of review, those results vary. I showed you a run of Skyrim where all cards perform better and not constrained.

    Also TDP is Thermal Design Power. It's typically a design figure for a "worst case scenario" for how much power the component must dissipate. As for actual power consumption, it will be on average much less than the TDP, and sometimes even more for short bursts of time.

    Regardless of power consumption of a chip, the physical chip has a temperature limit by their components hence a max TDP. For laptops it's much less due to constraints in size.

    I assume the 660m is being touted as 75w merely as a mistake, or because people are stupid an assume higher tdp is automatically better performance? Also, 660m might have a more aggresive turboboost. And finally, depending on game, the 660m might not even break a sweat.

    also, you have a bunch of GPUs like 6970m, 6990m, 480m, 485m, 580m, 670m rated at 100w TDP all, but consume different power each. Applying a very uhm... general not so specific logic example, would you assume an overclocked 675m to consume more power than an stock 580m? Both have the same TDP.
     
  43. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    That guy who tested out the G75VW with 3610QM and GTX 660M, got an power consumption read from the kill a watt of 83.4W.

    2820QM, which is a 45W TDP Sandy Bridge peaks at 56W when all cores are on full use. Read desktop GPU reviews and you will find greater power consumption than the TDP. Here is an example. GTX 580, GPU draw is 280W, TDP of GTX 580 is 244W. 6970M was a 75W TDP, but it draw up to 100W power from the AC/DC adapter. Power consumption IS higher than TDP (!!!) because an electric component does not output every watt it use as HEAT. That would be a very bad design.

    That is my point. And since the G75 AVERAGE at 83.4W and the Ivy bridge probably draw lets say 50W due to 22nm benefit, and like Meaker say doesn`t utilize full capacity by running Vantage, so maybe 30W, the GPU at full speed, draws around 53W, the TDP of 75W does not fit in my opinion. It should be lower, maybe around 45W

    Oh that was what you meant. Crossfire problems. Yeah you are right. I thought you meant there was something wrong in general about the gametesting. Sorry, now I understand what you mean. :)
     
  44. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,086
    Trophy Points:
    431
    You are assuming that components always use higher power consumption than TDP all the time which is wrong. As I stated, power consumption can be higher than TDP, but it is generally less (you know, speedstep technology, downclocking and numerious power efficiency figures). Specially because TDP in laptops corresponds directly to temperature to cool. It is a design figure of what's supposed to be a thermal disspation of your cooling system.

    Desktop components are a bit more loose with TDP, and are allowed much more headroom in both cooling and power drawn. Unless you use a very small case, you should look for power drawn in desktop cards instead.

    As far as I know the 75w tdp was made up by dell right? When was it officially stated? It would be weird the 650m to be 40w or something, and the 660m being magically much higher tdp being the exact same card with almost same clocks and config.
     
  45. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    What else is a chip giving out apart from heat? Noise? Light?

    No, power = heat, everything else is negligible. Please don't start THAT debate again, it's been sealed shut and needs to stay that way.
     
  46. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Yes but if a GPU/CPU whatever is drawing 100W Max, does it output 100W as heat? That is answer "yes" according to many sites that says TDP = Power consumption with mobile GPUs. And since the Vantage utilized the GPU max and I assume the power consumption of the GPU is around 53W, so should the TDP. That is what I meant Ryzeki. There is no downclocking or speedstep involved with Vantage is it? I understand that a certain scenario, lets say browsing the internet, the GPU downclocks to preserve battery and reduce heat, and then we will see lower power consumption than TDP. That I totally agree on. :)

    Yes the 75W TDP from Dell is very strange and I find it odd.

    Whatever. I`m going to find what the POWER CONSUMPTION of GTX 560M is and compare with 660M and see if they improved it. Thats what really matters anyways. Screw TDP. But first a movie :p
     
  47. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
    I just laughed loud at hes test. Who the hell tests like this??? I mean Come on!
    The only way to test max power draw of GPU is to run run stress test (like lynx) on CPU. Write down the power draw, then start GPU stress test and subtract this power draw from CPU stress test power draw.
    3D marks are the last thing what to measure power draw of. Lol.


    Besides. TDP calculations are not identical with Nvidia and AMD.
    Nvidia cards always go above rated TDP at peak consumption. AMD cards never do, because AMD cards TDP is calculated from peak. AMD powertune allowes card only go up to its rated TDP. Never above.
    Manual overclocking is different story ofcourse.
     
  48. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Its not a professional reviewer duh. It was more of a quick test just to see :)
    Vantage is still pretty demanding since we use it to rank GPUs. Not as precise as your way but a little interesting non the less. We get these debates out of it so it was totally worth it :p
    Now to my movie

    So in other words, Nvidia GPU TDP is generally lower than the power consumption and AMD GPU TDP = Power consumption? Is this valid for notebook GPUs as well? If so then the TDP must be lower than the 75W
     
  49. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I just rep`ed you Ryzeki, Supranium and Meaker (You will have to wait since I given away too much today) for partisipating in this discussion. As a pat on the shoulder for having the energy to debate against this stubborn man. After all I`m just a clueless moron :D :p
     
  50. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,086
    Trophy Points:
    431
    I think the whole deal between TDP of nvidia and AMD was because one of them... calculated TDP of the complete board+RAM, while the other was without RAM or some nonsense like that. I forgot haha.
     
← Previous pageNext page →