The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Nvidia GTX 680MX?

    Discussion in 'Gaming (Software and Graphics Cards)' started by Tyranids, Oct 23, 2012.

  1. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Haha, I am not even an Nvidia fanboy, I would, and still have plans, on getting the HD7970m. The reason I got this was due to a good deal. Other than that, I wouldn't have forked the money over, it is an expensive GPU.

    But yeah, talking about apple, what the hell? the new Ipad 4 is coming out in like 2 weeks, and 6 months ago they just released the Ipad 3. Seriously.

    Anyways this GPU might not be restricted by the 100w tdp. Since it is appearing first on a desktop PC? They use notebook parts to reduce size and keep a slim form factor, and get less restrictions. Who knows :p
     
  2. kzk5609

    kzk5609 Notebook Enthusiast

    Reputations:
    0
    Messages:
    23
    Likes Received:
    0
    Trophy Points:
    5
    13544??? That amount of cores only will be achievable in 20 more years :p Anyway, i'm curious about the FPS it might get in games. Hope someone will get 1 to try soon :)
     
  3. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Well I think you can only get it in the iMac for now.
     
  4. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    It's listed laptop GPU when you visit nvidia website, it says it will be coming to laptops but only links to 680m laptops atm :/
     
  5. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Well it's a "mobile" GPU, not a desktop card.
     
  6. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    I expected another GPU would come out soon, I cancelled my Alienware just in time...now I can buy two of these puppies.
     
  7. Defengar

    Defengar Notebook Deity

    Reputations:
    250
    Messages:
    810
    Likes Received:
    40
    Trophy Points:
    41
    The problem with this muctth graphics power is that you will be bottlenecked with an sli setup unless you get an xtretme 3940qm processor to power your rig.
     
  8. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Yes sir, XM is necessary. :)
     
  9. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    This may be all in one systems only due to the power consumption, more used for the super low profile you can get.

    This is a fully fledged 680 core, with normal voltage GDDR5 (opposed to the low voltage chips used on the 680M) I would expect the TDP to be around the 110-125W region.
     
  10. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    That would be unfortunate, I hope you're wrong.
     
  11. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Not likely will be available, too high power consumption.

    Not at all. i7-2720QM only used ~ 50% CPU utilization with high GPU demanding games. This is only 10-15% more powerful

    ^^^^^ THIS ^^^^^

    People think this will instantly be an MXM card. My bet is that it will NOT. Not without a die shrink.
     
  12. littleone562

    littleone562 Notebook Deity

    Reputations:
    1,417
    Messages:
    993
    Likes Received:
    59
    Trophy Points:
    66
    It will likely be an MXM card, Apple is not going to create different revisions of the motherboard to occupy the 680MX when the other options for the 27 inch imac are a 660M and 675MX. Possibly though it will surpass 100W via some enhancements on their proprietary board but who knows.
     
  13. maverick1989

    maverick1989 Notebook Deity

    Reputations:
    332
    Messages:
    1,562
    Likes Received:
    22
    Trophy Points:
    56
    If the current MXM revision only allows a maximum of 100W of power delivered to the device, how is it going to support a GPU that consumes more?
     
  14. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
  15. Mobius 1

    Mobius 1 Notebook Nobel Laureate

    Reputations:
    3,447
    Messages:
    9,069
    Likes Received:
    6,376
    Trophy Points:
    681
    That's such a scam -.-
    They slash the vRAM by half



    Expect the unexpected ;)



    I would, as long as it's in a laptop :p



    Stock? o_o
    Does it mean that the 580M OV burnout fiasco will happen again?
     
  16. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    Guess you're not reading what others have shown. It won't be in a laptop, it's not even MXM.
     
  17. Prema

    Prema Your Freedom, Your Choice

    Reputations:
    9,368
    Messages:
    6,297
    Likes Received:
    16,485
    Trophy Points:
    681
    Yaeh this card doesn't show in any of my beta bios yet, so if it is coming then not tomorrow... ;)
     
  18. long2905

    long2905 Notebook Virtuoso

    Reputations:
    2,443
    Messages:
    2,314
    Likes Received:
    114
    Trophy Points:
    81
    isn't that MXM 3.0A or C or something? you know the type that AMD/NVIDIA put their low/mid range mobile chips on.
     
  19. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Yeah, both those cards are mxm, the small top one is mxm-a and the larger one is mxm-b. Apple uses faithful mxm slots but their bioses can be quite different.
     
  20. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    So Nvidia make a mobile GPU just for Apple?
    MX series (670MX and 675MX) is widely available for notebooks but suddenly the 680MX is not?

    hmmmm
     
  21. mattcheau

    mattcheau Notebook Deity

    Reputations:
    1,041
    Messages:
    1,246
    Likes Received:
    74
    Trophy Points:
    66
    right? like i said...

    i guess it's still yet to be seen if the 680mx will be available to non-(i)Mac users though.

    Sent from my PI39100 using Board Express
     
  22. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Why not? Apple has massively huge sales volume.

    And even if the form factor is the same as MXM it doesn't mean it can't run at higher voltage if the slot / card is designed to accommodate it. The 680m is already at its power consumption limit when OC'd, so I don't see how this would work.
     
  23. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Well that brings me to my original question:
    The max OC we have seen with 680M, is that max because of the hardware, ie cores cannot go any higher, or is it because it is too hot, or is it because they draw 100W?

    If its max because the cores cannot go any higher without creating artifacts on the screen, that means the 1536 cores can go higher than 680M on max OC even with a regular MXM module.

    If its max because the cores draw 100W, then maybe Apple wants the best they can get out of 100W because the iMac have very high resolutions on the screen. Enter 680MX. It performs like a 680M with a huge OC, but it is stock clocks for 680MX. What bothers me with this theory is that Apple was not saitsfied with the 650M performance in the Macbook. Solution?
    They increased the core clock from 735MHz to 900MHz on the 650M.

    Why couldn`t they do the same with 680M?

    Well that leads us to power requirements over 100W...


    Yep I know Apple is big, but have Nvidia ever created a brand new GPU just for one company?
    Will this lead to other notebook brands ditching MXM and using a different module that can feed it with more power to properly use 680MX?

    And, do you know that 680M max OC is 100W max? Have you measured it?
     
  24. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Sure why not? It's not really a new GPU, just a variation on an existing one.

    I doubt it. Apple is known to offer proprietary hardware all the time. Perhaps there is a new MXM standard in the works which will accommodate higher power consumption cards for next gen laptops.

    Ask Meaker. He seemed to have run into a power limiting bottleneck even after using his frankenPSU (spliced tip).

    This is all assumption of course, but I would be shocked to see this offered in a regular MXM 3.0b format.
     
  25. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    LOL yeah Meaker. I`m pretty shure his insane OC is bigger than than 680MX stock anways since he is more or less up to GTX 670 performance lol :p

    Do you remember the world record overclock with the 680M? They used a regular MXM module and they got enough power through it. There is no way a stock 680MX is even close to that.
    Question is if this is real though and not some trickery from Wesley.


    [​IMG]
     
  26. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    Again, Nvidia says clearly 680MX is coming to laptops, EVERYTHING else here is speculation
    When you click on 680mx is says it's available on notebooks but only links to msi/alienware/etc laptops with 680m atm.

    Alienware already did price cut the 680m
     
  27. long2905

    long2905 Notebook Virtuoso

    Reputations:
    2,443
    Messages:
    2,314
    Likes Received:
    114
    Trophy Points:
    81
    Just give it sometime people, jeez. The new imac is not even available yet and we got the es 7970M ways before they actually became available. There is only so much to be speculated this early.
     
  28. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
    There is nothing to speculated about.
    It's OFFICIAL 680MX is coming to laptops, Nvidia list laptops under the GPU.
    They also says it has the perfect balance between BATTERY LIFE and performance, never heard of a desktop with a "battery life"

    http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-680mx

    1-Listed under notebooks GPUS category - GeForce > Hardware > Notebook GPUs > GeForce GTX 680MX > Overview
    2-Battery life mentioned aka laptops, desktops do not have a "battery life"
    3-Lists a paunch of laptops in another page in the same website under 680MX but atm only links to laptops that has 680m on them here "
    GeForce > Hardware > Notebook GPUs > GeForce GTX 680MX > Buy Online" and lists laptops
    http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-680mx/buy-online

    It's coming to laptops, very soon in a month, anyone who still argue has no idea what he is talking about basically and making things up.
     
  29. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    NVidia would not make a GPU specifically for Apple... I'm sure Tim Cook had some agreement with NVidia to get their fastest GPU before anyone else would for the new release of the iMac - it's how Apple operates. Have you seen the over-under on the stock drop lately? It's nearly 100 points! They need something to bring them back, ha ha. I'm sure it will be available soon for Alienware, be patient!

    If what I said it true, NVidia made a great decision. There are millions of people all over the world who purchase iMacs, and this will definitely help them.
     
  30. hackness

    hackness Notebook Virtuoso

    Reputations:
    1,237
    Messages:
    2,367
    Likes Received:
    427
    Trophy Points:
    101
    Looks like MSI is queuing after Apple, hopefully Clevo joins the queue too..
     
  31. spybenj

    spybenj Notebook Deity

    Reputations:
    284
    Messages:
    1,004
    Likes Received:
    2
    Trophy Points:
    56
    What makes you say that?(that msi is next)
     
  32. flubadoo

    flubadoo Notebook Consultant

    Reputations:
    1
    Messages:
    120
    Likes Received:
    0
    Trophy Points:
    30
    Why would they ever make a GPU for apple?
    Just take a step back and ask yourself: since when did Apple ever invest in super good GPUs? That's right, never.
    And what games or ANYTHING on a mac would utilize such a powerful GPU?

    And having a mobile gpu in an iMac is even stupider, since it's common knowledge that mobile GPUs are much more expensive than a desktop one with the same performance.
     
  33. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    You are right. Although the facts speak for themselves.. It's weird yeah
     
  34. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Not sure what you're trying to say. Specs shows that they will be using a 680MX, so it is going in the iMac. Have you even SEEN an iMac? Obvious why they use mobile components. It's an LCD with the PC components embedded in it. Why would they go with a desktop GPU PCB that's almost as big as the entire iMac mainboard itself? Mobile components consume less power, generate less heat, and take up less space than the desktop counterparts. That's why.

    There's always boot camp for Windows too.
     
  35. hackness

    hackness Notebook Virtuoso

    Reputations:
    1,237
    Messages:
    2,367
    Likes Received:
    427
    Trophy Points:
    101
    Was in the buy online search engine on the page for GTX680MX, right now those laptops listed there are with a GTX 680"M", unless NVIDIA is only going to copy the old GTX680M page into the GTX680MX page else it could be possible.

    GeForce GTX 680MX | Buy Online | GeForce
     
  36. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    Not weird at all. Last year's iMac had a 6970m, which was high end at the time. Not a stretch to say that the 680mx will be in the new iMac.
     
  37. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Me too thinks that this 680MX is going inside a variety of notebooks, not just Apple. Its not going to use more than 100W and MXM should work. Overclocking potential vs 680M however, thats a different story and I have no idea how that will turn out.

    That said, I know that Kepler is extremely efficient and we have all seen how underclocked the 680M really is. Everyone pretty much speculated on a 685M. I personally knew a 1536 core would come (see the 780M speculation thread) but I thought Nvidia would wait until 780M before releasing it. I guess they had to complete finish the new MX-line they introduced. Can`t just have 670MX and 675MX and no 680MX :p

    Its a bit sad that we don`t have any reviews on the 680M like techpowerup that hook up the GPUs to one of their advanced machines to see exactly what the GPU draw with certain clocks. That would have helped a lot.
     
  38. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    I look at the overclocked GTX 580M, and can't help but to feel like MXM 3.0's 100W limit doesn't exist.
     
  39. littleone562

    littleone562 Notebook Deity

    Reputations:
    1,417
    Messages:
    993
    Likes Received:
    59
    Trophy Points:
    66
    I agree with that haha...
     
  40. hackness

    hackness Notebook Virtuoso

    Reputations:
    1,237
    Messages:
    2,367
    Likes Received:
    427
    Trophy Points:
    101
    At first I thought the purpose of MX was to replace the Fermi GPUs, now the 680MX... Next = 660MX? Haha..
     
  41. spybenj

    spybenj Notebook Deity

    Reputations:
    284
    Messages:
    1,004
    Likes Received:
    2
    Trophy Points:
    56
    Found this:
    NVIDIA GeForce GTX 680MX - Notebookcheck.net Tech
    Hopefully they're wrong, but notebookcheck is pretty reliable.
     
  42. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
  43. spybenj

    spybenj Notebook Deity

    Reputations:
    284
    Messages:
    1,004
    Likes Received:
    2
    Trophy Points:
    56
    It's a prediction, but they have more experience with mobile GPUs that most people on this forums. I don't see anything wrong with assuming that they're prediction has a higher chance of being right then most people on this forum.
     
  44. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Code:
    GeForce GTX 680M SLI [U]2688[/U] @ 720MHz
    
    GeForce GTX 680MX [U]1536[/U] @ 720MHz
    
    GeForce GTX 680M [U]1344[/U] @ 720MHz
    
    Seeing this makes me excited... :D
     
  45. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    That got me thinking about the people around here that sell computers...
    They tell people that a notebook with Quad core have 4 processors. They don`t know anything about GPUs, but if they did they would probably have told a customer that a computer with 680M contains 1344 graphic cards :D
     
  46. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Cloud, a quad core is like having 4 single core processors all sharing the same memory controller, also nvidia calls them cuda cores anyway.

    Sometimes you think about things in the oddest of ways.
     
  47. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    such an understatement!

    anyway Im glad that apple got that 680mx, now Im wondering when nvidia is going to revamp mxm, or if its even needed
     
  48. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I understand what a Quad core is but you can`t say its 4 CPUs when there doesn`t exist any CPU with a die with just one core that you can compare against. Especially about the GPU. Thats my point. :)

    EDIT: There does exist a Sandy CPU with 1 core (Celeron). Several of them actually. Well I`m not buying that 4 Celeron equals 1 Sandy Quad core. "Dude I just bought 3610QM. It perfom like 4 Celeron`s. Awesome"
    Really?

    Oh well, that was the off topic of the day from me. I like being odd though. :p
     
  49. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    I do *hope* that it works with MXM 3.0b because that would mean good upgrade options in a year or so when Maxwell rolls around. But I'm also a pessimist and a realist...
     
  50. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Now, in the cpu world, no, a quad is not always 4 times as fast as a single core cpu, but it is capable of being under the right conditions and if i had two boxes that you could not look inside, in one was a single package quad core and the other a motherboard with 4 single cores in sep. sockets you would have to VERY carefully look at performance in some edge cases to tell them appart.

    With graphics all work is perfectly parrallel, so assuming that you are graphics bottlenecked, a chip with twice as many units (cores, rops, memory bandwidth) will perform twice as well.
     
← Previous pageNext page →