Haha, I am not even an Nvidia fanboy, I would, and still have plans, on getting the HD7970m. The reason I got this was due to a good deal. Other than that, I wouldn't have forked the money over, it is an expensive GPU.
But yeah, talking about apple, what the hell? the new Ipad 4 is coming out in like 2 weeks, and 6 months ago they just released the Ipad 3. Seriously.
Anyways this GPU might not be restricted by the 100w tdp. Since it is appearing first on a desktop PC? They use notebook parts to reduce size and keep a slim form factor, and get less restrictions. Who knows![]()
-
-
Anyway, i'm curious about the FPS it might get in games. Hope someone will get 1 to try soon
-
-
-
-
I expected another GPU would come out soon, I cancelled my Alienware just in time...now I can buy two of these puppies.
-
The problem with this muctth graphics power is that you will be bottlenecked with an sli setup unless you get an xtretme 3940qm processor to power your rig.
-
Yes sir, XM is necessary.
-
Meaker@Sager Company Representative
This may be all in one systems only due to the power consumption, more used for the super low profile you can get.
This is a fully fledged 680 core, with normal voltage GDDR5 (opposed to the low voltage chips used on the 680M) I would expect the TDP to be around the 110-125W region. -
That would be unfortunate, I hope you're wrong.
-
People think this will instantly be an MXM card. My bet is that it will NOT. Not without a die shrink. -
-
If the current MXM revision only allows a maximum of 100W of power delivered to the device, how is it going to support a GPU that consumes more?
-
The iMac uses a completely different board design than laptops, it's not MXM.
iMac Intel 27" EMC 2309 and 2374 Teardown - Page 3 - iFixit (step 22)
This is the GPU daughterboard.
This is an MXM board:
-
They slash the vRAM by half
Does it mean that the 580M OV burnout fiasco will happen again? -
Guess you're not reading what others have shown. It won't be in a laptop, it's not even MXM.
-
Yaeh this card doesn't show in any of my beta bios yet, so if it is coming then not tomorrow...
-
isn't that MXM 3.0A or C or something? you know the type that AMD/NVIDIA put their low/mid range mobile chips on.
-
Meaker@Sager Company Representative
Yeah, both those cards are mxm, the small top one is mxm-a and the larger one is mxm-b. Apple uses faithful mxm slots but their bioses can be quite different.
-
So Nvidia make a mobile GPU just for Apple?
MX series (670MX and 675MX) is widely available for notebooks but suddenly the 680MX is not?
hmmmm -
i guess it's still yet to be seen if the 680mx will be available to non-(i)Mac users though.
Sent from my PI39100 using Board Express -
And even if the form factor is the same as MXM it doesn't mean it can't run at higher voltage if the slot / card is designed to accommodate it. The 680m is already at its power consumption limit when OC'd, so I don't see how this would work. -
Well that brings me to my original question:
The max OC we have seen with 680M, is that max because of the hardware, ie cores cannot go any higher, or is it because it is too hot, or is it because they draw 100W?
If its max because the cores cannot go any higher without creating artifacts on the screen, that means the 1536 cores can go higher than 680M on max OC even with a regular MXM module.
If its max because the cores draw 100W, then maybe Apple wants the best they can get out of 100W because the iMac have very high resolutions on the screen. Enter 680MX. It performs like a 680M with a huge OC, but it is stock clocks for 680MX. What bothers me with this theory is that Apple was not saitsfied with the 650M performance in the Macbook. Solution?
They increased the core clock from 735MHz to 900MHz on the 650M.
Why couldn`t they do the same with 680M?
Well that leads us to power requirements over 100W...
Will this lead to other notebook brands ditching MXM and using a different module that can feed it with more power to properly use 680MX?
And, do you know that 680M max OC is 100W max? Have you measured it? -
This is all assumption of course, but I would be shocked to see this offered in a regular MXM 3.0b format. -
LOL yeah Meaker. I`m pretty shure his insane OC is bigger than than 680MX stock anways since he is more or less up to GTX 670 performance lol
Do you remember the world record overclock with the 680M? They used a regular MXM module and they got enough power through it. There is no way a stock 680MX is even close to that.
Question is if this is real though and not some trickery from Wesley.
-
Again, Nvidia says clearly 680MX is coming to laptops, EVERYTHING else here is speculation
When you click on 680mx is says it's available on notebooks but only links to msi/alienware/etc laptops with 680m atm.
Alienware already did price cut the 680m -
Just give it sometime people, jeez. The new imac is not even available yet and we got the es 7970M ways before they actually became available. There is only so much to be speculated this early.
-
It's OFFICIAL 680MX is coming to laptops, Nvidia list laptops under the GPU.
They also says it has the perfect balance between BATTERY LIFE and performance, never heard of a desktop with a "battery life"
http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-680mx
1-Listed under notebooks GPUS category - GeForce > Hardware > Notebook GPUs > GeForce GTX 680MX > Overview
2-Battery life mentioned aka laptops, desktops do not have a "battery life"
3-Lists a paunch of laptops in another page in the same website under 680MX but atm only links to laptops that has 680m on them here "
GeForce > Hardware > Notebook GPUs > GeForce GTX 680MX > Buy Online" and lists laptops
http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-680mx/buy-online
It's coming to laptops, very soon in a month, anyone who still argue has no idea what he is talking about basically and making things up. -
NVidia would not make a GPU specifically for Apple... I'm sure Tim Cook had some agreement with NVidia to get their fastest GPU before anyone else would for the new release of the iMac - it's how Apple operates. Have you seen the over-under on the stock drop lately? It's nearly 100 points! They need something to bring them back, ha ha. I'm sure it will be available soon for Alienware, be patient!
If what I said it true, NVidia made a great decision. There are millions of people all over the world who purchase iMacs, and this will definitely help them. -
Looks like MSI is queuing after Apple, hopefully Clevo joins the queue too..
-
What makes you say that?(that msi is next)
-
Why would they ever make a GPU for apple?
Just take a step back and ask yourself: since when did Apple ever invest in super good GPUs? That's right, never.
And what games or ANYTHING on a mac would utilize such a powerful GPU?
And having a mobile gpu in an iMac is even stupider, since it's common knowledge that mobile GPUs are much more expensive than a desktop one with the same performance. -
-
There's always boot camp for Windows too. -
GeForce GTX 680MX | Buy Online | GeForce -
-
Me too thinks that this 680MX is going inside a variety of notebooks, not just Apple. Its not going to use more than 100W and MXM should work. Overclocking potential vs 680M however, thats a different story and I have no idea how that will turn out.
That said, I know that Kepler is extremely efficient and we have all seen how underclocked the 680M really is. Everyone pretty much speculated on a 685M. I personally knew a 1536 core would come (see the 780M speculation thread) but I thought Nvidia would wait until 780M before releasing it. I guess they had to complete finish the new MX-line they introduced. Can`t just have 670MX and 675MX and no 680MX
Its a bit sad that we don`t have any reviews on the 680M like techpowerup that hook up the GPUs to one of their advanced machines to see exactly what the GPU draw with certain clocks. That would have helped a lot. -
-
-
-
Found this:
NVIDIA GeForce GTX 680MX - Notebookcheck.net Tech
-
Karamazovmm Overthinking? Always!
-
It's a prediction, but they have more experience with mobile GPUs that most people on this forums. I don't see anything wrong with assuming that they're prediction has a higher chance of being right then most people on this forum.
-
Code:
GeForce GTX 680M SLI [U]2688[/U] @ 720MHz GeForce GTX 680MX [U]1536[/U] @ 720MHz GeForce GTX 680M [U]1344[/U] @ 720MHz
-
That got me thinking about the people around here that sell computers...
They tell people that a notebook with Quad core have 4 processors. They don`t know anything about GPUs, but if they did they would probably have told a customer that a computer with 680M contains 1344 graphic cards -
Meaker@Sager Company Representative
Cloud, a quad core is like having 4 single core processors all sharing the same memory controller, also nvidia calls them cuda cores anyway.
Sometimes you think about things in the oddest of ways. -
Karamazovmm Overthinking? Always!
anyway Im glad that apple got that 680mx, now Im wondering when nvidia is going to revamp mxm, or if its even needed -
EDIT: There does exist a Sandy CPU with 1 core (Celeron). Several of them actually. Well I`m not buying that 4 Celeron equals 1 Sandy Quad core. "Dude I just bought 3610QM. It perfom like 4 Celeron`s. Awesome"
Really?
Oh well, that was the off topic of the day from me. I like being odd though. -
I do *hope* that it works with MXM 3.0b because that would mean good upgrade options in a year or so when Maxwell rolls around. But I'm also a pessimist and a realist...
-
Meaker@Sager Company Representative
With graphics all work is perfectly parrallel, so assuming that you are graphics bottlenecked, a chip with twice as many units (cores, rops, memory bandwidth) will perform twice as well.
Nvidia GTX 680MX?
Discussion in 'Gaming (Software and Graphics Cards)' started by Tyranids, Oct 23, 2012.