larger problem there is not the physical slot but rather the firmware compatibility between different brands, i.e. vbios/bios/ec...
Sent from my Nexus 5 using Tapatalk
-
Don't forget the extra pin on some cards that only make it compatible with certain machines. But yes, the rest is exactly what you said. Are they still doing that extra pin thing?
-
Which is exactly why it's not standardized or regulated enough. vbios compatibility should be part of the equation. I'm not sure exactly how it works, but MXM use should be licensed to OEM's under the agreement they closely follow MXM standards. OEM's shouldn't have the ability to alter the MXM standard and still call it MXM or even use the connector or form factor. A standard is a standard for a reason.
Take HP for example. They use MXM, but follow none of the protocols for pin out. Quite a few of their MXM cards have their vbios chip located on the laptop's motherboard - not the actual MXM module - and utilize pins in the MXM slot to interface it with the card . . . which makes upgrading nearly impossible even to other HP MXM modules.Last edited: Aug 28, 2015t456 likes this. -
Actually that's part of the MXM spec - it's up to the vendor to decide where to put the vBIOS. MXM is meant more for an easy replacement, than upgrading.
-
-
King of Interns Simply a laptop enthusiast
Great news! I was hoping I will be able to give the 680M a worthy replacement
-
Well, I know some people who upgraded from Kepler to Maxwell had some serious issues because of the BIOS incompatibilities. So, as long as they have some updated BIOS, you should be okay. Also a good chance that 920XM will bottleneck the Pascal 1080M.*
*The last architecture jump we had was 580M to 680M, and performance doubled. 980M to 1080M may be doubled again. -
I thought it was because UEFI is required for Maxwell
-
King of Interns Simply a laptop enthusiast
Weell that also depends on the laptop's bios. Take the trusty M15x and every GPU since the 260M works in it with at least 90% of full functionality. Pretty good leap from MXM 2.1.
I think the emphasis should therefore also fall on laptop manufacturers to make their system bios more forgiving towards upgrades.TomJGX and Robbo99999 like this. -
King of Interns Simply a laptop enthusiast
I am waiting to see if DX12 allows me to sqeeze more from the old 920XM. You are of course right though! Still I would rather risk it than buy and soldered piece of rubbish. -
yeah, lucky them m15x owners, rheyve had a pretty good run so far
Sent from my Nexus 5 using TapatalkKing of Interns and Prema like this. -
Robbo99999 Notebook Prophet
Yep, 920xm could start to bottleneck. Your last sentence there, also don't forget that 680M to 980M is also double performance, so 580M to 1080M could be 8 times faster based on what you're saying. -
Not if clocked high enough
-
I don't count 680M to 980M because they're both 28nm.
But yes, it should be a nice increase in performance. NVIDIA is throwing around 10x. I think in certain areas, it will be 10x+ better than Maxwell. However, raw performance in games, I doubt we'll see more than 150% increase - most likely less. If we do, I'll be surprised, and very happy.
-
For fixed function unit, isn't too hard to get 10x, I mean it is from none to have... more like infinite increase ~~~
probably dreaming for shaders
-
Since I just picked up an upgradeable P770DM-G, I've been looking hard into Pascal and MXM 3.0B. From recent information, I believe there will be a high chance of that socket surviving another season. This article: http://wccftech.com/nvidia-pascal-gtx-1080-8gb-1070-launching-summer-debut-april-gtc-2016/ , while primarily talking about the desktop side of things, makes me believe that the first "high-end-ish" cards will still utilize gddr5/gddr5x with the true high-end cards using the HBM2, much like AMD's 390x/Fury X went with. That article also mentions a possibility of new notebook gpu's showing up at Computex this year, so my theory is if gddr5 will still be used on the notebook side, we may have some MXM 3.0b time left. Hopefully. Otherwise I'll have locked myself out of gpu upradeability. D:
-
What does the memory technology has to do with MXM 3.0B?
You could connect a Radeon Pro Duo via MXM 3.0B if AMD would want to.
-
Robbo99999 Notebook Prophet
I suppose he's thinking of the interposer size, but I'm thinking MXM 3.0B doesn't stipulate interposer/X-bracket size?
EDIT: I found this document that describes the standard: http://www.mxm-sig.org/file.cfm?doc=36A88883-DFFA-E3E6-16C301BCF5633572
To me it seems that X-bracket size/interposer size is stipulated to a fixed size - I did a quick skim of the document, so I could be wrong. Anyway, to me it looks like they couldn't put Fury interposer on MXM 3.0B due to it being too large - (maybe I should go measure what the size of the Fury interposer is so I can be sure, but I can't be bothered right now!).Last edited: Mar 15, 2016 -
I think GDDR5X would be nvidia's next potent mobile chipset with 3.0b, About HBM I don't have any hint if it can make it to MXM unless the Nvidia really wants to go full bore like the GTX980 mobile...
So we might see HBM in action & if they really want to maxxx performance a new MXM standard would be sweet, the MXM tech is Nvidia's own !! ofc with the HBM2
Even though most of the NB GPUs are BGA the Clevo / MSI / Nvidia went miles ahead with the GTX 980 chipset and MSI claims upgradeability... So about MXM dying I think It will never happen
Last edited: Mar 15, 2016 -
-
Everything eventually changes its just when we don't know.
-
-
lol
-
-
I was thinking that if the same memory tech is being used, the same gpu layout/socket would be used. Buuut that doesn't make a lot of sense, since there were mxm 3.0b cards using gddr3 back in ye olde times. Oops. Oh maaan, this standard has been around for 8 years.
I will become the driving market force to ensure MXM never dies. Viva la customization!
-
Let me correct that for you
Viva la MXM and Death to BGA!!!!
PrimeTimeAction, Kommando, triturbo and 1 other person like this.
End of mxm 3.0b imminent?
Discussion in 'Gaming (Software and Graphics Cards)' started by Dan2015, May 14, 2015.