I've been looking around and according to wikipedia ( http://en.wikipedia.org/wiki/Mobile_PCI_Express_Module) which I hope is wrong it says that MXM 3.0b only supports memory bus up to 256bit. With HBM just around the corner could mean that mxm 3 is finished as HBM1 has a 1024 bit bus and HMB2 will be similar. With HBM1 being in AMD's next flagship desktop and most likely mobile gpu cards it means that there will be no more gpu upgrades and the limit will be the 980m. Don't get me wrong the 980m isn't a bad card, but it will definitely be outdated very soon just like the normal cycle of video cards.
Now I have very little experience in this area and I hope I'm wrong about this but it's just something I've put together the last few days.
-
HBM is still a "draft" technology and AMD knows how to do good product presentations but not good products. They are very good at marketing, keep that in mind. We still don't have any prototypes or tests and I think it's like that becuase this technology is same crap as Bulldozer architecture. AMD is already late showing new GPUs because Maxwell simply destroys R9 200 series radeons with power so if they would have something really good they would show it.
There are also rumors that Radeon R9 390x will be in water-cooling edition only because HBM does so much heat...
PS:
MXM 3.0b according to specs is a 100W max power draw slot but GTX 880M gets 122W from it and it still works lol. -
lol sounds like someone has an axe to grind with AMD
HBM is not "draft" technology, otherwise I imagine nVidia wouldn't jump aboard the bandwagon and put HBM2 on Pascal.AMD co-developed HBM with Hynix, thus they get 1 year of exclusivity rights. HBM also presents significant power savings compared to GDDR5, and one estimate puts it at 60W for a 280X class GPU. 390X is going to use an AIO to avoid repeating the 290X leaf blower disaster.
As far as MXM goes -- maybe. One thing to keep in mind is that with 2.5D HBM, the footprint of the actual card itself will be much smaller, since the memory chips are now "stacked", and will sit much closer to the GPU die on an interposer. So if there is a significant physical size shrink due to HBM, that could have an impact on things.ajc9988, triturbo, TBoneSan and 1 other person like this. -
I didn't read the specs but isn't 1024 bit HBM internal bus direct between video memory and GPU and 256 bit external mxm bus, that connects GPU with CPU and other components, if required? therefore one shouldn't affect the other?
As already stated, I didn't read the specs and I don't know details, but I think HBM turns video memory almost into internal GPU cache (some designs stack memory modules on top of gpu module, making it literally one physical chip) and that shouldn't affect mxm, which basically is a standard connector to outside world, but I could be wrong. On the other hand mxm will have to be updated at some point to accommodate more powerful gpu designs, possibly higher power draw etc. But again future mxm could be designed to be backwards compatible, like pcie itself was on the desktop. Or maybe we should worry that mxm doesn't disappear all together, like cpu socket on future Intel mobile cpu's. -
Inevitable? Yes, most definitely. The question that remains is, "when?"
It could be next year with Pascal, or it could be in 2018 with Volta comes out. I'd say there's a very high chance Volta will not be MXM. -
MichaelKnight4Christ Notebook Evangelist
The only way to end this madness is for laptops to have external gpu expandability options. That way people could use mxm all they want until its not supported then have an option for external graphics to fall back on in case of that dreadful day mxm 3.0b will no longer be supported.
-
Kade Storm, Kommando, Mr. Fox and 2 others like this.
-
Or maybe a new revision of the format... Like MXM 3.0c
Mr. Fox likes this. -
Kade Storm, ajc9988, Prema and 3 others like this. -
While I cannot relate to the excitement about all these TDP-locked, soldered machines floating about, I *DO* understand the principle of an eGPU. In fact, as I said in the other thread about BGA vs PGA when I said I respected the AW13 more than the Aorus X7 Pro, I would also respect something like a better version of MSI's GS30 which, while thin, doesn't attempt to shove a 980M inside its small chassis, and can leave its space dedicated to cooling its CPU, rather than gimping cooling on both CPU and GPU but proclaiming thinness/lightness (like Gigabyte's P34x V3 does... freaking computer manages to thermally throttle a stock 970M in an air-conditioned room).
The reason I don't exactly respect the GS30 is because its internal layout could have been better. One of its heatpipes runs RIGHT NEXT to a M.2 SSD slot. Like wat. Those things get hot enough already, you want a hot as hell heatpipe next to it? That was just really bad design; I couldn't recommend that thing to someone as-is and tell them put in a M.2 drive in that slot. Terrible.Kade Storm, sa7ina and Mr. Fox like this. -
Honestly I fear with a smaller form factor it'll make it even easier to make and justify soldering everything.
-
-
TomJGX likes this.
-
So yeah, MSI. Let's hope.TomJGX, Mr Najsman and triturbo like this. -
-
I've thought of them possibly making another version of MXM, and I like that idea. We'll see.
Change is coming, though. That we know to be very likely.MichaelKnight4Christ likes this. -
The physical MXM 3.0 connector is pretty relevant, it's the size of the board that's not. The connector has, get that - 10A @19V (190W), 5 DisplayPorts (4 lanes each, up to 2560x1600), LVDS (2 lanes, up to 1920x1200) and CRT (VGA). It's not like you can use all of those with a single laptop, even if you have docking station. OK, someone might be pushing it hard there, so ditch the LVDS, and you'll have space for 2 more DPs (8 lanes), so you can run a video control room from a laptop, lovely. Still, the single biggest hurdle is the size of the board. Because of it, there's literary no space for extra memory channels. Other than getting close to the desktop GPUs, I see no reason against keeping the same connector and make the board bigger than the connector. Just like it was with the old MXM 2.1 - Type I, II and III boards were wider than the connector. This might be the only benefit of having soldered components - you have all the space you want, since you are dealing with the whole damn motherboard. It's not like anyone would take advantage as it seems, but the option is there.
As for BGA to socket adapter - it can be done, but most likely than not, there wont be enough space, as it would add at least 5mm height.
GPU chips as well as soldered CPUs can be changed... it's just a lot more expensive and not home friendly. You can change a GPU chip from one generation to another, even a few generations after, as long as they keep the same BGA grid, and you have access to the required straps (pull-up and pull-down resistors, whose task is to tell which chip you're running and etc.). But yeah, A LOT MUCH VERY MORE harder indeed, and it's not like we don't have incompatibilities as it is now to add the expensive soldering/desoldering in the mix.TomJGX and MichaelKnight4Christ like this. -
I really think that replacing your own soldered CPU/GPU is work for professionals; also there has to be some kind of "soldering machine" to do this work? I remember in old notebooks when one had issues with the motherboard chipset, the only way to re-solder it was using a infrared soldering machine... Which is not that kind of user-friendly lol
-
Ionising_Radiation ?v = ve*ln(m0/m1)
Oh, all of you guys are talking about MXM when every single reviewer is fapping to the Razer Blade. Which is basically a glorified, over-expensive MacBook Pro.
-
I wish Mr. Fox had the model of laptop I do with his 4930MX and 980Ms and modified it for cooling so he could run more benches with current-gen GPUs without it ever throttling and then literally picking it up and putting it in a backpack and walking away. ON. CAMERA.TomJGX and Ionising_Radiation like this. -
Ionising_Radiation ?v = ve*ln(m0/m1)
-
Really strange how the same guys that care what's inside the big box (like the very bolt and screw of it), couldn't be bothered to care about the insides of their notebook.
D2 Ultima likes this. -
It's not even that the blade itself is a terrible idea... but it's MEANT to be a supplementary machine, and is nowhere near designed to be the "end-all gaming laptop" that people seem to love it for.
The discrepancy between how people treat, expect of and baby their desktops and their laptops is... appalling.Ashtrix, TomJGX and Ionising_Radiation like this. -
Ionising_Radiation ?v = ve*ln(m0/m1)
Kommando, Ashtrix, TomJGX and 1 other person like this. -
It's predictable at this point. And incorrect too, because the P770ZM blows it out of the water in every single aspect.Kommando, ajc9988, Ashtrix and 1 other person like this. -
Indeed. I see the Razer Blade as a second laptop for an avid gamer. As a companion to a bigger, thicker, full service high end gaming laptop, they make sense. The Blade performs well enough for a portable gamer even if it is bested by the beasts of gaming laptops.
-
The problem is that they've started shoving bigger/hotter GPUs into it (it began with midrange; GTX x60M GPUs only; now it's GTX x70M) and it doesn't work well enough, and the price they're calling is WAY overboard. And sadly, people STILL KEEP GIVING THEM THE MONEY. If the Razer Blade was marketed to PC gamers who want a machine just to play some games at their parents' house, or to go to a LAN party by taking the bus, or whatever, and be APPROPRIATELY PRICED, I would have no problems whatsoever. I wouldn't even care if it gets as much love as it does right now... because it'd be a machine with a purpose that it fulfills. It's why I respect the AW13. It's priced well, it's got good cooling and a feature set that makes sense. It doesn't try to oversell itself. You know?
But that thing is a waste of money more than anything else, and people are loving it. Way too much. -
Ashtrix likes this.
-
Some Razer mouses are not that bad. But again, "not that bad" is not a high standard.
D2 Ultima likes this. -
I thought the Deathadder was quite good and I recommended it for quite some time. Till I learned the G402 and G502 and Zowie FK1 mice were available in that price range. They're all loads better.
I'm sticking with my G502, but I would like to try the Roccat Kone Optical. I think that's what it's called. -
Ionising_Radiation ?v = ve*ln(m0/m1)
Speaking of mice, can someone recommend a decent wireless mouse that is about the size of the Deathadder/Mamba, i.e. optimised for right-handers? Within the $50 to $100 range is good. I'm not an MMO player, so I don't need tens of buttons on my mouse.
-
What's you performance requirements? If you want do FPS properly, good wireless mouses are basically non-existent.
-
-
The problem isn't the tech, which has been there for many years. It's the market being not big enough for any vendor to care. Vendors simply throw the highest-number-tagged sensor on a wireless device which is obviously heavily throttled for power saving, and then Ctrl+V their GAMING GRADE WIRELESS TECHNOLOGY 1MS SUPER FAST RESPONSE bulls*it from last gen. They don't really mind because players who actually care about performance are usually afraid of wireless, and the group who really want to have a good wireless mouse (and can tell the difference) is just too small.
With the new wave of low power consumption gaming sensors we might see something good soon. But so far I've not seen anything on the international market. There was a regionally marketed Chinese product which was reasonable, but still not as aggressive as it could be.Last edited: May 18, 2015D2 Ultima likes this. -
If the G502 or G402 could be incorporated into a wireless format, that'd work amazingly. Those mice are quite good.
-
Only if Logitech is willing to get aggressive on power management.
The last time Logi used the sensor on G402, they built G602 which has decent endurance but performance is not necessarily better than some of the office-targeted products. Not exactly what we want here.
The new sensor on G502 is very interesting. It (should) has much lower power consumption than previous high-end offers. If Logi is willing to make a G700(s) replacement where the sensor is properly locked down at full load we might see something that's both performance-wise great and endurance-wise not bad.D2 Ultima likes this. -
I'm happy with my CM Reaper, lol. Got it on a warranty for my old CM Inferno... Its been 4 years now since I bought it and got the replacement... Hell of a deal!
-
This thread has demiced..
Anyway I still have a Logitech MX518.Ionising_Radiation likes this. -
Speaking of
This went live today
http://www.pcgamer.com/the-best-wireless-gaming-mouse/TomJGX likes this. -
Can you enthusiasts stop complaining? They are killing off mxm for bga. I mean they already did.
Sent from my 306SH -
-
Meaker@Sager Company Representative
We have a smaller form factor capable of all the same power delivery and connections. It's called mxm-a
-
-
It's up to the ODM how it would be designed. As I said, most MXM-B monsters can deliver 190W, but can't handle it *cough*cooling*cough*.
-
Meaker@Sager Company Representative
ajc9988 likes this. -
http://www.anandtech.com/show/9266/amd-hbm-deep-dive/4
As stated here and other places, the area used by memory will be significantly smaller than any previous generation because of the stack. This means that, so long as the form factor supports the needed voltage, you may be able to bring more clocks to the GPU, more shaders, etc., while also having it fit in the laptop. Heat is a concern, but at 16nm, you will, theoretically, get the benefit of less heat from the die shrink. My concern is if the port carrying the data is limited at 256-bit or similar, it may create a bottleneck. As for the size of new desktop cards, they will be about 2/3 the size of current flagship offerings. This should allow for something in the size of a mxm3.0b to be CONSIDERABLY more powerful (so long as heat from the added performance can be controlled). This article also mentioned an IHS on AMDs flagship GPU die to protect the HBM from being "crushed." This may not be as much of an issue in laptops because of lower mounting pressure and the use of more compressible thermal pads, but could happen. -
-
BGA GPUs are garbage as much as CPUs, even if they don't have as many performance issues. In their self-consuming greed they are cutting off their nose in spite of their face. There will always be people that are totally ignorant about what they are spending money on, and those that simply don't care. Those of us that know better and do care won't settle for garbage. BGA is not a replacement for MXM... it is an inferior substitute for quality, performance and flexibility. It is designed to make money and what better way than to block laptop sales-killing upgrades. They have made their bed, now I hope they die in it. If all they are going to produce is trash, we don't need them or their trash any more. I hope all of the purveyors of garbage go bankrupt and shutter their operations, or pull their heads out, straighten up and fly right... whichever comes first... que sera, sera.
-
-
MXM needs another revision. MXM 3.0b was a much needed improvement over MXM 2.1. MXM 3.0b is much more like an actual desktop PCI-e slot in your laptop. The standardization and cross generation/manufacturer/brand compatibility is so much better than 2.1, however it's still not where it should be. MXM needs to be as standardized as PCI-e so that all MXM modules and laptops that use an MXM slot need to be interchangeably compatible just like desktop PCI-e.
End of mxm 3.0b imminent?
Discussion in 'Gaming (Software and Graphics Cards)' started by Dan2015, May 14, 2015.