I've been looking around and according to wikipedia ( http://en.wikipedia.org/wiki/Mobile_PCI_Express_Module) which I hope is wrong it says that MXM 3.0b only supports memory bus up to 256bit. With HBM just around the corner could mean that mxm 3 is finished as HBM1 has a 1024 bit bus and HMB2 will be similar. With HBM1 being in AMD's next flagship desktop and most likely mobile gpu cards it means that there will be no more gpu upgrades and the limit will be the 980m. Don't get me wrong the 980m isn't a bad card, but it will definitely be outdated very soon just like the normal cycle of video cards.
Now I have very little experience in this area and I hope I'm wrong about this but it's just something I've put together the last few days.
-
HBM is still a "draft" technology and AMD knows how to do good product presentations but not good products. They are very good at marketing, keep that in mind. We still don't have any prototypes or tests and I think it's like that becuase this technology is same crap as Bulldozer architecture. AMD is already late showing new GPUs because Maxwell simply destroys R9 200 series radeons with power so if they would have something really good they would show it.
There are also rumors that Radeon R9 390x will be in water-cooling edition only because HBM does so much heat...
PS:
MXM 3.0b according to specs is a 100W max power draw slot but GTX 880M gets 122W from it and it still works lol. -
lol sounds like someone has an axe to grind with AMD
HBM is not "draft" technology, otherwise I imagine nVidia wouldn't jump aboard the bandwagon and put HBM2 on Pascal.
AMD co-developed HBM with Hynix, thus they get 1 year of exclusivity rights. HBM also presents significant power savings compared to GDDR5, and one estimate puts it at
60W for a 280X class GPU. 390X is going to use an AIO to avoid repeating the 290X leaf blower disaster.
As far as MXM goes -- maybe. One thing to keep in mind is that with 2.5D HBM, the footprint of the actual card itself will be much smaller, since the memory chips are now "stacked", and will sit much closer to the GPU die on an interposer. So if there is a significant physical size shrink due to HBM, that could have an impact on things.ajc9988, triturbo, TBoneSan and 1 other person like this. -
I didn't read the specs but isn't 1024 bit HBM internal bus direct between video memory and GPU and 256 bit external mxm bus, that connects GPU with CPU and other components, if required? therefore one shouldn't affect the other?
As already stated, I didn't read the specs and I don't know details, but I think HBM turns video memory almost into internal GPU cache (some designs stack memory modules on top of gpu module, making it literally one physical chip) and that shouldn't affect mxm, which basically is a standard connector to outside world, but I could be wrong. On the other hand mxm will have to be updated at some point to accommodate more powerful gpu designs, possibly higher power draw etc. But again future mxm could be designed to be backwards compatible, like pcie itself was on the desktop. Or maybe we should worry that mxm doesn't disappear all together, like cpu socket on future Intel mobile cpu's. -
Inevitable? Yes, most definitely. The question that remains is, "when?"
It could be next year with Pascal, or it could be in 2018 with Volta comes out. I'd say there's a very high chance Volta will not be MXM. -
MichaelKnight4Christ Notebook Evangelist
The only way to end this madness is for laptops to have external gpu expandability options. That way people could use mxm all they want until its not supported then have an option for external graphics to fall back on in case of that dreadful day mxm 3.0b will no longer be supported.
-
If you want external GPU options, then fine. It defeats the entire purpose of having power "on the go" since carrying around an external monitor and the adapter and requiring THREE plugs (one for laptop, one for adapter, one for monitor). And building a laptop with extremely good internal power and then designing it to attach to external GPUs is all but wasting money. If MXM is killed, the only thing to hope for is a new modular connector type.Kade Storm, Kommando, Mr. Fox and 2 others like this.
-
Or maybe a new revision of the format... Like MXM 3.0c
Mr. Fox likes this. -
I tend to agree. If they plan to kill it off, for the love of Pete, wait for something worth having to surface first. I cannot relate to the excitement about disposable junk that has its critical components welded to the mobo, or embracing the idea of owning an expensive notebook that has been emasculated. If you are going to use a crippled mobile device and expect to magically transform it into something wicked the moment you anchor it to a desk, then you've got another thing coming... horrible concept in my mind, built entirely around a concept of compromise. Why not just do it right up front? Upgrade an older beast laptop that is built correctly and still runs like a banshee, or build a monster desktop for playing the part of grim reaper. You can use a $299 Android tablet or Chromebook for web browsing, trolling a timeline of meaningless blather on Facepuke and munching on your chum bucket filled with spam. When it breaks, you just toss it into a dumpster and grab another one from Walmart while you are already there to buy more groceries and toiletries.
That makes a lot more sense than tossing the baby out with the bath water.Kade Storm, ajc9988, Prema and 3 others like this. -
Yes indeed. I wouldn't mind a MXM 4.0 or some such thing. Maybe when everyone grabs HBM as n=1 pointed out it we could get smaller cards too. Hopefully Clevo might make one of those tri-SLI laptops again for you eh Mr. Fox
While I cannot relate to the excitement about all these TDP-locked, soldered machines floating about, I *DO* understand the principle of an eGPU. In fact, as I said in the other thread about BGA vs PGA when I said I respected the AW13 more than the Aorus X7 Pro, I would also respect something like a better version of MSI's GS30 which, while thin, doesn't attempt to shove a 980M inside its small chassis, and can leave its space dedicated to cooling its CPU, rather than gimping cooling on both CPU and GPU but proclaiming thinness/lightness (like Gigabyte's P34x V3 does... freaking computer manages to thermally throttle a stock 970M in an air-conditioned room).
The reason I don't exactly respect the GS30 is because its internal layout could have been better. One of its heatpipes runs RIGHT NEXT to a M.2 SSD slot. Like wat. Those things get hot enough already, you want a hot as hell heatpipe next to it? That was just really bad design; I couldn't recommend that thing to someone as-is and tell them put in a M.2 drive in that slot. Terrible.Kade Storm, sa7ina and Mr. Fox like this. -
Honestly I fear with a smaller form factor it'll make it even easier to make and justify soldering everything.
-
-
Right, as long as Clevo continues to offer desktop CPU versions as well designed as the P750ZM, we shouldn't have much to worry about. Problem is that they are the *ONLY* OEM to do so. Would be nice to see MSI and/or Alienware to step up in this regard. But I've lost a lot of faith in AW considering their latest moves. Someone should be innovative and come up with a BGA socket adapter. That way you can swap your CPU as needed. Still an issue for GPU though, but as long as MXM or equivalent survives, there should be options.TomJGX likes this.
-
Well from certain information whose source I cannot really confirm, basically Intel's planning for desktop-CPU-using laptops like the ZM series to be the new "high performance notebooks". If MSI would adopt it, that'd be great. Alienware said flat out they do not care for MXM/sockets and are going their eGPU solution route. ASUS.... hasn't let people upgrade for years I really don't know why they're so bloody popular, especially with reviewers.
So yeah, MSI. Let's hope.TomJGX, Mr Najsman and triturbo like this. -
So Intel does want to see more?
-
I've thought of them possibly making another version of MXM, and I like that idea. We'll see.
Change is coming, though. That we know to be very likely.MichaelKnight4Christ likes this. -
The physical MXM 3.0 connector is pretty relevant, it's the size of the board that's not. The connector has, get that - 10A @19V (190W), 5 DisplayPorts (4 lanes each, up to 2560x1600), LVDS (2 lanes, up to 1920x1200) and CRT (VGA). It's not like you can use all of those with a single laptop, even if you have docking station. OK, someone might be pushing it hard there, so ditch the LVDS, and you'll have space for 2 more DPs (8 lanes), so you can run a video control room from a laptop, lovely. Still, the single biggest hurdle is the size of the board. Because of it, there's literary no space for extra memory channels. Other than getting close to the desktop GPUs, I see no reason against keeping the same connector and make the board bigger than the connector. Just like it was with the old MXM 2.1 - Type I, II and III boards were wider than the connector. This might be the only benefit of having soldered components - you have all the space you want, since you are dealing with the whole damn motherboard. It's not like anyone would take advantage as it seems, but the option is there.
As for BGA to socket adapter - it can be done, but most likely than not, there wont be enough space, as it would add at least 5mm height.
GPU chips as well as soldered CPUs can be changed... it's just a lot more expensive and not home friendly. You can change a GPU chip from one generation to another, even a few generations after, as long as they keep the same BGA grid, and you have access to the required straps (pull-up and pull-down resistors, whose task is to tell which chip you're running and etc.). But yeah, A LOT MUCH VERY MORE harder indeed, and it's not like we don't have incompatibilities as it is now to add the expensive soldering/desoldering in the mix.TomJGX and MichaelKnight4Christ like this. -
I really think that replacing your own soldered CPU/GPU is work for professionals; also there has to be some kind of "soldering machine" to do this work? I remember in old notebooks when one had issues with the motherboard chipset, the only way to re-solder it was using a infrared soldering machine... Which is not that kind of user-friendly lol
-
Ionising_Radiation ?v = ve*ln(m0/m1)
Oh, all of you guys are talking about MXM when every single reviewer is fapping to the Razer Blade. Which is basically a glorified, over-expensive MacBook Pro.
-
Because almost every single reviewer owns a desktop, doesn't bother to care after their laptop, doesn't care about performance control in a laptop, and doesn't care if it overheats on them in Crysis 3 or GTA V, because "well it's just a laptop".
I wish Mr. Fox had the model of laptop I do with his 4930MX and 980Ms and modified it for cooling so he could run more benches with current-gen GPUs without it ever throttling and then literally picking it up and putting it in a backpack and walking away. ON. CAMERA.TomJGX and Ionising_Radiation like this. -
Ionising_Radiation ?v = ve*ln(m0/m1)
Forget performance control. The Blade has ONE God-forsaken M.2 slot. Even my W230SS has three storage slots, and most newer Clevos have at least four. This is plain BS... And worst of all, even the FHD version is something like $2000. Ain't nobody got the money for that. -
Really strange how the same guys that care what's inside the big box (like the very bolt and screw of it), couldn't be bothered to care about the insides of their notebook.
If you're doing it properly, yes there's this machine, and as I said, it's not home friendly.D2 Ultima likes this. -
I was actually trying to "not get started" on the blade. I have so much I can yell about it's amazing.
It's not even that the blade itself is a terrible idea... but it's MEANT to be a supplementary machine, and is nowhere near designed to be the "end-all gaming laptop" that people seem to love it for.
The discrepancy between how people treat, expect of and baby their desktops and their laptops is... appalling.Ashtrix, TomJGX and Ionising_Radiation like this. -
Ionising_Radiation ?v = ve*ln(m0/m1)
It's because of demand for the Blade and enthusiasts who love it without caring for anything else, like cost per performance. I don't remember exactly where, but there was this site that compared several gaming laptops of this year, and the Blade was obviously top. The P650SE was tucked away in some corner, and got a low rating because it doesn't look good enough. What the actual heck?Kommando, Ashtrix, TomJGX and 1 other person like this. -
And THAT is the whole point for them. "Well I want it to look good and feel good". And PCGamer consistently makes love to ASUS. "Oh we refreshed our best laptop... yeah it's the 17" ASUS again".
It's predictable at this point. And incorrect too, because the P770ZM blows it out of the water in every single aspect.Kommando, ajc9988, Ashtrix and 1 other person like this. -
Indeed. I see the Razer Blade as a second laptop for an avid gamer. As a companion to a bigger, thicker, full service high end gaming laptop, they make sense. The Blade performs well enough for a portable gamer even if it is bested by the beasts of gaming laptops.
-
Right. And that's just fine, if it's doing what it's meant to do.
The problem is that they've started shoving bigger/hotter GPUs into it (it began with midrange; GTX x60M GPUs only; now it's GTX x70M) and it doesn't work well enough, and the price they're calling is WAY overboard. And sadly, people STILL KEEP GIVING THEM THE MONEY. If the Razer Blade was marketed to PC gamers who want a machine just to play some games at their parents' house, or to go to a LAN party by taking the bus, or whatever, and be APPROPRIATELY PRICED, I would have no problems whatsoever. I wouldn't even care if it gets as much love as it does right now... because it'd be a machine with a purpose that it fulfills. It's why I respect the AW13. It's priced well, it's got good cooling and a feature set that makes sense. It doesn't try to oversell itself. You know?
But that thing is a waste of money more than anything else, and people are loving it. Way too much. -
Razer's products have always been a waste of money: FACT!Ashtrix likes this.
-
Some Razer mouses are not that bad. But again, "not that bad" is not a high standard.
D2 Ultima likes this. -
I thought the Deathadder was quite good and I recommended it for quite some time. Till I learned the G402 and G502 and Zowie FK1 mice were available in that price range. They're all loads better.
I'm sticking with my G502, but I would like to try the Roccat Kone Optical. I think that's what it's called. -
Ionising_Radiation ?v = ve*ln(m0/m1)
Speaking of mice, can someone recommend a decent wireless mouse that is about the size of the Deathadder/Mamba, i.e. optimised for right-handers? Within the $50 to $100 range is good. I'm not an MMO player, so I don't need tens of buttons on my mouse.
-
What's you performance requirements? If you want do FPS properly, good wireless mouses are basically non-existent.
-
Eh I wouldn't say that. Good wireless tech has come a very long way. Headsets in particular are fantastic now. Mice, I don't know of any good wireless mice though, but that's because I don't look for them. While wireless headsets are great, wireless mice are not something I really need.
-
The problem isn't the tech, which has been there for many years. It's the market being not big enough for any vendor to care. Vendors simply throw the highest-number-tagged sensor on a wireless device which is obviously heavily throttled for power saving, and then Ctrl+V their GAMING GRADE WIRELESS TECHNOLOGY 1MS SUPER FAST RESPONSE bulls*it from last gen. They don't really mind because players who actually care about performance are usually afraid of wireless, and the group who really want to have a good wireless mouse (and can tell the difference) is just too small.
With the new wave of low power consumption gaming sensors we might see something good soon. But so far I've not seen anything on the international market. There was a regionally marketed Chinese product which was reasonable, but still not as aggressive as it could be.Last edited: May 18, 2015D2 Ultima likes this. -
If the G502 or G402 could be incorporated into a wireless format, that'd work amazingly. Those mice are quite good.
-
Only if Logitech is willing to get aggressive on power management.
The last time Logi used the sensor on G402, they built G602 which has decent endurance but performance is not necessarily better than some of the office-targeted products. Not exactly what we want here.
The new sensor on G502 is very interesting. It (should) has much lower power consumption than previous high-end offers. If Logi is willing to make a G700(s) replacement where the sensor is properly locked down at full load we might see something that's both performance-wise great and endurance-wise not bad.D2 Ultima likes this. -
I'm happy with my CM Reaper, lol. Got it on a warranty for my old CM Inferno... Its been 4 years now since I bought it and got the replacement... Hell of a deal!
-
This thread has demiced..
Anyway I still have a Logitech MX518.Ionising_Radiation likes this. -
Speaking of
This went live today
http://www.pcgamer.com/the-best-wireless-gaming-mouse/TomJGX likes this. -
Can you enthusiasts stop complaining? They are killing off mxm for bga. I mean they already did.
Sent from my 306SH -
But complaining is fun D=
-
Meaker@Sager Company Representative
We have a smaller form factor capable of all the same power delivery and connections. It's called mxm-a
-
Not according to Wikipedia: http://en.wikipedia.org/wiki/Mobile_PCI_Express_Module#2nd_generation_configurations_.28MXM_3.29
-
It's up to the ODM how it would be designed. As I said, most MXM-B monsters can deliver 190W, but can't handle it *cough*cooling*cough*.
-
Meaker@Sager Company Representative
The slot is the same and can deliver the same amount of power, what the generally recommended power targets are for the old standards does not really matter.ajc9988 likes this. -
http://www.anandtech.com/show/9266/amd-hbm-deep-dive/4
As stated here and other places, the area used by memory will be significantly smaller than any previous generation because of the stack. This means that, so long as the form factor supports the needed voltage, you may be able to bring more clocks to the GPU, more shaders, etc., while also having it fit in the laptop. Heat is a concern, but at 16nm, you will, theoretically, get the benefit of less heat from the die shrink. My concern is if the port carrying the data is limited at 256-bit or similar, it may create a bottleneck. As for the size of new desktop cards, they will be about 2/3 the size of current flagship offerings. This should allow for something in the size of a mxm3.0b to be CONSIDERABLY more powerful (so long as heat from the added performance can be controlled). This article also mentioned an IHS on AMDs flagship GPU die to protect the HBM from being "crushed." This may not be as much of an issue in laptops because of lower mounting pressure and the use of more compressible thermal pads, but could happen. -
We'll complain all we want? If you have a problem close your ears! I refuse to be fed this second hand crap... I don't mind BGA GPU's as much as the damm CPU's... The intel BGA CPUs are useless however even on the GPU side, with the BGA GPUs, the vBIOS is integrated into the system BIOS which causes problems so I do in the end not like both of them..
-
BGA GPUs are garbage as much as CPUs, even if they don't have as many performance issues. In their self-consuming greed they are cutting off their nose in spite of their face. There will always be people that are totally ignorant about what they are spending money on, and those that simply don't care. Those of us that know better and do care won't settle for garbage. BGA is not a replacement for MXM... it is an inferior substitute for quality, performance and flexibility. It is designed to make money and what better way than to block laptop sales-killing upgrades. They have made their bed, now I hope they die in it. If all they are going to produce is trash, we don't need them or their trash any more. I hope all of the purveyors of garbage go bankrupt and shutter their operations, or pull their heads out, straighten up and fly right... whichever comes first... que sera, sera.
-
-
MXM needs another revision. MXM 3.0b was a much needed improvement over MXM 2.1. MXM 3.0b is much more like an actual desktop PCI-e slot in your laptop. The standardization and cross generation/manufacturer/brand compatibility is so much better than 2.1, however it's still not where it should be. MXM needs to be as standardized as PCI-e so that all MXM modules and laptops that use an MXM slot need to be interchangeably compatible just like desktop PCI-e.
End of mxm 3.0b imminent?
Discussion in 'Gaming (Software and Graphics Cards)' started by Dan2015, May 14, 2015.