Probably because a few laptop buyers choose this junk graphic variant with all the names. Easier to cheat all these customers because they maybe not read/know as much about the 180w PSU Gate. Few sold = Few complaints. Everyone with Aw + Gtx980m knows that Gtx980m and newer Aw15/17 is a failure without a 240 psu.
-
-
woodzstack Alezka Computers , Official Clevo reseller.
Here's the thing, the gaming/laptop enthusiast community has at large wanted bigger and better graphics in the laptops. So they had to ditch the 100W limitation and thus are going to change MXM 3.0b to whatever the next one will be called. With more headroom, we could still technically get 75-125W videocards in a new form but there will art least be more room for them to operate without throttling so bad if they can use more/ have more power, hopefully.
This is really, a change caused inpart by this community. Anyone not satisfied, really begs to wonder if DELL and other corporations are wrong in thinking "They do not know what they want."
This is essentially, the OPPOSITE of soldered on garbage. They tried to do it that way to see if there was a performance gain/efficiency bieng the better term - howevre, it resulted in an immediete loss in per4formance over the previous generation - so it cause quite a stir and sales were crap ! Like any revolution, it takes time to settle in and absorb those changes. Now, it didn't sit well, so here comes the next revolution, until something ties it down.
What we need is a backwards compatability MXM 3.0b connection capable of 125-175W and mobile chipsets that can afford that sort of power ! Then, you must shre CPU and GPU heatsinks, together in one larger heatsink, where less ground is wasted inbetween, this will save roo for extra cooling, if it was a larger heatsink that shared resource space AND was roughly abit larger then the combined two would have been. It will be simplier, more robust, and can share cooling. Clevo has sort of start to do it, and it works amazing, and they have been able to infact support desktop CPU's while doing this too !Last edited: Oct 25, 2015MickyD1234 and i_pk_pjers_i like this. -
If you read recent posts, Dell has said they chose BGA because MXM is not mature enough and hasn't been developed enough to continue to support it. Too expensive blah blah blah BS.
BGA certainly has it's place, it allows for thinner/lighter laptops. TBH I don't see compelling reason for BGA to even exist though. If I need a thin light laptop, I'm not buying a BGA ultrabook, I'm buying a Google Pixel C or Microsoft Surface Book, laptops that have enough power to run any basic office applications, light gaming, video/web surfing while functioning as a convenient tablet. I personally think Intel ought to scrap BGA all together and scrap mobile CPUs. They should focus on Desktop LGA, server and mobile market. Thin light laptop market is DEAD, nada. I don't see a good reason to buy a MacBook Pro anymore, buy a freaking iPad Air Pro or MacBook Air.
But I see no advantages or reasons to use BGA in desktop replacement/Gaming laptops unless you are truly vain and need an Aurorus or Razer notebook, because having a thin laptop while sitting for hours stationary gaming really matters. Especially when your gaming setup already takes up large real estate with audio components, peripherals, massive mouse pads etc. You can tell I think the Aurora and Razer notebooks idiotic, all that money for something so vain and shouldn't be a considering when buying these types of laptops. -
Go on basically ANY other forum. Linus' forums, OCN, /r/pcmr, etc. All the general public cares about is if it's thin. Nobody cares about if it works properly. As long as it doesn't stutter in games people really don't care. All the lenovo people who tell me the laptop is great... I tell them play a game stressing both CPU and GPU and tell me how your CPU loses turbo boost. They don't care, because it plays things.
Either people want things too cheap ($1000 or less for essentially an i5 and gtx 970 performance) or they want things extremely thin because they can barely lift 2 pounds.
I'm not saying there are no legitimate reasons to want a machine that thin, but the fact remains and will always remain: Buy to suit your function. If your main function is "I need it to be lightweight and thin" then you either buy a weaksauce laptop that satisfies that requirement, or you buy something like the GS30 Shadow which lacks a GPU and you game at home. If your main function is gaming, then buy something to suit the hardware in the machine.TomJGX, MickyD1234, jaybee83 and 2 others like this. -
I do like thin if only for the fact it forces companies to optimize the form factor a bit. However they take it to an extreme so now they do it to have the absolute thinnest laptop though, like the Razer. "I'm thin as a dime" ... "Well I'm 0.1mm thinner than a dime"... yadda.
Just optimize the form factor. If it has to be 21mm thick instead of 19mm thick but add adequate cooling then so be it. Just do it. But don't go 15mm thick because you can only to result in a hot mess. -
The new desktop Gtx980 cards +180w require thicker laptops. If laptop brands still want to use modern thin laptops + high Tdp cards they will only make a new throttle mess. I think we will see a new type of laptops now. Remember Aw15 and 17 normally use the same top off the line graphics card from Nvidia. You can't do this anymore with those 2 models. High Tdp cards for thicker models with better cooling and low Tdp cards for thinner model. Thank goodness for that.
Last edited: Oct 27, 2015 -
I want to know more about Asus 3D Vapour cooling system. Is it hype or is it real? If it's real, then maybe we can see future laptops like Clevo ZM, DM series run cooler and even thinner.
From what I can gather, it would be a heatsink with fluid, but the heat pipes would be fused/part of the heatsink itself. Better distribution and dissipation of heat then.
It is a wonder that it has taken this long for a laptop OEM to implement this.
Last edited: Oct 28, 2015Prema likes this. -
Robbo99999 Notebook Prophet
-
I would like to know also. Seems the heatsink itself has vapour chamber, but now the heatpipes are part of it, one unit.
TomJGX and Robbo99999 like this. -
that would mean a larger amount of liquid and a larger dissipation area for it to condensate again, interesting...
Sent from my Nexus 5 using Tapatalk -
Basically a mini All in One water cooler. Really just a hybrid of existing laptop cooling and AIO in desktops.
ajc9988 likes this. -
Are you thinking of the GX700 with it's dock maybe?ajc9988 likes this. -
There is a cooling design where you have big vapor chamber, like the one in the picture and heatpipes which transfer the heat to the radiators and a return pipe for the fluid. This lacks the later, so in my head its only one solution - thicker wick, since it doesn't need space for the liquid (that usually is in the heatpipes). This way the wick transfers more fluid and there's plenty of fluid, because it is in the vapor chamber. That's my take on it. Pretty great designs, both G752 and GX700, sadly I think that they'll be paired with non-upgradeable components which is a REAL shame.
i_pk_pjers_i and ajc9988 like this. -
http://descargitas.com/2015/10/30/analisis-imac-27-pulgadas-finales-2015-i7-6700k-r9-m395x/
http://www.macitynet.it/nuovi-imac-retina-5k-confronto-della-gpu-mac-pro/
Reviews of the Apple R9 m35x (sadly, pictures are scarce). In short, the R9 m395x compares to the GTX 960 / GTX 970m; however, I wonder if this will change once the new big AMD driver comes out. -
-
-
-
i_pk_pjers_i and Robbo99999 like this.
-
But there is not some great divide between NV and AMD like there is with Intel and AMD on the CPU side. All it takes is one slip up from NV and it's 3-4 years ago again with AMD as top dog in the graphics space.TomJGX, TBoneSan and i_pk_pjers_i like this. -
-
I was under the impression that AMD gpu's tend to be far more powerful/capable than Nvidia in professional software.
Now, if all you are comparing are gaming performance and efficiency based on that alone, then yes, AMD is definitely lacking, and they hadn't released anything new (except rebrands) in the mobile (even desktop - sans their Fury line which actually puts them relatively on par with Maxwell more or less gaming-wise).
However, I was under the impression that a GPU should be evaluated using ALL of it's capabilities.
Not just gaming performance.
After all, there are multiple consumers who use pro based software.
I just find that its a bit unfair to compare AMD and Nvidia on the gaming side and say which company is better based on that alone.
GCN as an architecture also seems to age better than Nvidia's do (at least on the desktop end), so AMD seems more focused on the longevity of their products - though if I'm not mistaken, 7970m experienced a lot of problems and failures, did it not? -
After Fermi, nVidia cards took a performance hit in pro apps and in compute in general. AMD still has the upper hand there, but in pure gaming, nVidia is winning. However their Fiji based cards are doing well against the big maxwell GM200 chips. We just need something in the mobile sector, which should happen in 2016.
TomJGX likes this. -
That being said, if AMD's arctic islands can crush what we currently have and match Pascal's cards next year, but still be "underperforming", when DX12 actually does come out and get used a bit, nVidrosoft might find that their "build to suffice" attitude is biting them in the butt =D. -
AMD did mention that Arctic Islands will have twice the performance per watt compared to their Fury line (which is almost neck and neck - 10W difference in some games - TDP wise).
I would imagine that a lot of AMD's power draw might come from their ability to surpass Nvidia in professional software.
Hence the high shader numbers which do not readily translate to gaming performance (but evidently, its a completely different ballpark on pro software).
As for their drivers... yes, it was pointed out that at DX11 and below, their drivers lack the same type of optimizations that Nvidia pulls into their own, but at the same time, Nvidia also plays dirty by using proprietary Gameworks libraries, while AMD freely released their TressFX (which is open source and incidentally BETTER in terms of visual quality and less resource intense than Nvidia) for which Nvidia managed to optimize.
Granted, Nvidia has more cash behind it, so they can afford better driver optimizations... though, after all this time, AMD really has little to no excuse on that end - though the upcoming driver release might affect things on this scale.
Regarding DX12... it may not be around the corner, however, given the choice between Nvidia and AMD, I'd pick the AMD Fury Nano.
Preferably in laptop form, exactly because it's more DX12 ready, has HBM already, and I usually keep my laptops for extended periods of use (my current Acer is 7 years old after all).
It's actually sad that AMD isn't pushing the Nano for the mobile. It probably can be done, and it would likely give them a needed edge to boost their sales and keep themselves visible in the laptop arena while also matching the desktop 980, if not surpassing it (and it would still offer more than good enough DX11 performance). -
AMD has promised so many things so many times... And they haven't delivered since the Athlon 64 on the CPU front and when it comes to video cards, sure they've traded blows with nVidia... But as @D2 Ultima said...... They can't make a driver to save their lives... and they REALLY need to.
nVidia 2015 mobile speculation thread
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, May 9, 2015.