when theres a continuous trend of people not caring for soldered, thin BGA laptop, they are going to pull stunt like these until people stand up and say no to them. no doubt the soldered market and thin laptop is with majority of buyers, this is more ways for them to make money off their old tech before giving us new ones in MXM form.
simply dont buy anything other than machine that offer socket parts to continue support socket machine is the only way imho, since these corporation dont seem to listen
-
Miguel Pereira Notebook Consultant
Don't want to be the devils advocate, but these companies don't are about logic they care about numbers.
Most likely there is a bigger market for thin bga then for thick lga laptops. This is a fact, the consumer choose with their wallet. And most he chosen the thin and less performance direction.
I actually understand this, even if I prefer to get all the performance I should get.
This Max-Q is just marketing stunt to further keep the momentum.
Enviado do meu MHA-L29 através de TapatalkIonising_Radiation and hmscott like this. -
this is also another way for nvidia to *control* the performance so in case they run out of architecture improvement like intel, they'll start controlling clock speed, throttle etc etc, i can see it all happening again. -
So this new MSI GS73 1070 comes with a 180w PSU, instead of the 230w it should have:
GS73VR 7RG STEALTH PRO
https://www.msi.com/Laptop/GS73VR-7RG-Stealth-Pro.html#hero-specification
The worst part is, nowhere in the Overview or Specs does it say anything about Max-Q - which it must be to only need a 180w PSU...
"Most powerful ultra slim 17.3'' gaming notebook with GeForce® GTX 1070"
"Graphics - GeForce® GTX 1070 with 8GB GDDR5"
It seems MSI doesn't think the name "Max-Q" is such a good selling pointAshtrix, Ionising_Radiation, atacool3 and 3 others like this. -
Last edited: Jun 1, 2017
-
Last edited: Jun 1, 2017ThePerfectStorm, atacool3, DukeCLR and 1 other person like this. -
Thousandmagister Notebook Consultant
I'm waiting for Volta , GTX 1150Ti maybe (xx50 series don't require any connector)
I thought I could buy a new clevo laptop next year but I changed my mind after reading this thread -
DukeCLR likes this.
-
Thousandmagister Notebook Consultant
I already have 2 gaming desktops , do I need another one ?
I prefer laptop over desktop any day , as I can bring it to anywhere I want (Japan to US, back and forth )
Desktop is boring , I leave both of them in my store house . Why would i want a desktop while I can do exactly the same thing on laptop ? CPU Overclocking is also possible thanks to Intel XTUIonising_Radiation, ole!!!, DukeCLR and 1 other person like this. -
As I get older, I find going from desktop to laptop more appealing as my daily drivers.
And I'm a desktop enthusiast that has owned cards LN2 benched (and world-record set) by Kingpin himself, and I've interviewed top OC'ers back when it was still cool. I've spent a lot of money in the 'scene' but as I age, I just find that laptops suit my use cases more. -
Meaker@Sager Company Representative
Ionising_Radiation, hmscott and DukeCLR like this. -
-
Papusan likes this.
-
Miguel Pereira Notebook Consultant
There are thousands of 920m "gaming" ready discreet gpu laptops sold everyday everywhere. You could do a lot worse then a Max-Q laptop...
This is for the average consumer that doesn't do the proper research and is trigger happy.
They won't stop existing just because we whine.
Enviado do meu MHA-L29 através de TapatalkIonising_Radiation likes this. -
gotta get that fat UPS or or just this all together http://www.ssiportable.com/products/portable-solutions/spark-s24t/ -
ThePerfectStorm, Ionising_Radiation and ajc9988 like this.
-
I think the BGA hate club talks too much between themselves. Max-Q serves well customers with money that want the most powerful small laptop. Not stupid.
-
Slim Max-Q-like laptops would be a good idea if they were honestly represented as implemented with power reduced 1070's / 1060's honesty portrayed.
It's not given an honest representation. That's the problem. Max-Q is misleading to unassuming laptop buyers that hear "1080 in a thin laptop" and go nutz over it all. Paying way more than they need to for a 1070 performance laptop.
Too bad that now the new thin 1070 laptops are shipping with 180w PSU's instead of 230w PSU's, and can't reach full performance - even if they aren't Max-Q builds.
It's not BGA vs LGA for me, it's lies vs truth - and myself and a lot of the other people discussing this here are going to be helping out all the disillusioned new Max-Q owners trying to figure out why they can't get their games to run at the same speed as other 1080 laptops - that's their expectation.
The Razer 1080 owners are already in mid-realization now that they've been short changed - 250w PSU to feed a 1080 and a 7820HK, so there's not enough power to go around to get full performance from either the GPU or the CPU, let alone both at the same time.
The good news is I think it will end up backfiring Nvidia this time. Razer buyers are like Macintosh buyers, overly enamored with thin shiny things and not paying attention to performance vs value.
I think Nvidia saw that sucker market and wanted a piece of the action, but this time it's all the slim laptop makers and a wider market of much more savvy buyers.
It will be interesting to see how it works outLast edited: Jun 1, 2017atacool3 likes this. -
Ionising_Radiation likes this.
-
Guys! Remember the comment I made where there is more speculation here than in a Flat Earth Society meeting? First wait for a benchmark or some sort of proper test before criticising and roasting the GPU. Sure from what we see the performance MIGHT be very lacking but we dont know for sure. The only info we actually know is a vague score for a vague test done by an unreliable test device published from an unreliable source squared(17,000 firestrike, done by triton predator, given to a website by Asus).
Also any business' model is getting profits so if the Max-Q is this lacking than even a dumb customer will realise it when he/she sees it. Unbiased review sites will see this flaw as well leading to lower sales which in turn convinces Nvidia to get their **** straight, and from the looks of it this might be the case.hmscott likes this. -
These are sales motivating descriptions designed to be enticing, not accurate in the form the buyer expects to get - invoking the "Yes I want it" with different reasons / data than the buyer's high performance expectations.
The few Max-Q slim laptops that have specifications list smaller than required PSU's to get full performance, all about 1 GPU level down in output wattage - indicating they are closer to the next level down in GPU performance than anywhere near what their number name suggests.
180w for the Max-Q 1070 instead of 230w, and 250w for the 1080 instead of a 330w.
That's enough information to be able to estimate the possible performance envelope, no magic is going to bring that much wattage missing out of the battery boost assist. It's just not going to deliver the performance users are excited about.
And, the pricing is gouging at the same level as the full 1080 / 1070 models, no reduction in price to match the reduction in performance.
It's just looking bad all the way around.Last edited: Jun 1, 2017CedricFP, DukeCLR, Blacky and 1 other person like this. -
DukeCLR likes this.
-
Even the benchmarks I have access to are from samples, not production ready systems.
-
Max-Q is not for me, but I am shure many costumers will be happy with it
quantumslip and DukeCLR like this. -
If there was "new magic" from Nvidia for 1080 / 1070 / 1060, it would be in *ALL* the laptops, even the large DTR models - as even those would benefit from reduction in thermals.
Since this isn't a global GPU based improvement - no new process or die - it's all just power and performance reduction smoke and mirrors to sell slim laptops with overpriced underperforming wannabe 1080's and 1070's, and sell them at the same high premium price as the full performance laptops.
The magic is in the misdirection being performed to open your wallet and extract your money.CedricFP, ThePerfectStorm, TBoneSan and 7 others like this. -
-
Or shall we say down the toilet?
ThePerfectStorm, DukeCLR and ajc9988 like this. -
What would you suggest to a rich guy that wants to game in a laptop not bigger than a macbook pro?
Ionising_Radiation, DukeCLR and hmscott like this. -
It's just not possible to get the high performance GPU / CPU into a Macbook Pro sized sack.
Apple can't do it, nobody can, it's just physically not possible.
Apple used to be a brand I frequented, for like 35 years, then they sold their performance souls for thin and pretty poser boxes.
My last foray, attempt at purchase was a 17" Macbook Pro, top model - best CPU + RAM - brought it home, built the system with all my development and management tools, put it to work, and it just ran it's little fans at full speed constantly, thermal throttling more than I'd ever seen them. I just couldn't use it.
There just isn't enough physical space to put the heavy duty cooling system required for the close to 100% CPU and GPU for the work I do. Desktop replacement laptop's are it. So I went with that. Too heavy and noisy. Set up servers, dropped down to a top of the line 18.4" BGA laptop, I've never been happier with a laptop.
It was huge, heavy, required a huge PSU, but it was quiet under load - and even at the extreme range when running jobs locally it was liveable.
You aren't going to get top performance in a MBP profile package, or a Razer BP, or a new MAX-Q laptop, they are all going to be far busier trying to throttle themselves to stay alive than spending time doing your work.
Get a nice 17.3" 1080 / 1070 MSI GT75VR, or Acer Predator, or Asus G701VI 1080 or Asus G752 1070, and any of those will give you "compact" high performance with adequate cooling, power, and OC headroom (undervolt to drop CPU temps on all).
Those are all BGA based laptops, and can OC a 7820HK to 4.2ghz-4.5ghz depending on the Silicon Lottery (luck of the draw).
Or, if you want the highest performance, up to 1080 SLI + desktop CPU - get a desktop
The LGA highend laptops are extremely *not* out of the box ready for the most part - they take a lot of tuning, BIOS fiddling, waiting for Prema releases for new models - 6+ months before you can get rid of the Clevo BIOS gotchas - power throttling mostly. These are very expensive too as most build them, but base models are competitive with the top BGA models.
What I recommend is to get 2 laptops. This is what I have done for many years.
1 is a slim pretty girl you can take out to work, social gatherings of non-tech people, that has enough battery life to be useful on long plane trips and long days in meetings away from power. Basically the dream laptop everyone wants, but wishes also did heavy duty gaming.
You could stop here and use something like LiquidSky or other remote gaming services - using your slim laptop as a front end for their heavy duty back end servers, but it's not viable for all games - it's getting better but not quite there yet.
And, get a killer gaming laptop that you can pull out when all the non tech people have gone home, plug it in to AC - all gaming laptops with Nvidia (/ AMD) discrete GPU's need to be run on AC for any useful gaming performance.
Then you can have all of what you want, but it's not gonna be gotten in a single slim laptop.
You can't pound 40lbs of thermal load into a 5lb sackLast edited: Jun 1, 2017 -
No one should support Nvidia's new madness with money. Same for the ODM's. If I was that rich guy, I would rather engage me more in charity than waste my money on Fraud/scam. This should apply to both rich and poor!! Same time you all will help our dear earth from being overflowing of... Yeah, you know.
hmscott likes this. -
ole!!!, TBoneSan, Ionising_Radiation and 1 other person like this. -
ie. the P950, GS63 and Razer Blade. These are all machines that easily handle a 1060 but not a 1070. From what I can gather, instead of overclocking a 1060 and using exponentially more power as you run into it's limit, it's more effective to take a 1070 and downclock it instead.
I'd be particularly interested to see what voltages the Max-Q chips run at.Ionising_Radiation and hmscott like this. -
Ionising_Radiation ?v = ve*ln(m0/m1)
We've agreed on many things, but to me, this Max-Q business just reeks to me of nVidia wanting to get rid of their lousy bins, and to make a quick buck off them. The Max-Q webpage is so full of marketing smoke, and hardly any real stats or specs (we haven't even got have official clock speeds, TDP, etc) that it truly makes me question the value of it. There are already hints that undervolting Pascal too much may disable a few cores under load. People may end up paying more for less. We're already in a niche, where notebooks cost around $300 – $500 more than their desktop counterparts due to miniaturisation. Do we need yet another worthless premium to pay, just because we're pursuing thin and light?
I believe @Prema has weighed in on the matter, and I'm fairly sure he knows what he's talking about.bradleyjb, jaybee83, hmscott and 1 other person like this. -
What I am interested in, is the technical aspects of it. There also doesn't seem to be any indication that they're "replacing" existing 1060/1070/1080 models.
The only one people will get caught up on is the completely new 1080 Max-Q models where people might buy into it thinking they'll get full 1080 performance when they won't. Hell, that already happens now since the "full" mobile Pascal chips don't actually clock anywhere near as high as most desktop ones anyway (mostly because the typical desktop cards are now pre-overclocked).
As I said, if you look at the product lineups, most of them just added a new GTX1070 version of whatever their thin/light GTX1060 model was. Some have added a new "thin" 1080 Max-Q model which will cost a tonne of money and lets be honest, bothers NONE of us here because buyers of those models probably don't care that much about that kind of money anyway.
In almost all of those cases, it's not like you were going to get a full fat 1070 in there anyway so...Ionising_Radiation likes this. -
After reading page after page. I am so confused. What is it that is wrong? I like the fact it uses less power.
Everywhere else i read this is an improvement. You think there is not an improvement in design and architecture? But they are BSing with downclocked versions of their cards? When there are already laptops with non max-q gpus ppl can buy?
I am buying a new laptop. What is wrong with this new clevo or new gs73vr? I have just bought ge72mvr and it uses power like way too much. Falsely advertised to have thunderbolt 3, plus xoticpc damaged it so I am returning it.
I would prefer to upgrade parts, lgm, but even when u get mxm there are changes in the architecture that make things incompatible. That is what I read about msi's issue when they promised upgradeability.
What is so wrong with it being bgm, but having thunderbolt 3. That ensures future proofing?
I read all kinds of negative and it stresses me out, I have done too much research to buy my stupid laptop, it is frustrating.
And F*ck this website. It freezes up my phone every time, just realized it keeps draining my network and finally I am not frozen when I turn on airplane mode.
I am getting me this new clevo or stealth, because asus 120hz is bs with ips response times in 20 miliseconds. 120 fps = you need 8.3ms at least response time for 120 frames in 1 second. Clevo's displays last rated at 12ms, but has g-sync. Msi is fastest response time, no g-sync. -
The reason nvidia push out this Max-Q graphics Trash!! Market Share is one thing, but decline of -27.8% in the discrete GPU market is much worse!!
NVIDIA dGPU Market Share at 72.5%, AMD’s at 27.5% – Enthusiast and High End GPUs Win Last Quarter
First up we have NVIDIA which saw a market share increase of 2%. This brings NVIDIA dGPU share to 72.5% which is their highest in several months. NVIDIA’s dGPU market share stood at 70.5% in the previous quarter but has managed to grow further despite a decline in the overall market. The reason is simply because of NVIDIA’s high-end offerings which are detailed in the segment divided market share. NVIDIA saw an overall decline of -27.8% in the discrete GPU market with -23.0% in the notebook and -25.6% in overall PC graphics shipments." -
You still may exchange 100 pages of posts before, during, and after purchase, but it will all be posts of joy
The Max-Q GPU's are just normal Pascal GPU's, but neutered to fit into a smaller thermal power profile, nothing magical, in fact it's all very straightforward. If you want less thermal heat to dissipate, then draw less power and do less work.
The 1080 Max-Q is really the darling of the whole project for Nvidia, and that's because it's the Unicorn everyone wants. A 1080 in a less than 10lb laptop that provides full performance.
The Max-Q 1070 is just an expensive solution to what is now done just fine by a 1060 OC'd. Nothing special here either.
The only special Max-Q product is the 1060 Max-Q. It has the same TDP power profile as a full 1060, but using the constant throttling off demand to reduce load, power draw, and therefore heat generated, the 1060 Max-Q is the sweet spot if you really want to build the smallest and quietest under load laptop using the new overall Max-Q treatment.
That's what I'd get if you want the thinnest gaming laptop, balancing excellent performance of the 1060 with Max-Q power throttling to keep the heat down and therefore keep the fans quiet.Last edited: Jun 1, 2017DukeCLR, atacool3, CedricFP and 1 other person like this. -
I guess the main question that you need to find a straight answer to is what will the utilization of less power cost you in terms of performance. Doing more with more always works out better than doing more with less. Less with less also works a lot of the time. Doing more with less is almost always going to be messed up. Good luck getting a straight answer on that, especially from shills professional reviewers. Most of them will be on such a sugar high from drinking the NVIDIA Kool-Aid that everything will come up smelling like roses.
Just brace yourself and don't be disappointed if you take the leap of faith and buy one only to find that it gets raped by everything else that hasn't been TGP/TDP castrated and still runs hotter than what you'd like it to.ThePerfectStorm, DukeCLR, Papusan and 1 other person like this. -
I am unashamedly, unabashedly, and unapologetically after a portable, slim laptop that can give me the best gaming performance possible while not compromising on throttling, longevity, and user experience.
I really do not care for the "hit the gym" crowd that seems to suggest laptop portability importance exists on an inverse scale to how ripped you are, which is ridiculous and reductive.
A 1050ti almost hits that sweet spot despite the rather surprising lack of options, but if a 1060 MAX Q can sit in between the 1060 and 1050ti performance-wise, it's almost perfect *for my particular use-case*.
The only issue becomes pricing, and I think this is where the rub is, and where for *me personally* it'll be a non-starter.
With THAT said, it is ridiculous that a 1060 MAX Q is called a "1060" and not a "1055", and same goes for a 1070 MAX Q being called a "1070" and not a "1065".
I understand there is a rationale for it (same number of cores, same chip, etc.) but from a performance standpoint, it only confuses the consumer. Confusing the consumer may make you money, but it will not earn you good will and loyalty. -
now boys on the journey for my last 6 yrs in laptop industry after finally admitting my sony vaio is a piece of junk with iGP, ive seen so many things from blinded consumers to, how corporation like intel/nvidia went from competitiveness and honesty turned into liars and controlled by greed.
lets look back when i first entered laptop space around 2010-2011 and I think that was when intel came out with sandy bridge VS AMD's FX and we all know how that went. Intel had an amazing cpu at the time, its silicon quality at 32nm was pretty good you could overclock 4.7-4.9 on average chips with decent voltage, and the IPC was extremely high vs competition.
of course intel had force socket change and people accepted it however, remember when cpu was tick/tock cycle per 2yrs and each yr will have it's own first gen and 2nd gen chip within either a tick or tock? at the time desktop 2500k, 2600k, mobile space was 2920xm? intel's old routine was half a year later, they will release a better binned/optimized chip in sandy's era was 2700k and 2960xm. this carried into Ivybirdge era but we started seeing not able to clock as high, and people made a good excuse for others to accept that 22nm simply can't clock better due to tighter density, in away it make sense and we still had 3920xm, half a yr later 3940xm. this carried onward until 4940mx and tbh, early signs were there that its already going down hill.
AMD wasn't able to catch up so intel had no need to make progress, however they already hold pretty much all the market they simply can't increase their profit margin anymore so they find other ways to make additional money and when a company starts to do that, we were on the path of doom.
intel does this by closing fabs, laying off employees and instead of improving their chip, they started *haswell-refresh* which technically that is suppose to be the 2nd half of optimization of original generation of tock, however intel made that last an entire year, carries onto broadwell which came out to be extremely late, then skylake and now optimization not just last 1yr but 2yrs, and look at just how well the current 14nm is able to overclock when people were made to believe 22nm can't clock well because of tighter density. other money make scheme include controlling CPU frequency with lowering the TDP/current going into CPU and only allow turbo to last 20-30 seconds before it throttles. making BGA chips and force OEM to give in or no more discounts for batch CPU purchase, each notebook they sell means they sell a CPU (no one wants AMD machines). these are their tactics and now including thinner PCB as well as non-soldered TIM to reduce longevity of their CPU, forcing people to buy new ones sooner.
nvidia on the other hand back with desktop/laptop 580/580M they had the crown, they gave consumer a full blown GPU chip, however started from 680/680M they gave us cut down chip because they were in the lead and AMD could only compete against a cut down GPU chip, they suck in the extra profit. it explains why 780/780M could have more GPU cores and they didnt need to redesign anything, no extra cost spend there. they introduce maxwell and their 3.5GB on GTX 970, trying to make extra money with G-Sync and calling it a feature, then they put pascal in which it wasnt even in their roadmap several yrs ago so they could give us a cut down version of volta called pascal. since to 900 series in notebook for high end, they gone full BGA just like intel, now with MAX-Q rofl.
I came across some random guy, posting on the internet stated that 1070N and 1080N were too hot and they throttle, so nvidia came out with the idea of MAX-Q to reduce the temperature, stated "about time nvidia did something". can you believe how stupid the mass consumers are? they honestly believe this is yet another feature/new innovation that nvidia has come up to target heating issue while giving them a *true* 1070m and 1080m notebook. just blow my mind at times.hmscott, DukeCLR, Papusan and 1 other person like this. -
Last edited: Jun 2, 2017bradleyjb, hmscott, DukeCLR and 1 other person like this.
-
The 680MX, conversely, was actually the first generation to use the same core design as the desktop counterpart for the flagship. Both used the GK104 with 1536 core count. Also worth noting, the desktop 680 was *only* 195W TDP which is significantly lower than any other flagship before or after it. There was no 250W chip for Kepler until the 780.
DukeCLR and Ionising_Radiation like this. -
in this case what the points i make holds true, people can glorify how innovative nvidia is with their max-Q and gsync, in the end, company used shady marketing and have ignorant consumers to fall for more of their crap. where as these are intended for further milking no matter how nicely you word it.
theres free sync and it work just as well as gsync, no reason why nvidia should charge more for display and gsync. no reason why nvidia would want to charge extra for max-Q when all they do is give you the same hardware, if not worse performer.DukeCLR likes this. -
While it would be nice if they did make G-Sync free, having the option to do so is a luxury only a company with the money at their disposal have to make. Not to mention they would somehow have to decouple it from the dependence on the Tegra SoC used as the TCON.
Whether you like it or not, Nvidia dumps massive amounts of money into R&D because they make so much damn money.tribaljet and Ionising_Radiation like this. -
, like it or not facts lul.
max-Q should be renamed to mini-milk -
I wonder if there will be proper standard MXM3.0b form factor boards that don't require an extra power connection with the GTX 1080 Max-Q that can be used to upgrade older laptops with standard MXM3.0b slots like the Alienware M18x-R2 and similar Clevo models. That would be a great application of such parts.
jaybee83 and Starlight5 like this. -
Miguel Pereira Notebook Consultant
Any Max-Q gpu will be a good enough gpu to play anything if you don't expect ultra 4k gaming. There is a space in the market for a secondary thin grab and go laptop, or you fail to see theogic behind it?
The issue here is the marketing of it, and how they will try to deceive the consumer with branding.
Enviado do meu MHA-L29 através de TapatalkStarlight5 likes this. -
Yay guys. I got some information that directly compares with the true GTX 1070 and GTX 1080. Apparently Gigabyte has tested out the Max-Q GPU and gave a 'conservative' guess that it performs 15% better than GTX 1070 but consumes less power and is cooler. If you add the true GTX 1080 into this equation you will see it has 40% better performance than GTX 1070.
It is like saying a 'new lamborghini is being released' making everyone assume it is going to be as fast as the previous models, but then the customer realises while the new lamborghini may be JUST as expensive and looks exactly like the older models, it performs like a..... Toyota?hmscott, DukeCLR and Starlight5 like this. -
You can understand why Nvidia wants to keep a check on this, because ultimately if they have lazy or no validation someone can buy a G-Sync laptop which doesn't perform well and then blame Nvidia. There's variables in the panels, GPUs, ODM integration and so on to take into account.tribaljet, hmscott, DukeCLR and 1 other person like this. -
New Clevos with Max-Q?
Discussion in 'Sager and Clevo' started by pdrogfer, May 30, 2017.