So Max-Q is a scam after all . Man, this is going to be fun , can't wait to see reviewers post this all over the media
![]()
-
Thousandmagister Notebook Consultant
-
Support.2@XOTIC PC Company Representative
-
You can't possibly believe that a 1080 GPU die in place of a 1070 would be easier to get the same performance output as a 1070 given the same power and thermal limits?
That somehow the 1080 is more power and thermally efficient than a 1070 at the same performance output?
I think given the same 1070 performance level power and thermal level envelope for both the performance would be the same, the 1070 in the same laptop as the 1080 Max-Q would give essentially the same performance, as it already has been tested to be true.
Also, a 1070 Max-Q model with a full 1060 would get the same performance.
It's the software watching the power and thermal limits with appropriate tuning feedback that is doing the work, the GPU dies aren't changed.
Given the same software running it all, with the Max-Q tuning for the same power / thermal limits, watching temperature and fan noise, it would rev things up just fine with the "whole" 1070 on the 1080 Max-Q platform, and same with the 1060 on the 1070 Max-Q platform.
"In real life" both the 1080 pulls more power and put's out more heat than a 1070, and a 1070 pulls more power and puts out more heat than a 1060 under the same load.
So far the 1080 Max-Q is matching the full 1070 in performance.
It's a huge waste to put a 1080 GPU in a build to detune it to 1070 performance levels, or a 1070 GPU in a build to detune it to 1060 performance levels, it's all just numbers to let vendors charge and make more $$$.
To bad they don't consider these things before rushing out and buying themLast edited: Jun 6, 2017atacool3 likes this. -
Meaker@Sager Company Representative
But please, go onhmscott likes this. -
Taking a perfectly good 1080 GPU that could perform at 1080 performance levels, and wasting it down to a 1070 performance level just to fit in a slim case.
When for a few percentage points less performance, or the same performance with DDR5X memory - yup, that could happen on the 1070 - don't think it can't - you could use a 1070.
I think if Nvidia had put the same effort into improving the 1060 and 1070 to fit in the slim laptops it could have done it. Adding DDR5X memory to make it work if that would have been necessary, which is doubtful. That would have kept the costs the same as they are now, instead of bumping the price by using more expensive GPU's.
It's a scam, and a not very opaque one at that. It's easy for people to see once you point someone's nose at it.
That is unless that someone or their company can profit from selling them...
But please, go onLast edited: Jun 6, 2017 -
Meaker@Sager Company Representative
hmscott likes this. -
Working with the existing GPU's would have been the smart and cost efficient way to go about reaching that performance user experience.
Wasting more expensive GPU dies to be able to say it's got a 1080 or a 1070 instead of a 1070 or 1060 respectively is pure marketing BS to extract more money from buyers
Over-spec'ing 1080 / 1070 GPU's into the solution is not providing a better performance or user experience, it's purely for mining $$$$'s.temp00876 likes this. -
Would be nice seeing same tech in desktops. With equal pricing. Aka pay 1080 prices for 1070. I wonder what the desktop camp would say
-
Meaker@Sager Company Representative
Power goes up with the square of voltage.
Power goes up linearly with frequency.
Power goes up linearly with unit numbers.
2560 shaders at 900mhz at 0.95v
1920 shaders at 1000mhz at 1v
Performance
2560 x 900 = 2,304,000
1920 x 1000 = 1,920,000
So the 2560 shader card would be around 20% faster. However it is doing this at 5% less voltage. That's a 10% power saving.
Power
2,304,000 x 0.8975 = 2,067,840
1,920,000 x 1 = 1,920,000
1,920,000 / 1,920,000 = 1 relative performance factor per watt
2,304,000 / 2,067,840 = 1.14 relative performance factor per watt
Hence a larger core is more efficient working at lower clocks. So for a given form factor using a larger (or more enabled core) gives better performance. -
The only way to really test it out would be to build a Max-Q laptop with both a 1070 and a 1080 and tune them for best performance, and see which one performs like a 1070... you let me know how that works out - but I think we already know the answer, both of them
Until then...
There are already 1070 wielding slim laptops with the same power and thermal limits, the GL702VM 1070 running without Max-Q tuning software vs the 1080 Max-Q - we already know the gaming performance is the same, the only difference is the huge cost of the Max-Q solution vs the $1000 cheaper GL702VM.
Given the same Max-Q dynamic tuning for optimizing GPU downtime - the idle time between load power / thermals - I think the GL702VM would run even better than it does today.
Maybe Nvidia will release the driver and firmware Max-Q tuning for all mobile GPU's?
That would be a nice thing for Nvidia to do, don't you think? -
Meaker@Sager Company Representative
-
-
And we haven't talked about overclocking and thermal headroom for the coming thin and flimsy.
For example. 1080 Max-Q is 90-110w vs. the working 1080 who is 180-200w in some models. I understand Pascal do it better with lowest possible temp. And ODM's put in 10-15% headroom on top of the cooling capacity. Will the thin and flimsy with the new Power Crippled card OC and hold higher clocks(oc'd), same as the thicker models with the "normal" graphics? See, eg 190w graphics + 47w BGA will have a thermal headroom around 23,7-35.5w vs. the lot lower for the new crippled machines. I expect the performance will be further crippled vs normal working graphics. Thin and flimsy isn't nice!! The main reason for this bastard is pushed out from Nvidia.Last edited: Jun 6, 2017hmscott likes this. -
Designed and built with a 1070 instead of a 1080 would provide the same or "close enough" 1070 performance as Max-Q 1080 for $1000 less
That's all it comes down to, really, making money where there was no money, instead of providing better performance and user experience for the same money.
It's not really a radical idea, and would be much better for Nvidia to have done, considering AMD is sniffing at their heels.
AMD should have a nice clean shot at taking a lot of market share from Nvidia if they keep this up and don't turn themselves around.Last edited: Jun 6, 2017 -
Since a Max-Q laptop is already "Max-Q"'ing out performance there will be absolutely no OC'ing a Max-Q laptopLast edited: Jun 6, 2017 -
hmscott likes this.
-
1070 desktop GP104 = 1920 SMs, 120 TMUs
1070 Mobile GP104 = 2048 SMs, 128 TMUs
How else do you think the mobile 1070 gets anywhere near the desktop version with a 40W lower TDP?
hmscott and Ionising_Radiation like this. -
And not forget... Use of weaker components!!
hmscott likes this. -
The $1000 additional cost to the end user is ridiculous when compared to the same laptop with 1070 performance as already shipping.
The die may cost the same to Nvidia, so in that case Nvidia should pass the "no cost" on to the user and keep the cost the same as a 1070 laptop is today.
At the very least, don't call it a 1080 - call it a 1070 Max-Q - it's got 1070 performance when shoehorned into a laptop far too small for it's own good so it's expected performance should be reflected in it's name.
The improved cooling of the newer designed laptops + the firmware and software for Max-Q to get finer control over power and thermals wouldn't be needed if increasing the SM / TMU alone were the solution.
The Max-Q tuning is firmware and driver changes to enhance the dynamic tuning to constantly change the GPU load on demand giving it more time to cool down yet quick response to additional demand.
It would work the same with a 1070 rather than a 1080, improving power usage and reducing thermal output while maintaining performance in the same power / thermal chassis package at the same cost to the end user should have been the goal.
It's basically a better boost / with added "de-boost", although that's a guess for now, until we can see performance graphs for Max-Q GPU's under varying load to see what Nvidia is varying under what conditions.
More likely Nvidia will provide a locked down environment to prevent users abusing the hardware.
If that Max-Q software is indeed monitoring the right things, it's already getting the most performance out of the cooling system to run at a 40dB noise output level, the extra few FPS at higher noise levels are likely hardly worth it.
Hopefully tuning options might be available on higher spec laptops with cooling better suited to the potential of the GPU at some later release.
If Nvidia had used a 1070/1060 instead of a 1080/1070 for Max-Q tuning in these new slim laptops it might be possible to unlock the limits, but not likely with a 1080/1070 that could easily overpower the cooling and power delivery with the limits removed.
Of course there is that built-in limit set in the firmware to match the lower wattage PSU, Nvidia might feel comfortable relying on that as a limiter and allow some user tuning.
We will know when we see one of the new Max-Q laptops fresh out of production destined for end usersLast edited: Jun 6, 2017 -
Ionising_Radiation ?v = ve*ln(m0/m1)
But here's the catch: GPUs have got no moving parts, unlike jet engines. They don't exactly need frequent and meticulous maintenance routines. The former are solid-state electronics, and a few decivolts, through they may make a GPU last a year or so longer, would hardly make a difference in the practical life of a GPU, meaning to say it'd be thrown away or replaced long before it ever hit its life span.
I understand your point of the notebook 1070 having some 128 SMs more than the desktop counterpart, with clock frequency and voltage lower. However, we're talking about putting a GTX 1080, which has nearly seven hundred more cores than a GTX 1070, into a notebook hardly two centimetres thick.
Fitting such a GPU into a thin-and-light just to downclock and undervolt it with weak fans (more air moved = more noise, and more air moved = more effective cooling, simple as that) is a waste of GPU. Like the reviews have mentioned, it'd end up performing hardly any better than a 1070 anyway. Not likely to have much OC headroom, either, given the thickness of the notebooks in question. Obviously it'd be efficient, because we're taking a high-end GPU and making it perform at mid-range levels of performance.
Going back to the jet engine analogy, it's like bolting GE90s on a B737 or an A320 (not that they'd fit, anyway) and making them run at 60% N1. Sure, we get efficiency. But it's a waste of a jet engine. Performance tiers exist for a reason, and if we start over-engineering everything just for the sake of efficiency, then we have a lot of wasted potential, and a waste of resources that we'd never use, anyway. I've already mentioned that the Pascal GPUs start disabling cores when the voltage begins to run too low. We might see the same case here, where instead of 2560 cores running at 0.95 V and reduced clocks, we may have 2400, 2300, maybe even just 2000 cores running at 0.95 V and reduced clocks. -
RMA isn't nice and cost money!!
-
All 1070s start out as a GP104 core. They get binned and cut down to suit which skew they're going into. A 1070N (mobile) is not a "bumped up" 1070, more like a "less cut down" 1080. Technically, a 1070 costs more money to create than a 1080 because it's not a fully functioning GP104. The reality is, Nvidia set the sell prices based on what comes out, not what goes in. If the process was perfect there wouldn't be a 1070, everything would be a 1080.
A) What Nvidia charge for each skew of the cores. We don't really know what this is. For all we know the difference between a 1070 GP104 and 1080 GP104 is almost nothing.
B) What the ODM charges on top of that. Again, we don't really know this either.
Neither of these things imply a "huge" added cost or a problem with the technological concepts here. The huge price-tags we're seeing on these new 1080 Max-Q machines is more likely huge "wow" price-tags like you get from Razer and such. The more regular models getting Max-Q variants appear to be priced much more reasonably, as I've documented at least with MSI a few pages back.
The key here is voltage efficiency. As voltage goes up, power consumption (and therefore heat) goes up exponentially and efficiency plummets. This is a concept everyone really needs to wrap their heads around.
The point (these are just example numbers) with Pascal in general is that the highest efficiency point seems to be around 0.9V -> 1.0V. Clock it appropriately and design the power delivery to work as efficiently as possible in that voltage range and your overall perf/watt goes up. Enough so that you can apparently jam a full GP104 that runs like a 1070 into a 110W power envelope.
One day we might even get. Cut-down chips would become a thing of the past, because binning would become obsolete.
That implies that there's a significant amount of RPM headroom in the fans. Granted it could come entirely down to firmware/bios limitations, but I have no doubt that a reasonable OC would not be limited due to thermals.
I guess we'll see when people get their hands on them. -
Pro-tip's my sweet patootie.
Nvidia doesn't need to apply a full 1080 to this solution, Nvidia doesn't need to call it a 1080 when it's delivering 1070 performance, and Nvidia doesn't need to charge $1000 more than the laptops providing 1070 performance today.
Nvidia is trying to cash in like Razer has with their 1080 Blade off the innocent dupes that fell for their marketing, paying for a full 1080 but only ending up with 1070 performance, and no CPU OC headroom due to power limits.
End of story.Last edited: Jun 6, 2017ThePerfectStorm, cj_miranda23, ajc9988 and 3 others like this. -
We know that former 980m graphics card had a high failure rate. And why this happened. With cheapo components intended for low load as stated(Max-Q)... Yes we have already the golden standard how this will go if overclocking is-was possibleAnd Max-Q is BGA
I wonder how cheap it will be and how easy it will be to opt for new or used MB
Last edited: Jun 6, 2017ThePerfectStorm, TBoneSan, cj_miranda23 and 2 others like this. -
Dialup David Notebook Consultant
When has Nvidia ever let the user have control of their hardware? Only reason overclocking via software works was because of a driver exploit you could use Riva Tuner to manipulate. Nvidia tried to patch that and the PC community damn near burned their brand to the ground until they undid the patch.
Voltage control on mobile is exclusively done via vBios modding, as well as TDP unlocking that requires a potentially bricking flash etc. This is completely the user manually doing all this, no hand holding. I wouldn't call that "allowing that to happen".
In my eyes, Max-Q is just a marketing gimmick to sell gimped and possibly low spec ASIC's that never made the cut for the voltage/clock specification to make it into other platforms (Mobile/desktop). So instead they nuke the clock speeds, cut the TDP atleast in half, and call it a special new part. This way manufacturers that don't want to spend the R&D dollars actually building a machine that can house/cool/power higher end cards can just plop this new "Low Power" card and keep the standard GTX 1080 or whatever moniker. Then the user who has no idea whether or not it's a gimped Max-Q variant, spends a fortune on this hardware that can never possibly compare to the equivalent non gimped GTX 1080, and proceed to go on these forums just to complain about how terrible the performance is compared to the guy that spent less on a 17.3" 30lb Sager/Clevo etc.
Don't get me wrong here, I think everyone wants a Razer Blade form factor with a full TDP GTX 1080 without fusing hydrogen atoms, but using this Max-Gimp marketing trash to push overpowered and underclocked hardware is a load of crap. Hopefully I'm wrong and just barking in the wind, but i'd like to be surprised.Last edited: Jun 7, 2017jaybee83, ssj92, Papusan and 1 other person like this. -
Now we just hope AMD release their GPU VEGA on notebook with full power and reasonable price i will stick with AMD forever :/
jaybee83, TBoneSan, Papusan and 1 other person like this. -
bennyg, ElCaptainX, jaybee83 and 4 others like this.
-
Support.2@XOTIC PC Company Representative
-
How is the eGPU market going currently? They seem like a very good option for a GPU upgrade without sacrificing battery(u only game with a mains plug nearby), thermals, and size of laptop. The only problems I see is the thunderbolt port decreases the performance compared to desktops slightly and some eGPUs are not very portable.... Would the price of a base eGPUs(with no actual GPUs) still result in it being more economical and useful than an overpriced and underpowered Max-Q GPU?
And for people who claim to game without a mains plug nearby, the only reason they can do so is because they are playing pinball or something.... The battery life for an actual game would be max 2 hours(and thats a very efficiently designed laptop). -
Support.2@XOTIC PC Company Representative
I think the initial eGPUs were timed poorly. They were clearly conceived before the performance differences of the full 10 series, and then being able to get near desktop form factor, even in a thicker heavier laptop made them less attractive. Right now it seems like nobody knows whether to jump on the bandwagon, hold off for a bit or skip it entirely. This coupled with the fact that the people who would benefit most from it and people who have older systems lacking the Thunderbolt port necessary to run one so they'd need to buy an entirely new computer just to be compatible, and the newer low end systems still being released without TB are all contributing to confusion and I imagine lower sales. -
Ionising_Radiation ?v = ve*ln(m0/m1)
With eGPUs, wouldn't the fact that most thin-and-lights have a U-series CPU make a massive bottleneck?
I mean, two cores, four threads, that's hardly better than a Pentium... -
Support.2@XOTIC PC Company Representative
There's a decent number with HQs, shouldn't be a huge problem. But you're right, the U series will be an obstacle. -
13R3 – GTX1060 dGPU / 13R3 – GTX1060 eGPU / XPS9365 – GTX1060 eGPU (akitio node)
Unigine Valley 2,368 2,408 2,067
Unigine Heaven 1,431 1,363 1,188
Unigine Superposition 2,085 2,073 2,164
3DMark Time Spy 3,537 3,677 3,507
3DMark Fire Strike 11,326 11,005 9,292
Metro Last Night Redux 83 FPS 71 FPS 58 FPS
Rise of the Tomb Raider 54.28 FPS 47.83 FPS 43.13 FPS
Tom Clancy’s The Division 49.9 FPS 40.8 FPS 39.1 FPS
i think it is a good result for a cpu like Intel Core i7-7Y75 1.3 GHz and without plug (only battery -
Support.2@XOTIC PC Company Representative
I still think that would be noticeable in a lot of cases. -
The other gotcha is the new rage for 120hz display's on all these smaller thin TB3 notebooks.
The TB3 bandwidth to eGPU and back to the internal screen has enough trouble at 60hz/FPS, IDK how it's going to react to trying to feed 120hz/120FPS gaming back to the internal display.
It's always been a good idea to connect the display to the eGPU directly, but with 120hz internal display it's going to be mandatory.TBoneSan likes this. -
I'm also interested with egpu solution to have 1 ultimate laptop for everything (gaming, work, etc).
I also think the surface book with performance base is really close to be the ultimate laptop, unfortunately the design doesn't allow easy maintenance (no removeable back cover), so re-pasting and cleaning of the heatsink+fan would be a nightmare to DIY.
I don't know why gaming laptop company doesn't make 1 like Surface Book... a gaming laptop with touchscreen, stylus, and 360 degree hinges.
Imagine... slim gaming laptop like MSI GS43VR but with 360 degree hinge to convert into tablet mode. -
You can't have both AAA gaming and tablet in the same laptop.
Even a tiny low end GS43VR is sucking in AC power and blowing out huge amounts of hot air when in heavy gaming - you couldn't hold on to it. And, you couldn't fold the screen over covering the air intakes. And, you'd have an AC cable trailing from the laptop / tablet.
Instead, right now, you could get 2 devices, one built correctly for gaming and one built correctly for tablet use.
Or, you could get one of those new magical Max-Q laptop's that transforms into a Sushi Serving Table:
Those are totally awesomeLast edited: Jun 7, 2017jaybee83, ThePerfectStorm, atacool3 and 3 others like this. -
Support.2@XOTIC PC Company Representative
jaybee83, ThePerfectStorm, atacool3 and 1 other person like this. -
jaybee83, ThePerfectStorm and Ionising_Radiation like this.
-
Ionising_Radiation ?v = ve*ln(m0/m1)
Sticks
, now with a built-in battery and proprietary Lightning plug which you can plug into your iPad to power the warmer coil built into the chopsticks. Keep your food warmer as it traverses the thirty centimetres from your plate/bowl to your mouth!
Available at all AppleStores and the Apple
Online Store
. Sticks Manager
app is available on the iOS
App Store
.
jaybee83, ThePerfectStorm, atacool3 and 1 other person like this. -
I'm sorry, I'm not being clear enough.
I didn't mean convertible laptop with high performance GPU in it.. it would be disaster like u said with all the heat and power consumption.
What I mean is Surface Book-like device, but from gaming laptop company like Asus or MSI or Razer.
So it will have the same feature like Surface Book, but with removeable back cover for easy maintenance.
As you probably aware, Surface Pro have been copied a lot by other laptop manufactures.
But not Surface Book. Only Surface Book manage to put in a GTX 965m into a 13" convertible laptop.
Other manufactures convertible laptop only manage to use integrated GPU.
The closest 13" laptop with dGPU is Alienware 13, but it have a footprint of a 14" laptop and it also not convertible.
MSI have GS30 with 950m, but it also not convertible. And they also stop producing 13" gaming laptop (no GS32 successor).hmscott likes this. -
We can only dream of a future with Mimosian Antimatter Chopsticks...
-
But, there are other gaming laptop makers that are or soon will be making 1050/1050ti, and perhaps even 1030 based laptops.
But, even those dGPU's (as well as the 965m) are going to need AC power for full performance, there really aren't any (powerful) dGPU's that run the same on battery as on AC, it's just not possible.
I think I saw one with a "130" GPU mentioned - it was either a typo and they meant 1030 or Nvidia has a new class of 3 digit GPU names. That may have a battery mode that's less power sucky than the higher models, but probably not powerful enough to do much either.
Keep watching for new releases -
-
Support.2@XOTIC PC Company Representative
I'm guessing 1030.
And I don't see a good battery gaming solution any time soon in laptops at least. You're gonna be tied to a wall whether it's a hybrid ultrabook or a massive DTR.hmscott likes this. -
Support.2@XOTIC PC Company Representative
-
Can a Gaming Laptop be Thin & Quiet? Explaining NVIDIA Max-Q
DukeCLR likes this. -
Miguel Pereira Notebook Consultant
If eGPU was properly developed you could have a thin laptop with a 1050 or equivalent and a HQ cpu or ryzen and use the eGPU for the heavy lifting.
But eGPU is not properly developed. It looses too much performance.
Enviado do meu MHA-L29 através de Tapatalk -
eGPU is a very attractive option for me. But I'm yet to see it done quite right. I'd want the internal monitor to be able to display 120hz, and without a hit.
In this instance I would want a good processor too. Not a thermal throttling 2.8ghz that can't feed the GPU.
New Clevos with Max-Q?
Discussion in 'Sager and Clevo' started by pdrogfer, May 30, 2017.