i would be the 1st to buy... i almost pull a trigger on a 13.5" Surface Book with 965m performance base. Unfortunately it lacks thunderbolt port.
-
Using less power is a great idea!
Being in my car since February, I got a gtx 1070 laptop, ge72mvr, and a deep cycle battery, optima d27f.
This laptop drains the heck even out of that battery and I have to recharge quite often.
I would definitely love using less power and when back home it will save on the bill!Ionising_Radiation likes this. -
Don't worry. If they keep on gimping every gen you'll be able to run it off a solar strip soon... like a calculator
Seriously though, I've never noticed any difference on my electricity bills whether it's been a greedy desktop or a puny notebook. I'd rather not game at all than have to play on battery. -
The purpose of the battery isnt to game but to do other stuff like watch videos and stuff(personal opinion). No person in their right minds will want to properly 'game' without a port nearby
-
I think you're wrong
@TBoneSan
TBoneSan and Spartan@HIDevolution like this. -
Meaker@Sager Company Representative
You can do it, it's just not going to last a huge amount of time. Certainly fine for lighter titles, odd they quote desktop battery life under a gaming title though. -
I used to game on battery on my old M570TU. It lasted about 1 hour. It was fun having people watch you play Tiberium Wars while on the bus from city to city.
jaybee83 and ThePerfectStorm like this. -
+ very nice way to kill your battery long before time. The battery isn't expensive, so why not?
Hope people buy spare battery before the Dell's premium support expires. If not, Dell can't help you. But we have Ebay or similar places if you really need a new one
Deal with Dell after your paid warranty/support is expired can be hard.
-
Meaker@Sager Company Representative
Well it will use up cycles yes but the protections in place should stop it from serious harm.Papusan likes this. -
People complaining that their battery already have reached +10% wear level after 2-3 months. Wonder why this happens
Maybe they took the sales sites advertising serious
Oh'well
-
Meaker@Sager Company Representative
There are limits to battery tech still, so it's true you can't have your cake and eat it too.
Like why can't I run full performance on battery and use it like that every day and then not get wear on it?
It depends what you want out of the machine though.
Typically the batteries are not glued, they are replaceable. -
Yeah, nice that storage, Wifi, ram and battery still is replaceable in notebooks
-
Holy crap! Mega Koolaid?jaybee83, Prema, steberg and 1 other person like this.
-
No doubt that could be fun in some parts of the world , but it's more a novelty to me than something that would dictate the whole premise of my machine.
I get that some students living on a campus might need a machine to pull double duty but when a $300 disposable can do a better job on battery I'd sooner use that solely for study and something else uncompromised for leisure.Last edited: Jun 4, 2017 -
Apparently people here are also forgetting that the 1070 "Mobile" has been using the "Max-Q" concept since release. The mobile version has an extra SM enabled with a lower stock clock in order to run more efficiently compared to the desktop version.
Another thing to consider, is in many cases a "MaxQ" model may get more VRAM and/or more memory bandwidth compared to the existing versions and close the gap a bit more.
ie. the P950 would otherwise come with a 6GB 1060 whereas it can now come in an 8GB 1070 with more bandwidth. In the case of the Aorus X5 which already has a 1070, the 1080 MaxQ will get 8GB GDDR5X.
I'd also like to see what they really mean with the noise limitation. This could be the real kicker that puts everything into perspective.
The number getting thrown around (by Nvidia themselves) is a maximum noise level of 40dB. 40dB is a HUGE change when you consider most of these types of machine are nudging 50dB. Dropping 8-10dB is nothing to sneeze at when you consider it's a logarithmic scale. It doesn't mention if that's just GPU noise development or CPU though.
To put that in perspective, the previous Aorus X5v6 puts out a rather stupid 53dB (57dB max!) according to Notebookcheck. Even big bulky machines like the P870 kick out 42dB or so. Getting something like the X5 down to 40dB means an effective 4X decrease in noise which is massive. -
Less noise means crippled power and maybe we will see even more crippled drivers-software from Nvidia. Everything at the expense!! You can't get both.infex likes this.
-
Just think about what you've said there which you've obviously not put any thought into.
Lets just say the 1070 is significantly downclocked. Yet, it'll still have to perform better than the 1060 while developing less noise in most cases.
A better example is the MSI GS63.
Current-ish GS63, it runs a full GTX1060 and Notebookcheck records it running 37dB in 3DMark06, 45dB in 3DMark11 and 47 dB in Witcher3.
The new GS63 with "crippled" Max-Q 1070 has to do all of that under 40dB and yet, faster than the 1060.
So, care to explain that in a meaningful way? Lets assume the worst case scenario, where the Max-Q 1070 performs the same as the standard 1060, the implication is it still generates less noise (which would imply less heat).atacool3 and quantumslip like this. -
I left this discussion because it went nowhere, but I want to come back to just ask one thing:
If nVidia had named these Max-Q variants something like 1055, 1065, 1075 (instead of 1060, 1070, 1080) or attached the m designation, or otherwised named it in such a way that the consumer knew it wasn't equivalent to the regular model, would y'all still be mad at nVidia or be more accepting? -
They would still be mad anyway because we'd find out that it's a 10X0 core that's been downclocked either way. In which case the argument would go "why are they gimped?" or "why isn't my 1065 which is actually a 1070 not running at full speed?".
A) Nvidia either calls it what it is (ie a 1060/1070/1080) and attaches some fancy moniker and white lies their way through it.
B) Or calls it a 1055/1065/1075 which blatantly lies about what the core is.
There's no "good" way to release a downclocked version of something. But at least option A is technically correct. Most people here don't even care about the technical aspects or even if the concept of "lower clocks with more cores" even works (which we already know it does with the 1070 Desktop vs Mobile btw). -
How to improve the cooling and a deliver normally functioning card instead of a power gimped <crippled> cards? A bit more noise, but with better performance. More power, gives more heat. Which will ultimately yield higher performance(In the sense of the word).
If nVidia named these Max-Q variants something like you said 1055, 1065, 1075... Call it what whatever they want, but lower performing card should-have to cost less. Same as 1070 cost less than 1080. And 1060 cost more than 1030Last edited: Jun 4, 2017 -
I think the main issue people take umbrage with is the fact that a 1080MQ is not going to cost proportionally less than a1080, that proportion being its performance disadvantage.
It's not about whether or not it's a good thing to have these MQ chips; plainly put, it is, as it's allowing, say, Aorus to stuff more power into the same chassis.
My biggest issue, personally, is that the MQ devices probably will sit very unfavourably on the price-performance curve.jaybee83 likes this. -
You're not reading me. In fact you haven't understood a damn thing. Most of these machines literally cannot handle the full version of the 1070 or 1080.
Think about it this way....
Hypothetically speaking (numbers below are just for the example), lets say you're building the thin GS63 and have a power + thermal budget of 100W. You can either:
- Put in a full 1060 (80W TDP) and it clocks normally and runs relatively cool since you have headroom.
- Put in a full 1070 (110W TDP) and it power AND thermal throttles. People will complain it throttles.
- Put in a full 1070, but downclock it to 1100mhz (lets say it equals about 100W TDP). It runs within your budget and a bit faster than the 1060. But what to call it......
Option 1 is what we have now (GS63VR 7RF), Option 2 sucks, and Option 3 works but we're gonna have to give it a dodgy name (Max-Q).
So I recently got a hold of some pricing for Australia on MSI models. Absolute pricing may differ, but relative pricing (ie percentage difference) should remain similar between regions:
GT62VR 7RD (GTX1060): $2599
GS63VR 7RF (GTX1060): $2999
GT62VR 7RE (GTX1070): $3099
GS63VR 7RG (GTX1070MaxQ): $3399
All of these are practically identical in other specs right down to wifi and warranty (7700hq/16gb/256GB SSD + 1TB HDD)
So lets summarise:
- GT62VR -> GS63VR 1060 models already shows a 15% increase in price. People are willing to pay a 15% premium for the thin GS63 chassis/aesthetics. Absolute difference = $400.
- Upgrading from a 1060 -> 1070 in the GT62VR costs an additional 19%. In absolute terms = $500.
- Upgrading the GS63 from a 1060 -> 1070MaxQ costs an addition 9%. Roughly half the addition in the GT63 above but in absolute terms, 20% less. Absolute difference = $400.
- GT62VR 1070 -> GS63VR-MaxQ 1070 model shows a 9% premium, or $300.
- A "hypothetical" GS63 that could fit a FULL 1070 would cost $3568 (2999 * 1.19).
-
That graph is very vague. Doesnt mention the conditions the laptop was used under. It is also very unlikely its a game as 99Whr is absolutely **** for 'gaming'(no battery capacity is good for gaming as there is a limit of 100Whr). That value must have been recorded while the laptop was internet surfing. Find me one laptop that can properly game for more than 3 hours and I will be impressed.
I literally did 20 seconds of searching to prove you wrong(i am a bit disappointed you didnt actually put any effort in refuting me). It does 100 minutes of 'load' tasks(check the notebookcheck review). -
The real problem with Max-Q(in my eyes) is the pricing. I dont mind that the Max-Q is slightly weaker than the real GTX 1080(and if u realllyy want to game ultra ultra high def get 2 GTX 1080) and I dont really mind the label they put on the GPU. But the laptops seem to be much more expensive than their counterparts. For example a laptop with an equivalent of GTX 1070 will cost 3000$! And that laptop doesnt even have anything else that seems worth that price(Triton Predator). A laptop with an equivalent of GTX 1060 costs 2300$(Zephyrus) while 'ultra thin' counterparts costs less than 2000$. The slight performance boost doesnt warrant a 400$+ price tag. I mean the Zephyrus has otber features like a 120Hz screen but that also isnt worth the price tag.
-
Put in what fits the cooling. Or improve this. If they go for Max-Q. Do not charge more than what the performance is equivalent to. Is the similar 1070... Oh'well pay for 1070 performance. No more!! Is it lower, reduce the price accordingly. I'm not interested pay same high prices for less performance, but thats me
It's Alienware who announce portable gaming. I'm not making the graphs or how this is done. Do not shoot the messenger
Last edited: Jun 5, 2017 -
I did an entire breakdown of how the GS63 MaxQ pricing fits in the 2nd half of my post. Did you even read?!
It clearly explains the pricing and how they (MSI Aust) are charging (IMO) a "reasonable" premium for the performance in that chassis which is LESS that what a full 1070 would "theoretically" cost. Assuming other brands follow similar pricing structures within their own line-ups, everything should come out fine. -
Assuming is a nice word
You will see the whole when these new machines from all brands come. But that people have to Pay 1080 price for 1% performance gain. oh'well. Not me
As well read...
hmscott likes this. -
Meaker@Sager Company Representative
Presumed performance from notebookcheck, yes lets take that as gospel
-
My main concern with the new Clevos isn't Max-Q, it's the crappy IPS screens they are sticking in them. 45% gamut? This is not why people buy an IPS screen.
/end threadjackhmscott likes this. -
I'm confused, they advertised Max-Q as just a redesign in cooling, not a change in the manufacturing process.
-
Support.2@XOTIC PC Company Representative
Let's be honest, since we're not average consumers here: The average consumer buys IPS because they heard it's better and that's as deep as it goes. Many would be better off with those 120Hz TN panels and buy IPS anyway.TBoneSan, Papusan, deadsmiley and 1 other person like this. -
Max-Q is the new M...I'm really happy with my normal GTX 1070, what's the point of putting a 1080 if it will be throttling all the time ???
The real change is Nvidia Volta, not this total marketing BS, really disappointed by Nvidia this time.hmscott likes this. -
Maybe the new graphics don't Throttling all the time? Lower clock speed will help on this
Lower clocks = Less throttling
If the ODM's follow Nvidia's spec.
-
Sooo, if it's "pre-throttled" with a low base frequency I would think that would qualify it to be called "throttled 100% of the time", as compared to a full performance non-Max-Q GPU.
The way the Max-Q reads performance *is* perched on the wave on constant throttling, to make the 40db or quieter requirement, and to keep from thermal throttling - maybe that sound throttling is setting an upper limit on performance to keep the Max-Q from ever showing thermal throttling?
That makes sense, set the fan noise as the throttling factor.
Papusan likes this. -
I wonder when INTEL or AMD will start with similar tech? I'm sure the ODM will adopt the tech yesterday
BTW. Clock speed around base clock isn't Throttling
At least not from Intel's specifications. I'm sure all who buy this will be happy with base clocks
Who need performance? Only more noise and heat!!
hmscott likes this. -
Yeah, well gamers will notice, they are looking like hawks for *more* FPS in the games they know about because they are playing the heck out of them, they won't be happy not getting what they think they will get.
If the 1080 Max-Q and 1070 Max-Q performance is even a few FPS less than real 1080's and 1070's, for which there are a zillion uploaded examples of performance in every game gamers want, the cow plop is gonna hit the fan, big-time.
If Max-Q doesn't deliver close to 1080 / 1070 performance that is going to make a lot of gamer buyer's PO'd, and they will be very verbal and expressive about it.
Last edited: Jun 5, 2017Papusan likes this. -
Well, all this depends... Average Joe, first time gaming book buyers and former desktop users may think the performance from the new power gimped cards is normal for laptops!!
Don't underestimate the buyers
hmscott likes this. -
I'm not talking about the completely clueless, they will be defending their high dollar flop, I'm talking about the gamers that are taking to heart the literal promise from Nvidia of 1080 / 1070 performance brought about by Max-Q technology.
If the gamers buying for that purpose, getting 1080 / 1070 performance don't get it, it's going to be a mess for Nvidia to clean up.Papusan likes this. -
You are the undisputed king of straw man arguments. I already gave you a bunch of information so you just go off and find something completely different....zzz
I've already read Prema's post and I have no doubt that what he said is true....for CLEVO. If they built a laptop that can handle a full 1070, then bring it down to fit Max-Q for the reasons stated, then I think they've made a bad decision. But that has nothing to do with Max-Q itself. Nobody told them they couldn't just release it as is. -
Ionising_Radiation ?v = ve*ln(m0/m1)
As much as I despise nVidia for this shady Max-Q business, I think @quantumslip and @Stooj are right in that we ought to wait for benchmarks first, see whether or not it'll actually be worth the price, and the thinness.
-
Since we already have the prototypical Max-Q 1080 embodied in the Razer 1080, I think we've already seen what people will get. Performance limited by inadequate power, cooling, an opportunity lost.''
I do think it's possible that given enough honest performance scrutiny from reviewers the vendors can tune the pricing dropping it to 1070 pricing levels for the 1080 Max-Q and 1060 pricing levels for the 1070 Max-Q, and that would be fair trade for the like performance.
It still smells like a scam with a high chance of unrealized anticipated performance by people hearing "1080", all adding up to a flop in the end.
Given how many people were taken in by the Razer 1080, only to realize their error and returning it, this could cost vendors lots of money.Papusan and Ionising_Radiation like this. -
-
Sry for being a bit sceptical, but what makes you say that the GTX 1080 in the Razer Blade Pro is anywhere near the performance(not literal performance but the other factors, we still dont know the literal performance of Max-Q) of the new Max-Q GPUs? Just because it is in a very thin laptop doesnt mean they are all the same GPUs lol. I dont think it is fair to apply that thermal problem stereotype YET, especially when evidence points to pretty amazing thermals on the GPU, relative to its performance. Also we r talking about Razer here, the company that still to this day havent realised the purpose of cooling in a laptop.hmscott likes this.
-
I don't have a quote off-hand, but others during the Max-Q roll out pointed to the Razer 1080 as an example of a pre-Max-Q attempt at fitting a 1080 in a slim laptop - Max-Q was suggested as an improvement on that design.
There is sparse data on the Max-Q 1080 performance, but there are 2 benchmarks we can refer to already that show the 1080 Max-Q only 1% and 4% faster than a stock 1070:
Max-Q 1080 1% faster than a stock 1070:
https://www.notebookcheck.net/Analysis-Acer-Triton-700-with-Nvidia-GTX-1080-Max-Q-GPU.214802.0.html
Max-Q 1080 4% faster than a stock 1070:
https://www.notebookcheck.net/NVIDIA-GeForce-GTX-1080-Max-Q-GPU-Benchmarks-and-Specs.224730.0.html
It's likely the Max-Q runs quieter than the Razer 1080 as part of the Max-Q strict qualification limitations is a noise level of 40db - IDK if that is idle or under load, but either way it's a lot quieter than a Razer 1080.
Right now as it appears, the Razer 1080 could be a tad faster and a whole lot noisier than a Max-Q 1080, which would be very sad for both
atacool3 and ThePerfectStorm like this. -
I said that I would try and update the above post when I was able to - so here goes! I know a lot of you are waiting for benchmarks and performance figures to come out, but as I mentioned above we're definitely not in a position to do this at the moment. I think the fact that there aren't any official reviews out from any other A-brands either would point to the status at present, which is that the soft launch has occured at Computex but the hard launch hasn't followed imediately. This is not at all uncommon in the IT industry.
We're not going to be able to release temperature, performance or other figures until we have systems that we are comfortable in releasing onto the market - which tbh would be the same for any other product launch we have or new technology integration. I mentioned previously that we currently haven't seen MAX-Q chassis which we feel fit in with our current product line or which necessarilly compliment our current product line, but that there's work going on behind the scenes so I'm not ruling anything out at the moment!
@hmscott (well and others that are discussing the performance) - bear in mind that MAX-Q has to be integrated into different chassis by different manufacturers. It's quite a clear conclusion to draw that the performance, temperatures and noise levels will vary depending on the chassis implimentation and thus you may find that performance differs between chassis manufacturers.Last edited: Jun 6, 2017bennyg, Papusan, Prema and 1 other person like this. -
It looks like pursuing 1080 Max-Q technology wouldn't be worth the effort, as the performance already shown doesn't suggest it will sell:
Max-Q 1080 1% faster than a stock 1070:
https://www.notebookcheck.net/Analysis-Acer-Triton-700-with-Nvidia-GTX-1080-Max-Q-GPU.214802.0.html
Max-Q 1080 4% faster than a stock 1070:
https://www.notebookcheck.net/NVIDIA-GeForce-GTX-1080-Max-Q-GPU-Benchmarks-and-Specs.224730.0.html
Maybe 1060 Max-Q with the same TDP as a full 1060 will fair better? -
Meaker@Sager Company Representative
This is about getting the most out of a form factor, not hitting performance targets. A 1070 or 1060 with the same restrictions would not do as well.
-
Support.2@XOTIC PC Company Representative
I would hope they notice that their super thin laptop doesn't perform the same as a thick heavy loud laptop and connect the dots. -
Thanks for sharing the benchmark tests! Well the results sure looks awful.... I was thinking of buying the approx. 3k$(apparently ONLY according to linustech) Aorus X5 MD and was sure that a freaking extra 1000$ price tag would result in a much better performance(midway between GTX 1070 and Gtx 1080), but now I am gonna reverse that decision to a 1000$ cheaper Aero 15 laptop with a GTX 1060. There is no use supporting such a s*** GPU/laptop so I might as well stick with the old GPU..... Btw are all the new laptops getting released going to be equipped with Max-Q GPUs or only a few ones(like triton, zephyrus and X5 MD).hmscott likes this.
New Clevos with Max-Q?
Discussion in 'Sager and Clevo' started by pdrogfer, May 30, 2017.