I didn't know this until tonight. But AB let's you unlink power and target temps now. I'm not sure how long this has been available. The only thing is you need to use one of the modern UIs rather than the classics.
-
I will have to look again to be sure. Maybe it's different with desktop GPUs. I don't use the classic UI and I have the latest, but I haven't been able to unlink them. With beta Prema Clevo 1080 vBIOS having power and temp sliders unlocked is awesome, but having AB link them so they cannot be changed independently sucks. For example, I may want to set my temp limit for 95°C and give priority to power over temps even when playing games at stock clocks without pushing the TDP to 120%.
Check out the power draw with the little baby beast MSI 1080 in the 16L13... 3DM11, 3DM11 X and FS examples in thumbnails. And, this vBIOS doesn't even have the temp target unlocked, only power target.
Attached Files:
Last edited: Jul 1, 2017Papusan, D2 Ultima, Donald@Paladin44 and 1 other person like this. -
-
@Mr. Fox
You might be right about the desktop being different from the mobile.
Just for reference, this is what I'm talking about.
Papusan and Donald@Paladin44 like this. -
http://www.guru3d.com/files-details/msi-afterburner-beta-download.html latest AB is 4.4.0 beta 12
Donald@Paladin44 likes this. -
Meaker@Sager Company Representative
IIRC Prema did say on the mobile cards this was linked on the firmware side.Donald@Paladin44 and TBoneSan like this. -
With the alpha vBIOS un-linking is working fine in Inspector & Precision, but not beta 10 of Afterburner (not sure about v12). They simply have been coded differently. There are also some temp stuff coding errors in Inspector since GTX7xxM, which have never been corrected...
Info:
Due to not having the option to remove the access request form on the blog
(after setting it private) my inbox has been flooded with hundreds of those requests in just the last few days...no need to do that as the good old blog has said its goodbyes!
Anyone interested in 'whats next' and wants to stay posted please just click my signature instead!
Happy Independence Day Weekend Everyone!
Last edited: Jul 3, 2017Papusan, clayton006, CaerCadarn and 10 others like this. -
If it's not unlinking in the latest release you can tell the author here, he's taking feedback as he's going through it. If the feature broke between 4.3.0 and 4.4.0 then that should count. I would do it for you but I don't have the vBIOS to reproduce the issue for you, so I can'tPapusan and Donald@Paladin44 like this.
-
Hi. So I did a thing, wrote an article. You all always said you liked my books, right?
?
@Mr. Fox I know you've been waiting for this. @don_svetlio you would love this too -
Nice job. I posted a reply to the article.TBoneSan, Papusan, Donald@Paladin44 and 4 others like this.
-
Yup that's definitely a Mr. Fox comment xDTBoneSan, Papusan, Donald@Paladin44 and 2 others like this.
-
don_svetlio In the Pipe, Five by Five.
Great work @D2 Ultima - I've already posted my reply on NBC
TBoneSan, Papusan, Donald@Paladin44 and 2 others like this. -
Meaker@Sager Company Representative
There is a conversation to be had about prices in general from Nvidia but this is no different to the mobile card days, it is effectively the successor to the 980M.
You want to say to those who don't want a heavier machine:
"Sorry but this chip is too good for you, it would do better in a larger machine"
And that's just not an attitude I can get behind.ssj92 and Donald@Paladin44 like this. -
don_svetlio In the Pipe, Five by Five.
So the solution is to charge 1080 prices and deliver 1070 or worse performance to those who want a slim machine? I don't think that is fair. Hell, I'd call it disrespectful if anything. A 1080 in a thin and light is not going to work, not a full-power one anyway. It's not that we think those chips are too good for them, the problem is that laws of physics can't be circumvented. -
Meaker@Sager Company Representative
The 1080 performs better per watt, where are you seeing it score lower?
It's the best possible silicon to have at the moment, to not offer it in a form factor where power efficiency is king would be insane IMO.
The 1080 a whole should not carry the premium it does over the 1070 but these chips should cost the same.Donald@Paladin44 likes this. -
So not sure how well contact is on the CPUs and GPUs judging by these pictures. I'm planning on putting a shim on the CPU right now but wasn't sure about the GPUs. During gaming they don't seem to be having any temp issues.
OneDrive album to photos -
don_svetlio In the Pipe, Five by Five.
Did you read the article? Or are you replying without fact-checking anything. MQ GPUs are nothing but undervolted and underclocked normal GPUs. It's not new miraculous tech, it's marketing. And you've bought it hook, line and sinker. -
I don't know what you're reading into on my end, but my entire issue from the get-go has been charging users $1200 for a 1070, and $750+ for whatever the 1070N Max-Q ends up performing like.
I wholeheartedly believe that they could have done it better, or dropped the price of the Max-Q cards to match their performance. Not ONCE did I state, anywhere in that article, that thin notebooks are trash, or should not be bought. I DID say if everyone put the effort they put into getting a 1080NQ to run into other, larger notebooks, we would all be having a much better time right now.
I do not like that they have just made essentially a "1080M" and the price is still what a "1080N" is.
You change the price, you don't oversell the performance, and I am perfectly fine. You optimize what we have and everybody benefits; even better. If a 1070N can do the job, why market the 1080N and promise the performance? That's the most anti-consumer practice I can think of, and it's exactly why I wrote this. You have been in this game longer than I have, you know how much names mean to those who aren't in the know. I never once said they could not apply the same technology to an existing, say, 150W 1080N. I said that 110W is a terrible restriction. -
Meaker@Sager Company Representative
I see an opinion piece, not and article and not hard data to back up the claims.
We are not even selling a 1080 max-q at this point, so you can be sure you would be getting the same treatment from before I posted on behalf of anyone.
Clocks/voltages are dynamic now, you can use MSI afterburner if you want to limit the clocks in current designs. Due to power circuitry differences (having more phases designed for higher loads) they will never be quite as efficient but that's the hardware design trade off you make. -
I don't understand this statement. I have, since I cannot remember how long, complained about Clevo's lackluster tolerance for heatsinks, especially on their high end models. The statement I said here is what held true before and till now. If Clevo cracked down on heatsink variance and ran a solid tight ship, and we did not need to modify notebooks ourselves or depend on OEMs like HIDevolution to do it, we'd be so much better off its not even funny.
If ODMs put the level of work and attention to detail that went into cooling and taming the Zephyrus into other notebooks for non-Max-Q, then literally EVERYBODY would benefit. They get more sales, less overheating, everyone is happy.
As for the clocks/voltage being dynamic, certainly I can use MSI AB to do so. I mentioned that this is done in that opinion piece, and claimed this is the exact reason why I find that a much more optimized voltage curve at a lower target is more successful at getting performance within a certain power envelope. If you are claiming no evidence is provided, I can add as much as I like, including Notebookcheck's own AW17 R1 880M 3DMark Firestrike benchmark, scoring 5252 points total (no GPU score provided, but with a 1006/6000 single GPU firestrike of my own scoring over 7000 GPU score on a firestrike run in my logs, I can estimate the 993MHz 880M would have functioned near to 7K GPU score, likely around 6900), and the Notebookcheck Zephyrus 3DMark benchmark scoring 14.2k total firestrike score with 18.2k GPU score, it does NOT clear 3x performance. In fact, HTWingnut's review of the P650HS echoes the same performance values; 18.2K GPU score and 14.2K CPU score. That Max-Q is not 3 times better. It might be close, but close does not cut it when such bold claims are made. And in addition to this, the P650HS might be 7.6mm thicker, but it is only 0.84 pounds heavier, and has four drive slots, and an unlocked CPU option, and also has a 120Hz Gsync panel option.
If you require, I can post links to every last bit of this. I do NOT make statements as fact without said evidence to back them up. I made many suggestions, criticized the ethics of the price, performance and naming scheme, and asked why it was needed for such superthin notebooks to happen to aquire that level of thermal engineering prowess. Razer certainly does not do it, and are laughed at by anyone with half a brain in the industry for it. People don't want to buy Clevo for the same reason, their stupid heatsink lottery.
Once again: my problem is charging users 1080N price for a 1080NQ that performs like an unoptimized 1070N, under the guise of "this is thin, so it's fine". I feel they could have done better by tweaking the 1070N and since soldered, simply flash the respective vBIOS to the kind of notebook. And then, maybe they could put a more proper 1080NQ in it later. But they distinctly said they found the exact point where performance drops off and higher clock speed do not matter much there. But this is already debunked; this would have rendered the card performing within a couple % of 1080Ns; as we have seen, however, this is not the case.Last edited: Jul 5, 2017 -
Rinse, Recycle and Repeat. When the marketing gets so far ahead of its claims like we're seeing now, then it becomes a problem.
This reminds me of 7700K's and 7700HQ's...how a lesser chip is sold for the same or more when they virtually cost about the same.
The HQ is better because it has an extra letter right? It's 7700 after all... I should call customer support and ask to make sure....
*face palm*
That's the problem we're facing and the industry is eating the majority (consumers) for lunch.
"Not ONCE did I state, anywhere in that article, that thin notebooks are trash, or should not be bought." - This is true...validated and verified.
The anti-consumer practice is the very reason why razer is getting so much heat. There are a lot of similarities there with the marketing side, but more so with overselling the dream...wait, it's a 1080Q so that's the same as 1080N right?
* splash cold water on me to wake me up *
From the data that I've been collecting so far from different sources (who actually have the MaxQ laptops), the GPU is running in the mid 70C's on the 1080Q.
Is it me or have the reviews so far been neglecting temp data on the GPU?
Anyhow, if this is true.....my goodness gracious....mid 70C's in something that is suppose to be efficient in a thin chassis?
Hmmm... I have a ton of data on the 1070 systems that I've worked on and the 1070 (~130W) in the 15R3 for example runs in the high 50C's and low 60C's....
So if a 1080Q with less TDP (90W-110W max), maximized, optimized, tuned etc... is running in the mid 70C's then........
Hmmmm.....1+2 = 6?
Wait, 1x + 2y = 6 ...ah that makes sense (x=trickery | y=ignorant consumers)
If MaxQ can really deliver what it claims, then there is a place for it in small laptops like the XPS with the tiny 1050, however, that is whole different league than gaming laptops.
It wouldn't be such a big deal if it was named properly, well and also if it was priced accordingly. (It's all about the ratio.)
I'll reserve my full opinion on this until I fully test the chips from two different systems, but what doesn't need to be tested is acknowledging the marketing stunt.
Generally speaking to make a point about the many lies in the industry, if they would have just spoken the truth in that...
"...We've found a way to maximize the trickery at its highest efficiency, to sway the consumers even more....ladies and gentleman....MaxQ (__name the lies here___) "
"....Thank you... thank you for believing in us and giving us your money on the purchase of recycled tech...."
*Everyone claps....*
Hmm...sounds like that company with a bite out of the fruit. (Who needs a headphone jack?)
.Last edited: Jul 5, 2017CaerCadarn, Papusan, TBoneSan and 7 others like this. -
-
I have a hard time getting behind deceptive, overpriced gimmicks drummed up as lame excuses to defraud the people that insist on having something too small, thin and light to offer amazing performance. I understand the reason for the form factor, and it doesn't matter whether I like it or not. I don't see a good reason for Max-Q garbage to exist, other than it is a creative new marketing strategy that is being over-hyped for the express purpose of padding the Green Goblin's bank account by selling the customers in this ultraportable fetish niche something weird that sounds better than it actually is.
Yes men and Kool-Aid drinkers are the reason we are in such a sorry state. We have Intel and NVIDIA butt-kissers to thank for everything that is wrong with laptops in today's world of technology, and for everything that's only going to get worse. And, those who fall for it are being led like lambs to slaughter.Last edited: Jul 5, 2017 -
-
Falkentyne Notebook Prophet
I would gladly pay @Prema or anyone else $50 if they could somehow certify my GT73VR 1070N vBios for flashing with 150w power limit
I know, I have to buy the programmer and just do it that way
Donald@Paladin44 likes this. -
So, the admins are going to sanitize your article now, eh? Nice. Wonder what kind of threat from NVIDIA prompted it? I hope that it doesn't get watered down too much. When the truth hurts it should be a call to accountability for the behavior causing it. NVIDIA would be the only beneficiary of any dilution or mincing of critical words.
-
-
Putting a positive spin on something that should only be viewed as objectionable and reworking the article to remove the negative perspective that was warranted simply reinforces my longstanding opinion that professional technology review work, with some rare exceptions, is the bastion of shills.
-
Agreed.
Problem is that if someone like NBC lets the truth slide, it will hurt sales for certain people. So... duct taping it is a solution i guess. :S -
Nobody did. One of the leaders of NBC outright said Nvidia does not lie, and wants that removed from the article, and they want to rework some of what I said (especially in the proper thermal engineering department). Even though, as I responded to meaker, I can literally prove that the AW17 R1 880M model he held in his hand on-stage during the keynote is 1500 3DMark points short of being "3x faster" than the ASUS Zephyrus, AND the GPU score falls short too; needing to be around 20k+ (it is only 18.2k). And how the scores of the Zephyrus exactly mimic a P650HS, without modification done on the 1070N for voltage optimizations on that notebook.
So I'm going to have to talk with them and try to prove what I said was indeed correct and see what does need any revising or fine-tuning that I cannot prove correct. Even though I distinctly for the entire article left opinions as "I think" or "this could likely", and the facts I left as fact.
That's what's going on.
Also I slept surprisingly well. -
@D2 Ultima
You should have used...."Speculations of course"
Although, not sure how you thought that review was going to fly though.... Track record speaks for itself though. They have never out right badmouth Nvidia or Intel for that matter..... Speculations of course. -
LOL.
What you see when you click on it now I might as well have my name removed, this isn't anything I spoke about.
Maybe next time I'll start using "speculations, of course" instead of "I think this could work" xDScerate, Johnksss, Papusan and 1 other person like this. -
-
don_svetlio In the Pipe, Five by Five.
Is it just me or does it look a bit like a Max-Q add now? ._.
hmscott likes this. -
The loss of money and a supply of test products is more important (to the recipients of those things) than accuracy, truth and honesty. And, since most of these are privately owned businesses, their opinions are the only ones that matter. They are free to do as they wish and censor whatever information undermines their agenda, or the agenda of the hand that feeds them. Depending on where one it sitting, that can be viewed as smart business or dirty pool. For consumers, it's the latter explanation that is most accurate.
-
I understand the business aspect of it, but lying is not something I condone, and nVidia has gotten away with it far far far too often.
If I cannot get my original message along and this ends up promoting Max-Q I'll ask for the article to be removed, and if they won't, remove my name and all my written text, and they can do it on their own.Scerate, iunlock, TBoneSan and 1 other person like this. -
Nice book
+rep
Btw I put this in here, although you may have seen it before
@Phoenix
You're welcome
Steelseries keyboard can't beat processing power!!
<it is effectively the successor to the 980M>Yeah, I know. But it's a difference. 980M vs. 980 desktop graphics hadn't the same price tag. 980 cost more. New M (Max-Qrippled) should cost less than the desktop versjon. Same as when 980 was released!!Last edited: Jul 5, 2017 -
-
That is for sure a part of it. Access to beta testing because one bad mouth the supplier, but then isn't it like that in ALL situations?
Example: I start badmouthing your choice of weapons after we have been out at the range on your dime for a weeks would pretty much cut my free admission and primary use of your weapons. Why? because you would feel betrayed. ~~Now wouldn't you?
This has been going on for a very long time, but there are actually ways to present both sides of an argument without alienating oneself in the process. And that seems to be the hard part. -
Had a little benching session with this sleek beauty today:
3DMark Sling Shot Extreme Unlimited
Apple iPad Pro 2017 10.5 : 3902
Samsung Galaxy S8+ : 3739
Samsung Galaxy S8 : 3733
Xiaomi Mi 6 : 3657
HTC U11:
3DMark Sling Shot Extreme
Apple iPad Pro 2017 10.5 : 3638
Samsung Galaxy S8+ : 3417
Samsung Galaxy S8 : 3409
Xiaomi Mi 6 : 3096
HTC U11:
3DMark Sling Shot
NVIDIA Shield Android TV : 4743
Apple iPad Pro 2017 10.5 : 4620
Samsung Galaxy S8+ : 4565
Samsung Galaxy S8 :4541
Xiaomi Mi 6 : 3844
HTC U11:
-
That's a really good video from Stephen. I thought he did an excellent job on his review. Looks like he thinks the keyboard is fine (I agree) and even went as far as saying it de-throned his GT73VR. Can't hate that.
-
Tried this on my new S8 but gave me a warning message about heat... Only 2035 on Samsung's new exynos silicon. Any repaste guides out there?
Look like the slideshows while running 3Dmark in the mid 00's.
Last edited: Jul 6, 2017 -
Does anyone here have an AW17 R1 with an 880M in it? And can run a stock firestrike run for me? A really good one, no overclocking but I'm expecting no card throttle etc.
-
Was sitting outside at 4:30am, maybe that played a role in the results.
Ionising_Radiation, Scerate, steberg and 1 other person like this. -
What about while getting a drink in the freezer?
Clevo Overclocker's Lounge
Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by Spartan@HIDevolution, Mar 4, 2016.


