The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    New Clevos with Max-Q?

    Discussion in 'Sager and Clevo' started by pdrogfer, May 30, 2017.

  1. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    when theres a continuous trend of people not caring for soldered, thin BGA laptop, they are going to pull stunt like these until people stand up and say no to them. no doubt the soldered market and thin laptop is with majority of buyers, this is more ways for them to make money off their old tech before giving us new ones in MXM form.

    simply dont buy anything other than machine that offer socket parts to continue support socket machine is the only way imho, since these corporation dont seem to listen
     
    Ashtrix, TBoneSan, DukeCLR and 3 others like this.
  2. Miguel Pereira

    Miguel Pereira Notebook Consultant

    Reputations:
    11
    Messages:
    197
    Likes Received:
    160
    Trophy Points:
    56
    Don't want to be the devils advocate, but these companies don't are about logic they care about numbers.

    Most likely there is a bigger market for thin bga then for thick lga laptops. This is a fact, the consumer choose with their wallet. And most he chosen the thin and less performance direction.

    I actually understand this, even if I prefer to get all the performance I should get.

    This Max-Q is just marketing stunt to further keep the momentum.

    Enviado do meu MHA-L29 através de Tapatalk
     
    Ionising_Radiation and hmscott like this.
  3. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    yes fact is majority of market dont understand these things and love thin until they realize it overheats, can't get same performance of socket parts or they simply dont understand it at all and think its fine the way it is.

    this is also another way for nvidia to *control* the performance so in case they run out of architecture improvement like intel, they'll start controlling clock speed, throttle etc etc, i can see it all happening again.
     
    Ashtrix, DukeCLR and Papusan like this.
  4. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    So this new MSI GS73 1070 comes with a 180w PSU, instead of the 230w it should have:

    GS73VR 7RG STEALTH PRO
    https://www.msi.com/Laptop/GS73VR-7RG-Stealth-Pro.html#hero-specification

    The worst part is, nowhere in the Overview or Specs does it say anything about Max-Q - which it must be to only need a 180w PSU...

    "Most powerful ultra slim 17.3'' gaming notebook with GeForce® GTX 1070"
    "Graphics - GeForce® GTX 1070 with 8GB GDDR5"

    It seems MSI doesn't think the name "Max-Q" is such a good selling point :)
     
  5. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Yes, but it's a deceptive marketing stunt, untruthfully representing the under driven under-performing Max-Q 1070 / 1080 GPU's as the real thing, and the over all Max-Q design environment allowing for full performance in 1/3 the thickness of chassis.
     
    Last edited: Jun 1, 2017
    atacool3 and DukeCLR like this.
  6. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,824
    Likes Received:
    59,553
    Trophy Points:
    931
    Damn, Nvidia and ODM's offer already soldered trash for use in thin and flimsy. And yet again they push out even more of similar, but this time intended for even thinner JokeBook's. Maybe they also should remember us who want 1080Ti in laptops? Time for desktop!!
    Or say what it is. Fraud - scam. Trick people into buying!! And a Hell lots of people doesn't see the difference!!
     
    Last edited: Jun 1, 2017
  7. Thousandmagister

    Thousandmagister Notebook Consultant

    Reputations:
    10
    Messages:
    102
    Likes Received:
    65
    Trophy Points:
    41
    Or eGPU . I already have EXP Beast + a used 8pin output 220W Dell Adapter ($65 in total)
    I'm waiting for Volta , GTX 1150Ti maybe (xx50 series don't require any connector)
    I thought I could buy a new clevo laptop next year but I changed my mind after reading this thread
     
  8. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    imho if one were to go eGPU may as well just go a powerful desktop, and a powerful laptop. you can choose to upgrade desktop gpu down the road with almost no restriction where as eGPU might be different matter all together. it'd be real funny if they try to sell eGPU and then control us too lolol
     
    DukeCLR likes this.
  9. Thousandmagister

    Thousandmagister Notebook Consultant

    Reputations:
    10
    Messages:
    102
    Likes Received:
    65
    Trophy Points:
    41
    I already have 2 gaming desktops , do I need another one ?
    I prefer laptop over desktop any day , as I can bring it to anywhere I want (Japan to US, back and forth )
    Desktop is boring , I leave both of them in my store house . Why would i want a desktop while I can do exactly the same thing on laptop ? CPU Overclocking is also possible thanks to Intel XTU
     
  10. CedricFP

    CedricFP Notebook Evangelist

    Reputations:
    49
    Messages:
    517
    Likes Received:
    298
    Trophy Points:
    76
    As I get older, I find going from desktop to laptop more appealing as my daily drivers.

    And I'm a desktop enthusiast that has owned cards LN2 benched (and world-record set) by Kingpin himself, and I've interviewed top OC'ers back when it was still cool. I've spent a lot of money in the 'scene' but as I age, I just find that laptops suit my use cases more.
     
    DukeCLR, jaybee83 and atacool3 like this.
  11. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,171
    Likes Received:
    17,882
    Trophy Points:
    931
    Nvidia are going to spin it to maximise sales and their brand, and they do it very well.
     
  12. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,824
    Likes Received:
    59,553
    Trophy Points:
    931
    Yeah, But this depends on stupid buyers!!!
     
    ole!!! and DukeCLR like this.
  13. DukeCLR

    DukeCLR Notebook Deity

    Reputations:
    218
    Messages:
    1,060
    Likes Received:
    1,165
    Trophy Points:
    181
    Unfortunately there will always be thousands of buyers who don't even come close to doing the research most of the people around here do before buying a laptop, or anything else for that matter. There is that old cliche "there is a sucker born every second"
     
    Papusan likes this.
  14. Miguel Pereira

    Miguel Pereira Notebook Consultant

    Reputations:
    11
    Messages:
    197
    Likes Received:
    160
    Trophy Points:
    56
    There are thousands of 920m "gaming" ready discreet gpu laptops sold everyday everywhere. You could do a lot worse then a Max-Q laptop...

    This is for the average consumer that doesn't do the proper research and is trigger happy.

    They won't stop existing just because we whine.

    Enviado do meu MHA-L29 através de Tapatalk
     
    Ionising_Radiation likes this.
  15. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    i know that feel, i have refused to buy desktop over last 5-6 yrs and always gone laptop but this time around i dont think i can hold anymore, especially most laptops out there are a disappointment and intel's new hedt has 18c, along with optane which intel will not make a decent size m.2 optane SSD anytime soon i am being kinda forced to go desktop to get what i want.

    gotta get that fat UPS or or just this all together http://www.ssiportable.com/products/portable-solutions/spark-s24t/
     
  16. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,824
    Likes Received:
    59,553
    Trophy Points:
    931
    I expect people ain't so stupid they pay Max-Q prices for 920m laptops. Or maybe they do, I don't know.
     
  17. bruno.uy

    bruno.uy Notebook Enthusiast

    Reputations:
    0
    Messages:
    40
    Likes Received:
    24
    Trophy Points:
    16
    I think the BGA hate club talks too much between themselves. Max-Q serves well customers with money that want the most powerful small laptop. Not stupid.
     
  18. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    It's not BGA hate for me, I have BGA laptops and think LGA's are overpriced and too noisy - and use servers for compute / render :)

    Slim Max-Q-like laptops would be a good idea if they were honestly represented as implemented with power reduced 1070's / 1060's honesty portrayed.

    It's not given an honest representation. That's the problem. Max-Q is misleading to unassuming laptop buyers that hear "1080 in a thin laptop" and go nutz over it all. Paying way more than they need to for a 1070 performance laptop.

    Too bad that now the new thin 1070 laptops are shipping with 180w PSU's instead of 230w PSU's, and can't reach full performance - even if they aren't Max-Q builds.

    It's not BGA vs LGA for me, it's lies vs truth - and myself and a lot of the other people discussing this here are going to be helping out all the disillusioned new Max-Q owners trying to figure out why they can't get their games to run at the same speed as other 1080 laptops - that's their expectation.

    The Razer 1080 owners are already in mid-realization now that they've been short changed - 250w PSU to feed a 1080 and a 7820HK, so there's not enough power to go around to get full performance from either the GPU or the CPU, let alone both at the same time.

    The good news is I think it will end up backfiring Nvidia this time. Razer buyers are like Macintosh buyers, overly enamored with thin shiny things and not paying attention to performance vs value.

    I think Nvidia saw that sucker market and wanted a piece of the action, but this time it's all the slim laptop makers and a wider market of much more savvy buyers.

    It will be interesting to see how it works out :)
     
    Last edited: Jun 1, 2017
    atacool3 likes this.
  19. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Occasionally it backfires on them, I think this has a good chance of being one of those times.
     
    Ionising_Radiation likes this.
  20. atacool3

    atacool3 Notebook Consultant

    Reputations:
    10
    Messages:
    108
    Likes Received:
    99
    Trophy Points:
    41
    Guys! Remember the comment I made where there is more speculation here than in a Flat Earth Society meeting? First wait for a benchmark or some sort of proper test before criticising and roasting the GPU. Sure from what we see the performance MIGHT be very lacking but we dont know for sure. The only info we actually know is a vague score for a vague test done by an unreliable test device published from an unreliable source squared(17,000 firestrike, done by triton predator, given to a website by Asus).

    Also any business' model is getting profits so if the Max-Q is this lacking than even a dumb customer will realise it when he/she sees it. Unbiased review sites will see this flaw as well leading to lower sales which in turn convinces Nvidia to get their **** straight, and from the looks of it this might be the case.
     
    hmscott likes this.
  21. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    It's the way it's being described, Max-Q 1080 performance expressed as an increase over the performance of a full 1060 - not as a % of full 1080 - that suggests it's not even close.

    These are sales motivating descriptions designed to be enticing, not accurate in the form the buyer expects to get - invoking the "Yes I want it" with different reasons / data than the buyer's high performance expectations.

    The few Max-Q slim laptops that have specifications list smaller than required PSU's to get full performance, all about 1 GPU level down in output wattage - indicating they are closer to the next level down in GPU performance than anywhere near what their number name suggests.

    180w for the Max-Q 1070 instead of 230w, and 250w for the 1080 instead of a 330w.

    That's enough information to be able to estimate the possible performance envelope, no magic is going to bring that much wattage missing out of the battery boost assist. It's just not going to deliver the performance users are excited about.

    And, the pricing is gouging at the same level as the full 1080 / 1070 models, no reduction in price to match the reduction in performance.

    It's just looking bad all the way around.
     
    Last edited: Jun 1, 2017
  22. atacool3

    atacool3 Notebook Consultant

    Reputations:
    10
    Messages:
    108
    Likes Received:
    99
    Trophy Points:
    41
    Fair enough, but you guys might be underestimating(hopefully) Nvidia's magic by a huge margin. Sure the firestrike score(17,000) may be lower than a true GTX 1080 but it isnt as bad as a 1060! That seems a bit too extreme in my mind. All we can do now is play the waiting game :)
     
    DukeCLR likes this.
  23. XMG

    XMG Company Representative

    Reputations:
    749
    Messages:
    1,754
    Likes Received:
    2,197
    Trophy Points:
    181
    Even the benchmarks I have access to are from samples, not production ready systems.
     
    hmscott and DukeCLR like this.
  24. estens

    estens Notebook Consultant

    Reputations:
    13
    Messages:
    108
    Likes Received:
    38
    Trophy Points:
    41
    Max-Q is not for me, but I am shure many costumers will be happy with it
     
    quantumslip and DukeCLR like this.
  25. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    It's the same silicon, it's the same power and thermal budgets, so the only way to reduce the thermal profile in a small box is to reduce the performance.

    If there was "new magic" from Nvidia for 1080 / 1070 / 1060, it would be in *ALL* the laptops, even the large DTR models - as even those would benefit from reduction in thermals.

    Since this isn't a global GPU based improvement - no new process or die - it's all just power and performance reduction smoke and mirrors to sell slim laptops with overpriced underperforming wannabe 1080's and 1070's, and sell them at the same high premium price as the full performance laptops.

    The magic is in the misdirection being performed to open your wallet and extract your money.
     
  26. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,044
    Messages:
    5,346
    Likes Received:
    1,034
    Trophy Points:
    331
    Exactly!
     
  27. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,824
    Likes Received:
    59,553
    Trophy Points:
    931
    Maybe The BGA hate clubs don't want throw their hard earned money down the drain? :rolleyes: Or shall we say down the toilet?
     
    ThePerfectStorm, DukeCLR and ajc9988 like this.
  28. bruno.uy

    bruno.uy Notebook Enthusiast

    Reputations:
    0
    Messages:
    40
    Likes Received:
    24
    Trophy Points:
    16
    What would you suggest to a rich guy that wants to game in a laptop not bigger than a macbook pro?
     
  29. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Fall back on what's already been said + common sense :cool:

    It's just not possible to get the high performance GPU / CPU into a Macbook Pro sized sack.

    Apple can't do it, nobody can, it's just physically not possible.

    Apple used to be a brand I frequented, for like 35 years, then they sold their performance souls for thin and pretty poser boxes.

    My last foray, attempt at purchase was a 17" Macbook Pro, top model - best CPU + RAM - brought it home, built the system with all my development and management tools, put it to work, and it just ran it's little fans at full speed constantly, thermal throttling more than I'd ever seen them. I just couldn't use it.

    There just isn't enough physical space to put the heavy duty cooling system required for the close to 100% CPU and GPU for the work I do. Desktop replacement laptop's are it. So I went with that. Too heavy and noisy. Set up servers, dropped down to a top of the line 18.4" BGA laptop, I've never been happier with a laptop.

    It was huge, heavy, required a huge PSU, but it was quiet under load - and even at the extreme range when running jobs locally it was liveable.

    You aren't going to get top performance in a MBP profile package, or a Razer BP, or a new MAX-Q laptop, they are all going to be far busier trying to throttle themselves to stay alive than spending time doing your work.

    Get a nice 17.3" 1080 / 1070 MSI GT75VR, or Acer Predator, or Asus G701VI 1080 or Asus G752 1070, and any of those will give you "compact" high performance with adequate cooling, power, and OC headroom (undervolt to drop CPU temps on all).

    Those are all BGA based laptops, and can OC a 7820HK to 4.2ghz-4.5ghz depending on the Silicon Lottery (luck of the draw).

    Or, if you want the highest performance, up to 1080 SLI + desktop CPU - get a desktop ;)

    The LGA highend laptops are extremely *not* out of the box ready for the most part - they take a lot of tuning, BIOS fiddling, waiting for Prema releases for new models - 6+ months before you can get rid of the Clevo BIOS gotchas - power throttling mostly. These are very expensive too as most build them, but base models are competitive with the top BGA models.

    What I recommend is to get 2 laptops. This is what I have done for many years.

    1 is a slim pretty girl you can take out to work, social gatherings of non-tech people, that has enough battery life to be useful on long plane trips and long days in meetings away from power. Basically the dream laptop everyone wants, but wishes also did heavy duty gaming.

    You could stop here and use something like LiquidSky or other remote gaming services - using your slim laptop as a front end for their heavy duty back end servers, but it's not viable for all games - it's getting better but not quite there yet.

    And, get a killer gaming laptop that you can pull out when all the non tech people have gone home, plug it in to AC - all gaming laptops with Nvidia (/ AMD) discrete GPU's need to be run on AC for any useful gaming performance.

    Then you can have all of what you want, but it's not gonna be gotten in a single slim laptop.

    You can't pound 40lbs of thermal load into a 5lb sack :)
     
    Last edited: Jun 1, 2017
    Ramzay, raz8020, dm477 and 4 others like this.
  30. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,824
    Likes Received:
    59,553
    Trophy Points:
    931
    No one should support Nvidia's new madness with money. Same for the ODM's. If I was that rich guy, I would rather engage me more in charity than waste my money on Fraud/scam. This should apply to both rich and poor!! Same time you all will help our dear earth from being overflowing of... Yeah, you know.
     
    hmscott likes this.
  31. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,201
    Messages:
    39,332
    Likes Received:
    70,613
    Trophy Points:
    931
    Amen to that.

    Mmm, mmm, mmm, c'mon now! That's right! Preach it, Preacher!!!
     
  32. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    Don't take the bait...you'll be here for hours with another 100 page thread of useless back and forth. I've tried it.

    One thing repeatedly brought up is the 1080 Max-Q. But the reality is, the large majority of the market that will be affected by this is the thin models which currently ship with a 1060 and are getting a 1070 MaxQ variant.

    ie. the P950, GS63 and Razer Blade. These are all machines that easily handle a 1060 but not a 1070. From what I can gather, instead of overclocking a 1060 and using exponentially more power as you run into it's limit, it's more effective to take a 1070 and downclock it instead.

    I'd be particularly interested to see what voltages the Max-Q chips run at.
     
    Ionising_Radiation and hmscott like this.
  33. Ionising_Radiation

    Ionising_Radiation ?v = ve*ln(m0/m1)

    Reputations:
    757
    Messages:
    3,242
    Likes Received:
    2,666
    Trophy Points:
    231
    The fact is that Max-Q is meant to maximise profits for nVidia, and give a system of less value to consumers.

    We've agreed on many things, but to me, this Max-Q business just reeks to me of nVidia wanting to get rid of their lousy bins, and to make a quick buck off them. The Max-Q webpage is so full of marketing smoke, and hardly any real stats or specs (we haven't even got have official clock speeds, TDP, etc) that it truly makes me question the value of it. There are already hints that undervolting Pascal too much may disable a few cores under load. People may end up paying more for less. We're already in a niche, where notebooks cost around $300 – $500 more than their desktop counterparts due to miniaturisation. Do we need yet another worthless premium to pay, just because we're pursuing thin and light?

    I believe @Prema has weighed in on the matter, and I'm fairly sure he knows what he's talking about.
     
  34. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    I certainly agree that the marketing side is BS. It always is and I expect nothing less of any of the companies if what they're doing is effectively releasing a slower version of existing hardware. They're not going to talk down their own products. Happens all the time so I fully expect most of the people weighing in here can skip right past that.

    What I am interested in, is the technical aspects of it. There also doesn't seem to be any indication that they're "replacing" existing 1060/1070/1080 models.

    I've watched Buildzoid's video and I wouldn't be surprised if there's a management controller implemented. It makes sense that you would need something like that eventually. Wouldn't surprise me if it has something to do with gating off cores at lower P-States to better save power. Pascal cut idle power consumption quite significantly and interestingly, almost the entire range idles at almost the same level.

    This is where I think everyone is jumping to conclusions. What I'm taking away from it all, is that most of the existing "thin" laptops now have a faster variant. NOT that there's any intention to make existing full 1060/1070/1080 models "slower".

    The only one people will get caught up on is the completely new 1080 Max-Q models where people might buy into it thinking they'll get full 1080 performance when they won't. Hell, that already happens now since the "full" mobile Pascal chips don't actually clock anywhere near as high as most desktop ones anyway (mostly because the typical desktop cards are now pre-overclocked).

    As I said, if you look at the product lineups, most of them just added a new GTX1070 version of whatever their thin/light GTX1060 model was. Some have added a new "thin" 1080 Max-Q model which will cost a tonne of money and lets be honest, bothers NONE of us here because buyers of those models probably don't care that much about that kind of money anyway.

    In almost all of those cases, it's not like you were going to get a full fat 1070 in there anyway so...
     
    Ionising_Radiation likes this.
  35. Womb Raider

    Womb Raider Notebook Enthusiast

    Reputations:
    0
    Messages:
    18
    Likes Received:
    4
    Trophy Points:
    6
    After reading page after page. I am so confused. What is it that is wrong? I like the fact it uses less power.

    Everywhere else i read this is an improvement. You think there is not an improvement in design and architecture? But they are BSing with downclocked versions of their cards? When there are already laptops with non max-q gpus ppl can buy?

    I am buying a new laptop. What is wrong with this new clevo or new gs73vr? I have just bought ge72mvr and it uses power like way too much. Falsely advertised to have thunderbolt 3, plus xoticpc damaged it so I am returning it.

    I would prefer to upgrade parts, lgm, but even when u get mxm there are changes in the architecture that make things incompatible. That is what I read about msi's issue when they promised upgradeability.
    What is so wrong with it being bgm, but having thunderbolt 3. That ensures future proofing?

    I read all kinds of negative and it stresses me out, I have done too much research to buy my stupid laptop, it is frustrating.

    And F*ck this website. It freezes up my phone every time, just realized it keeps draining my network and finally I am not frozen when I turn on airplane mode.

    I am getting me this new clevo or stealth, because asus 120hz is bs with ips response times in 20 miliseconds. 120 fps = you need 8.3ms at least response time for 120 frames in 1 second. Clevo's displays last rated at 12ms, but has g-sync. Msi is fastest response time, no g-sync.
     
  36. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,824
    Likes Received:
    59,553
    Trophy Points:
    931
    The reason nvidia push out this Max-Q graphics Trash!! Market Share is one thing, but decline of -27.8% in the discrete GPU market is much worse!!

    NVIDIA dGPU Market Share at 72.5%, AMD’s at 27.5% – Enthusiast and High End GPUs Win Last Quarter
    First up we have NVIDIA which saw a market share increase of 2%. This brings NVIDIA dGPU share to 72.5% which is their highest in several months.
    NVIDIA’s dGPU market share stood at 70.5% in the previous quarter but has managed to grow further despite a decline in the overall market. The reason is simply because of NVIDIA’s high-end offerings which are detailed in the segment divided market share. NVIDIA saw an overall decline of -27.8% in the discrete GPU market with -23.0% in the notebook and -25.6% in overall PC graphics shipments."
     
  37. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    You can wrap up the discussion quickly if you agree to get a high end Clevo LGA laptop, they've got the range of configs already worked out and the best vendors to get them from - whereever you live in the world - so to be fair if you are simpatico with their approach it's a quick trip :)

    You still may exchange 100 pages of posts before, during, and after purchase, but it will all be posts of joy :)
    @bruno.uy too...

    The Max-Q GPU's are just normal Pascal GPU's, but neutered to fit into a smaller thermal power profile, nothing magical, in fact it's all very straightforward. If you want less thermal heat to dissipate, then draw less power and do less work.

    The 1080 Max-Q is really the darling of the whole project for Nvidia, and that's because it's the Unicorn everyone wants. A 1080 in a less than 10lb laptop that provides full performance.

    The Max-Q 1070 is just an expensive solution to what is now done just fine by a 1060 OC'd. Nothing special here either.

    The only special Max-Q product is the 1060 Max-Q. It has the same TDP power profile as a full 1060, but using the constant throttling off demand to reduce load, power draw, and therefore heat generated, the 1060 Max-Q is the sweet spot if you really want to build the smallest and quietest under load laptop using the new overall Max-Q treatment.

    That's what I'd get if you want the thinnest gaming laptop, balancing excellent performance of the 1060 with Max-Q power throttling to keep the heat down and therefore keep the fans quiet.
     
    Last edited: Jun 1, 2017
  38. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,201
    Messages:
    39,332
    Likes Received:
    70,613
    Trophy Points:
    931
    So, if the goal is something thin and light enough to be kind of cute, if not downright goofy, and 30-40 FPS gaming with medium or low settings sounds exciting, this is probably that really big thing you have been waiting for.

    I guess the main question that you need to find a straight answer to is what will the utilization of less power cost you in terms of performance. Doing more with more always works out better than doing more with less. Less with less also works a lot of the time. Doing more with less is almost always going to be messed up. Good luck getting a straight answer on that, especially from shills professional reviewers. Most of them will be on such a sugar high from drinking the NVIDIA Kool-Aid that everything will come up smelling like roses.

    Just brace yourself and don't be disappointed if you take the leap of faith and buy one only to find that it gets raped by everything else that hasn't been TGP/TDP castrated and still runs hotter than what you'd like it to.
     
  39. CedricFP

    CedricFP Notebook Evangelist

    Reputations:
    49
    Messages:
    517
    Likes Received:
    298
    Trophy Points:
    76
    Personally, I think this is an underrated point and actually one of the more interesting propositions.

    I am unashamedly, unabashedly, and unapologetically after a portable, slim laptop that can give me the best gaming performance possible while not compromising on throttling, longevity, and user experience.

    I really do not care for the "hit the gym" crowd that seems to suggest laptop portability importance exists on an inverse scale to how ripped you are, which is ridiculous and reductive.

    A 1050ti almost hits that sweet spot despite the rather surprising lack of options, but if a 1060 MAX Q can sit in between the 1060 and 1050ti performance-wise, it's almost perfect *for my particular use-case*.

    The only issue becomes pricing, and I think this is where the rub is, and where for *me personally* it'll be a non-starter.

    With THAT said, it is ridiculous that a 1060 MAX Q is called a "1060" and not a "1055", and same goes for a 1070 MAX Q being called a "1070" and not a "1065".

    I understand there is a rationale for it (same number of cores, same chip, etc.) but from a performance standpoint, it only confuses the consumer. Confusing the consumer may make you money, but it will not earn you good will and loyalty.
     
    raz8020, bradleyjb, DukeCLR and 6 others like this.
  40. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    wow hmscott, this is excellent logic that people need to see, its not difficult to understand what nvidia is trying to pull here. i can finally say, welcome to our side!

    lmao mr. fox


    now boys on the journey for my last 6 yrs in laptop industry after finally admitting my sony vaio is a piece of junk with iGP, ive seen so many things from blinded consumers to, how corporation like intel/nvidia went from competitiveness and honesty turned into liars and controlled by greed.

    lets look back when i first entered laptop space around 2010-2011 and I think that was when intel came out with sandy bridge VS AMD's FX and we all know how that went. Intel had an amazing cpu at the time, its silicon quality at 32nm was pretty good you could overclock 4.7-4.9 on average chips with decent voltage, and the IPC was extremely high vs competition.

    of course intel had force socket change and people accepted it however, remember when cpu was tick/tock cycle per 2yrs and each yr will have it's own first gen and 2nd gen chip within either a tick or tock? at the time desktop 2500k, 2600k, mobile space was 2920xm? intel's old routine was half a year later, they will release a better binned/optimized chip in sandy's era was 2700k and 2960xm. this carried into Ivybirdge era but we started seeing not able to clock as high, and people made a good excuse for others to accept that 22nm simply can't clock better due to tighter density, in away it make sense and we still had 3920xm, half a yr later 3940xm. this carried onward until 4940mx and tbh, early signs were there that its already going down hill.

    AMD wasn't able to catch up so intel had no need to make progress, however they already hold pretty much all the market they simply can't increase their profit margin anymore so they find other ways to make additional money and when a company starts to do that, we were on the path of doom.

    intel does this by closing fabs, laying off employees and instead of improving their chip, they started *haswell-refresh* which technically that is suppose to be the 2nd half of optimization of original generation of tock, however intel made that last an entire year, carries onto broadwell which came out to be extremely late, then skylake and now optimization not just last 1yr but 2yrs, and look at just how well the current 14nm is able to overclock when people were made to believe 22nm can't clock well because of tighter density. other money make scheme include controlling CPU frequency with lowering the TDP/current going into CPU and only allow turbo to last 20-30 seconds before it throttles. making BGA chips and force OEM to give in or no more discounts for batch CPU purchase, each notebook they sell means they sell a CPU (no one wants AMD machines). these are their tactics and now including thinner PCB as well as non-soldered TIM to reduce longevity of their CPU, forcing people to buy new ones sooner.

    nvidia on the other hand back with desktop/laptop 580/580M they had the crown, they gave consumer a full blown GPU chip, however started from 680/680M they gave us cut down chip because they were in the lead and AMD could only compete against a cut down GPU chip, they suck in the extra profit. it explains why 780/780M could have more GPU cores and they didnt need to redesign anything, no extra cost spend there. they introduce maxwell and their 3.5GB on GTX 970, trying to make extra money with G-Sync and calling it a feature, then they put pascal in which it wasnt even in their roadmap several yrs ago so they could give us a cut down version of volta called pascal. since to 900 series in notebook for high end, they gone full BGA just like intel, now with MAX-Q rofl.

    I came across some random guy, posting on the internet stated that 1070N and 1080N were too hot and they throttle, so nvidia came out with the idea of MAX-Q to reduce the temperature, stated "about time nvidia did something". can you believe how stupid the mass consumers are? they honestly believe this is yet another feature/new innovation that nvidia has come up to target heating issue while giving them a *true* 1070m and 1080m notebook. just blow my mind at times.
     
  41. atacool3

    atacool3 Notebook Consultant

    Reputations:
    10
    Messages:
    108
    Likes Received:
    99
    Trophy Points:
    41
    This comment may seem out of place in this thread(sorry) but you should get the Aero 15. It is light, thin and has a GTX 1060 which is used effectively due to its very efficient cooling system. It also has a lot of other points i didnt mention like a bezel-less screen, really good battery, etc.
     
    Last edited: Jun 2, 2017
  42. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    Psst, you've got that all wrong. The 580M was not a full blown chip. It was the lesser GF114 core (ie GTX560) when the flagship desktop was running a GF110 (GTX580). The GF110 was a 240W chip, it sure as hell wasn't going in a laptop.
    The 680MX, conversely, was actually the first generation to use the same core design as the desktop counterpart for the flagship. Both used the GK104 with 1536 core count. Also worth noting, the desktop 680 was *only* 195W TDP which is significantly lower than any other flagship before or after it. There was no 250W chip for Kepler until the 780.

    Nope. The 780M is the same GK104 chip. The desktop 780 was a new GK110 which returned to 250W TDP.

    It is a feature. G-Sync desktop was well into production before VESA fully adopted Adaptive Sync as a feature. Mobile G-Sync is a different kettle of fish. I have no idea just how much, if any, validation there is done on the panels. So whether G-Sync/Mobile should cost extra money is debatable. Personally I think it's one of the greatest developments in the past few decades because it addressed a fundamental flaw in the display output chain.
     
    DukeCLR and Ionising_Radiation like this.
  43. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    hey nicely stated, you made good points so it make sense why i am not so into graphics card but rather cpu and storage. however when a person give a politician money, one can say its bribing, returning whats owed, it is exchanging money from 1 person to another.

    in this case what the points i make holds true, people can glorify how innovative nvidia is with their max-Q and gsync, in the end, company used shady marketing and have ignorant consumers to fall for more of their crap. where as these are intended for further milking no matter how nicely you word it.

    theres free sync and it work just as well as gsync, no reason why nvidia should charge more for display and gsync. no reason why nvidia would want to charge extra for max-Q when all they do is give you the same hardware, if not worse performer.
     
    DukeCLR likes this.
  44. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    G-Sync is still different to FreeSync, at least on desktops. Mobile is almost the same. As I said, they developed it LONG before VESA or AMD got onto it. Nvidia built G-Sync and developed the TCON module for it, are you suggesting the money they put into the research should just be given away for free?

    While it would be nice if they did make G-Sync free, having the option to do so is a luxury only a company with the money at their disposal have to make. Not to mention they would somehow have to decouple it from the dependence on the Tegra SoC used as the TCON.
    Whether you like it or not, Nvidia dumps massive amounts of money into R&D because they make so much damn money.
     
    tribaljet and Ionising_Radiation like this.
  45. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    the massive amount of money they got from bga soldered junk book :D, like it or not facts lul.

    max-Q should be renamed to mini-milk
     
  46. Glzmo

    Glzmo Notebook Deity

    Reputations:
    476
    Messages:
    822
    Likes Received:
    86
    Trophy Points:
    41
    I wonder if there will be proper standard MXM3.0b form factor boards that don't require an extra power connection with the GTX 1080 Max-Q that can be used to upgrade older laptops with standard MXM3.0b slots like the Alienware M18x-R2 and similar Clevo models. That would be a great application of such parts.
     
    jaybee83 and Starlight5 like this.
  47. Miguel Pereira

    Miguel Pereira Notebook Consultant

    Reputations:
    11
    Messages:
    197
    Likes Received:
    160
    Trophy Points:
    56
    I understand your point of view, but even a gtx1060 gets much more then 30-40fps in high/ultra (not medium or low) in 98% or more of the games that exist if you play at 1080p. Don't be that guy.

    Any Max-Q gpu will be a good enough gpu to play anything if you don't expect ultra 4k gaming. There is a space in the market for a secondary thin grab and go laptop, or you fail to see theogic behind it?

    The issue here is the marketing of it, and how they will try to deceive the consumer with branding.

    Enviado do meu MHA-L29 através de Tapatalk
     
    Starlight5 likes this.
  48. atacool3

    atacool3 Notebook Consultant

    Reputations:
    10
    Messages:
    108
    Likes Received:
    99
    Trophy Points:
    41
    Yay guys. I got some information that directly compares with the true GTX 1070 and GTX 1080. Apparently Gigabyte has tested out the Max-Q GPU and gave a 'conservative' guess that it performs 15% better than GTX 1070 but consumes less power and is cooler. If you add the true GTX 1080 into this equation you will see it has 40% better performance than GTX 1070.

    It is like saying a 'new lamborghini is being released' making everyone assume it is going to be as fast as the previous models, but then the customer realises while the new lamborghini may be JUST as expensive and looks exactly like the older models, it performs like a..... Toyota?
     
    hmscott, DukeCLR and Starlight5 like this.
  49. XMG

    XMG Company Representative

    Reputations:
    749
    Messages:
    1,754
    Likes Received:
    2,197
    Trophy Points:
    181
    Nvidia require validation for every panel that is used in a G-Sync laptop, it involves testing by the ODM and also Nvidia themselves. This costs the ODM and Nvidia time and money - now the argument over wether Nvidia should charge extra for this is debatable and as a rep I don't really want to get involved in that discussion. With NVSR we will have full hardware G-Sync, opposed to "mobile G-Sync" which up to now has been software based. The T-con in laptops will now contol the sync in NVSR (NVidia Smart Refresh) which is why we can now sync to 120Hz. This is requiring more testing and validation than before, because the T-con is in control now.

    You can understand why Nvidia wants to keep a check on this, because ultimately if they have lazy or no validation someone can buy a G-Sync laptop which doesn't perform well and then blame Nvidia. There's variables in the panels, GPUs, ODM integration and so on to take into account.
     
  50. DukeCLR

    DukeCLR Notebook Deity

    Reputations:
    218
    Messages:
    1,060
    Likes Received:
    1,165
    Trophy Points:
    181
    Thant's an interesting point, I wonder if that's why The GT73s have Gsync but it's not listed in the specs. It works but if they don't claim it perhaps they don't have to do costly procedures.
     
    atacool3 and hmscott like this.
← Previous pageNext page →