The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    New Clevos with Max-Q?

    Discussion in 'Sager and Clevo' started by pdrogfer, May 30, 2017.

  1. Thousandmagister

    Thousandmagister Notebook Consultant

    Reputations:
    10
    Messages:
    102
    Likes Received:
    65
    Trophy Points:
    41
    So Max-Q is a scam after all . Man, this is going to be fun , can't wait to see reviewers post this all over the media
    [​IMG]
     
  2. atacool3

    atacool3 Notebook Consultant

    Reputations:
    10
    Messages:
    108
    Likes Received:
    99
    Trophy Points:
    41
    Well most of the reviewers are gonna probably kiss the manafacturers ass. I dont think I know any actual unbiased reviewers(famous) reviewers lol.
     
    temp00876, Papusan and hmscott like this.
  3. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    Pretty sure I saw Linus go off on Intel over i9 this weekend, would like to see his opinion on this one.
     
    skandal, Papusan and hmscott like this.
  4. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Sounds like more Max-Q'ish magical thinking ;)

    You can't possibly believe that a 1080 GPU die in place of a 1070 would be easier to get the same performance output as a 1070 given the same power and thermal limits?

    That somehow the 1080 is more power and thermally efficient than a 1070 at the same performance output?

    I think given the same 1070 performance level power and thermal level envelope for both the performance would be the same, the 1070 in the same laptop as the 1080 Max-Q would give essentially the same performance, as it already has been tested to be true.

    Also, a 1070 Max-Q model with a full 1060 would get the same performance.

    It's the software watching the power and thermal limits with appropriate tuning feedback that is doing the work, the GPU dies aren't changed.

    Given the same software running it all, with the Max-Q tuning for the same power / thermal limits, watching temperature and fan noise, it would rev things up just fine with the "whole" 1070 on the 1080 Max-Q platform, and same with the 1060 on the 1070 Max-Q platform.

    "In real life" both the 1080 pulls more power and put's out more heat than a 1070, and a 1070 pulls more power and puts out more heat than a 1060 under the same load.

    So far the 1080 Max-Q is matching the full 1070 in performance.

    It's a huge waste to put a 1080 GPU in a build to detune it to 1070 performance levels, or a 1070 GPU in a build to detune it to 1060 performance levels, it's all just numbers to let vendors charge and make more $$$.
    Yeah, but by then it might be too late to return it without losing the restocking fee.

    To bad they don't consider these things before rushing out and buying them :)
     
    Last edited: Jun 6, 2017
    atacool3 likes this.
  5. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,171
    Likes Received:
    17,882
    Trophy Points:
    931
    A fully enabled chip is more efficient than a cut down one and GDDR5x is more efficient than GDDR5.

    But please, go on ;)
     
    hmscott likes this.
  6. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Yeah, I will, it's a waste of money, a huge waste of money. Hows that? :)

    Taking a perfectly good 1080 GPU that could perform at 1080 performance levels, and wasting it down to a 1070 performance level just to fit in a slim case.

    When for a few percentage points less performance, or the same performance with DDR5X memory - yup, that could happen on the 1070 - don't think it can't - you could use a 1070.

    I think if Nvidia had put the same effort into improving the 1060 and 1070 to fit in the slim laptops it could have done it. Adding DDR5X memory to make it work if that would have been necessary, which is doubtful. That would have kept the costs the same as they are now, instead of bumping the price by using more expensive GPU's.

    It's a scam, and a not very opaque one at that. It's easy for people to see once you point someone's nose at it.

    That is unless that someone or their company can profit from selling them...

    But please, go on ;)
     
    Last edited: Jun 6, 2017
  7. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,171
    Likes Received:
    17,882
    Trophy Points:
    931
    With that logic why not go for a desktop then to fully unleash the 1080 and not "waste" it on a laptop that will only do up to around 2000mhz in games when it could do 2100mhz in a desktop?
     
    hmscott likes this.
  8. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    It's not about making the fastest laptop, it's about making the slim form factor laptop perform at the highest performance possible while providing a livable user experience.

    Working with the existing GPU's would have been the smart and cost efficient way to go about reaching that performance user experience.

    Wasting more expensive GPU dies to be able to say it's got a 1080 or a 1070 instead of a 1070 or 1060 respectively is pure marketing BS to extract more money from buyers

    Over-spec'ing 1080 / 1070 GPU's into the solution is not providing a better performance or user experience, it's purely for mining $$$$'s.
     
    temp00876 likes this.
  9. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,824
    Likes Received:
    59,553
    Trophy Points:
    931
    We still talk about notebooks :rolleyes: Would be nice seeing same tech in desktops. With equal pricing. Aka pay 1080 prices for 1070. I wonder what the desktop camp would say :D
     
    hmscott and XMG like this.
  10. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,171
    Likes Received:
    17,882
    Trophy Points:
    931
    Ok, maybe I am treating the below as a given when it's not. But below is an example.

    Power goes up with the square of voltage.
    Power goes up linearly with frequency.
    Power goes up linearly with unit numbers.

    2560 shaders at 900mhz at 0.95v
    1920 shaders at 1000mhz at 1v

    Performance
    2560 x 900 = 2,304,000
    1920 x 1000 = 1,920,000

    So the 2560 shader card would be around 20% faster. However it is doing this at 5% less voltage. That's a 10% power saving.

    Power
    2,304,000 x 0.8975 = 2,067,840
    1,920,000 x 1 = 1,920,000

    1,920,000 / 1,920,000 = 1 relative performance factor per watt
    2,304,000 / 2,067,840 = 1.14 relative performance factor per watt

    Hence a larger core is more efficient working at lower clocks. So for a given form factor using a larger (or more enabled core) gives better performance.
     
    raz8020, bennyg and hmscott like this.
  11. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Right...seems legit.

    The only way to really test it out would be to build a Max-Q laptop with both a 1070 and a 1080 and tune them for best performance, and see which one performs like a 1070... you let me know how that works out - but I think we already know the answer, both of them :)

    Until then...

    There are already 1070 wielding slim laptops with the same power and thermal limits, the GL702VM 1070 running without Max-Q tuning software vs the 1080 Max-Q - we already know the gaming performance is the same, the only difference is the huge cost of the Max-Q solution vs the $1000 cheaper GL702VM.

    Given the same Max-Q dynamic tuning for optimizing GPU downtime - the idle time between load power / thermals - I think the GL702VM would run even better than it does today.

    Maybe Nvidia will release the driver and firmware Max-Q tuning for all mobile GPU's?

    That would be a nice thing for Nvidia to do, don't you think?
     
  12. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,171
    Likes Received:
    17,882
    Trophy Points:
    931
    It will be controlled via the firmware on the card and EC, it would need both updating.
     
  13. XMG

    XMG Company Representative

    Reputations:
    749
    Messages:
    1,754
    Likes Received:
    2,197
    Trophy Points:
    181
    I can't comment on whether those figures are actually representative of Max-Q when it's actually available to purchase, but yes indeed that's the decision - will people pay the premium for the slimmer chassis or stick with most affordable but slightly thicker chassis - which will also have more storage options and so on.......
     
    Papusan and hmscott like this.
  14. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,824
    Likes Received:
    59,553
    Trophy Points:
    931
    And we haven't talked about overclocking and thermal headroom for the coming thin and flimsy.
    For example. 1080 Max-Q is 90-110w vs. the working 1080 who is 180-200w in some models. I understand Pascal do it better with lowest possible temp. And ODM's put in 10-15% headroom on top of the cooling capacity. Will the thin and flimsy with the new Power Crippled card OC and hold higher clocks(oc'd), same as the thicker models with the "normal" graphics? See, eg 190w graphics + 47w BGA will have a thermal headroom around 23,7-35.5w vs. the lot lower for the new crippled machines. I expect the performance will be further crippled vs normal working graphics. Thin and flimsy isn't nice!! The main reason for this bastard is pushed out from Nvidia.
     
    Last edited: Jun 6, 2017
    hmscott likes this.
  15. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Right, but a newly designed laptop would have both firmware updates and new drivers.

    Designed and built with a 1070 instead of a 1080 would provide the same or "close enough" 1070 performance as Max-Q 1080 for $1000 less :)

    That's all it comes down to, really, making money where there was no money, instead of providing better performance and user experience for the same money.

    It's not really a radical idea, and would be much better for Nvidia to have done, considering AMD is sniffing at their heels.

    AMD should have a nice clean shot at taking a lot of market share from Nvidia if they keep this up and don't turn themselves around.
     
    Last edited: Jun 6, 2017
  16. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    The whole point of Max-Q is to dance around the top edge of that models performance limits.

    Since a Max-Q laptop is already "Max-Q"'ing out performance there will be absolutely no OC'ing a Max-Q laptop :)
     
    Last edited: Jun 6, 2017
    temp00876 and Papusan like this.
  17. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,824
    Likes Received:
    59,553
    Trophy Points:
    931
    If OC'ing was possible, I would expect 35W cooling headroom would beating the lot lower from the new thin and flimsy with Max-Qrippled!! As Pascal's max performance depends on lowest possible temp.
     
    hmscott likes this.
  18. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    Protip, the current 1070 already does this. Max-Q "downclocking" is not a new concept and has basically underpinned the entire mobile generation until recently.

    1070 desktop GP104 = 1920 SMs, 120 TMUs
    1070 Mobile GP104 = 2048 SMs, 128 TMUs

    How else do you think the mobile 1070 gets anywhere near the desktop version with a 40W lower TDP?

    Depends. Given the 40dB limit, I'm assuming that if that limit is removed you'd end up with quite a lot of headroom in cooling.
     
    hmscott and Ionising_Radiation like this.
  19. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,824
    Likes Received:
    59,553
    Trophy Points:
    931
    I expect ODM's have to follow Nvidias spec how to cripple firmware on the card and EC. Aka certified for Max-Qrippled. And we all know.... ODM's is pretty good in this :rolleyes: And not forget... Use of weaker components!!
     
    hmscott likes this.
  20. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Gently bumping up the SM / TMU mixes is one thing, swapping in a 1080 that's $$$'s more just to get 1070 level performance in a slim laptop is a ludicrous waste of resources.

    The $1000 additional cost to the end user is ridiculous when compared to the same laptop with 1070 performance as already shipping.

    The die may cost the same to Nvidia, so in that case Nvidia should pass the "no cost" on to the user and keep the cost the same as a 1070 laptop is today.

    At the very least, don't call it a 1080 - call it a 1070 Max-Q - it's got 1070 performance when shoehorned into a laptop far too small for it's own good so it's expected performance should be reflected in it's name.

    The improved cooling of the newer designed laptops + the firmware and software for Max-Q to get finer control over power and thermals wouldn't be needed if increasing the SM / TMU alone were the solution.

    The Max-Q tuning is firmware and driver changes to enhance the dynamic tuning to constantly change the GPU load on demand giving it more time to cool down yet quick response to additional demand.

    It would work the same with a 1070 rather than a 1080, improving power usage and reducing thermal output while maintaining performance in the same power / thermal chassis package at the same cost to the end user should have been the goal.

    It's basically a better boost / with added "de-boost", although that's a guess for now, until we can see performance graphs for Max-Q GPU's under varying load to see what Nvidia is varying under what conditions.
    What's the likelihood that Nvidia will allow user control of such things?

    More likely Nvidia will provide a locked down environment to prevent users abusing the hardware.

    If that Max-Q software is indeed monitoring the right things, it's already getting the most performance out of the cooling system to run at a 40dB noise output level, the extra few FPS at higher noise levels are likely hardly worth it.

    Hopefully tuning options might be available on higher spec laptops with cooling better suited to the potential of the GPU at some later release.

    If Nvidia had used a 1070/1060 instead of a 1080/1070 for Max-Q tuning in these new slim laptops it might be possible to unlock the limits, but not likely with a 1080/1070 that could easily overpower the cooling and power delivery with the limits removed.

    Of course there is that built-in limit set in the firmware to match the lower wattage PSU, Nvidia might feel comfortable relying on that as a limiter and allow some user tuning.

    We will know when we see one of the new Max-Q laptops fresh out of production destined for end users :)
     
    Last edited: Jun 6, 2017
  21. Ionising_Radiation

    Ionising_Radiation ?v = ve*ln(m0/m1)

    Reputations:
    757
    Messages:
    3,242
    Likes Received:
    2,666
    Trophy Points:
    231
    I'd like to put it this way: Max-Q is like de-rating a high-bypass turbofan engine for take-off: bigger engine, turned down to 80 – 90% N1 for efficiency and to keep the engine lasting longer, saving money and fuel in return. Ordinary GPUs (and the power user community here) are like military turbofans and jet fighters: low-bypass, with afterburners (i.e. OC headroom in PC parlance) thrown in to boost performance when needed, where performance is king, rather than efficiency, such as in a dogfight or when attempting to out-fly an AMRAAM or a SAM.

    But here's the catch: GPUs have got no moving parts, unlike jet engines. They don't exactly need frequent and meticulous maintenance routines. The former are solid-state electronics, and a few decivolts, through they may make a GPU last a year or so longer, would hardly make a difference in the practical life of a GPU, meaning to say it'd be thrown away or replaced long before it ever hit its life span.

    I understand your point of the notebook 1070 having some 128 SMs more than the desktop counterpart, with clock frequency and voltage lower. However, we're talking about putting a GTX 1080, which has nearly seven hundred more cores than a GTX 1070, into a notebook hardly two centimetres thick.

    Fitting such a GPU into a thin-and-light just to downclock and undervolt it with weak fans (more air moved = more noise, and more air moved = more effective cooling, simple as that) is a waste of GPU. Like the reviews have mentioned, it'd end up performing hardly any better than a 1070 anyway. Not likely to have much OC headroom, either, given the thickness of the notebooks in question. Obviously it'd be efficient, because we're taking a high-end GPU and making it perform at mid-range levels of performance.

    Going back to the jet engine analogy, it's like bolting GE90s on a B737 or an A320 (not that they'd fit, anyway) and making them run at 60% N1. Sure, we get efficiency. But it's a waste of a jet engine. Performance tiers exist for a reason, and if we start over-engineering everything just for the sake of efficiency, then we have a lot of wasted potential, and a waste of resources that we'd never use, anyway. I've already mentioned that the Pascal GPUs start disabling cores when the voltage begins to run too low. We might see the same case here, where instead of 2560 cores running at 0.95 V and reduced clocks, we may have 2400, 2300, maybe even just 2000 cores running at 0.95 V and reduced clocks.
     
    TBoneSan, temp00876, Papusan and 2 others like this.
  22. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,824
    Likes Received:
    59,553
    Trophy Points:
    931
    Yeah, I expect they will prevent you from fry your expencive hardware for everything in the world!! :cool: RMA isn't nice and cost money!!
    upload_2017-6-7_3-15-7.png
     
  23. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    That's not how it works.
    All 1070s start out as a GP104 core. They get binned and cut down to suit which skew they're going into. A 1070N (mobile) is not a "bumped up" 1070, more like a "less cut down" 1080. Technically, a 1070 costs more money to create than a 1080 because it's not a fully functioning GP104. The reality is, Nvidia set the sell prices based on what comes out, not what goes in. If the process was perfect there wouldn't be a 1070, everything would be a 1080.

    The cost to the end user is entirely dictated by:
    A) What Nvidia charge for each skew of the cores. We don't really know what this is. For all we know the difference between a 1070 GP104 and 1080 GP104 is almost nothing.
    B) What the ODM charges on top of that. Again, we don't really know this either.

    Neither of these things imply a "huge" added cost or a problem with the technological concepts here. The huge price-tags we're seeing on these new 1080 Max-Q machines is more likely huge "wow" price-tags like you get from Razer and such. The more regular models getting Max-Q variants appear to be priced much more reasonably, as I've documented at least with MSI a few pages back.

    You're not looking at the bigger picture. It's not just about increasing SM/TMU, it's about running as many as possible at the most efficient voltage and whatever clock-speed is necessary to run at that voltage.

    The key here is voltage efficiency. As voltage goes up, power consumption (and therefore heat) goes up exponentially and efficiency plummets. This is a concept everyone really needs to wrap their heads around.
    The point (these are just example numbers) with Pascal in general is that the highest efficiency point seems to be around 0.9V -> 1.0V. Clock it appropriately and design the power delivery to work as efficiently as possible in that voltage range and your overall perf/watt goes up. Enough so that you can apparently jam a full GP104 that runs like a 1070 into a 110W power envelope.

    It's got nothing to do with life-span. It's all about how much voltage is required to switch the transistor.

    Depends on who you are I guess (as far as dictating if it's a "waste" or not). For all we know, the 16nm process has been nailed to such a high degree that nearly every chip comes out capable of running all SMs. For all we know, Nvidia is selling the Max-Q chips cheaper and it's the ODMs piling the extra cost on. If you think about it, if the 16nm GP104 production was perfected, it would actually COST Nvidia money to disable SMs to create the 1070.

    One day we might even get. Cut-down chips would become a thing of the past, because binning would become obsolete.

    This is what I'm really not sure about, purely because of the 40dB noise limit. I think people underestimate just how big the difference is.
    That implies that there's a significant amount of RPM headroom in the fans. Granted it could come entirely down to firmware/bios limitations, but I have no doubt that a reasonable OC would not be limited due to thermals.

    I guess we'll see when people get their hands on them.
     
  24. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Yeah, I know how it works @Stooj and I know you know I know that, so knock it off.

    Pro-tip's my sweet patootie. :p

    Nvidia doesn't need to apply a full 1080 to this solution, Nvidia doesn't need to call it a 1080 when it's delivering 1070 performance, and Nvidia doesn't need to charge $1000 more than the laptops providing 1070 performance today.

    Nvidia is trying to cash in like Razer has with their 1080 Blade off the innocent dupes that fell for their marketing, paying for a full 1080 but only ending up with 1070 performance, and no CPU OC headroom due to power limits.

    End of story.
     
    Last edited: Jun 6, 2017
  25. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,824
    Likes Received:
    59,553
    Trophy Points:
    931
    I keep talking to my self :D
    We know that former 980m graphics card had a high failure rate. And why this happened. With cheapo components intended for low load as stated(Max-Q)... Yes we have already the golden standard how this will go if overclocking is-was possible :) And Max-Q is BGA :eek: I wonder how cheap it will be and how easy it will be to opt for new or used MB :oops:
     
    Last edited: Jun 6, 2017
  26. Dialup David

    Dialup David Notebook Consultant

    Reputations:
    112
    Messages:
    249
    Likes Received:
    123
    Trophy Points:
    56
    When has Nvidia ever let the user have control of their hardware? Only reason overclocking via software works was because of a driver exploit you could use Riva Tuner to manipulate. Nvidia tried to patch that and the PC community damn near burned their brand to the ground until they undid the patch.

    Voltage control on mobile is exclusively done via vBios modding, as well as TDP unlocking that requires a potentially bricking flash etc. This is completely the user manually doing all this, no hand holding. I wouldn't call that "allowing that to happen".

    In my eyes, Max-Q is just a marketing gimmick to sell gimped and possibly low spec ASIC's that never made the cut for the voltage/clock specification to make it into other platforms (Mobile/desktop). So instead they nuke the clock speeds, cut the TDP atleast in half, and call it a special new part. This way manufacturers that don't want to spend the R&D dollars actually building a machine that can house/cool/power higher end cards can just plop this new "Low Power" card and keep the standard GTX 1080 or whatever moniker. Then the user who has no idea whether or not it's a gimped Max-Q variant, spends a fortune on this hardware that can never possibly compare to the equivalent non gimped GTX 1080, and proceed to go on these forums just to complain about how terrible the performance is compared to the guy that spent less on a 17.3" 30lb Sager/Clevo etc.

    Don't get me wrong here, I think everyone wants a Razer Blade form factor with a full TDP GTX 1080 without fusing hydrogen atoms, but using this Max-Gimp marketing trash to push overpowered and underclocked hardware is a load of crap. Hopefully I'm wrong and just barking in the wind, but i'd like to be surprised.
     
    Last edited: Jun 7, 2017
  27. ElCaptainX

    ElCaptainX Notebook Consultant

    Reputations:
    17
    Messages:
    282
    Likes Received:
    68
    Trophy Points:
    41
    Now we just hope AMD release their GPU VEGA on notebook with full power and reasonable price i will stick with AMD forever :/
     
  28. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Yep AMD could really throw a spanner in the works if they got off their butt and released a MXM card.
     
    bennyg, ElCaptainX, jaybee83 and 4 others like this.
  29. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    I'm not expecting much, but I'm still playing wait-and-see with this tech. Haven't had my hands on a machine with it yet, who knows, maybe I'll pleasantly surprised.
     
  30. atacool3

    atacool3 Notebook Consultant

    Reputations:
    10
    Messages:
    108
    Likes Received:
    99
    Trophy Points:
    41
    How is the eGPU market going currently? They seem like a very good option for a GPU upgrade without sacrificing battery(u only game with a mains plug nearby), thermals, and size of laptop. The only problems I see is the thunderbolt port decreases the performance compared to desktops slightly and some eGPUs are not very portable.... Would the price of a base eGPUs(with no actual GPUs) still result in it being more economical and useful than an overpriced and underpowered Max-Q GPU?

    And for people who claim to game without a mains plug nearby, the only reason they can do so is because they are playing pinball or something.... The battery life for an actual game would be max 2 hours(and thats a very efficiently designed laptop).
     
  31. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331

    I think the initial eGPUs were timed poorly. They were clearly conceived before the performance differences of the full 10 series, and then being able to get near desktop form factor, even in a thicker heavier laptop made them less attractive. Right now it seems like nobody knows whether to jump on the bandwagon, hold off for a bit or skip it entirely. This coupled with the fact that the people who would benefit most from it and people who have older systems lacking the Thunderbolt port necessary to run one so they'd need to buy an entirely new computer just to be compatible, and the newer low end systems still being released without TB are all contributing to confusion and I imagine lower sales.
     
    atacool3, TBoneSan and Papusan like this.
  32. Ionising_Radiation

    Ionising_Radiation ?v = ve*ln(m0/m1)

    Reputations:
    757
    Messages:
    3,242
    Likes Received:
    2,666
    Trophy Points:
    231
    With eGPUs, wouldn't the fact that most thin-and-lights have a U-series CPU make a massive bottleneck?

    I mean, two cores, four threads, that's hardly better than a Pentium...
     
    TBoneSan, jaybee83 and hmscott like this.
  33. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331

    There's a decent number with HQs, shouldn't be a huge problem. But you're right, the U series will be an obstacle.
     
  34. sicily428

    sicily428 Donuts!! :)

    Reputations:
    816
    Messages:
    3,610
    Likes Received:
    1,987
    Trophy Points:
    231
    here some benchmarks from egpu.io

    13R3 – GTX1060 dGPU / 13R3 – GTX1060 eGPU / XPS9365 – GTX1060 eGPU (akitio node)

    Unigine Valley 2,368 2,408 2,067
    Unigine Heaven 1,431 1,363 1,188
    Unigine Superposition 2,085 2,073 2,164
    3DMark Time Spy 3,537 3,677 3,507
    3DMark Fire Strike 11,326 11,005 9,292

    Metro Last Night Redux 83 FPS 71 FPS 58 FPS
    Rise of the Tomb Raider 54.28 FPS 47.83 FPS 43.13 FPS
    Tom Clancy’s The Division 49.9 FPS 40.8 FPS 39.1 FPS

    i think it is a good result for a cpu like Intel Core i7-7Y75 1.3 GHz and without plug (only battery
     
  35. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331

    I still think that would be noticeable in a lot of cases.
     
  36. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Yup, not a good pairing for eGPU.

    The other gotcha is the new rage for 120hz display's on all these smaller thin TB3 notebooks.

    The TB3 bandwidth to eGPU and back to the internal screen has enough trouble at 60hz/FPS, IDK how it's going to react to trying to feed 120hz/120FPS gaming back to the internal display.

    It's always been a good idea to connect the display to the eGPU directly, but with 120hz internal display it's going to be mandatory.
     
    TBoneSan likes this.
  37. xor01

    xor01 Notebook Deity

    Reputations:
    178
    Messages:
    752
    Likes Received:
    62
    Trophy Points:
    41
    I'm also interested with egpu solution to have 1 ultimate laptop for everything (gaming, work, etc).

    I also think the surface book with performance base is really close to be the ultimate laptop, unfortunately the design doesn't allow easy maintenance (no removeable back cover), so re-pasting and cleaning of the heatsink+fan would be a nightmare to DIY.

    I don't know why gaming laptop company doesn't make 1 like Surface Book... a gaming laptop with touchscreen, stylus, and 360 degree hinges.
    Imagine... slim gaming laptop like MSI GS43VR but with 360 degree hinge to convert into tablet mode.
     
  38. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Because it's a gaming laptop that needs AC power to power the GPU and high performance cooling to cool the gaming load on the CPU and GPU.

    You can't have both AAA gaming and tablet in the same laptop. :confused:

    Even a tiny low end GS43VR is sucking in AC power and blowing out huge amounts of hot air when in heavy gaming - you couldn't hold on to it. And, you couldn't fold the screen over covering the air intakes. And, you'd have an AC cable trailing from the laptop / tablet.

    Instead, right now, you could get 2 devices, one built correctly for gaming and one built correctly for tablet use.

    Or, you could get one of those new magical Max-Q laptop's that transforms into a Sushi Serving Table:
    d5dd0852-f4e8-4bf5-8fd4-017db6c582d1.jpg
    Those are totally awesome :rolleyes:
     
    Last edited: Jun 7, 2017
  39. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    Note, the chopsticks are sold separately.
     
  40. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    You'll want to get the top end chopsticks with 10,000 levels of pressure sensitivity, totally worth the extra $500, for sure :D
     
  41. Ionising_Radiation

    Ionising_Radiation ?v = ve*ln(m0/m1)

    Reputations:
    757
    Messages:
    3,242
    Likes Received:
    2,666
    Trophy Points:
    231
    Apple Sticks , now with a built-in battery and proprietary Lightning plug which you can plug into your iPad to power the warmer coil built into the chopsticks. Keep your food warmer as it traverses the thirty centimetres from your plate/bowl to your mouth!

    Available at all Apple Stores and the Apple Online Store . Sticks Manager app is available on the iOS App Store .
     
  42. xor01

    xor01 Notebook Deity

    Reputations:
    178
    Messages:
    752
    Likes Received:
    62
    Trophy Points:
    41
    LOL.. hahaha
    I'm sorry, I'm not being clear enough.
    I didn't mean convertible laptop with high performance GPU in it.. it would be disaster like u said with all the heat and power consumption.
    What I mean is Surface Book-like device, but from gaming laptop company like Asus or MSI or Razer.
    So it will have the same feature like Surface Book, but with removeable back cover for easy maintenance.

    As you probably aware, Surface Pro have been copied a lot by other laptop manufactures.
    But not Surface Book. Only Surface Book manage to put in a GTX 965m into a 13" convertible laptop.
    Other manufactures convertible laptop only manage to use integrated GPU.

    The closest 13" laptop with dGPU is Alienware 13, but it have a footprint of a 14" laptop and it also not convertible.
    MSI have GS30 with 950m, but it also not convertible. And they also stop producing 13" gaming laptop (no GS32 successor).
     
    hmscott likes this.
  43. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    For now that's the top stick.

    We can only dream of a future with Mimosian Antimatter Chopsticks...
     
  44. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    There are Pascal replacements coming along from MS I am sure, if not now then next generation.

    But, there are other gaming laptop makers that are or soon will be making 1050/1050ti, and perhaps even 1030 based laptops.

    But, even those dGPU's (as well as the 965m) are going to need AC power for full performance, there really aren't any (powerful) dGPU's that run the same on battery as on AC, it's just not possible.

    I think I saw one with a "130" GPU mentioned - it was either a typo and they meant 1030 or Nvidia has a new class of 3 digit GPU names. That may have a battery mode that's less power sucky than the higher models, but probably not powerful enough to do much either.

    Keep watching for new releases :)
     
  45. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,824
    Likes Received:
    59,553
    Trophy Points:
    931
    eGPU is similar as Tv's 3D functionality. Died before it was ready in use for computers!! Designed for more milking!!
     
  46. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331

    I'm guessing 1030.

    And I don't see a good battery gaming solution any time soon in laptops at least. You're gonna be tied to a wall whether it's a hybrid ultrabook or a massive DTR.
     
    hmscott likes this.
  47. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    I think if it has been ready and compatible earlier it would have made a real difference. But as it stands, it was too little, too late. Who knows though, maybe it will become viable again in the future, when more machines have the right combination of hardware.
     
    Papusan and hmscott like this.
  48. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Can a Gaming Laptop be Thin & Quiet? Explaining NVIDIA Max-Q
     
    DukeCLR likes this.
  49. Miguel Pereira

    Miguel Pereira Notebook Consultant

    Reputations:
    11
    Messages:
    197
    Likes Received:
    160
    Trophy Points:
    56
    A laptop with a gtx1050 has enough punch for any game at medium to high 1080p 60fps on the go.

    If eGPU was properly developed you could have a thin laptop with a 1050 or equivalent and a HQ cpu or ryzen and use the eGPU for the heavy lifting.

    But eGPU is not properly developed. It looses too much performance.

    Enviado do meu MHA-L29 através de Tapatalk
     
  50. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    eGPU is a very attractive option for me. But I'm yet to see it done quite right. I'd want the internal monitor to be able to display 120hz, and without a hit.
    In this instance I would want a good processor too. Not a thermal throttling 2.8ghz that can't feed the GPU.
     
← Previous pageNext page →