The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    RTX 3080 trumps 2080ti by a whopping margin

    Discussion in 'Gaming (Software and Graphics Cards)' started by JRE84, Jun 24, 2020.

  1. KING19

    KING19 Notebook Deity

    Reputations:
    358
    Messages:
    1,170
    Likes Received:
    782
    Trophy Points:
    131
    Lol. Certain people have money to burn, especially for only a 10% performance increase over the RTX 3080. :p
     
  2. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    4% more CUDA cores same as the mobile 3080 scam, but GDDR6X memory and 320-325W in benchmarks. Yeah, the mobile 3080 showing off as a more and more disgusting and obscure graphics cards. We may see a 10% uplift with the Super gen mobile 3080 if this become a thing. 880M once again.

    NVIDIA GeForce RTX 3070 Ti Gaming & Synthetic Performance Benchmarks Leak Out, Up To 10% Faster Than RTX 3070
     
    Clamibot likes this.
  3. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    the problem isnt nvidia....its the consumer...if people didnt pay extra for 5 percent increases in performance nvidia would not make it this is fact truth and 100 billion percent correct...people are dumb nvidia not so much as all that really matters in life is money health and family
     
    Clamibot likes this.
  4. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    While Dell continue cannibalizing the GPU cores, MSI push out new firmware with higher TGP for their GE76 Jokebooks with 3080 Mobile. Yeah, so different can the OEMs be. But 165W is still an awful asd disgusting idea by Nvidia and its OEM partners.

    [​IMG]
    The MSI GE76 Raider runs the GeForce RTX 3080 after a vBIOS update with 165 watts


    The MSI GE76 Raider gets better gaming performance via software update, because the latest vBIOS version increases the TGP of the GeForce RTX 3080 laptop GPU. With 165 watts, the device can compete with the fastest gaming flagships on the market.
     
  5. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,983
    Trophy Points:
    431
    what about clevo, why is clevo following these disgusting practice? only 150w cards in clevo laptops too?
     
    seanwee and Papusan like this.
  6. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    Yeah, they also is forced to follow Nvidia’s guidelines. At least they don’t disable Cuda cores :D

    But in short all Ampere high end mobile graphics cards is castrated. Give a huge thanks to the thin and slim trend.
     
    ole!!! likes this.
  7. Tenoroon

    Tenoroon Notebook Deity

    Reputations:
    149
    Messages:
    751
    Likes Received:
    589
    Trophy Points:
    106
    I'm hoping for an Ampere mobile Ti/Super refresh that will produce a 200(+) watt 3080 Ti/Super mobile. I'd also hope that if a 3080 Ti/Super mobile is made, that it would use the GA102 die to be as close to the desktop 3080 as possible.
     
  8. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Betting against you is usually the winning play.
     
    seanwee, hfm, Clamibot and 3 others like this.
  9. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    The 3070 peaks at 140 and 165 for the 80. That's the deal until (if) they release a super card. Eluktronics is also giving their 3080 laptops the 165 bump but I think only the Prometheus is going to be able to use it for more than a timespy run; the rest don't have much headroom.
     
  10. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,983
    Trophy Points:
    431
    having custom/unlocked vbios (if thats a thing) doesnt help?

    its disappointing though. the 3080 suppose to be at 200w means if theres a TI or Super mobile cards it should be 250w or 300w even. having 200w on TI/super feels way gimped especially we are going to pay a premium on it.

    guess time to wait for 4080 or 5080 see what that bring us. maybe by then AMD will have something competitive and not locked down to force nvidia out of this dumpster line up.
     
    Tenoroon and Papusan like this.
  11. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    You need do a shunt mod. Nvidia lock everything above max 165W vBios for 3080 mobile. Same for the rest of the graphics SKUs. You can only cross flash vBios. Still a max allowed TGP.

    Custom/mod graphics firmware is something from the past. Locked down as a 100 years old virgin.
     
    ole!!! likes this.
  12. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    [​IMG]
     
    Papusan and Clamibot like this.
  13. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    Yeah shunt modding is the way to go if the only difference the super version brings is a higher tdp.
     
    Papusan likes this.
  14. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    Yet another gaming-book who will rely on Dynamic boost 2.0 to keep power below what it need.

    Razer updates Blade 17 with faster 130 W GeForce RTX GPUs but with no changes to the 230 W AC adapter notebookcheck.net

    Anyone know if Razer use battery boost? The Razerbook 17 Pro with the 100W graphics card pulled 223W from same power adapter...

    [​IMG]

    I expect you now will be able to cook eggs on the power adapter. And if Razer use battery boost they will now grab more cash from selling more batteries :D

    Major Battery Issues Plague Razer Laptops
     
    Last edited: Jul 14, 2021
    JRE84, Vasudev and Falkentyne like this.
  15. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    Yep, not even close to 2080 Ti performance. Nvidia and its partners made a disaster with their castrated TDP and Dynamic boost 2.0! The thin and skinny shrinking sickess continue.

    [​IMG]
    MSI GS76 Stealth: Thin chassis comes at the expense of GPU performance notebookcheck.net

    The TGP of current Nvidia graphics cards is a tiresome topic. As in the case of the MSI GS76 Stealth, it can happen that the built-in RTX 3080 runs slower than an RTX 3070 in other laptops.

    [​IMG]
    86% MSI GS76 Stealth 11UH gaming laptop test: Thin design costs GPU performance

     
    Tyranus07 and Vasudev like this.
  16. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    the title of the thread was 3080 desktop beats 2080ti desktop as there is no 2080ti laptop and it was an early indicator of what we could expect...holy smokes is the desktop far off the laptop variant
     
    Clamibot and Papusan like this.
  17. Falkentyne

    Falkentyne Notebook Prophet

    Reputations:
    8,396
    Messages:
    5,992
    Likes Received:
    8,633
    Trophy Points:
    681
    RTX 3080 mobile only trades blows with a 1080 Ti (desktop) or 2080 non super (desktop).
    And seems to fit in some bizarre space between the full 3060 and 3070 cards.
    And I don't know anything about laptops now (I don't even know if the DTR Clevo has a higher TDP 3080 in there or not)
    but I have a feeling if you shunt mod one of these (even if you can find the shunts, and it had better be a MXM card), you'll either blow up the VRM's or the system will throttle because you passed the speed limit and just got a ticket...
     
    Vasudev and JRE84 like this.
  18. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    I've helped several people shunt mod 30 series laptops and all have gone fine. (Granted I was overseeing the process)

    Here are some scores from a shunted GP66 with 3080 mobile
    https://www.reddit.com/r/GamingLapt...p66_11uh_optimized_gaming_synthetic_and_real/

    In fact I've turned people away from shunt modding MXM cards because they were equipped with anemic 4-5 phase 30A vrms which are barely enough for the gpu running at stock limits.
     
    Last edited: Sep 20, 2021
  19. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    Oh you ain't seen nothing yet

    https://www.notebookcheck.net/Maing...thin-size.559365.0.html#toc-energy-management

    I quote:

    We're able to record a maximum draw of 257 W from the medium-sized (~15.3 x 7.4 x 3 cm) 240 W AC adapter when running Prime95 and FurMark simultaneously. Thus, the system could have potentially benefitted from a larger AC adapter with a higher power rating.

    [​IMG]

    165w 3080 on a 240w adapter is just sad
     
    Papusan likes this.
  20. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    like I don't get it you post a result with a mobile 3080 scoring 14000 in timespy and the 2080ti desktop scores 14000 stock, yes I know you can overclock it further but thats not the point...having 2080ti power in a laptop is insane. but why do you guys say its only close to a 2080??

    I have been a member for what seems like forever and every year or 2nd year its the same thing people with laptops try to overclock to see the gains and net performance then compare them to desktop...if an overclocked laptop scores like(1080-1080DT) stock 1080 DT then you are safe and it is common to say its like either a stock 1080 DT or an overclocked one.

    nothing has changed except for current gen mobiles gap being further from 0 mean vs desktop.....pascal was great..now what weare left with is something thats been the case for like a decade or two..

    like look at the 980m/880m/780m/680m/580m/480m/280m/9800gtx/8800m

    like LMAO guys thats a freeking long time of the same thing...

    its kinda like 3d.....living for 30-50 years and just after 30 years realizing you see in 3d..lol......eureka the world is 3d....lets discuss...

    unless you were living in a cave it should not be suprising or disapointing
     
  21. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    I gave up on trying to overclocking the GPU past its original settings more or less because laptop manufacturers install cooling that's BARELY able to handle stock settings, nevermind oc ones.
    My system (in the signature) is an exception as it seems to have a really powerful cooling which keeps the CPU (Ryzen 2700) at 75 degrees C when its maxed out, and GPU (desktop grade Vega 56) at about 65-68 degrees C when its maxed out - so, I have some 'breathing room'... but, overclocking my V56 is not simple as modern drivers aren't exactly kind to OC-ing, so it can just make thing less stable - and to be fair, at stock, my V56 is about 5-10% behind the fully TDP unlocked version - so, while I can undervolt AND overclock at the same time, it takes finesse - but I don't think its worth it as the GPU is still more than capable at playing all modern games at 1080p and high FPS using 'High' settings (at least).

    If you can both undervolt AND overclock a GPU (or at least undervolt it), then sure, I'd probably suggest it as a viable option to get more efficiency and performance out of it... but more performance is usually usable only if/when you discover your laptop is struggling.

    Undervolting should probably be the defacto option in laptops if the manufacturer hadn't done their jobs correctly... and in cases of modern CPU's (not GPU's), they seem to be reaching ridiculous temperatures (about 90 to 95 degrees Celsius) when they're being stressed in gaming (not professional software which can stress a CPU to 100% easily).

    To me, that is unacceptable because those are borderline temps which will reduce the overall lifespan of other components in the laptop. So, the first thing I'd do in such a situation would be to repaste the CPU and GPU. This would probably help in dropping the temps somewhat... but next step would be undervolting both the CPU and GPU (which can also improve performance as it allows a hw to draw less power and reach/maintain its boost clocks for much longer, and drop overall temps further).

    Cooler laptops = longer lasting systems (if they are maintained), improved efficiency and better/consistent performance.

    Since most laptops come with 1080p panels, its unlikely most modern GPU's from mid-range and above will be struggling to run modern games at pretty high FPS using that resolution... but overall, OC-ing can only gain you 10-15%... maybe 20% more performance.
    This would be good if you're borderline between unplayable and playable FPS in games... but otherwise, its too much of a hassle and results in potentially larger thermals for small improvement that you probably won't even notice.

    I try not to change my system too often (maybe after 3 or 4 years - possibly 5 if I can get away with it).
    Last one I used for 9 years out of necessity (due to having no money to buy a new laptop).

    The one I'm presently using (in signature - and since 2019) should do me just fine for at least a while longer... but because I'm also a student in 3D animation, I MAY need to change it (sadly) for a system that has an RTX GPU for the purpose of using the GPU for hw acceleration (since most professional software apparently aschewed AMD gpu's from gpu acceleration and overall support - not because AMD gpu's aren't capable, but because software devs can't be bothered to write needed code to take advantage of AMD gpu's).
     
    Last edited: Sep 20, 2021
    hertzian56, JRE84 and Papusan like this.
  22. Clamibot

    Clamibot Notebook Deity

    Reputations:
    645
    Messages:
    1,132
    Likes Received:
    1,567
    Trophy Points:
    181
    It's disappointing because technology is supposed to progress, not regress.

    For the first time ever with Pascal, we finally had laptops that performed exactly the same as their desktop counterparts after about 3 decades of crappy laptops. Things should've stayed that way. That's what we call progress.

    In the past few years, we've seen that progress regress back to the dark days where laptops performed significantly worse than their desktop counterparts. This is stupid, and there is no genuine reason why laptops can't perform the same as their desktop counterparts. We saw it done with Pascal, so there's no reason it couldn't have stayed that way, even with increasing power draws from newer CPUs and GPUs.

    With Pascal and Turing Super, I could justify paying a premium to get desktop performance from a laptop. I cannot justify paying a premium to get worse performance from a laptop than a desktop. This is stupid and unacceptable. Again, there is no valid reason a laptop can't perform just as well as its desktop counterpart. It's perfectly possible, as demonstrated with Pascal 5 years ago.

    Thin and light isn't necessarily the only problem behind this regression. It's more of a lack of innovation than anything else. Regardless, laptops have significantly regressed in the past few years rather than progressed in terms of the performance delta between them and their desktop counterparts.

    Wasn't the goal since the inception of the laptop to have desktop performance on the go? What happened to that? That was the entire driving factor behind Nvidia's mobile GPU marketing since Maxwell. Remember those "closing the gap" messages from them?
     
  23. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    Around 100 of the 3080 laptops manage 14000 in Time Spy on 3Dmark leaderboard. And the test isn't the same as in games that run several hours. Don't expect all those laptops on top 100 can run stable on their maxed out oc. The new modern disgusting Dynamic boost 2.0 feature will cripple the Cpu or the Gpu performance if you run other Cpu intensive tasks in the background while gaming. The nice fun with shared pipes (Unifed heatsink paired with already mentioned awful and disgusting feature).

    Highest score: 14754
    Average score: 11965
    Pretty sure they learned the hard way that you'll keep your laptop longer if they offered desktop class performance. And they have to offer what the OEM ask for.
     
    Last edited: Sep 20, 2021
    hertzian56, JRE84 and Clamibot like this.
  24. Tenoroon

    Tenoroon Notebook Deity

    Reputations:
    149
    Messages:
    751
    Likes Received:
    589
    Trophy Points:
    106
    I'm just going to say it, and I will argue this until I die, but Pascal was the best GPU generation Nvidia has ever created. Desktop cards are still holding up, price at the time wasn't too bad and inflated, Mobile Pascal was basically the desktop cards, and 1080's/1080ti's are still holding up very well today.
     
  25. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181

    this!!

    yes pascal was awesome and still is...a 1080 can still max games with high fps

    we need to support the sucesses and shun the failures....remember a tennis player is only as good as his second serve...

    also with pascal do you guys think its possible they retarded the 1080 dt and gimped them just to close the gap
     
    Clamibot and hertzian56 like this.
  26. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    788
    Trophy Points:
    131
    yeah just go desktop if you want top performance, in terms of laptop top performance it's just a tradeoff unless you can find a manuf that will do as much as possible to really put a desktop into a case like eurocom. They're at the mercy of intel/nvidia etc though. At that point though afa price it's more than a full desktop system(under normal supply/pricing situations of course), wouldn't make any sense unless you have a real need to lug something that powerful around.

    Pascal mobile really cool and anything 3060m and under I'd just get those used like 1070/1080 other than if you get new condition whole system at the same price or only marginally higher, 1660ti/2060/3060. 1070/1080 laptops used still comparable price for whole system to 3060/2060 new laptops, nah I got new system.
     
    JRE84 and Clamibot like this.
  27. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    My last laptop had a desktop 1080 GTX in it and Cyberpunk kicked sand in it's face and walked away with the girl. I liked that lappy and it held up well, kept it 2 years which is longer than any other up to now.

    Buy the right laptop and you'll get all 165 watts all the time and despite the waa on this forum the results are spectacular. The 3080 is significantly more potent than the 1080 in a laptop when comparing the best of their type; you gents can dislike it as much as you want and it will be no less true.

    There isn't much of anything about the graphics market I like, what they're doing in design, the misleading marketing, the lack of real competition, I could go on and on. The world keeps spinning though and you make your choices. I'm enjoying the crap out of my 3080 and I'm not apologizing for it.

    One other thing of note: Some of you make it sound like top notch 1080 laptop performance was cheap, it was not. Mine was something like 3500 dollars and that was about par if you wanted the best of breed; a desktop 1080 with all the trimmings. If you were feeling it you could have spent more.
     
    electrosoft and JRE84 like this.
  28. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    yeah I went with the 1060....meh it aged at the same time as the 1080
     
  29. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    Problem is that it could have been MUCH MUCH faster if a desktop 3080 chip was used.

    Performance isn't "bad" strictly speaking. It just could and should have been better. If I'm paying xx80 class price, I better be getting xx80 class performance.
     
  30. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Current estimations are that the drop to the 5nm process may produce a "Pascal-like" generational leap, for both AMD and Nvidia.

    That's really why I bought this cheap-ish ($899) RTX 3050 Ti laptop to get me by until the RTX 40 series. There was no point in dropping serious money on an RTX 3070/3080 laptop when they weren't even much faster than my 200W RTX 2080. I can live with the 4GB VRAM by reducing settings, it's a low-end card after all.
     
    JRE84 and Clamibot like this.
  31. Cylix101

    Cylix101 Notebook Consultant

    Reputations:
    72
    Messages:
    236
    Likes Received:
    132
    Trophy Points:
    56
    JRE84 likes this.
  32. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    To be fair, RDNA 2 for example already experienced 50% increase in performance per watt over previous generation... and RDNA 3 has been stated it should do the same.
    AMD is also slated to release consumer RDNA 3 GPU's with chiplets late next year (whereas NV will be behind AMD on that front by 1 year - AMD is also releasing MI200 accelerators for data centers that use MCM late this year)... I was thinking of waiting until late next year (or early 2023) to replace what I have (depending also on whether professional software will also include GPU acceleration support for AMD GPU's, because right now, that's excessively sparse).


    No chiplets though.
    AMD will also have a big upgrade with RDNA 3 which should bring another increase of 50% performance per watt (pure uArch changes independent of manuf. node) along with chiplets (on consumer GPU's - or at least, upper high end ones - mid range ones will apparently still be monolithic versions).
     
    JRE84 likes this.
  33. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    5 phase 30A VRMs is already a hard physical limit to ~150W. On that scenario shunting would do anything just having over current protection triggered every time a VRM goes over 30A if any. What's the VRM config on BGA systems for the 3080 mobile?

    The only reason why laptop's are so behind desktops is because the thin and light trend. It's just not possible to dissipate 320W of a GPU on a laptop that is 5mm tick, also the battery life would be like 15 minutes. My 3080 Ti is 3-slot height, that's like 3 thin and light laptops, lol There used to be a niche before for tick desktop replacement laptops. That niche is gone now.

    Dynamic Boost would be a great idea if the GPU laptop could take, let's say 200W on normal mode, and up to 300W with dynamic boost. But as it is now, it just take like how much? 30W extra from the CPU? to a total of 165W?

    Hopefully with AMD FSR, GTX series will have a nice boost in performance now. Since Nvidia DLSS is just for RTX graphics, but to be honest there isn't much of a difference between DLSS and FSR, they look equally good. FSR 2.0 will eve be much closer to DLSS 2.0.

    What pisses me off is the name scheme. Why not call the mobile 3080 just a 3060 Ti (which is how it performs) or call it a 3080M. The desktop 3080 and the mobile 3080 are worlds apart on everything.

    Rumors are like the 4090 is going to perform 2x the 3090. At cost of like 450W TDP vs 350W TDP. Kinda frustrating how every gen desktop GPUs get more and more performance increase over the last gen and laptops get less and less performance increase over the last gen.
     
    JRE84, seanwee, krabman and 3 others like this.
  34. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    788
    Trophy Points:
    131
    Diminishing returns over previous generations until a real breakthrough if any comes. Those max power requirements for desktop cards wow. At least we can get all sorts of real info by people who took the plunge and by benchs. I saw that I think it was samsung is now mass manuf 4k/90hz laptop screens, would be nice to get those across the board at this point since 4k been out a long time now. Should have 2k as a new standard in gaming oriented laptops by now though.

    https://www.notebookcheck.net/Samsu...nd-4K-OLED-displays-for-laptops.562346.0.html
     
    JRE84 likes this.
  35. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    I agree with most of your post but the part quoted above may not be correct: The professor tells me that they'll be back when the semiconductor shortage ends.
     
    JRE84 likes this.
  36. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    Here is one more that come with too weak power adapter. Pretty disgusting. And Razer is well known for bloated batteries after some time.

    We're able to record a maximum draw of 235 W from the medium-sized (~17 x 7 x 2.5 cm) 230 W AC adapter. The charging rate will slow significantly when running stressful loads as a result. Yep forget those 4 extra bin on the Cpu.

    https://www.notebookcheck.net/Razer...-130-W-TGP-GeForce-RTX-graphics.561887.0.html
     
    JRE84 and seanwee like this.
  37. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    On the 3080 GP/GE 66/76 the vrms are 6 phase with 50A power stages, 5 stage on the 3070 models.



    With mux switches you can have long battery life without compromising performance nowadays. And even current cooling systems in medium sized laptops (GP66) can handle shunted 200w gpus like a champ (temps in the mid 60s). The problem is manufacturers/nvidia not putting more faith into laptop cooling systems and cranking up the tdp.



    Best combo nowadays is to bypass the cpu tdp in bios (imon slope + offset) combined with a shunt mod. Now the only limiting factor is how much power the manufacturer lets the laptop draw.



    I'll be upgrading to a "4080" laptop when it comes out and i've already given up hope that laptops will match desktops for the next few years at least. I believe we're in the power hungry gpu phase again ala fermi and hopefully hopper will be like what keppler was to fermi and bring power draws back in check.[/QUOTE]

    That said, if i get 4k120hz performance in AAA games i'll be satisfied i suppose. And considering the 4090 will be 2x 3090 at just 100w more there will still be a substantial uplift in terms of performance per watt. Which means at current laptop tdps 3090 performance is well within reach.
     
    Last edited: Sep 21, 2021
    JRE84 and Clamibot like this.
  38. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    I don't think we will see xx80 desktop graphics cards below 320W anymore. But for gaming laptops... They will be thinner with equivalent changes in the cooling.
     
    JRE84 likes this.
  39. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    While there will be thinner laptops, it is nearly impossible for medium sized laptops to go away, and those are enough for me. DTR tdp is just a shunt mod away.
     
  40. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76



    I don't know about switches mux, but if let's say a laptop needs 200W total to play games, and most common batteries are 90WHr then the battery will last less than half hour while playing games, that is physics, and the law of energy conservation. For the battery to last longer there has to be some kind of power limitation while on battery mode. If on battery mode the laptop can play games at 100W it will last a bit less than 1 hour. But you have to take a performance loss, it can't be other way around.

    About the cooling. GPUs aren't that difficult to cool down, the thing is a video card like the desktop 3080 needs around 300W on stock settings and can go as high as 350+W while boosting. I don't think a mid size laptop can cool down that much power, a desktop replacement probably can. A desktop replacement laptop should use around 400W total while playing games, but that's less than 15 minutes on battery of 90WHr. Of course I don't care about battery mode, but most people do.
     
  41. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    I don't think he was saying you get magic power with a MUX switch; rather, that you can have power when you need it by switching to dgpu and then switch back to the onboard garbage when you don't need the power and get extended battery life in that mode.
     
    Spartan@HIDevolution and seanwee like this.
  42. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    788
    Trophy Points:
    131
    yes, the problem with it is that most laptops don't have mux switches. Just for normal non gaming or not heavy graphical use most igpus are fine and save a lot of power if battery time is of importance. Kind of hard to get that information about mux though, eluktronics tongfangs tout it on their website, so I would guess most tongfang chassis have it. Mainstream major brand probably don't though.
     
    Spartan@HIDevolution likes this.
  43. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    Yeah but this thread is about laptops with 3080s; not the plodding tedium. Presumably maximizing gaming performance is high on the buying criteria for most 3080 buyers? Getting extra battery life if the need comes up is a nice bonus if you can get it even if you don't need it. I fall into that category; I've yet to use the laptop unplugged and have no plans to do so but it's nice to know I can. I've got a lot of disposable income though and have multiple laptops for different needs. Most won't have that luxury but if they come here and peruse these threads they'll know to check for the MUX when buying it they want it. Along with how much power will be available, whether it can game without throttling... You know; all the stuff you shouldn't have to know but need to if you want to buy without getting burned in the current laptop market.
     
    seanwee likes this.
  44. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    So the battery life while playing is the same with or without mux switch. Gaming laptops and batteries are natural enemies. To be honest it has never crossed my mind to play games while on battery.
     
    Spartan@HIDevolution likes this.
  45. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,722
    Trophy Points:
    931
    What a failure... Both Nvidia, Amd and the OEMs enjoy screw their customers. This apply also for the flagship as 3080 mobile.

    Simple optimization guide (AMD Ryzen)

    A Disable CPU Turbo
    When Turbo-mode is disabled all CPU cores run @ 3.2GHZ which will reduce power consumption & drop the temperatures up to 10C (in performance mode).
    On the plus side of things the power savings on the CPU will be utilized (more) by the Nvidia GPU.
    The RTX3070 mobile chip for example will almost constantly draw 140W while gaming.
    When the CPU turbo is enabled the wattage on the GPU will occasionally drop to the stock 125W and will only boost to 140W under certain conditions.
    So the CPU performance loss will vary per application, in GPU heavy loads for example there will be almost no (noticeable) performance loss because your GPU can draw more wattage.
    http://forum.notebookreview.com/thr...-2021-discussion.836058/page-76#post-11119625
     
    Spartan@HIDevolution and seanwee like this.
  46. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Mobile Ryzen is a failure to me, when disabling turbo is the most common recommendation to improve the experience.
     
    Spartan@HIDevolution likes this.
  47. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    No, absolutely not. You're exactly right about battery life, so much power only goes so far. The MUX switch allows you to maximize performance, but also run in hybrid mode and sip when you're not needing that performance. Run on the DGPU and your gaming time will be just as sweet as plugged in but very, very short. Realistically, a MUX switch just gives your laptop more capability, extends it's range. Yes you can go full blast but you can also sip battery at need by staying on the embedded graphics adaptor. Do that and the lights stay on longer but gaming performance goes into the toilet. Right not there is no ideal situation but consider my last laptop: I had only the 1080 GTX, there were no embedded graphics. I had the options of full blast or lights out in 15 minutes. The new laptop with the MUX adds the capability of actually getting (somewhat more) time on batt. It's not an option I use and I do question how many people but full size laptops with 3080s with battery life in mind but as I said, it is nice to know it's there if you need it.

    There is more to it than just that in regards to mobo layouts and lost performance but that's too much typing, you can search it up quickly if you're of a mind.
     
    Spartan@HIDevolution likes this.
  48. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    788
    Trophy Points:
    131
    This is a discussion forum there is no tedium sorry doesn't work like that, sheesh some people.... Yes a mux switch is nice because you can switch manually if you have the option, or have your computer switch automatically to low power or high power depending on the usage profile. It does have it's bugs that I've noticed, ie in some situations it does not really switch too well, limiting some games etc when they are not limited with just dgpu, there is overhead with the igpu when you are in hybrid mode too, not much but it's there. What would be nice is if a bios based mux was standard instead of a lot of hybrid or optimus systems to give user ultimate decision. You can bypass igpu if you output to an external screen though. Gaming on a laptop, can't see being too battery time conscious though
     
  49. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    The plodding tedium I was referring to are the vast majority of laptops which are largely unremarkable.

    I agree with you on the main: A physical MUX with BIOS level integration is the way to go. I have it now and it flat works. Should be standard on every serious gaming laptop until the hardware iterates past the need.

    I edited this post to stay on track rather than the digression I had thrown in.
     
    Last edited: Sep 23, 2021
  50. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    This

    A mux switch doesn't magically transform your laptop battery into a portable nuclear reactor lol. It just enables you to have unrestricted performance while also having the battery saving capabilities optimus/hybrid mode brings. Battery saving as in the gpu isn't draining power when you don't need it so you can still have 8-10 hours of battery life when doing light tasks and not 2-3 hours.

    Gaming on battery has always been a no go. Perfomance drops significantly, you harm battery lifespan due to the high power draw and you get an hour of mediocre performance at best.


    For people who actually spend time researching their laptop purchases (ie people who actually care about performance), a mux switch is a must nowadays. A 3080 laptop without one performs like a 3070 lapop with one. And from reading reddit/youtube comments, if a laptop doesn't come with a mux switch it is immediately disregarded, sometimes unfairly so for lower price laptops that can't fit a $50 piece of hardware into the budget.


    Well yes but it's not that simple. I've done the calculations and a "Max-p" desktop 3080 chip will be around the 200-220w range, right in the domain of medium sized laptops with proper cooling.

    Fyi Laptop gpu tdps and Desktop gpu tdps are calculated differently. Where a desktop gpu also factors in the power draw of the RGB lights, fans, video output and so on into its TDP, a laptop gpu only factors in the power draw of the gpu die and the vram into its TDP. That and Max-p laptop gpus are typically run at 90% desktop performance (assuming core count parity) and draw only 80-85% of the power to do so since its further down the efficiency curve.

    A laptop with a desktop 3080 chip is very possible, thats what made me so frustrated to hear that they were selling 3070 chips as laptop 3080s when they were announced.
     
← Previous pageNext page →