The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    How will Ampere scale on laptops?

    Discussion in 'Gaming (Software and Graphics Cards)' started by Kunal Shrivastava, Sep 6, 2020.

  1. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
  2. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Papusan and hfm like this.
  3. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    https://game-debate.com/news/29159/...-performance-analysis-and-graphics-comparison

    "
    Does DLSS affect image quality?
    DLSS 2.0 does somewhat affect image quality in Death Stranding, as an upscaled image from 1080p or even 1440p won't be nearly as sharp as native 4K. (...)

    DLSS also struggles to properly upscale small particle effects from lower resolutions, resulting in often blurry and not so sharp particle effects. In Death Stranding this is most notable when it rains, but in our experience Quality mode was understandably a lot better, but the effect didn't detract from our enjoyable experiences in the game. If the strange artifacts and blurring bothers you, it might be best to play at native 1440p instead of 4K DLSS, but the bottom line is that this down to personal preference.
    "

    IMHO it's clear. DLSS trades off image quality for performance. The lower VRAM utilisation can only be achieved by lowering the texture quality, that's CS 101. Devs can do that because the average gamer can't tell the difference. It's also possible that they have to lower texture quality, either because of the amount of VRAM required by DLSS itself, or because it's easier for DLSS to upscale objects with blurry textures, given how it apparently struggles with the fine detail of particle effects.

    Details like that would explain why access to DLSS documentation likely requires an NDA :)
     
    Last edited: Jan 25, 2021
  4. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    I think I posted that a while back after watching a comparison video... dlss 2.0 does for sure look better than native
     
  5. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Yeah, I look better on Snapchat than native too lol, but that doesn't mean that the heavily processed image has the same image quality and detail.
     
  6. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    I dont normally argue with you, but digital foundry did a 800x zoom on quality and clearly 4k dlss 2.0 looked crisper and had comprable image quality, 4k had more definition but looked duller...overall rep to dlss 2.0



    this is 540p vs 1080p

    imagine 4k...dlss 2.0 better by a long shot
     
    Last edited: Jan 25, 2021
  7. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    To be fair, it is "fake" detail that is being generated. But I'd argue that if it looks great, who cares?
     
    hfm and JRE84 like this.
  8. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    yeah like too my eyes 1080p upscaled with dls2.0 looks sharper one can argue sharpen your 4k image in which case looking better might be in 4ks favor but performance??? who cares when they look so close ill take the extra 15-30 fps
     
  9. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    So you get proof VRAM usage drops with DLSS, to be expected, then move the goalposts with "it's a one game sample.."
     
    ars92, electrosoft and NuclearLizard like this.
  10. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    To add more info.. The only game I currently have installed that supports DLSS is Wolfenstein: Youngblood.

    I used my panel's 2560x1600 resolution to check DLSS VRAM usage of the game with various settings from disabled, DLSS Performance, DLSS Balanced and DLSS Quality.

    For some reason RTSS OSD disappears when I turn OFF DLSS.. which is wacky, apparently some bug in RTSS. But, between Performance, Balanced and Quality DLSS settings the VRAM usage iteratively increased approximately a little less than 100 megabytes per level from Performance to Balanced to Quality on ~high settings..
    upload_2021-1-25_23-16-47.png


    I then realized that the game has it's own built in metrics so I cranked up the quality to UBER MENCH and took 4 readings there.
    upload_2021-1-25_23-30-5.png

    So that's two games now where using DLSS reduces VRAM usage, and using lower quality DLSS settings will reduce it more than higher quality DLSS settings. This all makes perfect sense given how it works. I am imagining these memory usage number would swing even more at 4K.
     
    Last edited: Jan 25, 2021
    NuclearLizard likes this.
  11. vala4ev

    vala4ev Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    9
    Trophy Points:
    6
  12. DRevan

    DRevan Notebook Virtuoso

    Reputations:
    1,150
    Messages:
    2,461
    Likes Received:
    1,041
    Trophy Points:
    181
  13. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    The graph is not to scale to exaggerate the point. The differences are actually not that dramatic, less than 10%.

    DLSS does reduce frame buffer memory requirements if multi-sample antialiasing is enabled, but that wouldn't explain a 400MB drop. Something else must have gone, and clearly, at the same time DLSS has memory requirements of its own. The fact that Nvidia doesn't share too many details regarding the proprietary implementation is suspect.

    DLSS 2.0 also attempts to sharpen the image to make it look "crisper" which has it pros and cons. Not even Nvidia would make the claim that the resulting image quality is better than rendering at a native resolution, but it's a really cool smart upscaling trick and works well in practice for gaming purposes. Surely 4K DLSS looks better than FHD, and for most people the question whether 4K native or DLSS looks better on Ultra settings is moot, because only DLSS is usable.

    Worth noting that Nvidia bet heavily on this, for example DLSS uses those RT cores many criticise, so no doubt there is a good number of sponsored reviews out there.
     
    Last edited: Jan 26, 2021
  14. vala4ev

    vala4ev Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    9
    Trophy Points:
    6
    Screenshot_2021-01-26-21-35-48-89.jpg Screenshot_2021-01-26-21-35-34-49.jpg
     
    Sk0b0ld, ars92 and Papusan like this.
  15. DRevan

    DRevan Notebook Virtuoso

    Reputations:
    1,150
    Messages:
    2,461
    Likes Received:
    1,041
    Trophy Points:
    181
    Thanks!
    That cpu score is strange...in FS it is better than stock 9900k, however in TS it is worse...maybe 3DMark is not patched yet and Time spy does not utilize 100% of the new AMD cpu ?
     
    Last edited: Jan 26, 2021
  16. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Not a fan of critical thinking? Just curious how it works dude, and yes - one data point was not enough.
     
    Last edited: Jan 26, 2021
  17. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    My stock 9900K is above 25K. The processor in that screenshot probably throtte hard in the Time Spy Cpu test.Can’t keep up after the graphics test is done due already heat saturated unified heatsink.
     
    Normimb, captn.ko and etern4l like this.
  18. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    Of course I changed the scale to make it easier to see. The point is the fact that DLSS uses less VRAM. Period.
     
    seanwee likes this.
  19. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    Yes I am a fan of critical thinking. My favorite podcast is skeptics guide. But this is just simple facts given how DLSS works.
     
  20. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Right, the frame buffer savings and texture quality reduction, except apparently DLSS 2.0 uses high quality textures. It's too complicated for me, as long as you understand all the simple facts, I'm happy. You wouldn't be able to reveal more due to the NDA you've signed anyway.
     
  21. Normimb

    Normimb Notebook Evangelist

    Reputations:
    1,800
    Messages:
    576
    Likes Received:
    1,059
    Trophy Points:
    156


    Big improvment at 4K gaming. No improvment at 1080p. Interesting.
     
    Papusan and etern4l like this.
  22. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    Ehh? NDA.. no conspiracy theories here man come on.
     
  23. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    1080p is more likely to be CPU bound. Also with all due respect for LTT, let's wait for something more in depth. :)
     
    Last edited: Jan 26, 2021
    hfm, Papusan and Normimb like this.
  24. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    No conspiracy. From what transpires in other fora, DLSS 2.0 is a fairly complicated system, and if you go to Nvidia website trying to find some docs, it looks like major dev access only, likely protected by an NDA.
     
  25. Normimb

    Normimb Notebook Evangelist

    Reputations:
    1,800
    Messages:
    576
    Likes Received:
    1,059
    Trophy Points:
    156
    Alienware M15/M17 R4 will have the lower TGP config of the 3080.


    upload_2021-1-26_9-45-26.png
     
    Sk0b0ld and etern4l like this.
  26. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Can you ask them about the soldered RAM and WiFi?

    What about Area 51M?
     
    Last edited: Feb 9, 2021
    ars92 likes this.
  27. Normimb

    Normimb Notebook Evangelist

    Reputations:
    1,800
    Messages:
    576
    Likes Received:
    1,059
    Trophy Points:
    156
    The guy is lost completely at Dell. He confirm soldier ram but now he is changing his mind on the TGP. He told me 80-150+ watts. It just prooves how difficult it will be for customer to know what we are really purchassing.
    2Capture.PNG
     
    Papusan and etern4l like this.
  28. Sk0b0ld

    Sk0b0ld Notebook Consultant

    Reputations:
    126
    Messages:
    176
    Likes Received:
    220
    Trophy Points:
    56
    [​IMG]
     
    Normimb likes this.
  29. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    He’s just quoting the Nvidia spec sheet lol.
     
    ars92, Papusan, etern4l and 1 other person like this.
  30. Normimb

    Normimb Notebook Evangelist

    Reputations:
    1,800
    Messages:
    576
    Likes Received:
    1,059
    Trophy Points:
    156
    LOL.
    Ok has i touhgt the guy was confused. I insist and got a better answer: 125 watts for the 8GB and 165watts for the 16GB,

    upload_2021-1-26_10-14-0.png
     
    Papusan, etern4l and Sk0b0ld like this.
  31. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Hmm, yeah, Ampere TGP can be confusing.
     
    Last edited: Feb 9, 2021
    ars92, seanwee, Normimb and 1 other person like this.
  32. mehtenj94

    mehtenj94 Notebook Consultant

    Reputations:
    32
    Messages:
    105
    Likes Received:
    74
    Trophy Points:
    41
    There is zero chance the m17r4 has anything less than 115W 3080. The time spy overall score on the tech radar review is 12100 which I’d assume can only be reached by a high TGP 3080 GPU
     
  33. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181

     
    Last edited: Jan 26, 2021
    etern4l likes this.
  34. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    He probably mixed up and missed hard as usual. The smaller brother m15 R4 have the more castrated 3080@8GB vram version. Hence Dell only provide 240W psu for the unlocked 10980Hk and 125W 3080 LOOL https://www.dell.com/en-us/shop/gam...-m15-r4-laptop/wnm15r450h?view=configurations

    -----------------------------------------------------------------------

    [​IMG]
    Performance Test: GeForce RTX 3070 Laptop & RTX 3080 Laptop

    TDP chaos. It was already fairly complicated to assess a gaming notebook's performance due to the various factors involved, such as for example the cooling system. With their current Ampere generation of GPUs, Nvidia upped the ante and made the hunt for the ideal laptop even harder.

    Due to the extremely high variances in TDP performance between two seemingly identical GPUs can differ drastically. Unfortunately, those differences are often concealed at the time of purchase unless sellers or OEMs fully disclose the GPU’s configured TDP. Given that in the past some OEMs and sellers chose to avoid disclosing the use of Nvidia’s Max-Q variant instead of the regular chip you will forgive us if we are rather skeptical.

    As you can see from Dell... Not a single world telling how castrated the 3080 mobile card is in the AW m15 R4. Neither in the sales page or the specs sheet.
     
    Last edited: Jan 26, 2021
    seanwee, etern4l and Normimb like this.
  35. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    There's plenty of other posts and analysis without needing access to walled docs. Just out of my own curiosity I'll install all the games I own that support DLSS and do this analysis. I'm expecting nothing surprising but never know! I'm also expecting it to be rather small but consistently predictable differences at 1600p. 4K is where it should be much more interesting as DLSS is much more effective at higher resolutions. It did however help me in a non-trivial way even at 1080p in cyberpunk. The game was essentially unplayable unless I used it at 1600p.
     
    etern4l likes this.
  36. Normimb

    Normimb Notebook Evangelist

    Reputations:
    1,800
    Messages:
    576
    Likes Received:
    1,059
    Trophy Points:
    156
    Good observation 240w power adapter with the K cpu.

    2Capture.PNG

    What a mess. So the 3070 outperforms the 3080...How confusing.
     
    Papusan and etern4l like this.
  37. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    This could result in some unnecessary confusion.
     
    Last edited: Feb 9, 2021
    Normimb and Papusan like this.
  38. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    2080>3070

    not good
     
    seanwee, Clamibot, Normimb and 2 others like this.
  39. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    As you know.... Dell prefer software tuning crippling...
    How Dell cripple performance explained by......

    And I'm sure they will take it to the utter end with the Ampere models. Why provide proper power adapters when you have tech as below? 240W adaper is so cute and it is cheaper. Big win.. For Dell. Not for the customers.

    Dynamic Boost 2.0

    Among the new features Nvidia brings to its 30-series is an improved Dynamic Boost 2.0. Dynamic Boost works by shunting power from the CPU to the GPU when needed, giving the GPU 15 additional watts of power.

    Nvidia raises the AI catchphrase and says Dynamic Boost 2.0 can increase performance by 16 percent. Dynamic Boost 2.0 previously only allowed power to be shifted from the CPU to the GPU, but now it can actually flow back to the CPU from the GPU if the game is CPU-bound rather than GPU-bound. Nvidia said it can also now balance power with the graphics memory.

    On laptops that support it, Dynamic Boost 2.0 will be on all the time, without the option to turn it off, as in the previous version. On older laptops that support Dynamic Boost, you will continue to be able to turn the feature off.

    While Dynamic Boost 2.0 can pretty much be supported by any RTX laptop, it’s up to the laptop maker to enable it. We didn’t get deep into this feature here, but we plan to soon.
    https://www.pcworld.com/article/360...view-the-fastest-laptop-gpu-just-arrived.html


    Welcome to 2021


    Mobile GPUs are now all "laptop GPUs"

    Mobile GPUs are now all "laptop GPUs"
    Nvidia is now calling all of the new mobile graphics cards "laptop GPUs". That sounds like a minor aside, but it isn't. But on the contrary.

    Once in 2015 and back
    As a reminder: Until the end of 2015, Nvidia had marked mobile GPU variants as M versions. This meant, even if it was definitely not clear to all customers, that a GeForce GTX 980M did not come close to a GeForce GTX 980 for desktop PCs.

    With the GeForce GTX 980 without the suffix “M” , Nvidia then presented a notebook GPU for the first time in September 2015, which virtually achieved the performance of the desktop version . It was based on the same GPU and used the same memory, only the consumption was slightly reduced. This was possible because Maxwell could already be operated very efficiently and economically in the desktop without direct resistance from AMD.

    With the mobile GeForce GTX 1000, Nvidia made this change in the nomenclature in the series: Because all mobile GPUs should only be a maximum of 10 percent slower than the desktop version, the mobile GPUs were named like those for the desktop. A new addition were the particularly efficient Max-Q variants, which have been throttled again in terms of consumption and should be a maximum of 10 percent slower. This promise also worked out in reality.

    The system was also retained in the subsequent RTX-2000 generation, although it was already falling apart, because the gap between GPUs for desktop PCs and notebooks widened.

    Now Nvidia is returning to the “M” of the generations for the GeForce GTX 980 with the suffix “Laptop GPU” used throughout the series. Why? A look at the key technical data of the new mobile variants becomes clear.

    Why should Nvidia go forwards as with Maxwell and Pascal Mobile? No profit in graphics that let you have good enough performance for more than 1 year before you need to upgrade your notebook. They have learned their lesson. Give it desktop performance will only provide longer lifespan before you need to buy a new laptop again. Less profit in that. Both for Nvidia and its partners (OEMs).
     
    Last edited: Jan 26, 2021
    acekard, seanwee, Normimb and 3 others like this.
  40. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    OK, so sounds like in practice this Dynamic Boost 2.0 will only work if CPU is lightly loaded, i.e. in old games which don't need Ampere ;)
    The obvious question is how would people know if this is enabled or not? Does 130W mean 130W or 115W + 115W. What does 125W mean?
     
    Last edited: Feb 9, 2021
    Normimb likes this.
  41. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Flow X13 disassembled showing Asus' LM protection solution: Foam damn around the package and thermal paste/goop on the capacitors around the APU die. Looks like that goop saved the ones on the left of the chip from being shorted out. Kinda nasty though. Most people who use LM on GPUs go for nail polish/conformal coating and or Kapton/electrical tape instead.

    2021-01-26 12.42.21.jpg Test1_678x452.jpg
     
    seanwee and Papusan like this.
  42. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    One of the reviews of an Asus laptop showed it doing the actual 115/130 it claimed in the specs; if I remember correctly it was 114 and 127 with a little on the other side of the decimal point. That one had off, auto, and always on in settings but I noticed the benchmarks were all with it always on so no answer as to what was gained. The benchmarks were solid on that one although not as impressive as one might hope.
     
    etern4l likes this.
  43. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Nice. As a new owner of an Asus WiFi router which offers bucketloads of configuration options, I am not surprised by the superb configurability.
     
    Last edited: Feb 9, 2021
    Papusan likes this.
  44. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    788
    Trophy Points:
    131
    Pay more get more I guess, but it's still pretty disingenuous to label ALL of the models with the same numbers. I think a lot of people will just assume it's all the same if the labeling is the same and get ripped off, par for the course. On another note the NBC/videocardz firestrike scores are not that impressive for a 3070 max-q of only 30% more than my 2060 ~95w refresh. What does that say for the 3060 laptop gpu benefit, not too promising. So 3060 max-P would I'm guessing equal 3070 max-q for a benefit of 30% over the 2060 95w max-???. 3060 max-Q less, I'd guess 15% improvement over 95w 2060, hmm. Then again I probably got ripped off over the 1660ti mobile gpu new as well, but I was going from a 980m to a 2060 so it's a huge upgrade for me. And for what I paid black friday deal 1660ti laptops are as much or more. May as well get the extra 5% and raytracing/dlss capability.
     
    Last edited: Jan 26, 2021
    etern4l likes this.
  45. Arog

    Arog Notebook Consultant

    Reputations:
    15
    Messages:
    252
    Likes Received:
    44
    Trophy Points:
    41
    I suppose the dynamic boost is impressive and allows more power to go the gpu when the cpu is not needed, which gets it some what closer to desktop 3080 clock speeds. Linus shows a big performance gap at 4k with the 3080 dominating the 2080 super, but in that benchmark at 4k the gpu isn't getting 60fps constant at 4k. 4k is unnecessary on a 17 inch laptop anyway. I'm still in the all I need is 60fps crowd though, but would of been willing to upgrade for 4k 60fps performance. The issue with that is there is no way you will have good thermals and good acoustics when running 4k at 60fps with this generation of gpus unless the laptop is quite big and hefty.
     
  46. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Depends on the conditions under which this feature still works. Is it still active in open world CPU heavy games like CP2077?

    Linus showed what he was contracted to show, let's wait for more independent reviews.
     
  47. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    788
    Trophy Points:
    131
    Similar benefit between 60/70 tiers when compared with the 20 series in UL, the scores indicate 2070mq is about 12.2% benefit over the 2060 refresh RF((which I'm assuming is the 115w version? and the old one the 80w version?)). I'm assuming a 3060mP would be 15% lower than 3070mQ but it seems like that doesn't work, where would the benefit come from 2060mRF and 3060mQ at all? Another case of 1660ti/2060 of like 5%-10%? Of course at average new pricing they're close in price if you go for the low end $999 3060 laptop vs the still available new 2060 laptops.

    According to UL the 2060mRF is ~60% benefit over a 1060m, that's in line with the 2060mRF/P and 3060maxP but imagine getting a 3060 max Q, almost no benefit!! I guess maybe 5-10% AND access to better dlss/raytracing and the other new tech? I get 6120 and the mean(I'm assuming) score UL for the 2060mRF is 6183. I was certainly ripped off with the 2060 from a labeling POV between desktop and laptop, 22% difference same name, ugh.

    Good luck to UL etc trying to distinguish on an official level between low/high powered versions, of course the scores would indicate it, but how to put that in the graph? Just one big mean score for all of them, confusion. My head hurts now.

    https://benchmarks.ul.com/compare/b...RE&reverseOrder=true&types=MOBILE&minRating=0
     
    Last edited: Jan 26, 2021
  48. Normimb

    Normimb Notebook Evangelist

    Reputations:
    1,800
    Messages:
    576
    Likes Received:
    1,059
    Trophy Points:
    156
    Maybe when he told me that the TGP of the card were: 80-150+ watts he was right?
    Maybe he was trying to say that: you first get 150+watts for few months and for all the reviews to be made..... and then with a magical update bios/vbios we will reduce yourTGP to 80 watts for you best protection. LOL :D :D
     
    ars92 and etern4l like this.
  49. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Their strong financial results suggest that their average customers are happy. It works faster than a Surface Pro, iPad, or a Macbook Air, so what's the problem?
     
    Last edited: Feb 9, 2021
    seanwee and Normimb like this.
  50. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    As a reminder... Do some of you remember the 799MHz throttlegate with first gen AW BGA models running 980M and 180W psu? The Hybrid battery boost was finally born to replace proper power adapters. Dell tried a second time with the newer models here http://forum.notebookreview.com/thr...iew-by-ultra-male.828258/page-9#post-10895897

    Yeah, Dell show it's beauty :D http://forum.notebookreview.com/thr...ew-by-ultra-male.828258/page-14#post-10929641

    240W psu with 3080 mobile even if its castrated is darn disgusting when you know it was too weak for previous models.

    Yeah, the new Ampere models will most likely be added to the list... https://www.dell.com/support/kbdoc/...r-battery-drain-while-ac-adapter-is-connected
     
    Last edited: Jan 26, 2021
← Previous pageNext page →