The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    How will Ampere scale on laptops?

    Discussion in 'Gaming (Software and Graphics Cards)' started by Kunal Shrivastava, Sep 6, 2020.

  1. Normimb

    Normimb Notebook Evangelist

    Reputations:
    1,800
    Messages:
    576
    Likes Received:
    1,059
    Trophy Points:
    156
    Edit: I made an error in the comparaison so i delete this post. Sorry.
     
    Last edited: Jan 24, 2021
    Papusan likes this.
  2. captn.ko

    captn.ko Notebook Deity

    Reputations:
    337
    Messages:
    981
    Likes Received:
    1,252
    Trophy Points:
    156
    Looks like RT was on. At least that's what he says
    Screenshot_20210124-195015_YouTube.jpg
     
    Normimb and etern4l like this.
  3. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    I was talking about GPU ("the card") undervolting using Afterburner. My remaining points stand I guess :)
    While we are at it, his CPU is slower too.

    My Spanish is a bit rusty, but the title of the YT video is:

    Shadow of the Tomb Raider Benchmark on RTX 3080 Mobile (ULTRA SETTINGS + RAY TRACING)
     
    Normimb and captn.ko like this.
  4. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    788
    Trophy Points:
    131
    yep ray tracing on would explain a lot, I mean tbh for that card it would be ridiculous to NOT have rt on. Good catch.
     
    etern4l likes this.
  5. Normimb

    Normimb Notebook Evangelist

    Reputations:
    1,800
    Messages:
    576
    Likes Received:
    1,059
    Trophy Points:
    156
    I am wrong. He does have ray tracing on. Sorry @etern4l and @captn.ko I wil fix this post now.
     
    Last edited: Jan 24, 2021
    captn.ko and etern4l like this.
  6. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    Isn't this supposed to be an enthusiast community? It seems sitting here comparing desktop cards with massive TGP's to laptop cards where the TGP varies wildly and cooling solutions are far inferior is folly. Why not just compare it straight up to the previous 20 series mobile silicon and figure out if it's worth upgrading from your current 20 series (or even 10 series) or not?

    Comparing desktop cards with massive coolers and usually triaxial (at least biaxial) fans and heatsinks seems.. silly and false equivalency?

    Even if nVidia is calling them both 3080 aren't we smart enough as enthusiasts to just know that's bollocks and not equivalent and do the mental comparison based on what we know of physics?
     
  7. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    788
    Trophy Points:
    131
    AB-SO-LUTELY! Sorry I missed that detail I just assumed it was all laptop hardware being compared on a laptop site LOL
     
    hfm likes this.
  8. Normimb

    Normimb Notebook Evangelist

    Reputations:
    1,800
    Messages:
    576
    Likes Received:
    1,059
    Trophy Points:
    156
    The 2080 super 200w is considere a laptop card when i dowload the driver from Nvidia. It is also considiere a laptop card when i compare it with similar system in 3dmark. My system is a laptop with a Desktop CPU, look at my sygnature.
    So far we have no information that there will ever be a more then 150+ watts (200 watts for laptop) for the 3080. I was trying to compare, last year, most powerfull laptop card (2080s super) to the most powerfull latop card we actually have information about about wich is the 3080 130 watts so far.
    I don't doubt the 3080 latpop will beat all the previous generation card but like you i am interested in knowing if it's worth an upgrade $$ for me. So far i am disapointed but i keep my mind open and look for the results we have rignt now.
     
    Last edited: Jan 24, 2021
    seanwee, Papusan, hfm and 1 other person like this.
  9. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Everyone makes mistakes.

    Would you mind running it with ray tracing, so we can compare?
     
    hfm, Normimb and etern4l like this.
  10. Normimb

    Normimb Notebook Evangelist

    Reputations:
    1,800
    Messages:
    576
    Likes Received:
    1,059
    Trophy Points:
    156
    Thank you @kevin.
    I am now trying to download the version with Ray Tracing option. I will inform you of my results. I expect it to be much less then the 123FPS.
     
    hfm and etern4l like this.
  11. captn.ko

    captn.ko Notebook Deity

    Reputations:
    337
    Messages:
    981
    Likes Received:
    1,252
    Trophy Points:
    156
    Expect 50% of your first run ^^
     
    hfm and etern4l like this.
  12. Normimb

    Normimb Notebook Evangelist

    Reputations:
    1,800
    Messages:
    576
    Likes Received:
    1,059
    Trophy Points:
    156
    I will post the results, even if it makes my system look bad. I am trying to be as accurate as possible. Like everyone else i am trying to guess what the new cards are really capable of.
    I will use his exact same setting (but with ray tracing on this time), No overclock, no undervolt.
     
    NuclearLizard, hfm and etern4l like this.
  13. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    IMHO the only disappointment should be with Dell for cynically not providing an upgrade path for users who spent thousands on a brand new 51M. I bet if you could get a GPU upgrade for a grand, and sell the old DGFF card on eBay, you would be much more inclined to look favourably at that proposition. That was the original idea of the 51M anyway (catching up to Clevo), and Dell completely dropped the ball at its execution.

    As for the 3080 200W, we should just wait for an announcement from Alienware. There has been no hint of a new 51M so far, they seem to be referring to the m15/m17 as the flagship in their pre-release marketing materials. The brand has been going downhill for a while, so it wouldn't be too much of a surprise if they cut their once ambitious flagship. Or maybe it's some sort of marketing strategy: lower consumers' expectations so they won't be disappointed with what they get: an ever more limited, non-upgradable "desktop replacement".
     
    Last edited: Jan 24, 2021
    Normimb, Eclipse251 and hfm like this.
  14. Normimb

    Normimb Notebook Evangelist

    Reputations:
    1,800
    Messages:
    576
    Likes Received:
    1,059
    Trophy Points:
    156

    @kevin,@papusan, @Spartan@HIDevolution,@etern4l,@captn.ko,@JRE84, @hfm
    Ok guys before i made a big mistake comparing last year strongest laptop card to this year using a different versioin of Shadow of the tomb raider for benchmark.

    Now i have dowloded the correct version and did run a test, all stock for me (don't know for this guy) and the exact same settings this guy have set.
    I have a much stronger GPU (10900K running at 49X) and this guy had a I7 (i think).
    This guy says he has a 130w card. (is it a 115w + 15w turbo like previously mentionned by @etern4l? i think so.)

    This guys results: 79 FPS (3080 lapopt 130w)
    My results: 82 FPS (2080S laptop 200w)

    This guy 3080:

    3080 130 Watts results.png


    My result 2080s

    2080 super 200 watts ray tracin on 1  (28).png



    You will notice all settings are the same (i really hope i didn't make any mistake) and that the 3080 is only 4% slower (mighy be equal after driver update) compare to 2080S of last year. For sure the 150+watts will beat the 2080S but not by the big margin as we were expecting. I hope this helps the community evaluating the real performace of the 3080 compare to last gen.


    Edit: just notice that if you look at the lower right corner that the CPU and GPU scores are separated. The 3080 then beats my 2080S by a few frames.
     
    Last edited: Jan 24, 2021
  15. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    My 190W GTX 1080 inside a 15” notebook says hi (desktop GTX 1080 FE is 180W). :hi:

    48AC6327-D3F8-4CE5-934C-C332F5620D52.jpeg
     
    TBoneSan and Papusan like this.
  16. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Many thanks bro.

    OK, so basically a 115W + maybe 15W chip (it's too bad the guy didn't capture the actual power draw) with a 45W TDP CPU (running at maybe 25W I guess) thin and light (2.4kg) laptop is providing the same performance as previous gen 185W GPU + twice the TDP CPU desktop replacement. That's extraordinary if you think about it. The 185W/200W 3080 models will blast last gen to smithereens. On top of that the new 3080 has 16GB of VRAM....
     
    Last edited: Jan 24, 2021
  17. captn.ko

    captn.ko Notebook Deity

    Reputations:
    337
    Messages:
    981
    Likes Received:
    1,252
    Trophy Points:
    156
    @Normimb
    Thx for the test.

    Just my 2 cents
    1. In cpu Limit tests your 5Ghz 10 Core is way in front of the 10870h. That has an impact to the overall score as well.

    2. The 130w 3080 is as fast as the 200w 2080s. We know that Desktop ampere is nearly as efficient as Turing but it looks like Notebook ampere is more efficient (same Score, but lower power)

    150w+ (200w) 3080 should beat 2080s by 15-20% and could reach 3070/2080ti level. Thats what i expect.
     
  18. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    You think 50 to 75% more power (seems the baseline here is 115W) will yield 15-20% better performance? Sounds like a very conservative estimate.
     
  19. captn.ko

    captn.ko Notebook Deity

    Reputations:
    337
    Messages:
    981
    Likes Received:
    1,252
    Trophy Points:
    156
    Its the same Chip the 3070 desktop has (a few more shader). So yeah, i expect 3070 Level with higher PL (thats the Best case)... more would be unrealistic...
     
    Normimb and etern4l like this.
  20. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Not really. Power scales quadratically with clocks/voltage. For example, the 200W 2080/2080S is only 10% faster than the 150W version.
     
    etern4l and Normimb like this.
  21. Normimb

    Normimb Notebook Evangelist

    Reputations:
    1,800
    Messages:
    576
    Likes Received:
    1,059
    Trophy Points:
    156
    Yes the 3080 200w (if it ever exist) will be a big improvment over the 2080s laptop + i edit my previous post and if you look at the lower right corner ot the benchmak results you will see that this guy GPU actually beats my 2080S GPU by a few frames. If on the other hand there is never a 200+w for this 3080 and only the 150+w then i don't expect more then a 20% gain over the 2080S 200watts.

    like i mention before i just edit my post go see the benchmark results and look a the lower right corner. The 3080 beats my 2080S. I agree a 150 watts will beat my 2080s but for it to reach 2080TI?....or 3070.....hum... possible but i am still sceptical.
     
    Last edited: Jan 24, 2021
  22. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    I guess. This gimped 3080 is running on fairly low clocks of 1600 and 60C, but now much higher could it go at 200W -1900? 2000? The latter, optimistic, case (+25%), on top of the increase in score due to a more powerful CPU would put 25%+ DTR vs DTR in the realm of possibility. A very respectable improvement, but not Earth-shattering.

    Also, I just noticed this GE66 Raider is a 15 inch laptop LOL. It's something I would consider if Dell fails again with their m15 refresh by soldering RAM (and they almost certainly will). Love that massive 99WHr battery, the same one the better old Alienware used to fit.

    Overall, given that I'm not clear on the impact of CPU performance difference on this benchmark, I am going to go with a conservative +30% estimate here (an extra 5% to account for other architectural improvements not taken into account), assuming the refresh will stay on the 10th gen CPUs. We'll find out in a couple of months, unless Alienware bin Azor's toy out of spite :rolleyes:
     
    Last edited: Jan 24, 2021
  23. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    I'd keep an eye on it. The GE series seems to have double tripod heatsinks. I was looking at its big brother (GE76) Vs the Scar 17. or whichever fchasis has the 150w GPU from asus.
     
    etern4l likes this.
  24. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    That doesn't sound good. You mean there are just 3 screws around CPU and 3 around GPU?
     
  25. Sk0b0ld

    Sk0b0ld Notebook Consultant

    Reputations:
    126
    Messages:
    176
    Likes Received:
    220
    Trophy Points:
    56
    I did some tests in SOTTR with my desktop 2080 super. Just curious to see if the RTX 3080 can compete against a desktop card from the previous generation. The tests were made with the same settings as in the video. The HWinfo logs were made during the benchmark. These are the setting which I've taken:

    Settings.jpg

    Specs:
    CPU: i9-9900KS (8/16)
    GPU: RTX 2080 Super MSI Gaming X Trio

    1st test with my daily UV-settings:
    CPU: Stock @ 5.0 GHz, Vcore 1.221V
    GPU: UV 0.845V @ 1.875 MHz
    (GPU Power avg in 2h gaming: 125-130w)
    [​IMG]

    [​IMG]


    2nd test with stock GPU settings:
    [​IMG]

    [​IMG]

    3rd test with slight GPU OC (+75/ +500):
    [​IMG]

    [​IMG]

    Between the UV-setting (130w avg) and the OC-setting (230w avg) are only 8 fps difference.

    RTX 3080 Laptop.jpg
    just for reference
    GPU bound: 66%)
     
    Last edited: Jan 24, 2021
  26. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    Yea, just like that alienware that had it. Probably not the best use of I'd watch it but I definitely recommend scrutinizing it more.
     
    etern4l likes this.
  27. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    I think ill wait for this roller coaster to stop...tired of faster than a 2080ti then slower then faster...bah..waiting till it's released to know
     
    etern4l and joluke like this.
  28. joluke

    joluke Notebook Deity

    Reputations:
    1,040
    Messages:
    1,798
    Likes Received:
    1,217
    Trophy Points:
    181
    Same here
     
    JRE84 and etern4l like this.
  29. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Thanks for the warning - for completness, which AW model suffers from this, if you remember?
     
  30. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    The 15R3 and 17R4 I think. TBH I haven't followed alienware so much since they killed the 18....the first time.
     
    etern4l likes this.
  31. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Btw, how do we know the 3080 Max-Q in the GE66 is 130W?
     
    etern4l likes this.
  32. mehtenj94

    mehtenj94 Notebook Consultant

    Reputations:
    32
    Messages:
    105
    Likes Received:
    74
    Trophy Points:
    41
    etern4l likes this.
  33. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Person on Reddit with a GP76 w/ RTX 3070 posted stock Firestrike and Time Spy scores:

    [​IMG]
    [​IMG]

    The GPU scores are pretty much 200W RTX 2080 territory.
     
    etern4l, Papusan and Normimb like this.
  34. Normimb

    Normimb Notebook Evangelist

    Reputations:
    1,800
    Messages:
    576
    Likes Received:
    1,059
    Trophy Points:
    156
    This guy «Geremoy» on youtube who posted this video on cyberpunk:
    upload_2021-1-24_21-16-39.png
     
  35. Normimb

    Normimb Notebook Evangelist

    Reputations:
    1,800
    Messages:
    576
    Likes Received:
    1,059
    Trophy Points:
    156
    Indeed it's very close, almost at par with my 2080S. If he is stock then it's a big improvment over 2070 laptop last gen.
     
    Last edited: Jan 24, 2021
  36. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    We all know you can't compare the castrated 3080 mobile with the 3080 desktop card. But 200W 3080 mobile with maxed out silicon should easly match 220W 3070 desktop card. And from before we have seen the power capped Max-Q cards vs the real mobile should perform arond 20-23% above.
    GeForce RTX 3070 Founder edition review
    upload_2021-1-25_7-40-50.png
    Add +20% on top and we are where the 3070 desktop cards stay.

     
  37. joluke

    joluke Notebook Deity

    Reputations:
    1,040
    Messages:
    1,798
    Likes Received:
    1,217
    Trophy Points:
    181
    Personally I wouldn't mind getting a RTX 3080 200w at all. We'll see when they actually release for Clevo X170. I'd love to see it in action
     
    Papusan and etern4l like this.
  38. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    IMHO inconclusive. He didn't respond to the request for clarification below, or show any power numbers to back this up.

    Bear in mind this is a thin and light. 130W is a lot. Does it have a vapour chamber?
     
    Last edited: Jan 25, 2021
  39. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    200w 3080M matching a 3070 desktop is to be expected. Still a dissapointment knowing what it could have been. An actual 3080 die would have been able to match a 3070 desktop with 130w and beat it handily at 150w+
     
  40. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    It's disgusting when you know the jokeman Jensen gave all the love to the desktop cards. He treat the laptop users as Nvidia's own milking machine.
    [​IMG]

    upload_2021-1-25_10-16-11.png
    'Giant Step into the Future': NVIDIA CEO Unveils GeForce RTX
     
  41. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    I'm old and indifferent to what might happen in the land of I Wish. If I can get 3070 desktop performance my ching is going to drop.
     
    hfm, NuclearLizard and etern4l like this.
  42. Normimb

    Normimb Notebook Evangelist

    Reputations:
    1,800
    Messages:
    576
    Likes Received:
    1,059
    Trophy Points:
    156
    Hey brother
    Yup....i didn't say it's for sure a 130w card but that is the answer this guy «Geremoy» gave when asked. BTW he is the one providing all the benchmark information on the 3080 we have right now.
    Tomorrow we will not be talking about his results anymore...lol.
    The 3070 laptop another guy posted are much better. Close to what we were expecting from rtx30xx.
     
    etern4l and NuclearLizard like this.
  43. BrightSmith

    BrightSmith Notebook Evangelist

    Reputations:
    143
    Messages:
    640
    Likes Received:
    383
    Trophy Points:
    76
    Very curious to see whether 16GB will make any difference for RTX and/or higher resolutions.
     
  44. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    From what I remember apparently things like DLSS chew up ram. Also devs might wet their pants because thats apparently the #1 thing they ask for.
    Hopefully it gets used up and isn't just to sell.
     
    etern4l likes this.
  45. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    DLSS reduces VRAM usage, since the internal rendering resolution is lower.
     
  46. BrightSmith

    BrightSmith Notebook Evangelist

    Reputations:
    143
    Messages:
    640
    Likes Received:
    383
    Trophy Points:
    76
    If the game supports it. But that will probably become the standard anyway.
     
  47. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    That sounds incorrect. If you are rendering at 4K, say, then you need a 4K framebuffer - whether you are using DLSS or not. With DLSS on, you need additional memory for:

    * The lower resolution frame buffer used for primary rendering
    * Any memory required by the DLSS algo itself (it might be a fair amount given the massive parallelization likely needed to get it to work as fast as it does).
     
  48. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    8BEBB86D-DC8A-4D2A-A451-01D3A9C8DC65.jpeg
     
    Papusan and hfm like this.
  49. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Interesting, but that's just one game. What likely happens here is because the rendering occurs at lower resolution, the engine automatically uses lower resolution textures. .In theory not every game would dumb textures down when rending at a lower resolution.
    Additionally, DLSS 2.0 will require additional memory since it includes temporal feedback (it needs the history of past frames). In fairness, all these extra frame buffers probably don't impact the VRAM utilisation much compared to textures (a 4K buffer takes up around 32MB, a FHD frame takes about 8MB), so the open question is how much VRAM is needed by the DLSS implementation itself. Sadly, the Green Goblin is not sharing DLSS documentation publicly as far as I can see.
     
    Last edited: Jan 25, 2021
  50. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Lower resolution textures look sharper? :vbconfused:

    Desktop_2021_01_25_11_53_26_290.jpg Desktop_2021_01_25_11_53_35_378.jpg Desktop_2021_01_25_11_53_44_244.jpg
     
← Previous pageNext page →