The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    How will Ampere scale on laptops?

    Discussion in 'Gaming (Software and Graphics Cards)' started by Kunal Shrivastava, Sep 6, 2020.

  1. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    I'm also running a DT1080 in my lappy and had been planning on upgrading with Ampere way back when I bought the laptop; seemed like a fairly reasonable upgrade cycle in terms of performance versus cost. Ambivalence has set in...
     
  2. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Well, my Vega 56 in the PH517-61 is limited to 120W (can't exceed this) and its only 5-10% behind (performance-wise) its fully TDP unlocked desktop equivalent (mine however is a desktop grade chip... just modified for use in a laptop).
    The full desktop Vega 56 has a TDP of 210W (with ability to draw more than that when fully stressed).

    So, you're looking at a reduction in power draw of just over 40%... but Vega and GCN in general were overvolted heavily from the start so that AMD could create a greater number of functional dies, which is why you could undervolt them severely and play with clocks to retain or increase performance, or even lose just a bit for a much greater gain in power efficiency (and I suspect the silicon used for the Acer laptops was better than most which allowed for this).

    Not sure if Ampere falls into the category of overvolted GPU's... but its possible the better binned silicon will be used for laptops, and to be fair, you don't have to lose a lot of performance by dropping the clocks a lot... in fact, you stand to lose more on efficiency and gain very little performance the more you increase the clocks.
     
    Vasudev likes this.
  3. Clamibot

    Clamibot Notebook Deity

    Reputations:
    645
    Messages:
    1,132
    Likes Received:
    1,567
    Trophy Points:
    181
    This probably sounds crazy, but I do expect a 200 watt Max-P laptop 3080 to only lose 5-10% of the performance relative to its desktop counterpart. Yes, the TGPs on the desktop Ampere cards are really high, but you also have to remember that efficiency goes down the drain the higher you push clock speeds. Take a look at this undervolting analysis performed on the RTX 3080: https://wccftech.com/undervolting-ampere-geforce-rtx-3080-hidden-efficiency-potential/.

    The power draw was decreased from around 310 watts to around 230 watts, and it only reduced the framerate in Forza Horizon 4 by 4 fps. The framerate when undervolted was around 139 fps vs the 143 fps at stock settings. The whole concern of a laptop 3080 performing only as well as a 3060 is blown way out of proportion. The Max-Q variant will probably perform like a 3070 while the Max-P variant will probably have a 5-10% performance loss over its desktop counterpart.

    Combine the significant efficency gains from undervolting, and couple that with the fact that laptop GPUs are usually the better binned chips, Nvidia should be able to decrease the TGP a lot without sacrificing much of the performance of the GPUs. I may sound really optimistic here, but I'm just making a conclusion based on all the information i've seen. And no I did not use that wccftech link as my only source. I've been reading a lot of forum posts regarding what the Ampere mobile cards could turn out like plus rumors from other sites (yes I know rumors aren't always accurate, but that's all we have right now).
     
  4. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    Sure Ampere has some good undervolting potential. But rumors are Nvidia is using the much smaller GA104 vs the big GA102, also using 256-bit bus instead of 320-bit bus, and the slow GDDR6 vs the faster GDDR6X, this is pretty different when compared to what happened with Pascal and Turing when specs where pretty much the same between laptop and desktop versions. The board of the desktop 3080 is huge to fit in to a laptop. It's impossible to downgrade all of those things and decrease just a 5-10% performance. I hope you guys are right and I'm wrong though
     
    seanwee likes this.
  5. Clamibot

    Clamibot Notebook Deity

    Reputations:
    645
    Messages:
    1,132
    Likes Received:
    1,567
    Trophy Points:
    181
    Gotcha. I didn't see those rumors about differences in hardware. If that is indeed going to be the case, then that will definitely have a negative impact on performance.
     
    seanwee likes this.
  6. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    Vasudev and seanwee like this.
  7. GrandesBollas

    GrandesBollas Notebook Evangelist

    Reputations:
    370
    Messages:
    417
    Likes Received:
    563
    Trophy Points:
    106
    Another data point in the reality show, "Don't Spend Too Much on Your Next Laptop". Didn't Frankie Valli write the song, "Junk is the Word"...

    "I solve my problems and I see the light
    We got a lovin' thing, we gotta feed it right
    There ain't no danger we can go too far
    We start believing now that we can be who we are
    Junk is the word.
    They think our love is just a growing pain
    Why don't they understand? It's just a crying shame
    Their lips are lying only real is real
    We stop the fight right now, we got to be what we feel
    Junk is the word
    Junk is the word, it's the word that you heard
    It's got groove it's got meaning
    Junk is the time, is the place, is the motion
    Junk is the way we are feeling
    We take the pressure and we throw away
    Conventionality belongs to yesterday
    There is a chance that we can make it so far
    We start believing now that we can be who we are
    Junk is the word"
     
  8. joluke

    joluke Notebook Deity

    Reputations:
    1,040
    Messages:
    1,798
    Likes Received:
    1,217
    Trophy Points:
    181
    We won't see RTX 3080 MAX-P?
     
    Last edited: Dec 5, 2020
  9. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    And it seems AMD will try compete in Mainstream. Sub 100W graphics just means we will see more of the thin and slimy Jokebooks. The trend will only continue.

    AMD Readies Radeon RX 6000M Mobility GPUs Based on Navi 23 & Navi 24, RDNA 2 With Sub-100W TGPs For Gaming Laptops wccftech.com

    We can expect AMD to compete directly with mainstream GeForce RTX 30 mobility GPU in the laptop segment. These models would be fully integrated on Lucienne and Cezanne based Ryzen 5000 gaming notebooks that are hitting shelves in the first half of 2021. What's going to be more interesting is if AMD deploys a small yet still useful amount of Infinity Cache on these GPUs.
     
    jclausius and Vasudev like this.
  10. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Navi 23 and 24? SSo they're basically replacing the 5600M, 5500M and 5300M. ZZzzzzzz

    I see AMD still has no interest in high-end gaming laptops.

    Every time Nvidia leaves the door open, AMD has a picnic on the lawn.
     
    hfm, Vasudev, joluke and 2 others like this.
  11. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,050
    Messages:
    11,278
    Likes Received:
    8,816
    Trophy Points:
    931
    That's the reason I'm looking at full AMD CPU+GPU(RDNA2) instead of nvidia. I really want Linux with better battery that works with custom scale factor aka fractional scaling and out of the box better GPU open source driver. With nvidia the TDP/TGP becomes the limiting factor and Max-P/Q/R/A or whatever is simply rehashed notebook with castrated firmware.
     
    Deks and joluke like this.
  12. joluke

    joluke Notebook Deity

    Reputations:
    1,040
    Messages:
    1,798
    Likes Received:
    1,217
    Trophy Points:
    181
    And Intel is shooting themselves in the foot with the 8 core limit on 11th gen. Screw them both (Intel and nVidia)
     
    Deks and Vasudev like this.
  13. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,050
    Messages:
    11,278
    Likes Received:
    8,816
    Trophy Points:
    931
    Yeah somewhat. But Intel IPC is somewhat lower than Ryzen.
     
    joluke likes this.
  14. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Well, Navi 22 is also supposed to be featured in the RX 6000M lineup (according to videocardz).
    But yeah... this next generation of discrete GPUs is not looking that good.
     
  15. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    Being honest with myself I'd have to say that gaming laptops are a whole different world than they were not so long ago. I'm disappointed with what I'm reading about Ampere in the mobile space, no question, and more so because they're screwing up my best guess for a reasonable upgrade path over time. Still glad we made it this far though, it wasn't long ago PC laptop gaming was a bad joke even on so called gaming laptops. I just wish they'd give us a little more respect and name them the way they actually run: As stated above, the latest greatest they can get into a lappy is equivalent to a 3060 in the desktop space I want them to call it that. I can take the idea that they can't deliver the goods just fine but I hate the disrespect with the marketing.
     
    hertzian56 likes this.
  16. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    788
    Trophy Points:
    131
    Agreed. Also I find it insulting that so called gaming laptops have not set a standard of either 1440p or 4k screens, they are still mostly at 1080p but with high hz, it's nice but not that important unless you're really into whiplash online games. With 4k tv's prices coming down so much how hard is it for gaming pc's to do the same? And other than the high hz/fps what is the point of anything over a 1070/1660ti/2060 for 1080p? Marketing is really a nasty dishonest business.
     
    seanwee likes this.
  17. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,312
    Trophy Points:
    431
    None of these cards can max out final fantasy XV or ARK in max details/settings at 1080P, so the more advanced graphics cards gets, the nicer we will be able to play our games. 1080P is fine for laptops IMO. Windows sucks at resolution scaling for non-native resolutions and I’d rather use FHD no scaling but with greater color gamut, faster refresh and response times, than deal with 4K/QHD scaling issues.
     
    Vasudev likes this.
  18. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Max settings are overrated, as they typically ramp the resolution/precision of various effects to unnoticeably high levels. Not to mention, ARK is one of the most unoptimized games in existence.
     
    Vasudev and seanwee like this.
  19. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    788
    Trophy Points:
    131
    yeah I feel like it's just diminishing returns, unoptimized games I guess so, if you want to keep playing those a long time or squeeze out that last bit of detail that won't be noticed. Buying less for more, for overpowering unoptimized junk as an excuse seems crazy to me though. The power of the 2070+ seems like a huge money and performance waste on 1080p. I mean I guess once used prices come down to what the cards I mentioned sell for, sure, but looking at the market that's not going to happen soon.
     
  20. joluke

    joluke Notebook Deity

    Reputations:
    1,040
    Messages:
    1,798
    Likes Received:
    1,217
    Trophy Points:
    181
    Vasudev likes this.
  21. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    Woww, that looks totally weak, but at least AMD name those GPUs with lower numbers designation (6700M, 6600M, 6500M), and not like Nvidia which will have a 3080/3070 Max-p/Max-q which could lead to the confusion for the buyer that he is actually buying something as powerful as the desktop counterpart.

    [​IMG]

    The price of the RTX 3090 is ridiculous and I'd say that is not the real competitor for the 6900XT. The real competition will be the 3080 Ti.

    It will be similar as with the RTX 3080 vs 6800XT, or the RTX 3070 vs 6800... Nvidia wins hands down. The price is similar, the performance is similar in rasterization, but the performance gap in favor of Nvidia in RT is huge, and not talking when the game supports DLSS, plus with Nvidia you get Tensor cores, CUDA, streaming suite. Honestly, the only reason to buy AMD in this generation is in case you have no access to a Nvidia GPU, lets say no stock for the Nvidia counterpart. AMD had it's great chance this year to take market share from Nvidia. AMD's only mission was to have stock available but they failed at it, lol
     
    seanwee and joluke like this.
  22. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    Damn nice. Nvidia rushed out their 3080 desktop cards with the proper GA102 silicon, 320W but wimpy 10GB. Why give the castrated mobile cards 16GB vRam? Bad conscience because they destroyed the Mobile flagship with a small 3070 die?

    Why the need for 16GB vram on the weaker Mobile card? So they can defend a higher than needed price point?

    If Nvidia push out 20GB version of 3080 after the 10GB screw up, it will continue increase the gap between Mobile vs. 3080 desktop cards. Aka Mobile 3080 will get the lesser and cheapo GA104, No GDDR6X, Lees vRam, crippled TGP and crippled clocks. Welcome to the past but this time with two kinds of M branded mobile cards (Max-P and Max-Q).

    The gap between mobile vs. desktop cards was meant to become closer when they created 980/1080 N (Max-P), but have now crashed hard as a failed experiment. But their goal has become a succes. Max-Q as second tier was their target and goal.

    NVIDIA will equip the mobile version of the GeForce RTX 3080 with 16 GB of VRAM, according to multiple sources.

    The NVIDIA GeForce RTX 3080 mobile outed with 16 GB of VRAM, a GA104 GPU and 6,144 CUDA cores notebookcheck.net

    The first details of the GeForce RTX 3080 mobile have appeared online. Not only do they offer an insight into what to expect from high-end gaming laptops next year, but they also raise the question about the continued existence of merely a 10 GB edition of the desktop variant of the RTX 3080.
     
    Last edited: Dec 6, 2020
    jclausius, joluke and Tyranus07 like this.
  23. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    Yeah the 10 GB for the desktop RTX 3080 to me has to do with the price of the RTX 3090. The price for the 3090 is twice the price of the 3080, that price difference is hard to justify really. But it would be even harder to justify if the RTX 3080 had 16 or 20 GB of VRAM. The future RTX 3080 Ti will totally cannibalize the sales of the RTX 3090 and make it pointless.

    As for the 16 GB of VRAM for the mobile RTX 3080 that's a cheap move just to be able to charge $1000, and to make it seems like this card is a big "upgrade" compared to the max-p/max-q RTX 2080, and some will fall in to that trap.
     
    jclausius, joluke and Papusan like this.
  24. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    To me its pretty clear the laptop "3080" is a sign of whats to come on desktops. Its clearly a 3070 ti put into a laptop.
     
    joluke likes this.
  25. ratchetnclank

    ratchetnclank Notebook Deity

    Reputations:
    1,084
    Messages:
    1,506
    Likes Received:
    900
    Trophy Points:
    131
    It's because the mobile part will have the slower cheaper gddr6 rather than gddr6x so they throw more at it to look better value.
     
    joluke likes this.
  26. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    Not because of that but simply because the GA104 either does 8gb or 16gb so they can't put 10gb. Which further shows the sheep in wolf's clothing.
     
  27. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,312
    Trophy Points:
    431
    It should surprise nobody that nvidia will be using GA104 in laptops. They’ve never used bigger dies in previous generations, why would they start now in the least efficient generation since Fermi? Always had GF104, GK104, GM204, GP104, TU104, now GA104 even when bigger dies were available. Only thing different this time around is a messed up naming scheme, but nvidia has never been good with those since dropping M suffix from mobile
     
    yrekabakery likes this.
  28. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    This. Once it was known that the 3080 would be using the big GA102 die, it was always wishful thinking that it would come to mobile. Even when Nvidia did use big GF100 on mobile (GTX 480M), it was so neutered that smaller GF104-based GTX 485M was better.
     
  29. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    Both Nvidia and AMD have screwed the notebook users. The thin and slimy down chassis sickness have finally paid of.

    Why spend the best and most expensive silicon and vRam in laptops when they can't utilize the advantage of it? And both the hardware manufacturers like Nvidia, Amd and the Notebook industrr can still charge premium because this will be the very best they can will offer. Yeah all we get is Castrated hardware for premium.

    AMD Radeon RX 6000M RDNA 2 Mobility GPUs Based on Navi 22, Navi 23, Navi 24 SKUs Further Detailed – Die Sizes, TGPs, Clock Limits m

    This is just our theory but there are possibilities that a binned Navi 22 SKU can be adjusted within the Radeon RX 6800M branding considering the desktop lineup goes all the way up to Radeon RX 6900 series but due to the huge TGP of the Navi 21 GPU, the chip is probably never going to see a mobility variant. The same is the case with the NVIDIA Ampere GA102 GPU which might never get to see a mobility launch due to its large die size and huge power requirements. The GA104 or the rumored GA103 are perfect candidates for a flagship RTX 30 mobility GPU variant.

    AMD Radeon RX 6000 and NVIDIA GeForce RTX 30 supply not to improve until February?

    According to the report, NVIDIA and AMD may have trouble keeping up with the demand for its new graphics cards till at least February. While AMD has not made any public statements in regard to their supply situation, NVIDIA has repeatedly confirmed that global shortages of wafers, substrate, and components are affecting the availability of the Ampere graphics cards.
     
    Last edited: Dec 7, 2020
    unlogic and Tyranus07 like this.
  30. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    788
    Trophy Points:
    131
    The funny thing about that article is that even the 3080 beats the amd and isn't it at around the same price as the amd card? From the pov of the performance/price of the 3090 in that game it is atrocious for sure. I just wonder why no 4k or even 2k res were tested, it would give a better idea of what those cards were designed for and will be used for I think.
     
    joluke likes this.
  31. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    Well, my GTX1080 has finally met it's match with Cyberpunk 2077. For all the scrutiny on ampere cards, I think one big deciding factor has been overlooked: DLSS and the uplift it has on performance. With 2nd gen tensor cores RTX 3080 mobile could theoretically have a 2x framerate over GTX1080 in a game that supports it, and it's safe to say that with Cyberpunk the technology has officially broken through its niche barrier and gone mainstream. Could we see MXM ampere cards pushed out as upgrades on older Clevo notebooks? It's worth upgrading just for DLSS.
     
    seanwee likes this.
  32. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,312
    Trophy Points:
    431
    Turing cards also have DLSS, don't forget. I really don't think 3000 mobile will be much better than 2000 super cards. Cyberpunk also is not very well optimized, so it is also leading to issues for people trying to play.
     
  33. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    Turing does, but Ampere has 2nd gen tensor cores-allegedly with 2x the efficiency!
    Aside from some bugs, I've seen no issues with optimization in the game.. Smooth, even frame times and even usage on the CPU/GPU. It's not trash like say, Crysis Remastered. I can actually see where all the GPU usage is going, it's really a next gen game or atleast as close to one as we're going to get right now. Only issue is, a GTX 1080 gets 30-35 fps in crowded areas at medium settings on 1440p,sometimes dropping to 29.For the first time people without Ray Tracing GPUs are missing out on visuals too-the game has phenomenal RT features! This game is really made for ultra graphics+ultra RT+DLSS+30FPS lock.
     
  34. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,312
    Trophy Points:
    431
    “2x Effiency” maybe with Tensor cores but I would not describe Ampere as an otherwise efficient card, at all.
     
    seanwee likes this.
  35. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    It's not, but the rasterised performance will atleast be as good as Turing and DLSS will be better (hopefully!)
    Considering how well Turing does in rasterised games, maybe ampere won't need to be a whole lot better? I can see Devs pushing for DLSS and Nvidia heavily capitalizing on this with better drivers for almost every future release. They could outdo AMD yet, until they come up with their own solution.
     
  36. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
  37. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    It's not just Tensor cores/DLSS and RT cores/ray tracing. Even on the regular shaders, Pascal is outdated compared to modern GPU architectures. The GTX 1080 still has a decent amount of brute horsepower, but in CP2077, Turing's (and by extension Ampere's) improved shaders give it huge ~30% advantage.

    I assume this is because even compared to other modern games, CP2077 is very heavy on integer math. Unlike Pascal, Turing and Ampere have dedicated INT8 cores in addition to the normal FP32 CUDA cores, so they can execute both floating-point and integer instructions concurrently. Meanwhile, Pascal does integer math on its FP32 cores, and it can only execute one type of instruction at a time. So anytime it needs to do integer, it has to block the execution of float, which is a huge inefficiency compared to Turing/Ampere.

    To take an extreme example, the 1080 is between the 2070 and 2070S in Modern Warfare 2 Remastered, a game using an older DX11 engine:

    Desktop_2020_12_11_10_28_18_963.png

    However, in CP2077, the 1080 only reaches the level of the 1660 Ti!

    Desktop_2020_12_11_10_28_26_634.png

    I think going forward, Pascal is only going to fall further and further behind Turing and Ampere in games using modern engines and techniques. It is essentially getting "Kepler'ed" before our eyes--losing ground to newer GPUs not because of malicious gimping on Nvidia's part, but because its architecture is simply less suited to modern titles.
     
  38. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    NVIDIA GeForce RTX 3070 for laptops spotted videocardz.de | Today

    It is important to note that mobile graphics cards are now offered in many (actually too many) variants. We have full-fledged Max-P models, which are usually reserved for thicker laptop designs, and numerous Max-Q series which often feature TGP (total graphics power) targets varying by 30-40W

    The specifications of the RTX 30 mobile series remain unconfirmed, but it does appear NVIDIA will no longer match its desktop and mobile series with the same GPU. After all, the RTX 3080 desktop launched with a GA102 GPU, and such a big processor is not going to be offered in gaming laptops any time soon. It is expected that the RTX 3080 mobile could feature just as many cores as the desktop RTX 3070 SKU (5888) which is a cut-down variant. This would still leave room for an RTX 30 Mobile SUPER refresh later.


    upload_2020-12-11_20-19-51.png

    The RTX 3070 and RTX 3060 mobile are harder to predict, but one would guess that RTX 3070 mobile could have a similar number of CUDAs to desktop RTX 3060 Ti. According to the leaked benchmark, that card would still be slower than desktop SKU but that is mainly due to the mobile form factor power constraints.

    A long time ago, NVIDIA made the promise of bringing desktop-class performance to laptops... They have by this failed hard!
     
    Last edited: Dec 11, 2020
    jclausius, seanwee and Tyranus07 like this.
  39. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    788
    Trophy Points:
    131
    Aren't they hitting thermal limits for air cooling, on gaming laptops, even with the newest tech? I can understand why they keep the vast majority of gaming laptops at 1080p, less power needed to push out at that res for newer cards, slimmer laptops. It's annoying to get only 1080p for the money but at least you can connect to a higher res monitor and output it to that. Also it keeps cost down, margins up
     
  40. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    For how much longer do you suspect pascal cards are going to be relevant before they become antiques?
     
  41. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,312
    Trophy Points:
    431
    this was a stupid promise to make as the power, thermal, and space constraints between desktop and laptop will always be very different. The people that believed this are even more stupid.
     
    Aroc and Papusan like this.
  42. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    The 200W 980 Mobile come in front of the 980 desktop cards. But it was an single exception and this will never ever come back. We are now back to the dark days, but with loads of different crippled Mobile cards with the same name branding.
     
  43. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    My guess is when the cross-gen period ends, so in about 2 years.
     
    Aroc likes this.
  44. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    My 1080 DT is surviving in Cyberpunk although for the first time I wasn't able to play at my native 2k and dropped it down to HD. I had planned on moving over to Ampere but now I'm wondering if I can survive until the next thing? I have a feeling toward the end I'll be wishing I'd of just bit the bullet earlier but if I give in to that thought I'll probably regret not holding out for the next thing. Sigh... It would be nice if you could just have something reliable upon which you could hang your upgrade roadmap.
     
  45. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    Drop the resolution of volumetric effects down to low.
     
  46. joluke

    joluke Notebook Deity

    Reputations:
    1,040
    Messages:
    1,798
    Likes Received:
    1,217
    Trophy Points:
    181
    Follow this performance tweak guide: https://docs.google.com/document/d/...CaX_s5PiQ-ucnV-9bbDl7i0tn5muDu-2uGLagDsEe/pub
    Got it from a forum. Won't share the link here else I'd be banned for obvious reasons
     
    krabman likes this.
  47. Clamibot

    Clamibot Notebook Deity

    Reputations:
    645
    Messages:
    1,132
    Likes Received:
    1,567
    Trophy Points:
    181
    Not really. Nvidia was able to make desktop and laptop cards have the same level of performance with the Pascal family of cards. It was a reasonable expectation for Nvidia to continue doing that with their future GPU series.

    What's really stupid is that they actually delivered on this promise with the Pascal family of cards, but then their long and hard fought for progress regressed with the Turing family. Now it's getting worse with Ampere. Did Nvidia fire all the talented hardware engineers they had and replace them with people who settle for mediocrity rather than strive for excellence? That's what it seems like to me.
     
    seanwee likes this.
  48. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    The mobile 3070 matching a 2080Ti doesn't sound like a horrible thing.
     
  49. seanwee

    seanwee Father of laptop shunt modding

    Reputations:
    671
    Messages:
    1,920
    Likes Received:
    1,111
    Trophy Points:
    181
    The 3070 is fine. The 3080 is the sore spot here.

    Maybe if its boycotted then nvidia will do something.
     
  50. joluke

    joluke Notebook Deity

    Reputations:
    1,040
    Messages:
    1,798
    Likes Received:
    1,217
    Trophy Points:
    181
    Highly doubt it
     
← Previous pageNext page →