The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    How will Ampere scale on laptops?

    Discussion in 'Gaming (Software and Graphics Cards)' started by Kunal Shrivastava, Sep 6, 2020.

  1. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    With RTX 3080 and 3090 having 300-400 watts power draw on 3rd party boards will the performance delta between laptops and desktops be even greater than Turing? Assuming nvidia sticks with the Samsung 8nm node and not 7nm it's hard to imagine massive gains, especially since laptops prevent cramming in anything more than 200 watts. This seems to be the upper limit in alienware/MSI/Clevo because anything higher is very hard to cool down.
    Makes sense that performance of the Pascal generation was the closest among laptops and desktops, given that gtx 1080 FE had 180 watts of TDP.
    My guess is gains on laptops will be a lot more modest at around 30-40% over Turing, maybe 50-60% better rtx performance with the 2nd gen rt cores but not the 2x claim as seen on desktops.
    All this is for the 180-200w variants, which is rare because majority of the laptops opt for max-q where gains might be even lower. Ampere might not get enough breathing room on thin and light laptops to have a sizeable impact over Turing.
    Considering Turing still had a manageable TDP but some RTX 2080 max-q laptops were flat out beaten by overclocked desktop RTX 2060s I'm not overly optimistic on the RTX Ampere mobile launch, maybe nvidia should go back to the mobility naming scheme rather than mislead people into paying for desktop performance.
    Thoughts?
    IMG-20200907-WA0003.jpg
     
    Last edited: Sep 7, 2020
    Biker Gremling likes this.
  2. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    30-60% is still a good improvement compared with the "Super" upgrade. Also, I wonder whether there are diminishing returns in terms of performance/power. If so a GPU running at 400W wouldn't be twice as fast as a GPU running at 200W.
     
  3. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    I think Clevo and Alienware can handle 300W GPU easy, so I'm optimistic about having 250+W GPUs on the next gen of gaming laptops
     
    seanwee likes this.
  4. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    My dm3 can barely cool a 180 watt gtx 1080 (90 degrees and throttle on kryonaut: 1625-1810mhz). For power delivery I think Gallium Nitrite chargers are the future but I don't think even the best laptop vapour chambers can push 300w of heat. Unless there is a breakthrough in laptop liquid cooling or some such I just can't see 300 W happening on laptops.
     
    DreDre, BlueBomber and hfm like this.
  5. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    There already are 'diminishing returns' in laptop GPU's on NV side.
    It's been a month or so, but I've seen an article on price/performance ratio of mobile gpu's and how much performance you actually get in relation to desktop GPU's.

    The 'cutoff point' were RTX 2060/2070 and 5600m/5700m (though AMD wasn't featured in that article - which is strange enough considering their mobile Navi GPU's have identical specs to desktop counterparts and only slightly lower clocks and memory bandwidth - which would put them within 5-10% of each other.

    So, with very large power requirements on the desktop for Ampere, it remains to be seen how performance will scale with lower power envelopes.

    Vega 56 had initially very large power demands on desktop, but we know this was because its voltage was set too high from factory (because AMD does this to increase the number of functional dies)... and when Acer put Vega 56 into PH517-61, they retained 95% of its performance while dropping its TDP down to 120W (and the stock clocked version in my laptop isn't even reaching full 120W, which means I can overclock it easily on both core and HBM and gain about 10% performance or more before I hit the 120W limit - that's very efficient).

    Something similar could happen here... however, NV (to my knowledge) isn't known for overvolting their GPU's out of the factory to increase the number of functional dies like AMD does.

    That said, mobile GPU's are usually better binned than desktop versions, so NV will likely be largely dropping clocks and voltages by certain amounts to compensate for the lower power demands... not sure how much of a drop in performance will there be, but we'll see.
     
    Last edited: Sep 6, 2020
  6. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    Aren't tensor and RT cores the 'notoriously power hungry' part?
     
  7. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    Basically 320w can be shrunk
     
  8. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    That's kinda weird. Maybe you live in a pretty hot place? Where I live the ambient temperature is somewhere between 20 to 30ºC and at stock clock speed my GTX 1080 draws 200+W (I think 210W is the maximum I've seen in my system) and the cooling system keeps the temperature around 80ºC without throttling. Also based on what people reports on the Area 51m R2 that laptop also cools down the RTX 2080 Super under 80ºC while drawing 200+W. That's why I'm optimistic. When using SLI in my system @100% load, both GPUs draw 200+W each (400+W in total) and the system is able to keep the temperature under 85ºC without thermal throttling.
     
    seanwee and DreDre like this.
  9. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    It also comes down to how OEM's implement cooling in laptops.
    Bear in mind that you won't always see top notch cooling on laptops that use all AMD hw for example (my Acer PH517-61 is an exception)... OEM's prefer to invest more quality into Intel/NV for some reason despite AMD having equally or more efficient hw.
     
    seanwee likes this.
  10. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    Well no, ambient temperatures here are quite pleasant :) You probably have a vapour chamber on your SLI laptop, but even then I'm surprised at how you can get 85 on dual cards. From what I've read dual 1080 laptops can be sold as portable nuclear reactors.
     
  11. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    Yeah I think that the vapor chamber makes the trick, the cooling system on the GPU is pretty good in my laptop. It's a sad thing that in the current gen (PS4/XBOX ONE gen) almost any game supports SLI so it is like having your second engine always shut down on a ship. The other sad thing is that there is no easy way to increase the power limit (and hence be able to push the overclock) which is a shame because the cooling system can take more power (heat to dissipate)
     
  12. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    You mean this?

    http://forum.notebookreview.com/threads/mobile-pascal-tdp-tweaker-update-and-feedback-thread.806161/
     
  13. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    lol yeah, but there is no way I'm going to do that... back in the day was pretty simple with NVflash
     
  14. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,088
    Trophy Points:
    431
    Unless there is a major breakthrough in cooling for laptops, ampere is going to be lackluster for mobile. It has hefty power requirements, and it will be largely turing that will continue to feature due to lower TDP.

    The RTX 2080 200w for laptop should still be among the more powerful variations available. Even if they did shrink the 3070 and cap its TDP, it will be cut down enough to be barely "better" than the 2080. Perhaps the new mobile GPUs will have the same raster performance but just improved RTX.

    I think I am officially done with laptops for high end gaming. I will only focus on laptops for work and "mild" gaming at best.
     
  15. Prototime

    Prototime Notebook Evangelist

    Reputations:
    206
    Messages:
    639
    Likes Received:
    888
    Trophy Points:
    106
    That's exactly how I feel now too. If the mobile/desktop GPU performance gap had continued to shrink like it did with Pascal, or if competition in the mobile space had started to bring down prices, I might want to stick with a laptop for gaming. Neither has happened, and especially now with Ampere being so power hungry, I have little reason to buy a new gaming laptop instead of building a cheaper, more-powerful desktop. Besides, for portability purposes, even Pascal notebooks will still have a fair amount of life left in them for lighter gaming sessions.
     
    ryzeki and DreDre like this.
  16. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    I don't know I went for Pascal's 1060 and egpu for future power...losing 30 percent on a 4080 won't be all that bad and cheaper in the long run as gaming is moving towards 8k in the next decade and I'm more than happy with 1080p....but if this trend continues I'm also officially done with gaming laptops
     
  17. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    Maybe expect 3070 desktop/2080 Ti performance (3080 Max-P will probaly beat 3070 due the faster vRam).
    upload_2020-9-8_0-59-19.png
    [​IMG]

    Mobile graphics as Max-P is in short pushed back to low powered M branding with less performance vs desktop cards (my 980 N have higher TGP than nvidia 980 Desktop card and perform better with stock speed). Nvidia started screw up their whole Mobile graphics cards lineup when they pushed out Max-Q. We have now two low powered M cards. One more crippled than the other. Almost equal TGP for Mobile Max-P and desktop cards in same lineup is now thrown into oblivion and will most likely never come back. Thanks Nvida, you tried but failed.
     
    Last edited: Sep 7, 2020
    Normimb, seanwee and ryzeki like this.
  18. DreDre

    DreDre Notebook Consultant

    Reputations:
    175
    Messages:
    178
    Likes Received:
    275
    Trophy Points:
    76
    In my P870tmg and P870dm3 in both I have GTX 1080s in SLI. The Only way I was able to tame temperatures in both of them was to use liquid metal. Both of them have the vapor chamber cooling but without liquid metal temperatures stayed in the 90s while gaming. I'm confident I am pulling the same wattage you are but I can't check myself at this point without a wattage meter.
     
  19. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    Double the dies is much easier to cool than single graphics card running +300W. Nvidia increased the cooling capacity for Ampere desktop cards vs last gen. And notebooks being slimmer and thinner so don't expect miracle cooling in next years gamingbooks (Not so easily put in a bigger truck cooling into an small Mini Cooper) :D
     
    Last edited: Sep 7, 2020
    jclausius likes this.
  20. Clamibot

    Clamibot Notebook Deity

    Reputations:
    645
    Messages:
    1,132
    Likes Received:
    1,567
    Trophy Points:
    181
    Since the 3070 has a TGP of 220 watts, wouldn't we get near desktop level performance with this card in laptops? I'm assuming the card would be limited to 200 watts in its Max-P variant. That makes the most sense to me.

    The 3080 doesn't even seem worth releasing on laptops since performance would be crippled immensely vs the 3070 or lower. The 3070 and any card below it should be capable of desktop level performance in their respective Max-P variants.

    The 3070 should be the top dog for laptops this generation. Keep performance within 5% of desktop levels with the 3050, 3060, and 3070, and we'll have some wicked awesome laptops. The 3080 should be relegated to desktops only.
     
  21. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    This.

    Pascal did well on mobile because they perfected their shader cores over multiple generations. With fancy Tensor and RT cores taking upwards of 50% die space it's no wonder TDP went up again with Turing.

    Case in point- Laptop GTX1660ti and RTX2060, not even talking max-q here. 1660ti is very close to desktop, 2060 lags by quite a bit. Look at how close the rasterized performance is between both these cards on laptops :



    It all comes down to the exotic 1st,2nd gen hardware these cards are packing.
     
    Prototime likes this.
  22. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    3080 come with GDDR6X (19 Gbps) while 3070 have GDDR6 running14 Gbps
    Why? With 3070 as top dog graphics for notebooks we would still pay same high prices but for lower specs. You want to pay same for less? :D
     
  23. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Yep, laptops are more expensive. It's the price of portability.
    Best max performance - > desktop
    Best performance / price - > desktop
    Portability - > laptop
     
    Normimb, Aroc and Papusan like this.
  24. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    I don't know man, maybe I was lucky with the silicon of both my 1080s. I did run 3Dmark for several minutes in a bucle and got this temps and power drawing:

    [​IMG]

    Nothing special on both GPU, thermal grizzly and generic thermal pads.
     
    Papusan, DreDre and Kunal Shrivastava like this.
  25. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    I don't know if I would call it exotic anymore. All the consoles and current generation GPU are going to include RT hardware. It's the new normal.
     
    Aroc and etern4l like this.
  26. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,088
    Trophy Points:
    431
    If they do make Max P models with 200W then those will surely have the near or same performance as desktop. But we know nvidia likes their 150w, 115w and lower targets for laptops. Not to mention every manufacturer was too busy making ever thinner and lower wattage devices, they are fully unprepared for new high watt devices in both CPUs and GPUs.

    Heh, we will even have a bit of issues with nvme, as it becomes ever more performant requiring better cooling, and those drives will simply not work with the current laptop market haha.

    Could big, strong and bulky laptops make a return? It would be cool if they stepped away from soldered on CPUs/GPUs haha, full return to actual portable full performance PCs.

    I got carried away. They will make a full 3090 GPU core capped at 80w, call it the best ever, running at 500mhz and watch it be perform like a 2070 on laptops.
     
    Normimb, Prototime and Papusan like this.
  27. ratchetnclank

    ratchetnclank Notebook Deity

    Reputations:
    1,084
    Messages:
    1,506
    Likes Received:
    900
    Trophy Points:
    131
    Obviously not we will just have thermally throttled cpus/gpus and ssds in devices but they will have lots of lights on them so it's fine.
     
    DreDre likes this.
  28. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    Lol.
    Next gen consoles will outdo most 3k laptops, what a shame..
     
    etern4l likes this.
  29. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    Looks like 200W is the middle way for Performance vs crippled power limits for 3080 Max-P. The 3080 Max-Q variants will be heavily crippled. Alienware started last year with Max-P@150W for thin. I expect more ODM/OEMs follow the same paths.
    upload_2020-9-15_17-13-43.png
    NVIDIA GeForce RTX 3080 cryptocurrency performance videocardz.de
    MSI think so. I wonder how hard the ssd throttling will be if you max out the PCIe 4.0 SSDs

    MSI Prestige and Summit laptops add 11th-gen Tiger Lake, PCIe 4.0 SSDs and more pcworld.com

    MSI's new Summit laptops aim at business, while the Prestige gets colorful. And yes, it's an Oprah "You get PCIe 4.0 SSD too" moment.

    Like MSI’s Stealth 15M, the company will take advantage of the faster PCIe 4.0 in the 11th-gen Tiger Lake by including x4 PCIe 4.0 SSDs in the Summit E15, E14, and B14. In the Summit B15, MSI pairs a PCIe 3.0 NVMe SSD with a PCIe 4.0 SSD.
    The thin and slimy direction can't or won't be changed. All they will do is cap the TGP. Design will always come before functionality.

    See my older post... http://forum.notebookreview.com/threads/nvidia-thread.806608/page-248#post-11045765
    [​IMG]
     
    Last edited: Sep 15, 2020
    Normimb, DreDre and etern4l like this.
  30. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    225W looks like a sweet spot there for a laptop's GPU. The performance drop is pretty small but the power consumption is a lot less
     
  31. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    This has always been the case when mining cryptocurrencies. That's why miners underclock and undervolt their GPUs. But hash rate is not indicative of gaming performance.
     
  32. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    I would rather say 210W. 1.1% less performance vs. 7% higher power consumption with 225W.
    What you expect the gaming performance will drop going from 320W down to 210W ?
     
    Normimb and etern4l like this.
  33. Clamibot

    Clamibot Notebook Deity

    Reputations:
    645
    Messages:
    1,132
    Likes Received:
    1,567
    Trophy Points:
    181
    I find those results very interesting. If we can get about 98% of the performance of the card at only 65% of the rated maximum power draw, this means laptops won't be screwed for performance. Of course this is mining and not gaming performance, but I'd expect results to be about the same. For gaming, you can normally get 70% of the performance at half the rated maximum power draw.

    If we extrapolate the results from the crypto mining chart Papusan provided to gaming performance, it seems pretty logical to me that we could get 98% of the performance of a 3080 at 65% of the power draw. This is because the vast majority of games rely exclusively on rasterization performance from the graphics cores. 210 watts is probably enough to run all 8704 cores at almost full throttle and supply minimal power to the raytracing and tensor cores for idling.

    In raytraced games, the perfomance deficit over the desktop cards will probably be significant since you won't be able to fully load all the hardware (graphics cores, raytracing cores, and tensor cores) all at the same time with a 210 watt power budget, but performance in games without raytracing should be about the same.
     
    seanwee and JRE84 like this.
  34. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    And that NBR is what we need more of, unbelievable. You sir made my day when you used the scientific method to come up with a conclusion as well as advanced deductive reasoning. We need more can do's vs can nots
     
    Clamibot likes this.
  35. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Odd. The source article lacks details. They set the power limit, so what? Would the mining algo be capable of causing this particular graphics card to draw more than in this case 225W of power?
     
    JRE84 likes this.
  36. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Compute loads like mining don’t cause the card the draw that much power because much of the graphics hardware lies dormant. The majority of die space is still dedicated to graphics, not compute, as these are gaming cards after all.
     
    hfm, etern4l and JRE84 like this.
  37. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    I was about to say this and happened upon your post finally. :)
     
    etern4l likes this.
  38. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Hmm, I thought the majority of die space is taken by the SMs, which are used by both compute and graphics tasks.

    Anyway, the crypto thing is just one weak data point. A lot could have gone wrong:
    1) PBUAK during testing
    2) Driver issues
    3) Crypto software issues
    Etc. Not sure how reliable the source is either.
     
    JRE84 likes this.
  39. Clamibot

    Clamibot Notebook Deity

    Reputations:
    645
    Messages:
    1,132
    Likes Received:
    1,567
    Trophy Points:
    181
    I was under the impression that the CUDA cores could be used for either graphics or compute workloads.
     
  40. Tyranus07

    Tyranus07 Notebook Evangelist

    Reputations:
    218
    Messages:
    570
    Likes Received:
    331
    Trophy Points:
    76
    I find pretty interesting that power consumption and performance don't scale 1:1.

    [​IMG]

    The RTX 2080 Max-Q TDP has a 40% of the TDP of a full desktop RTX 2080 but in performance the Max-Q has between 67% to 97% of the performance of the desktop version depending on the test. 80% of the performance in average.

    That makes me think that a 210W 3080 for laptops could achieve over 90% of the performance of the 320W desktop version
     
  41. Clamibot

    Clamibot Notebook Deity

    Reputations:
    645
    Messages:
    1,132
    Likes Received:
    1,567
    Trophy Points:
    181
    The main reason power consumption and frequency don't scale 1:1 is because the voltage required to sustain a given frequency increases at an exponential rate.

    The formula for power consumption of a CPU is p = CV^2f, where p is the power in joules, C is the load capacitance, V is the voltage, and f is the frequency.

    There are other factors that contribute to increasing power consumption at a more than linear rate. Since more power is required to run at higher frequencies, that generates more heat, and heat increases electrical resistance. Then there's power leakage at the transistor level.

    There's probably more factors, and they all add up to increasing the power consumption at more than a linear rate relative to frequency.
     
    Biker Gremling likes this.
  42. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    There will always be some red herring examples of two different graphics cards yielding about the same performance - it just means that the given use case cannot utilise the faster hardware well. The 67% is what matters. 60% power and 67% performance is close to linear scaling. So with RTX 3080 mobile at 210W / 65% of max power there are few reasons to believe a miracle will happen and it will provide 90 or 97% of full desktop card performance in relevant applications, i. e. ones that can fully utilise the faster hardware. Yes, the performance gap will probably be smaller in non-RT games. Would just wait/look for some TimeSpy benchmark results at different power limits for a more reliable guess on that.
     
    Last edited: Sep 16, 2020
    Biker Gremling and JRE84 like this.
  43. BrightSmith

    BrightSmith Notebook Evangelist

    Reputations:
    143
    Messages:
    640
    Likes Received:
    383
    Trophy Points:
    76
    There's always the possibility that 1080p remains the standard for gaming on laptops for a while longer (or finally 2K) because of its screen size, so they can afford to gimp mobile GPUs.
     
    thegreatsquare and JRE84 like this.
  44. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    CPU bottlenecks will be even more obvious at that resolution.
     
  45. BrightSmith

    BrightSmith Notebook Evangelist

    Reputations:
    143
    Messages:
    640
    Likes Received:
    383
    Trophy Points:
    76
    Precisely, which lowers the incentive even more to upgrade from 2080 to a 3080 if you're gaming at 1080p on your laptop.
     
  46. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Guys, is this a wishful thinking RTX 20 series owner's club? Which modern games running at 1080p and Ultra settings are bottlenecked by the CPU?
     
  47. Kunal Shrivastava

    Kunal Shrivastava Notebook Consultant

    Reputations:
    82
    Messages:
    110
    Likes Received:
    91
    Trophy Points:
    41
    Battlefield 5 on anything lower than i7 8700.
     
  48. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Anything lower than i7-8700 is not really an option in reasonable gaming laptops.
     
    seanwee likes this.
  49. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    FS2020, Crysis Remastered, Ubisoft open-world games (WD2, AC Origins/Odyssey, FC5), KCD.
     
    Biker Gremling likes this.
  50. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,931
    Messages:
    3,535
    Likes Received:
    3,507
    Trophy Points:
    331
    Well done digging up some examples, but it's not clear if Crysis Remastered qualifies as a modern game.

    https://www.dsogaming.com/pc-perfor...eaded-cpu-issues-just-like-the-original-game/

    There will always be some games which are classed as "bottlenecked by the CPU", but in many, if not all, cases (as with Crysis) this will just be a case of a legacy engine incapable of utilising modern multicore CPUs. A true bottleneck would entail the game running a modern 6 core + CPU at full power.
     
 Next page →