With RTX 3080 and 3090 having 300-400 watts power draw on 3rd party boards will the performance delta between laptops and desktops be even greater than Turing? Assuming nvidia sticks with the Samsung 8nm node and not 7nm it's hard to imagine massive gains, especially since laptops prevent cramming in anything more than 200 watts. This seems to be the upper limit in alienware/MSI/Clevo because anything higher is very hard to cool down.
Makes sense that performance of the Pascal generation was the closest among laptops and desktops, given that gtx 1080 FE had 180 watts of TDP.
My guess is gains on laptops will be a lot more modest at around 30-40% over Turing, maybe 50-60% better rtx performance with the 2nd gen rt cores but not the 2x claim as seen on desktops.
All this is for the 180-200w variants, which is rare because majority of the laptops opt for max-q where gains might be even lower. Ampere might not get enough breathing room on thin and light laptops to have a sizeable impact over Turing.
Considering Turing still had a manageable TDP but some RTX 2080 max-q laptops were flat out beaten by overclocked desktop RTX 2060s I'm not overly optimistic on the RTX Ampere mobile launch, maybe nvidia should go back to the mobility naming scheme rather than mislead people into paying for desktop performance.
Thoughts?
![]()
-
Kunal Shrivastava Notebook Consultant
-
30-60% is still a good improvement compared with the "Super" upgrade. Also, I wonder whether there are diminishing returns in terms of performance/power. If so a GPU running at 400W wouldn't be twice as fast as a GPU running at 200W.
-
I think Clevo and Alienware can handle 300W GPU easy, so I'm optimistic about having 250+W GPUs on the next gen of gaming laptops
seanwee likes this. -
Kunal Shrivastava Notebook Consultant
DreDre, BlueBomber and hfm like this. -
It's been a month or so, but I've seen an article on price/performance ratio of mobile gpu's and how much performance you actually get in relation to desktop GPU's.
The 'cutoff point' were RTX 2060/2070 and 5600m/5700m (though AMD wasn't featured in that article - which is strange enough considering their mobile Navi GPU's have identical specs to desktop counterparts and only slightly lower clocks and memory bandwidth - which would put them within 5-10% of each other.
So, with very large power requirements on the desktop for Ampere, it remains to be seen how performance will scale with lower power envelopes.
Vega 56 had initially very large power demands on desktop, but we know this was because its voltage was set too high from factory (because AMD does this to increase the number of functional dies)... and when Acer put Vega 56 into PH517-61, they retained 95% of its performance while dropping its TDP down to 120W (and the stock clocked version in my laptop isn't even reaching full 120W, which means I can overclock it easily on both core and HBM and gain about 10% performance or more before I hit the 120W limit - that's very efficient).
Something similar could happen here... however, NV (to my knowledge) isn't known for overvolting their GPU's out of the factory to increase the number of functional dies like AMD does.
That said, mobile GPU's are usually better binned than desktop versions, so NV will likely be largely dropping clocks and voltages by certain amounts to compensate for the lower power demands... not sure how much of a drop in performance will there be, but we'll see.Last edited: Sep 6, 2020DreDre, Eclipse251, etern4l and 1 other person like this. -
Kunal Shrivastava Notebook Consultant
-
Basically 320w can be shrunk
-
-
Bear in mind that you won't always see top notch cooling on laptops that use all AMD hw for example (my Acer PH517-61 is an exception)... OEM's prefer to invest more quality into Intel/NV for some reason despite AMD having equally or more efficient hw.seanwee likes this. -
Kunal Shrivastava Notebook Consultant
You probably have a vapour chamber on your SLI laptop, but even then I'm surprised at how you can get 85 on dual cards. From what I've read dual 1080 laptops can be sold as portable nuclear reactors.
-
Yeah I think that the vapor chamber makes the trick, the cooling system on the GPU is pretty good in my laptop. It's a sad thing that in the current gen (PS4/XBOX ONE gen) almost any game supports SLI so it is like having your second engine always shut down on a ship. The other sad thing is that there is no easy way to increase the power limit (and hence be able to push the overclock) which is a shame because the cooling system can take more power (heat to dissipate)
-
Kunal Shrivastava Notebook Consultant
-
lol yeah, but there is no way I'm going to do that... back in the day was pretty simple with NVflash
-
Unless there is a major breakthrough in cooling for laptops, ampere is going to be lackluster for mobile. It has hefty power requirements, and it will be largely turing that will continue to feature due to lower TDP.
The RTX 2080 200w for laptop should still be among the more powerful variations available. Even if they did shrink the 3070 and cap its TDP, it will be cut down enough to be barely "better" than the 2080. Perhaps the new mobile GPUs will have the same raster performance but just improved RTX.
I think I am officially done with laptops for high end gaming. I will only focus on laptops for work and "mild" gaming at best.hertzian56, Normimb, Biker Gremling and 3 others like this. -
-
I don't know I went for Pascal's 1060 and egpu for future power...losing 30 percent on a 4080 won't be all that bad and cheaper in the long run as gaming is moving towards 8k in the next decade and I'm more than happy with 1080p....but if this trend continues I'm also officially done with gaming laptops
-
Mobile graphics as Max-P is in short pushed back to low powered M branding with less performance vs desktop cards (my 980 N have higher TGP than nvidia 980 Desktop card and perform better with stock speed). Nvidia started screw up their whole Mobile graphics cards lineup when they pushed out Max-Q. We have now two low powered M cards. One more crippled than the other. Almost equal TGP for Mobile Max-P and desktop cards in same lineup is now thrown into oblivion and will most likely never come back. Thanks Nvida, you tried but failed.Last edited: Sep 7, 2020 -
-
Last edited: Sep 7, 2020jclausius likes this.
-
Since the 3070 has a TGP of 220 watts, wouldn't we get near desktop level performance with this card in laptops? I'm assuming the card would be limited to 200 watts in its Max-P variant. That makes the most sense to me.
The 3080 doesn't even seem worth releasing on laptops since performance would be crippled immensely vs the 3070 or lower. The 3070 and any card below it should be capable of desktop level performance in their respective Max-P variants.
The 3070 should be the top dog for laptops this generation. Keep performance within 5% of desktop levels with the 3050, 3060, and 3070, and we'll have some wicked awesome laptops. The 3080 should be relegated to desktops only. -
Kunal Shrivastava Notebook Consultant
Pascal did well on mobile because they perfected their shader cores over multiple generations. With fancy Tensor and RT cores taking upwards of 50% die space it's no wonder TDP went up again with Turing.
Case in point- Laptop GTX1660ti and RTX2060, not even talking max-q here. 1660ti is very close to desktop, 2060 lags by quite a bit. Look at how close the rasterized performance is between both these cards on laptops :
It all comes down to the exotic 1st,2nd gen hardware these cards are packing.Prototime likes this. -
Normimb, Aroc, jclausius and 1 other person like this. -
-
Nothing special on both GPU, thermal grizzly and generic thermal pads.Papusan, DreDre and Kunal Shrivastava like this. -
-
Heh, we will even have a bit of issues with nvme, as it becomes ever more performant requiring better cooling, and those drives will simply not work with the current laptop market haha.
Could big, strong and bulky laptops make a return? It would be cool if they stepped away from soldered on CPUs/GPUs haha, full return to actual portable full performance PCs.
I got carried away. They will make a full 3090 GPU core capped at 80w, call it the best ever, running at 500mhz and watch it be perform like a 2070 on laptops. -
ratchetnclank Notebook Deity
DreDre likes this. -
Kunal Shrivastava Notebook Consultant
Lol.
Next gen consoles will outdo most 3k laptops, what a shame..etern4l likes this. -
NVIDIA GeForce RTX 3080 cryptocurrency performance videocardz.de
MSI Prestige and Summit laptops add 11th-gen Tiger Lake, PCIe 4.0 SSDs and more pcworld.com
MSI's new Summit laptops aim at business, while the Prestige gets colorful. And yes, it's an Oprah "You get PCIe 4.0 SSD too" moment.
Like MSI’s Stealth 15M, the company will take advantage of the faster PCIe 4.0 in the 11th-gen Tiger Lake by including x4 PCIe 4.0 SSDs in the Summit E15, E14, and B14. In the Summit B15, MSI pairs a PCIe 3.0 NVMe SSD with a PCIe 4.0 SSD.
See my older post... http://forum.notebookreview.com/threads/nvidia-thread.806608/page-248#post-11045765
Last edited: Sep 15, 2020 -
-
yrekabakery Notebook Virtuoso
-
-
I find those results very interesting. If we can get about 98% of the performance of the card at only 65% of the rated maximum power draw, this means laptops won't be screwed for performance. Of course this is mining and not gaming performance, but I'd expect results to be about the same. For gaming, you can normally get 70% of the performance at half the rated maximum power draw.
If we extrapolate the results from the crypto mining chart Papusan provided to gaming performance, it seems pretty logical to me that we could get 98% of the performance of a 3080 at 65% of the power draw. This is because the vast majority of games rely exclusively on rasterization performance from the graphics cores. 210 watts is probably enough to run all 8704 cores at almost full throttle and supply minimal power to the raytracing and tensor cores for idling.
In raytraced games, the perfomance deficit over the desktop cards will probably be significant since you won't be able to fully load all the hardware (graphics cores, raytracing cores, and tensor cores) all at the same time with a 210 watt power budget, but performance in games without raytracing should be about the same. -
Clamibot likes this.
-
JRE84 likes this.
-
yrekabakery Notebook Virtuoso
-
etern4l likes this.
-
Anyway, the crypto thing is just one weak data point. A lot could have gone wrong:
1) PBUAK during testing
2) Driver issues
3) Crypto software issues
Etc. Not sure how reliable the source is either.JRE84 likes this. -
I was under the impression that the CUDA cores could be used for either graphics or compute workloads.
-
I find pretty interesting that power consumption and performance don't scale 1:1.
The RTX 2080 Max-Q TDP has a 40% of the TDP of a full desktop RTX 2080 but in performance the Max-Q has between 67% to 97% of the performance of the desktop version depending on the test. 80% of the performance in average.
That makes me think that a 210W 3080 for laptops could achieve over 90% of the performance of the 320W desktop version -
The formula for power consumption of a CPU is p = CV^2f, where p is the power in joules, C is the load capacitance, V is the voltage, and f is the frequency.
There are other factors that contribute to increasing power consumption at a more than linear rate. Since more power is required to run at higher frequencies, that generates more heat, and heat increases electrical resistance. Then there's power leakage at the transistor level.
There's probably more factors, and they all add up to increasing the power consumption at more than a linear rate relative to frequency.Biker Gremling likes this. -
Last edited: Sep 16, 2020Biker Gremling and JRE84 like this.
-
BrightSmith Notebook Evangelist
There's always the possibility that 1080p remains the standard for gaming on laptops for a while longer (or finally 2K) because of its screen size, so they can afford to gimp mobile GPUs.
thegreatsquare and JRE84 like this. -
Kunal Shrivastava Notebook Consultant
-
BrightSmith Notebook Evangelist
-
Guys, is this a wishful thinking RTX 20 series owner's club? Which modern games running at 1080p and Ultra settings are bottlenecked by the CPU?
-
Kunal Shrivastava Notebook Consultant
-
seanwee likes this.
-
yrekabakery Notebook Virtuoso
Biker Gremling likes this. -
https://www.dsogaming.com/pc-perfor...eaded-cpu-issues-just-like-the-original-game/
There will always be some games which are classed as "bottlenecked by the CPU", but in many, if not all, cases (as with Crysis) this will just be a case of a legacy engine incapable of utilising modern multicore CPUs. A true bottleneck would entail the game running a modern 6 core + CPU at full power.
How will Ampere scale on laptops?
Discussion in 'Gaming (Software and Graphics Cards)' started by Kunal Shrivastava, Sep 6, 2020.