Considering how well Ampere seems to undervolt, I'd expect that to happen. I believe the point it starts to get really inefficient is above 1850 Mhz, and it costs an extra 100 watts to push to 2000 Mhz (for the 3080). I'm fine taking a minor (10% or less) performance hit if it cuts the power draw by a third.
-
Remember a bellwether is an individual who either leads or indicates trends;Normimb likes this. -
-
Doesn't some of the custom 3080 desktop cards come with 380W TGP? 150W cards will be around 2.55 times less TGP. And not so sure the 20GB 3080 "if it happens" will be on 320W. All boils down to Big Navy.Last edited: Oct 27, 2020 -
Thats all assuming i was daft enough to get an alienware in the first place. -
150w+15% = 172W
200W+15% = 230W
And you have the unified heatsink. More heat from Gpu will affect Cpu and so on. Imagine the desktop graphics card had to share the cooling with the Cpu in a desktop. Nasty? -
Heatsinks aren't built based on the tdp requirement. They are built to the best the laptop chassis can fit then sometimes downgraded and downcosted (smaller less heatpipes [Asus Zephyrus], less heatpipes [MSI GE, Asus Scar]) based on the sku. -
Based on the fact that TGP of the 3070 is around 220W and TGP of the 2080Ti is around 250W, and the performance of the two is about equal, it doesn't really seem like the 3070 is that much more power efficient at the same performance level. Are we really going to see Ampere perform that much better than Turing at the same TDP? Seems like it might be marginal improvement on efficiency but nothing staggering unless I'm completely misunderstanding something. Seems like around 13% power improvement for the same performance level.
The big draw of the Ampere desktop cards seems to be that they are WAY CHEAPER for the same or better performance level as Turing, it doesn't really look like the perf/watt is that much better.
EDIT: also the 2080Ti is powering 3GB more RAM.. that's even worse as an 8GB 2080Ti would shave a small amount off that 250W and probably not see any performance decrease if they could keep the memory bus the same width. -
I’ve been saying this since the ampere presentation. All they did was feed the core more power compared to Turing. The efficiency gains are so small, which means laptops are going to see probably the worst “improvement” in a “generation”. Fermi to Kepler, Kepler to Maxwell, and Maxwell to Pascal were huge. Pascal to Turing was meh and now Turing to Ampere is even more meh when you look at performance vs power consumption -
GaN chargers are here so much more powerful adapters can be made without making them bigger. And pathetic laptop heatsinks need to go, they can be much much better than what we have now. We are far from any physics limitations, just limitations in the aspect of innovation.hfm likes this. -
Probably why they are chomping at the bit to go TSMC 7nm to squeak out a little more power efficiency.
-
More like +20% and will increase up to around 25% power efficiency depending on the resolution.
https://www.igorslab.de/nvidia-gefo...mpere-kann-auch-sparsam-und-knuffig-klein/12/
seanwee likes this. -
At any rate, so.. TDP for TDP and frame for frame... we may be looking at roughly 30% or so more performance at the exact same TDP, which would be interesting, though not sure it would be enough for people to suddenly dump their Turing notebooks.
The decision is probably more clear on desktop where TDP doesn't mean as much if people already have the PSU overhead available. I guess if you also had to buy a new PSU it might make the value proposition a little less interesting. -
Kunal Shrivastava Notebook Consultant
-
Undervolted AMD RX 6800 with 183 W TGP offers clues about an upcoming RX 6800M laptop-grade GPU matching the desktop RTX 2080 Ti
http://forum.notebookreview.com/thr...00-gpus-on-oct-28.834078/page-8#post-11059699
From the link...
We know that Nvidia is already planning to release mobile versions for the RTX 3000 series, but we are not really sure if Team Green can provide decent performance gains over the 2080-series given the limitations imposed by the 8 nm fabrication process from Smasmung. AMD could have a big advantage here, but it also needs to be mindful of the pricing schemes.
Maybe we will see a huge failure both from Nvidia and AMD for coming gamingbooksWouldn't it be nice with yesterday's desktop performance in tomorrows notebooks, LOOL
TBoneSan likes this. -
Rumors....
NVIDIA GeForce RTX 3070 / 3080 Mobile for AMD Ryzen 5000H CPUs videocardz.de
3080 Mobile as castrated chips. Back to M branded chips (not equal the desktop chips as Maxwell-Pascal-Turing), but with the Max-Q stinker as an awful attachment in the M graphics lineup. Now with double M vs the old days. Nice, Nvidia sneaked in Max-Q to increase their already high profit. This to be able to help the notebook manufacturers create Apple designed gamingbooks.
Exclusive: NVIDIA GeForce RTX 3000 Mobility Series Lineup, Roadmap 1H 2021 And AMD CPU Support! wccftech.com
We have been told that the RTX 3080 and RTX 3070 mobility GPUs will be fast-tracked to land in January 2021, however, the GPUs will initially only be available in ASUS and MSI flavors. HP, Dell, and Lenovo have reportedly decided to wait until April to support these new GPUs as they want to pair it with launches that are happening then.
My source also mentioned that they expect the RTX 30 mobility series to have a very short life cycle (considering there are rumors of NVIDIA planning to shift to 7nm TSMC for its SUPER variants of the RTX 30 lineup).
Last edited: Nov 24, 2020jclausius, joluke, GrandesBollas and 5 others like this. -
ratchetnclank Notebook Deity
-
Gesendet von meinem GM1913 mit Tapatalk -
-
jclausius, seanwee, Normimb and 1 other person like this.
-
https://www.reddit.com/r/nvidia/comments/jn878f/small_beginners_undervolting_guide_for_rtx_3070_fe/
This one says that its around 140W for 1900Mhz:
https://www.reddit.com/r/MSI_Gaming/comments/jkvdwd/msi_rtx_3070_ventus_2x_oc_undervolt_results_wow/
And 175-180W for 1800-1900Mhz:
https://www.reddit.com/r/nvidia/comments/jkdib3/quick_rtx_3070_fe_undervolting_results_85_pl187w/
180W after undervolt for around 1800Mhz, didn't really expect the GPU to be THAT power hungry, more or less it seems :/
So, considering the rumored 80W from those websites, what are those 3070 going to clock in, 1200Mhz, with maybe 1300Mhz boost, or is Nvidia going to use the good 7nm dies(not even sure if they make the GA104 in 7nm..) for laptops?..
That or saving pretty good dies, but even 140W for 1900Mhz, how will that end after being capped to 80W :/ -
-
Otherwise, yes, I agree they are being misleading and are doing false advertising (and yet, no one is doing anything to stop them). -
-
-
yrekabakery Notebook Virtuoso
-
-
You'll need to wait for the normal 3080 for mobile and see how they behave.. MAX-Q 3080 is basically a 3070 or less..
Last edited: Dec 1, 2020 -
joluke likes this.
-
HaloGod2012 Notebook Virtuoso
Benchmarks today show the 2080 super and 3060ti being roughly the same and trading slight blows. If the mobile 3080 is doing desktop 3060ti performance then I see little reason to upgrade. -
If it does, sucks ass -
-
-
I wouldn't be amazed if the RTX cards for Clevo's would push 250w to be honest since we have 2 bricks for the X170SM-A.
-
The 200W Mobile 3080 (not the crippled Max-Q variant) should easly beat 2080 Ti
https://www.kitguru.net/components/...nvidia-rtx-3060-ti-founders-edition-review/4/Scerate likes this. -
Kunal Shrivastava Notebook Consultant
-
Kunal Shrivastava Notebook Consultant
@joluke
Bro, and how did you get a stable 200mv undervolt on that CPU?! I can't for the life of me push past 120 on my 9900kf (Crash in Battlefield) -
The trend for gamingbooks go the opposite way, bruh. Thin, slimmer, and neat disgusting as Applebooks. See average of class Gaming.
Last edited: Dec 2, 2020 -
-
-
-
-
See the power consumption between 3070 vs3060 Ti. Opposite vs. specs.
https://www.guru3d.com/articles_pages/geforce_rtx_3060_ti_founder_edition_review,6.html
Edit.
200W Mobile 2080 Super scores around 10800 in Time Spy graphics. The 3060 Ti around 12000. I doubt that a major graphics upgrade only will provide around 12.5% increase in graphics performanceGoing from Turing to Ampere isn't the same as from 2080 to 2080 Super Mobile
Last edited: Dec 2, 2020jclausius, Scerate, joluke and 1 other person like this. -
https://www.igorslab.de/en/nvidia-g...way-and-point-on-quiet-path-from-this-claw/9/
The biggest problem to me is the power limit of 200W. it wouldn't be the first time that Laptop users get so minimal performance improvement over generations. I remember, 780M ---> 880M, or 1080 ---> 2080 (150W)
Ampere is a huge jump in performance but also is a huge jump in power consumption. From 215W for the RTX 2080 to 320 W for the RTX 3080. The real jump in performance in laptops might come with the 7nm architecture in mid 2021. Probably with 7nm Nvidia would be able to get RTX 3070/RTX 2080 Ti performance at 200WScerate, seanwee, Papusan and 1 other person like this. -
To be honest I'm really considering getting a RTX 2080 for my P775DM3-G and a 9900KF and just invest the rest on a desktop. If it's going to be such a crappy leap of performance (12%) it's not worth to give so much money for a new laptop. We'll see
-
This is also from Igors lab. Undevolted 3070. Expect binned silicon for mobile graphics will run even more effecient... http://forum.notebookreview.com/thr...scale-on-laptops.834039/page-12#post-11054740
Ampere Architecture Whitepaper -
-
How will Ampere scale on laptops?
Discussion in 'Gaming (Software and Graphics Cards)' started by Kunal Shrivastava, Sep 6, 2020.