Not this bad![]()
You nailed it. Back to old days sins. Now with M branded crippled cards in all sorts of flavours. Why not just brand all the new mobile cards Max-Q but with differenced TGP ? Max-P is in short gone after Nvidia went back to cut down silicon. They could use same die for desktop and Max-P but nope. The mobile cards is their milk cow.
Will be fun see how mobile 3080 will compete with 270W 3070 desktop cards
https://videocardz.com/newz/emtek-a...080-and-rtx-3070-hv-black-monster-cod-edition
It looks like Nvidia want to destroy all the mobile cards for this generation.
NVIDIA GeForce RTX 3060 Mobile would actually feature fewer Streaming Multiprocessors than its Turing predecessor. The card is allegedly equipped with 3072 CUDA cores which means it has 24 SMs as opposed to 30 enabled on RTX 3060
https://videocardz.com/newz/alleged...and-rtx-3060-mobile-gpu-specifications-emerge
-
-
electrosoft Perpetualist Matrixist
Agreed, this is how it was for quite some time (mobile parts neutered / down clocked desktop parts) till we had a brief window of somewhat thermal and consumptive parity to have equal or near desktop levels of performance in our mobile rigs.
With Nvidia pushing Ampere to the edge power consumption wise (thanks AMD!), there is no way we're going to cram 350w+ GPU parts in a laptop.
It was kind of to be expected. -
they may be crippled but wont it still be like a 40 percent improvement
-
yrekabakery Notebook Virtuoso
-
This doesn't mean they needed have to use a cut down silicon for 3080 flagship. And neither the slower vRam chips. Just cap the boost clock as they always have done. But this doesn't help boost the sale of next gen 4xxx mobile cards. Cripple it massive so people really want the next scam. I'm sure they got a lot love from the Notebook manufacturers for the Ampere mobile lineup. Doomed to help increase sales of next gen gamingbooks.
raz8020 likes this. -
why so little I thought the 3080 is like a desktop 2080 super or ti core wise and clock speeds.
also I should be asking what are you basing your estimates off of? could you post a link 20 percent seems like a bad move for nvidia as they are in the market to make money and if it was only 20 percent I dont see many people jumping ship.
im not saying your wrong im just saying its suprisingly a poor decision -
It’s not a decision. It’s physics. The ampere architecture is very power hungry, but there are set thermal and power constraints you can fit into a laptop. Things have to be cut in order to meet those requirements. This shows weakness in Nvidia’s engineering, efficiency is going down the toilet after pascalraz8020, seanwee, Papusan and 1 other person like this.
-
yrekabakery Notebook Virtuoso
The 3080M is using the fully enabled GA104 core (a marginal 1 SM more than the desktop 3070) but clocked 10% lower due to the 150W power limit. That would put 3080M around desktop 2080 Super or 3060 Ti level performance. We’ve already had 200W 2080 and 2080 Super in laptops for the past couple years, which perform identical to the desktop 2080, so a desktop 2080S/3060 Ti level 3080M is hardly an improvement worth getting up for. -
oh ok so why on earth is nvidia doing this? and i thought the super varients were only 10-5 percent faster
-
Money; throwing power at it was the cheapest route forward and will cost no sales because they're in a duopoly and sales are guaranteed.
raz8020, seanwee, Papusan and 1 other person like this. -
yrekabakery Notebook Virtuoso
More like monopoly. AMD hasn't made a mobile GPU worth caring about in almost a decade.
-
i heard the 6900xt matches and sometimes beats a 3090 at 4k...hmm probably rigged results but at least its something, also I was thinking of just doing a egpu setup but these cards are next to impossible to buy
-
electrosoft Perpetualist Matrixist
They had to either severely under clock the 3080 to hit a 150-200w power envelope or use the 3070 which requires less power out of the gate. 3000 series is power hungry especially the 3080 and 3090. You are talking a reduction of ~350w+ to 150-200w for a true mobile 3080. That's pretty substantial. In the end, it may have been a wash benchmark wise: heavily capped 3080 vs near (~10% less @ 150w) normalized 3070 running in a mobile envelope.
Performance wise, it looks like another big 'ole round of meh. 3080M (aka the artist formerly known as 3070) will bring 2080 super - 2080ti (ish) level of performance which will be a not so large step over the Mobile 2080 super performance, but it is an upgrade...technically. Then again 2080 mobile to 2080 super mobile was severely disappointing too. I do think a 200w version of the 3080M would equal a full fat stock 3070 or better. Something like that in an X170 (or even Alienware 51m r2) would be decent.
When I read about shoehorning 3080 hardware into laptops but just grossly capping clocks and mem speed, all I see is a small subsection of overclockers bemoaning the fact they won't be able to take a capped "real" 3080 and go in there and tinker with it and shunt it to push normal desktop speeds as much as possible "Don't restrict me in any way bro..."
As for Nvidia being a money grubbing corporation....what else is new? Instead of innovating for mobile solutions, they opted to ramp up desktop power requirements for Ampere with no comparable mobile solution besides using a mid tier card and relabeling it Nvidia old school style. With AMD hot on their heels now and a viable threat, they can't hold anything back on the desktop where they're not constrained but mobile solutions will suffer if the desire is desktop parity. -
electrosoft Perpetualist Matrixist
6900xt is the real deal for pure rasterization performance, but it really is just a 6800xt with 8 more CU's enabled. Subpar RT and no DLSS to boot. Overall, on average tested against a wide selection of games, I think the 3090 came out ahead by ~4%? 6800xt, 3080, 6900xt, 3090 are so close in so many games without RT or DLSS enabled anyone will work. Of course if RT or DLSS come into play AMD does fall behind quickly.
If you go AMD, 6800xt is definitely the sweet spot, but I'd still take a 3080 over a 6800xt...
....even with 10GB and modern games knocking on that VRAM limit (that one was for you @yrekabakery !!
)
raz8020, seanwee and yrekabakery like this. -
I wonder how the maxed out GA104 for mobile 3080 (200w Max-P) will perform vs. a power gimped real 3080 desktop card running 200w.
It's all about the RT performance nowadays as nvidia said so nice to Hardware Unboxed
And not comparable max FPS or high numbers in 3DM FS and Time Spy.
Last edited: Jan 1, 2021raz8020, JRE84, etern4l and 1 other person like this. -
yrekabakery Notebook Virtuoso
I’ve come around to the notion that RX 6800/6900 are overpriced. They simply lack too much in features compared to Ampere. The more RT effects a game employs, the further they fall behind. They don’t even match Turing in fully path-traced games like Quake II and Minecraft. The 6800XT performs half as fast as the lowest RTX card, the 2060, in Minecraft when DLSS is enabled lol. And that’s another thing too, without DLSS AMD simply falls even further behind in both RT and rasterization.
This is probably why CDPR didn’t enable ray tracing support on AMD GPUs in Cyberpunk. Cyberpunk is the most comprehensive ray-traced AAA title, and requires DLSS for acceptable performance with RT enabled. The performance with RT enabled on AMD would just be too poor and hurt CDPR’s image. There’s no technical reason otherwise that Cyberpunk’s RT shouldn’t work on AMD, since it just uses the universally supported DXR feature of the DX12 API.
Not to mention, AMD can’t compete with the Turing/Ampere NvENC encoder when it comes to image quality at low, livestream-level bitrates. And performance at 4K+ favors Ampere’s float-heavy SM design and much higher memory bandwidth, allowing it to brute force its way past AMD’s Infinity Cache and much lower bandwidth.
16GB of VRAM is nice, but I’d rather have 10GB of much faster memory. I’m of the opinion that the 3080 will hit a bottleneck in raw performance before memory capacity, even in future games.raz8020, JRE84 and electrosoft like this. -
Well Nvidia could have used the GA102 chip at lower clocks but there wouldn't be any benefit since they would have to clock the GA102 so low that it would match a GA104 anyway. And Nvidia will charge $1200 for a mobile GA104 and that will bring more profit to them than delivering a mobile GA102 with lower clock speed. Also the stock for GA104 seems to be a bit better than the stock for GA102 so it was in fact an easy choice for Nvidia to just use the GA104 instead the GA102.
I'd highly recommend to everybody to skip this mobile generation and wait for the Super refresh or whatever name Nvidia gave to the Ampere 7nm cards which are going to release in Q3-Q4 of 2021. At 150W cooling limit going from 8nm to 7nm should make an interesting difference in performance.
Interesting enough the desktop RTX 3080 has a wonderful performance per watt ratio, which is by definition "efficiency". I blame laptops engineers for not being able to find better cooling solutions that allows to laptops to use 300+W GPUs. They barely can handle 150W on the so called "gaming" laptops. Although large Clevo laptops easy can handle 250W GPUs.
Well to be fair the RTX 3080M should have a good 40% performance improvement in RT over mobile 2080 and 2080S, thanks to the use of the new 2nd gen RT cores also improve DLSS with 3rd gen Tensor cores. But in regular rasterization the difference should be more like 20-25%.
My friend I wouldn't touch AMD unless you can't get your hands on a RTX graphics card. The RTX 3080 is the sweet spot in this gen of graphics cards. The RTX 3080 has all the performance of the RTX 3090/6900XT minus a 5-10% depending on the resolution plus the good price of $700, and beats the hell out of AMD on RT, and you get Tensor cores for AI/DLSS and broadcasting suite. A really nice product. And if you're willing to expend $1000, just wait for the RTX 3080 Ti, but the performance jump from a RTX 3080 will be something like a 5-7%, you go only after those 20 GB GDDR6X more than anything with the RTX 3080 Ti.
Nice video, I wonder what the number would have been for 150W cap. Probably more like a 25% performance decrease in 1080p and a 50% performance decrease at 1440p. We need better cooling solutions on laptops. That's the key my friend.JRE84 likes this. -
yrekabakery Notebook Virtuoso
No, they wouldn't have used the GA102 chip anyway, since mobile GeForce has never had a greater than 256-bit bus.raz8020 likes this. -
does anyone have a eta for the mobile 3080
-
Really theres two issues here preventing the full fledged 3080 from making it into laptops.
1) There has never been a mobile gpu made that had a memory bus larger than 256bit. The challenge of cramming that onto an MXM board or laptop motherboard is immense. This is the same reason why laptops never got NVLINK support. Simply not enough real-estate on the PCBs.
When I saw the 3080 was a gpu with a >256bit bus, I knew it was never going to be put in a laptop.
2) The average consumer for nearly a decade now has been buying thin and light "gaming" laptops with a ferver. They've shown that they don't care about maximum performance, are perfectly happy with the computer being disposable (bga components) and will pay $2-$2.5k+ every year for a the newest model.
If that is what the consumer wants, then why would NVIDIA or any other laptop manufacturer undertake the extremely expensive task of trying to cram a gpu with a >256bit bus into a laptop and then cool it at 300 watts? The consumers have said they don't want that.
I blame the consumer more than the companies. Its not like there weren't other options out there for the past several generations. If the average consumer actually wanted performance, then everyone would be rocking Clevo P870's (or similar) with their 500+ watts of gpu cooling capability, and thin and light "gaming" laptops would be a niche.
Sent from my SM-G973U using Tapatalkraz8020, yrekabakery, electrosoft and 1 other person like this. -
I found something interesting but to be taken with a grain of salt
NVIDIA GeForce RTX 3080 Mobile GPU - Benchmarks and Specs - NotebookCheck.net Tech
shows the mobile 3080 above the (rtx6000 aka 2080ti)
it should be close to below
median: 31058 in fs graphicsjc_denton likes this. -
https://www.notebookcheck.net/NVIDI...ional-Graphics-Card-for-Laptops.434515.0.html
They could if they wanted. And none use standard MXM.raz8020, jc_denton, Spartan@HIDevolution and 1 other person like this. -
I agree on the main but I don't know too many people that are buying 2k laptops annually and thin laptops aren't popular without reason. To put some perspective on it the average laptop sale price in 2019 was just 630 dollars and that number has been relatively constant in recent years. There are probably a lot of reasons for that outside of affordability; need comes to mind in the context of this conversation: A lot of people aren't gamers and many who are PC gamers have a desktop pressed into that duty while at the same time nowadays the only reason the vast majority of people would need that big heavy laptop is for gaming since almost any modern laptop can make quick work of most other tasks. Then you've got road warriors; these people need something that is actually portable. It's one thing to schlep a 11 lb DTR out to the car, into the office, back to the car, and home again. Another entirely to carry it through multiple airports while taking 5k, 10, or even more steps going from point A to B. I can tell you I personally did exactly that with my Clevo's for years. It wasn't fun but I travel all over the world and stay for a month or more; both for work and play. Consider the weights: Those DTRs ran about 10 lbs and with cord and supply I usually hit at about 12 to 15 lbs for the absolute must have equipment to light that thing off. It's not easy to find a bag that wont fall apart when used for more than getting the laptop to the car that weighs less than 3 lbs by itself. Throw in some paperwork and random shrapnel like your mouse and it's easy to hit at 25 lbs even when being conscious of weight. Nowadays I run an Aorus that weighs just 7 and change for the brick, cord, and lappy itself and the works will hit at about 12 to 15. Ten lbs doesn't seem like much on the trip to the car but it does when your step count starts getting up there and I have to say I'm not missing that 10 lbs. Of course I'm in a very small group of people that would carry a Clevo around the world, most road warrior types are going to stick to light; and yes, thin. To go back to the masses who are spending that 630 on average they by cost alone did not buy the laptop for gaming and as they did not why would they want a fat laptop?
What I'm trying to point out here by my own example is that I'm in a small minority; on this thread that puts me in good company because no one who comes to this forum and clicks on this thread is going to be even close to an average laptop consumer. Most people would not buy a 10 lb gaming clevo, ever, even if it was 630 dollars instead of 3500. I want that power as much as anyone here but considered objectively I don't have any trouble understanding why thin is here to stay.raz8020, JRE84 and electrosoft like this. -
electrosoft Perpetualist Matrixist
For #1, they COULD if they found a viable mass consumer market and a reason to which leads to #2, there is no widespread reason to do that. I agree with you and have argued companies will make and sell what...well...sells. When the market years ago reached a point of numerous DTRs being available AND thin and lights were moving in the numbers quickly skewed to the thin and lights. Your average laptop user has zero desire to schlep around a 12lb+ w/ PSU monstrosity when they can get 60-70% of the performance now in a package half the size.I look at the 14" Zephyrs with 4900h CPUs and 2060 power and they match or beat what used to be 10lb+ DTR laptops just 5yrs ago.
Even amongst clevos, I am amazed at how light this batch of P750DM's I upgraded and repaired for a customer are compared to my P870TM1 and I'm a big dude that works out, lifts heavy and backpacks. The P750DMs felt incredibly light. Then I'll pick up the family's group 17" HP with its tiny power adapter and it feels like nothing. A 15" Clevo barely fits in the 17" laptop bag for the HP. "Moar powah!" is in direct conflict with "thinner and lighter" and the compromise are relatively powerful THINNER and LIGHTER gaming laptops that while they may not pack the cooling and punch of full on DTRs, they more than get the job done and are much easier to move around let alone carry around. I LOVE my P870TM1 and on big trips/outings it is my road dog (like the P870DM-G was before it and the Alienware 18 was before it), but there is NO WAY I'm carrying that site to site, place to place if I can use a 3-5lb laptop with killer battery life so I don't even have to worry about a power outlet. I understand why the market is what it is.
Basically, this in spades. I'm fully aware of the forums and hardware enthusiasts I co-mingle with and seek. I am also fully aware I am in a very small minority in regards to true market forces and what most consumers want. The bean counters are paid the big bucks to figure out what to sell that the average consumers will buy. -
I'm already at 26.3k in a thin and light, and this one is most likely a maxp version so thats not impressive at all. Very disappointing actually
Heck overclocked GE75s with 2080s are hitting the 31k graphics markLast edited: Jan 2, 2021 -
yeah i had no idea you could hit 26000 with a gs75. wow .
but yeah this is depressing me as its my turn to upgrade -
Just go for a 3070 laptop. Edit: Assuming nvidia doesn't cuck that one too.
The point of contention here is the fact that the mobile "3080" isn't and shouldn't be considered a 3080.Papusan likes this. -
Like the tecnology is evolving soo fast on the gpu market that i think its better to pay for a medium one because 1 year later the top market card is crushed by the less expensive
for example rtx 2060 ti vs rtx 3060 ti prices vs performance -
i am gaming at 1080p maybe 4kon a tv once in a while, do you think the 3070 will be enough?
-
Depends on what settings and what fps you're targeting. I'm doing fine with my GS75 with 1800p + reshade CAS.
I'd look to desktop 3060ti performance for reference.
No idea about future titles though.JRE84 likes this. -
electrosoft Perpetualist Matrixist
Mobile 3080 confirmed specs:
https://videocardz.com/newz/nvidia-...irmed-with-6144-cuda-cores-and-16gb-of-memory
jc_denton, seanwee, Papusan and 1 other person like this. -
yrekabakery Notebook Virtuoso
OpenCL != gaming performance. Compute benchmarks use very little power, hence the 3080MQ is able to boost high and beat out the desktop 3070 since it is the full GA104 chip (6144 CUDA vs. 5888). In actual 3D graphics games and benchmarks, the 3080MQ will be very power limited, hence much lower clock/perf. This is again illustrated by the 2080MQ being similar to the desktop 2060 in gaming, while in that OpenCL test it's even beating the desktop 2070. -
lol after like 3 people telling me it wasn't a 2080ti but now it is? this is great news and thats the max q version..
also its looking like a 30 percent bump for the full fat 3080...and this is what I expected...I really hope they make it in mxm form and then I will buy a DTR (mxm FF) if not I guess its the gs67 or gs68 by msi, super excited. -
yrekabakery Notebook Virtuoso
As mentioned earlier, there simply isn't enough PCB space on MXM for >256-bit bus and the associated memory packages and traces. If there ever is a GA102-based mobile 3080/3090/A6000, you can bet your socks it'll be BGA.
-
electrosoft Perpetualist Matrixist
I'm not disagreeing with you. I just posted the specs.
-
This is Zephyrus cooling with the low power card, wanna see what more serious machines can do with the higher powered variant. If that gets close to 3070 performance that's a big jump over my 1080 DT and gets me into ray tracing. I'm in the market but I gotta see a real world difference that says to me 3k was well spent.
-
Its not. As @yrekabakery mentioned opencl cannot be used to estimate gaming performance.
-
are there any games that use OpenCL? lol most of the time benchmark leaks are useless. Mark my words, power = performance. There is absolutely no chance in hell that the 90W 3080MQ will outperform the 220W desktop 3070 in games that use full power of the graphic card.
-
Nvidia mobile RTX 3080 Max-P versus Max-Q: specs and estimated performance
http://forum.notebookreview.com/thr...scale-on-laptops.834039/page-27#post-11068902 -
In-line with what I’ve been saying all along: closer to a RTX 2080 Super than a 2080TiAroc likes this.
-
The less crippled of the 3080 Max-P line-up is expected to max out at 200W. The results above if correct, seems being from heavly crippled Mobile cards with capped off balls.
RTX 3080 Max-P
Edit.
It seems that RTX 3080 Mobile will be offered with either 16GB or 8GB memory capacity, which will segment this model even further. In reality, we might see as many as 10 models called RTX 3080 Mobile/Laptop, but each carrying a different clock speed, power limit, or memory capacity.
The beauty. 10 different 3080 Mobile SKUs looks damn disgusting. It looks like we still haven't seen the bottom barrel from Nvidia/notebook manufacturers. The downwards direction have still not reached the bottom.
https://videocardz.com/newz/nvidia-...ns-emerge-6144-cuda-cores-clocked-at-1245-mhzLast edited: Jan 6, 2021 -
That seems about right to me. The power limit of 115 W and the even slower version of GDDR6 of 12 Gbps (instead of 14 Gbps) are even worst than I was thinking.raz8020, Normimb, seanwee and 1 other person like this.
-
Welp this generation's mobile GPUs are an epic fail. The 3070 will be the closest to its desktop counterpart provided it gets a 200 watt power limit in one of its various Max-P configurations.
It appears the gap between the 3060 mobile and desktop versions may be worse than the gap than the 3070 mobile and desktop versions if my above statement pans out.raz8020, jc_denton, electrosoft and 3 others like this. -
so with open cl we have hope but open cl is not accurate.
with charts that dont show numbers we are out of luck
with power draw and tdp we know the 3080 will be below the 3070
what the heck, are we going to see 2080ti performance or not, why is nvidia ripping off the people that make them rich..almost every company on the planet goes by customer is always right policy and nvidia seems too be doing the opposite, you will buy and you will enjoy or nothing thats it you must accept and they leave us no option..I know AMD isn't the best but they never did this and i'm thinking of taking action by buying a console (support of amd) and then getting an amd only laptop..seanwee likes this. -
It's certainly disappointing and looks to stay that way without a big surprise but we've been waiting this long, I figure I might as well see actual numbers at this point before making a decision. Looking like a survive until the next thing dynamic to me...
-
NVIDIA Announces GeForce RTX 30 Series for Laptops: Mobile Ampere Available Jan. 26th
http://forum.notebookreview.com/thr...scale-on-laptops.834039/page-36#post-11070515 -
electrosoft Perpetualist Matrixist
Nice update 4k benchmark test for 3070, 6800, 6800xt, 3080, 6900xt, 3090.
4k DLSS Cyberpunk even the 3070 beats the 6900xt:
raz8020, JRE84, jc_denton and 1 other person like this. -
yeah but dlss..
6900xt is the fastest card available if you exclude dlssseanwee likes this. -
I'm getting the Alienware m15 R4 with RTX 3080 (125-140W TDP) so I'm pretty excited. This is upgrading from my existing laptop with RTX 2080 Max-Q.
-
at first I thought you were spamming the forums but it's obvious you just had a alot of input to give, I can't rep you again until I rep 3 other people but props to you very helpful posts and we just might be able to keep NBR alive if we post more interesting content and attract people...NBR is fairly easy to find on google and I wish it the best as it has countless threads which are informative and helpful to the masses.
to @rockelino
also @Papusan do you know if they are releasing a 200w 3080 for laptops or will it just be in the form of a 3080 super maq q max performance ultra low epic slow cool version ....lolPapusan likes this.
RTX 3080 trumps 2080ti by a whopping margin
Discussion in 'Gaming (Software and Graphics Cards)' started by JRE84, Jun 24, 2020.