Darn disgusting. The OEMs play God with the gamers/lapJockey/customers. And Nvidia offer happily the needed support (offer guides and tools to cripple own products). @Mr. Fox @Ashtrix @jc_denton +++
![]()
100 W GeForce RTX 3080 vs. 130 W GeForce RTX 3070: Which is the better choice? notebookcheck.net
-
-
I wonder if the 3080 Super Laptop GPU ends up being the actual GA102 die.electrosoft, Spartan@HIDevolution and JRE84 like this.
-
True if I was going to dump that much into a xx70 or xx80 series laptop I'd definitely make sure it had a mux switch, I think at one time someone tried to have a list of mux laptops here somewhere. BUT sadly there are a lot of people who have no idea about all this AND buy 70/80 laptops today, us regulars here are a niche lol
Edit: well the net is a great thing lol Jarrods has a small list, looks like of the major brands lenovo is good for that outside of tongfangs, go tongfang imo for almost guaranteed bios based mux.
https://jarrods.tech/list-of-gaming-laptops-with-mux-switch/Last edited: Sep 24, 2021Spartan@HIDevolution, Clamibot, JRE84 and 1 other person like this. -
Disgusting as always, but when you have market monopoly they can pretty much do whatever they want.Spartan@HIDevolution, JRE84 and Papusan like this.
-
Meaker@Sager Company Representative
It's amazing with the same chip you can get that much better power efficiency. If you laptop will only handle the 100W TDP then you can get the same performance as a 130W card in a 100W one.
JRE84 likes this. -
what do you mean? example...im sure im not the only one wondering what you mean
ok quick analysis...you mean desktop 130w into a laptop 100w...if thats it yes its amazing
also I'm not sure what NBR member is doing this can't remember but hes holding off till 4000 series and that is dang smart....I heard it is on a new process and will be twice as power efficient..this could mean small gains if the desktop tdp is like 400w lol but its likely we will eventually see another pascal esque card/cards. I really hope the 4080dt is the same chip as the 4080m so too speak.Clamibot likes this. -
Meaker@Sager Company Representative
So the 3080 and 3070 share the same core, the 3070 has some shaders disabled. The 3080 max-q here has given the performance pretty close to the 3070 core while consuming 30W less power. That's a big efficiency saving of 25% ish. There must be some binning going on there.
If you only have 100W of GPU cooling in the chassis that is a nice gain. -
Yeah, but something doesn't makes sense. On theoretical ideal conditions, the 130W 3070 should outperform the 100W 3080, there shouldn't be any efficiency difference between a SM from a 3090/3080/3070/ etc all 30-series SM should have the same efficiency (i.e frames per second per watt). What I think is going on here is the 3070 laptop has worst cooling system and so needs more power to move those SM clocks at the a higher temp. The other reason might be the 3080 peak power is closer to 130W or the 3070 power limit is really less than 130W. To really know what's going on we need to know actual measurements like power consumption, temperatures, thermal throttling and power throttlingJRE84 likes this.
-
While i'm skeptical that the 130w 3070 is not at least slightly faster than the 100w 3080, it makes sense that having more cuda cores will be more efficient than pushing up the clocks a corresponding percentage.
To make it simpler to understand, lets equalise the power draw to 130w for example. On the 3070, the same wattage is distributed to less cores, therefore, the cores will be pushed to run at higher clocks and higher voltages while the 3080 with more cores will have less power per core and thus run at a lower clock and voltage. In a perfect world, the gpu will scale perfectly with power and voltage but that is not the case. The power requirement goes up exponentially as the voltage and clockspeed increases. Therefore, pushing the voltage and clockspeeds higher will make the gpus run less efficiently. This is particularly noticeable is extreme overclocking benchmarks where the gpus are consuming double the power but are only 20-30% faster. On the flip side, undervolting can makes gpus more efficient because they are further down the efficiency curve so you can retain 90% of the performance at 70% the power.
That was the original idea behind the "max-q" branding, to target the point where power draw to performance stops scaling linearly and start to require exponentially more power. However nvidia discarted the idea later into the life cycle of the 10 series and ever since then the 20 and 30 series max-q cards have just been stuck with arbitary power limits like 60w for the xx60 series and 80w for the xx70 and xx80 series.JRE84 likes this. -
Nice. The 3080 flagship mobile card can't even push 60 fps in (coming) games. Maybe if Nvidia offer an 3080 Super version to push people over on new is better? The 3080 moniker is worthless for gaming-books. 3080 was meant as 4K cards but the mobile version failed hard. @Mr. Fox @Ashtrix @jc_denton ++++
Far Cry 6: PC graphics performance benchmark review - Game performance guru3d.com
Last edited: Oct 8, 2021jc_denton, Ashtrix, Mr. Fox and 1 other person like this. -
Yeah, in that article something is holding back the performance of the 3070 laptop. As you said probably the laptop is running at higher temperatures and that's why the 3070's performance isn't scaling with power. If temperature is cool enough the 3070 should scale with power well enough, specially at 130W which is low power. The 3070 was designed to run at 200W easy.
The mobile 3080 isn't in that chart right? you mean something like the mobile 3080 should do in between the desktop 3070 and the desktop 2080 super which will be around 55 fps?Papusan likes this. -
A mobile 3080 is a castrated 3070 desktop cards with a few more cores but capped of TDP so you should expect around same 58 fps as in the chart (maybe sub). This for the 150w+ version. The 3070 Ti have same amound cores as the mobile flagship but a much higher TDP and run in at 61 fps.
-
Meaker@Sager Company Representative
That's why you power mod it, even less shunt resistors than the 1080 so happy days
-
Yep that will work but 60 fps is still a hard task to reach. Remember the 3070 Ti with GDDR6X and 290w board power (same amount cores as 3080 mobile castrate Editions) that max out at 61 fps. Then you have those so called 3080 mobile below. I wouldn't push those at +-165w in longer game sessions than a few minutes. Yep, 3080 mobile ain't what you think it is
Not at all (4K cards).
-
Meaker@Sager Company Representative
-
61 vs 59?
what im thinking is
01100100011101010110110101100010
and
01100011011011110110111001100111011100100111010101100101011011100111010000100000
0110111001101111Last edited: Oct 8, 2021 -
Be below 60 fps is dumb bruh
At least if you have a so called latest gen 4K card
-
uhmmm, it definitely depends. The shunt mod will void your warranty. So is not ideal. Also some mobile 3080s have 5 VRMs of 30A each. In that case you can't do more than 150W. If you go ahead and shunt mod over 150W it will mostly trigger over current protection or even worst a VRM will die.Mr. Fox likes this.
-
the real message was its simple and congruent to dumb....just boredMr. Fox likes this. -
To my knowledge, there is a group of VRMs for the core. The core usually runs around 1 volt. There is another group of VRMs for memory that runs a little over 1V, and another group of VRMs for the MSVDD that also runs around 1V. So knowing that Power = Voltage x Current .... each VRM will be around 1 V x 30 A = 30 W. So with 5 VRMs all of them add up around 150W which is about the TGP for the mobile 3080. I think sean said that some BGA laptops runs 50A VRMs, which will definitely allow room for shunt mod. But again you have to check what VRMs are used in the particulat mobile 3080 to be sure if it worth to do the shunt modJRE84 likes this.
-
oh ok now ic....good point..you seem knowledgeable on shunt also.
intelligent and well informedSpartan@HIDevolution likes this. -
Not if you solder well enough. I successfully rma'd my GS75 a month back no questions asked. That is why soldering is still my most reccomended way of shunt modding, either by carefully bridging the existing shunts or by replacing the eisting R005 shunts with R004 or R003 shunts depending on how much power you want.
Depends on your card as well. A 3060 may run at 1v with 150w@150A but a 3080 will only run at 0.75v at 150w@200A which will overload the vrms.
Yes, but actually no. The current rating for vrm mosfets represents the peak current output, not the sustained output. And even for the highest end 80A smart power stages, heat is a big issue past 50A on desktop cards with active vrm cooling let alone on passively cooled laptop vrms.
Here is an example power lost as heat vs current output graph from the NCP303150 datasheet
At 50A you're looking at more than 11w of heat output from each mosfet which is crazy amount for something so small.
I'm sure some of you have heard of the Alienware A51m horror stories about vrms burning up.
And that is with the mosfets just drawing about 32-33A per stage
As a rule of thumb, don't exceed 4w of heat output per power stage in laptops with active vrm cooling and 3w of heat output per power stage in laptops with passive vrm cooling.
Heavier games can and will consume more power at a given voltage. So a heavy game may push your vrms over the limit and trip over current protection even when you think its safe, and in the worst case scenario, have a power spike and burn out the vrms outright. Always have a 10-15% safety margin at the very minimum.Falkentyne, Papusan, Clamibot and 1 other person like this. -
None of the options are good ones when you stop and think about it...
- Pay too much for something that has been castrated "because its a laptop"
- Put up with it's limitations "because it's a laptop" so you can have a warranty
- Reduce some of its limitations and void the warranty on the castrated product
- Resign yourself to the fact that it will be somewhat less impotent, but still not quite the product that the label it carries suggests that it should be
Maybe the the primary value in having it comes from impressing your friends on Facebook. -
Alternatively, put some elbow grease into it and make a good product the best product for yourself
. I for one need the portability a laptop offers and if i don't think a laptop's performance can live up to my expectations even after modding, I'll just wait for a future release.
Spartan@HIDevolution, Clamibot, krabman and 3 others like this. -
I know, right? I've been waiting since 2015. I haven't spotted a suitable one yet. Thankfully, portability is no longer a need for me. It used to be. If it were, I would be very disappointed with the options available. Or, perhaps it would be more accurate to say the lack thereof.Clamibot, Papusan, seanwee and 1 other person like this.
-
Here's bro @seanwee a few decades forwards from from now on
Spartan@HIDevolution, Clamibot, JRE84 and 2 others like this. -
Hopefully not lol. If the 4080 laptops do not at least double the current gpu performance of my machine, even after a shunt mod, i may just switch to game streaming from a SFF 4080 desktop. I've been streaming games to my ipad to make use its glorious mini-led display and it has been very usable.
You can set it to immediately start streaming to your laptop/ultrabook/tablet and you basically have a wireless E-desktop.Spartan@HIDevolution, Clamibot, Papusan and 1 other person like this. -
bro paps.....have you tried geforce now? if you dont mind 1080p/120 its really good nowadays
Spartan@HIDevolution likes this. -
I’m not a gamer. No time for that
Many years ago since I stopped with games. I rather enjoy benching
https://hwbot.org/user/papusan/
Spartan@HIDevolution and JRE84 like this. -
yeah i bench alot funny thing is i have a 1650....broke 3300 in timespy but yeah im a bencher also....wish i was young again!! you?
edit can you post a bench of your beast 980 -
-
Spartan@HIDevolution and Papusan like this.
-
Meaker@Sager Company Representative
Warranty? I don't think any laptop i've owned more than a week has technically kept the warranty from what I have done lol.electrosoft, seanwee, Papusan and 1 other person like this. -
What are you using to stream to your laptop/ultrabook/tablet? I tried Steam Link and Moonlight (which use Nvidia's Shield API) and none of them are good enough to me. Steam Link is "better" but the image at it's highest quality is too compressed to my taste, the VSYNC is awful bad and takes a good 10% of GPU usage from the host machine. Also limited at 120 fps.
Geforce Now is the Nvidia's platform where the game runs on Nvidia's servers? can it actually do 1080p/120fps?
You have a point there
. Personally I don't care much about warraty either. I care only if the manufacturer send the replacement part to me so I can make the repair.
-
It seems Nvidia have updatet and improved the users experience long time ago
1080p 120fps streaming | NVIDIA GeForce Forums -
The support on that thread looks overwhelming , lol
-
Falkentyne Notebook Prophet
Too bad I didn't take a picture of my 3090 FE pushing 600W and running at 83C on the core and 94.9C on the VRM's...
Still works fine.electrosoft, seanwee and Papusan like this. -
Not all is equal lucky
New World GPU Victims Revived! - Overloaded Vcore Power Stages on Nvidia Ampere identified by Youtuber igorslab.de | October 8, 2021
The Youtuber buildzoid investigated a defective graphics card of a Twitter follower on his channel Actually Hardcore Overclocking, which allegedly fell victim to the new Amazon MMO New World. After a few hours with the graphics cardhfm likes this. -
Moonlight is by far the better choice. I use HEVC(HDR Streaming works very well), unlimited stream bandwidth and 120hz and with a lan connection to the router and wifi connection to the tablet. When standing still i do notice the chroma subsampling and poorer gradation but HDR alone is already far better than my laptop's display. Input lag is also a very zippy 30ms
-
Buildzoid is so underrated
-
This one means the laptop gamer jockey will be further screwed by Nvidia + the laptop OEMs and the performance/power gap between laptop graphics and desktops will further increase for next gen graphics cards. The vanilla 3080 desktop cards TDP start from 320W and the mobile variant of 3080 is capped castrated at 85W with a sailing up to 165w as maximum power with the dreaded Dynamic Boost 2.0 feature. Welcome to the future. Yep, it looks grim for gaming laptops.
PCIe 5.0 High-Power Connector For Next-Gen Graphics Cards To Deliver Up To 600W To GPUs
I haven't taken wrong regarding the future for laptops before, soo...
Last edited: Oct 10, 2021 -
After having G-Sync for 3 years now I got so spoiled and can't tolerate tearing or vsync microstuttering. I remember before having G-Sync I was happy with Adaptive Vsync now in my desktop setup I use Vsync only. If my system can't do solid 144fps I change the refresh rate of the monitor at 120 Hz, if my system can't do solid 120 fps I lower again to 100 Hz and so on. With moonlight I got massive vsync microstuttering, while the host machine was running the game at solid 120 fps I was getting lots of microstuttering, on a fully wired gigabit network. You never experienced something like that?
I hope this gets widely adopted on new video cards. Having 3 8-pin connectors is so awkward and it can only do 450W. Only one cable at 600W looks like a winner. But the FE graphics already have a similar connector but it didn't was widely adopted for some reason.Papusan likes this. -
I don't play MMOs, but no way in hell would I even install this on my Ampere card system until there's some definitive fix.Papusan likes this.
-
interesting
as for mobile 2x aint bad...i knew this going forwards
do i trust him....no.....but it might be true these leakers are rarely wrongLast edited: Oct 10, 2021 -
Have heard something like this before. Nvidia blog touted that their 3080 (Yep, Nvidia branded it their flagship) to be up to two times as fast as the RTX 2080. This is what you see was the end result for the desktop cards. And Mobile graphics cards will get lower and lower graphics TDP (or maybe Dynamic Boost 3.0 with more awful power sharing tech between Cpu and Gpu - 100w+65w) vs the desktop versions because we want thinner and thinner laptops jokebooks.
Last edited: Oct 10, 2021Spartan@HIDevolution, Clamibot, hfm and 1 other person like this. -
yeah i was being sarcastic./never ceases to amaze me all the rumors
Spartan@HIDevolution and Papusan like this. -
Nope, I suspect it may have something to do with the device on the recieving end in your case.
-
The laptop on my signature. Which is a W10 based machine. It should work fine but it is what it is I guess
RTX 3080 trumps 2080ti by a whopping margin
Discussion in 'Gaming (Software and Graphics Cards)' started by JRE84, Jun 24, 2020.
![[IMG]](images/storyImages/csm_ampere72_ef57c0e80f.jpg)
![[IMG]](images/storyImages/1.jpg)
![[IMG]](images/storyImages/2.jpg)