Dude, it''s amazing what advanced filtering can do. It's also extraordinarily easy to construct contrived examples for marketing purposes.
https://www.nvidia.com/en-us/geforce/forums/geforce-graphics-cards/5/301925/dlss-bad-image-quality/
-
-
yrekabakery Notebook Virtuoso
-
"
Does DLSS affect image quality?
DLSS 2.0 does somewhat affect image quality in Death Stranding, as an upscaled image from 1080p or even 1440p won't be nearly as sharp as native 4K. (...)
DLSS also struggles to properly upscale small particle effects from lower resolutions, resulting in often blurry and not so sharp particle effects. In Death Stranding this is most notable when it rains, but in our experience Quality mode was understandably a lot better, but the effect didn't detract from our enjoyable experiences in the game. If the strange artifacts and blurring bothers you, it might be best to play at native 1440p instead of 4K DLSS, but the bottom line is that this down to personal preference.
"
IMHO it's clear. DLSS trades off image quality for performance. The lower VRAM utilisation can only be achieved by lowering the texture quality, that's CS 101. Devs can do that because the average gamer can't tell the difference. It's also possible that they have to lower texture quality, either because of the amount of VRAM required by DLSS itself, or because it's easier for DLSS to upscale objects with blurry textures, given how it apparently struggles with the fine detail of particle effects.
Details like that would explain why access to DLSS documentation likely requires an NDALast edited: Jan 25, 2021 -
I think I posted that a while back after watching a comparison video... dlss 2.0 does for sure look better than native
-
-
I dont normally argue with you, but digital foundry did a 800x zoom on quality and clearly 4k dlss 2.0 looked crisper and had comprable image quality, 4k had more definition but looked duller...overall rep to dlss 2.0
this is 540p vs 1080p
imagine 4k...dlss 2.0 better by a long shotLast edited: Jan 25, 2021 -
-
yeah like too my eyes 1080p upscaled with dls2.0 looks sharper one can argue sharpen your 4k image in which case looking better might be in 4ks favor but performance??? who cares when they look so close ill take the extra 15-30 fps
-
ars92, electrosoft and NuclearLizard like this.
-
To add more info.. The only game I currently have installed that supports DLSS is Wolfenstein: Youngblood.
I used my panel's 2560x1600 resolution to check DLSS VRAM usage of the game with various settings from disabled, DLSS Performance, DLSS Balanced and DLSS Quality.
For some reason RTSS OSD disappears when I turn OFF DLSS.. which is wacky, apparently some bug in RTSS. But, between Performance, Balanced and Quality DLSS settings the VRAM usage iteratively increased approximately a little less than 100 megabytes per level from Performance to Balanced to Quality on ~high settings..
I then realized that the game has it's own built in metrics so I cranked up the quality to UBER MENCH and took 4 readings there.
So that's two games now where using DLSS reduces VRAM usage, and using lower quality DLSS settings will reduce it more than higher quality DLSS settings. This all makes perfect sense given how it works. I am imagining these memory usage number would swing even more at 4K.Last edited: Jan 25, 2021NuclearLizard likes this. -
Asus TUF a15 2021 - RTX3070 90W and 5800h
Time Spy - https://www.3dmark.com/spy/17765045
Fire Strike - https://www.3dmark.com/fs/24749182 -
-
DLSS does reduce frame buffer memory requirements if multi-sample antialiasing is enabled, but that wouldn't explain a 400MB drop. Something else must have gone, and clearly, at the same time DLSS has memory requirements of its own. The fact that Nvidia doesn't share too many details regarding the proprietary implementation is suspect.
DLSS 2.0 also attempts to sharpen the image to make it look "crisper" which has it pros and cons. Not even Nvidia would make the claim that the resulting image quality is better than rendering at a native resolution, but it's a really cool smart upscaling trick and works well in practice for gaming purposes. Surely 4K DLSS looks better than FHD, and for most people the question whether 4K native or DLSS looks better on Ultra settings is moot, because only DLSS is usable.
Worth noting that Nvidia bet heavily on this, for example DLSS uses those RT cores many criticise, so no doubt there is a good number of sponsored reviews out there.Last edited: Jan 26, 2021 -
That cpu score is strange...in FS it is better than stock 9900k, however in TS it is worse...maybe 3DMark is not patched yet and Time spy does not utilize 100% of the new AMD cpu ?Last edited: Jan 26, 2021 -
Last edited: Jan 26, 2021
-
-
seanwee likes this.
-
-
-
Big improvment at 4K gaming. No improvment at 1080p. Interesting. -
-
-
-
Alienware M15/M17 R4 will have the lower TGP config of the 3080.
-
What about Area 51M?Last edited: Feb 9, 2021ars92 likes this. -
-
yrekabakery Notebook Virtuoso
ars92, Papusan, etern4l and 1 other person like this. -
-
Last edited: Feb 9, 2021ars92, seanwee, Normimb and 1 other person like this.
-
There is zero chance the m17r4 has anything less than 115W 3080. The time spy overall score on the tech radar review is 12100 which I’d assume can only be reached by a high TGP 3080 GPU
-
-----------------------------------------------------------------------
Performance Test: GeForce RTX 3070 Laptop & RTX 3080 Laptop
TDP chaos. It was already fairly complicated to assess a gaming notebook's performance due to the various factors involved, such as for example the cooling system. With their current Ampere generation of GPUs, Nvidia upped the ante and made the hunt for the ideal laptop even harder.
Due to the extremely high variances in TDP performance between two seemingly identical GPUs can differ drastically. Unfortunately, those differences are often concealed at the time of purchase unless sellers or OEMs fully disclose the GPU’s configured TDP. Given that in the past some OEMs and sellers chose to avoid disclosing the use of Nvidia’s Max-Q variant instead of the regular chip you will forgive us if we are rather skeptical.
As you can see from Dell... Not a single world telling how castrated the 3080 mobile card is in the AW m15 R4. Neither in the sales page or the specs sheet.Last edited: Jan 26, 2021 -
etern4l likes this.
-
-
Last edited: Feb 9, 2021
-
How Dell cripple performance explained by......
And I'm sure they will take it to the utter end with the Ampere models. Why provide proper power adapters when you have tech as below? 240W adaper is so cute and it is cheaper. Big win.. For Dell. Not for the customers.
Dynamic Boost 2.0
Among the new features Nvidia brings to its 30-series is an improved Dynamic Boost 2.0. Dynamic Boost works by shunting power from the CPU to the GPU when needed, giving the GPU 15 additional watts of power.
Nvidia raises the AI catchphrase and says Dynamic Boost 2.0 can increase performance by 16 percent. Dynamic Boost 2.0 previously only allowed power to be shifted from the CPU to the GPU, but now it can actually flow back to the CPU from the GPU if the game is CPU-bound rather than GPU-bound. Nvidia said it can also now balance power with the graphics memory.
On laptops that support it, Dynamic Boost 2.0 will be on all the time, without the option to turn it off, as in the previous version. On older laptops that support Dynamic Boost, you will continue to be able to turn the feature off.
While Dynamic Boost 2.0 can pretty much be supported by any RTX laptop, it’s up to the laptop maker to enable it. We didn’t get deep into this feature here, but we plan to soon.
https://www.pcworld.com/article/360...view-the-fastest-laptop-gpu-just-arrived.html
Mobile GPUs are now all "laptop GPUs"
Mobile GPUs are now all "laptop GPUs"
Nvidia is now calling all of the new mobile graphics cards "laptop GPUs". That sounds like a minor aside, but it isn't. But on the contrary.
Once in 2015 and back
As a reminder: Until the end of 2015, Nvidia had marked mobile GPU variants as M versions. This meant, even if it was definitely not clear to all customers, that a GeForce GTX 980M did not come close to a GeForce GTX 980 for desktop PCs.
With the GeForce GTX 980 without the suffix “M” , Nvidia then presented a notebook GPU for the first time in September 2015, which virtually achieved the performance of the desktop version . It was based on the same GPU and used the same memory, only the consumption was slightly reduced. This was possible because Maxwell could already be operated very efficiently and economically in the desktop without direct resistance from AMD.
With the mobile GeForce GTX 1000, Nvidia made this change in the nomenclature in the series: Because all mobile GPUs should only be a maximum of 10 percent slower than the desktop version, the mobile GPUs were named like those for the desktop. A new addition were the particularly efficient Max-Q variants, which have been throttled again in terms of consumption and should be a maximum of 10 percent slower. This promise also worked out in reality.
The system was also retained in the subsequent RTX-2000 generation, although it was already falling apart, because the gap between GPUs for desktop PCs and notebooks widened.
Now Nvidia is returning to the “M” of the generations for the GeForce GTX 980 with the suffix “Laptop GPU” used throughout the series. Why? A look at the key technical data of the new mobile variants becomes clear.
Why should Nvidia go forwards as with Maxwell and Pascal Mobile? No profit in graphics that let you have good enough performance for more than 1 year before you need to upgrade your notebook. They have learned their lesson. Give it desktop performance will only provide longer lifespan before you need to buy a new laptop again. Less profit in that. Both for Nvidia and its partners (OEMs).Last edited: Jan 26, 2021 -
The obvious question is how would people know if this is enabled or not? Does 130W mean 130W or 115W + 115W. What does 125W mean?Last edited: Feb 9, 2021Normimb likes this. -
yrekabakery Notebook Virtuoso
Flow X13 disassembled showing Asus' LM protection solution: Foam damn around the package and thermal paste/goop on the capacitors around the APU die. Looks like that goop saved the ones on the left of the chip from being shorted out. Kinda nasty though. Most people who use LM on GPUs go for nail polish/conformal coating and or Kapton/electrical tape instead.
-
etern4l likes this.
-
Last edited: Feb 9, 2021Papusan likes this.
-
Pay more get more I guess, but it's still pretty disingenuous to label ALL of the models with the same numbers. I think a lot of people will just assume it's all the same if the labeling is the same and get ripped off, par for the course. On another note the NBC/videocardz firestrike scores are not that impressive for a 3070 max-q of only 30% more than my 2060 ~95w refresh. What does that say for the 3060 laptop gpu benefit, not too promising. So 3060 max-P would I'm guessing equal 3070 max-q for a benefit of 30% over the 2060 95w max-???. 3060 max-Q less, I'd guess 15% improvement over 95w 2060, hmm. Then again I probably got ripped off over the 1660ti mobile gpu new as well, but I was going from a 980m to a 2060 so it's a huge upgrade for me. And for what I paid black friday deal 1660ti laptops are as much or more. May as well get the extra 5% and raytracing/dlss capability.
Last edited: Jan 26, 2021etern4l likes this. -
I suppose the dynamic boost is impressive and allows more power to go the gpu when the cpu is not needed, which gets it some what closer to desktop 3080 clock speeds. Linus shows a big performance gap at 4k with the 3080 dominating the 2080 super, but in that benchmark at 4k the gpu isn't getting 60fps constant at 4k. 4k is unnecessary on a 17 inch laptop anyway. I'm still in the all I need is 60fps crowd though, but would of been willing to upgrade for 4k 60fps performance. The issue with that is there is no way you will have good thermals and good acoustics when running 4k at 60fps with this generation of gpus unless the laptop is quite big and hefty.
-
Linus showed what he was contracted to show, let's wait for more independent reviews. -
Similar benefit between 60/70 tiers when compared with the 20 series in UL, the scores indicate 2070mq is about 12.2% benefit over the 2060 refresh RF((which I'm assuming is the 115w version? and the old one the 80w version?)). I'm assuming a 3060mP would be 15% lower than 3070mQ but it seems like that doesn't work, where would the benefit come from 2060mRF and 3060mQ at all? Another case of 1660ti/2060 of like 5%-10%? Of course at average new pricing they're close in price if you go for the low end $999 3060 laptop vs the still available new 2060 laptops.
According to UL the 2060mRF is ~60% benefit over a 1060m, that's in line with the 2060mRF/P and 3060maxP but imagine getting a 3060 max Q, almost no benefit!! I guess maybe 5-10% AND access to better dlss/raytracing and the other new tech? I get 6120 and the mean(I'm assuming) score UL for the 2060mRF is 6183. I was certainly ripped off with the 2060 from a labeling POV between desktop and laptop, 22% difference same name, ugh.
Good luck to UL etc trying to distinguish on an official level between low/high powered versions, of course the scores would indicate it, but how to put that in the graph? Just one big mean score for all of them, confusion. My head hurts now.
https://benchmarks.ul.com/compare/b...RE&reverseOrder=true&types=MOBILE&minRating=0Last edited: Jan 26, 2021 -
Maybe he was trying to say that: you first get 150+watts for few months and for all the reviews to be made..... and then with a magical update bios/vbios we will reduce yourTGP to 80 watts for you best protection. LOL -
Last edited: Feb 9, 2021
-
Yeah, Dell show it's beautyhttp://forum.notebookreview.com/thr...ew-by-ultra-male.828258/page-14#post-10929641
240W psu with 3080 mobile even if its castrated is darn disgusting when you know it was too weak for previous models.
Yeah, the new Ampere models will most likely be added to the list... https://www.dell.com/support/kbdoc/...r-battery-drain-while-ac-adapter-is-connectedLast edited: Jan 26, 2021electrosoft, Spartan@HIDevolution and Normimb like this.
How will Ampere scale on laptops?
Discussion in 'Gaming (Software and Graphics Cards)' started by Kunal Shrivastava, Sep 6, 2020.