Edit: I made an error in the comparaison so i delete this post. Sorry.
-
-
Looks like RT was on. At least that's what he says
-
I was talking about GPU ("the card") undervolting using Afterburner. My remaining points stand I guess
While we are at it, his CPU is slower too.
My Spanish is a bit rusty, but the title of the YT video is:
Shadow of the Tomb Raider Benchmark on RTX 3080 Mobile (ULTRA SETTINGS + RAY TRACING) -
yep ray tracing on would explain a lot, I mean tbh for that card it would be ridiculous to NOT have rt on. Good catch.
etern4l likes this. -
-
Isn't this supposed to be an enthusiast community? It seems sitting here comparing desktop cards with massive TGP's to laptop cards where the TGP varies wildly and cooling solutions are far inferior is folly. Why not just compare it straight up to the previous 20 series mobile silicon and figure out if it's worth upgrading from your current 20 series (or even 10 series) or not?
Comparing desktop cards with massive coolers and usually triaxial (at least biaxial) fans and heatsinks seems.. silly and false equivalency?
Even if nVidia is calling them both 3080 aren't we smart enough as enthusiasts to just know that's bollocks and not equivalent and do the mental comparison based on what we know of physics?Kunal Shrivastava, alaskajoel, etern4l and 2 others like this. -
AB-SO-LUTELY! Sorry I missed that detail I just assumed it was all laptop hardware being compared on a laptop site LOLhfm likes this.
-
The 2080 super 200w is considere a laptop card when i dowload the driver from Nvidia. It is also considiere a laptop card when i compare it with similar system in 3dmark. My system is a laptop with a Desktop CPU, look at my sygnature.
So far we have no information that there will ever be a more then 150+ watts (200 watts for laptop) for the 3080. I was trying to compare, last year, most powerfull laptop card (2080s super) to the most powerfull latop card we actually have information about about wich is the 3080 130 watts so far.
I don't doubt the 3080 latpop will beat all the previous generation card but like you i am interested in knowing if it's worth an upgrade $$ for me. So far i am disapointed but i keep my mind open and look for the results we have rignt now.Last edited: Jan 24, 2021seanwee, Papusan, hfm and 1 other person like this. -
-
I will post the results, even if it makes my system look bad. I am trying to be as accurate as possible. Like everyone else i am trying to guess what the new cards are really capable of.
I will use his exact same setting (but with ray tracing on this time), No overclock, no undervolt.NuclearLizard, hfm and etern4l like this. -
IMHO the only disappointment should be with Dell for cynically not providing an upgrade path for users who spent thousands on a brand new 51M. I bet if you could get a GPU upgrade for a grand, and sell the old DGFF card on eBay, you would be much more inclined to look favourably at that proposition. That was the original idea of the 51M anyway (catching up to Clevo), and Dell completely dropped the ball at its execution.
As for the 3080 200W, we should just wait for an announcement from Alienware. There has been no hint of a new 51M so far, they seem to be referring to the m15/m17 as the flagship in their pre-release marketing materials. The brand has been going downhill for a while, so it wouldn't be too much of a surprise if they cut their once ambitious flagship. Or maybe it's some sort of marketing strategy: lower consumers' expectations so they won't be disappointed with what they get: an ever more limited, non-upgradable "desktop replacement".Last edited: Jan 24, 2021Normimb, Eclipse251 and hfm like this. -
@kevin,@papusan, @Spartan@HIDevolution,@etern4l,@captn.ko,@JRE84, @hfm
Ok guys before i made a big mistake comparing last year strongest laptop card to this year using a different versioin of Shadow of the tomb raider for benchmark.
Now i have dowloded the correct version and did run a test, all stock for me (don't know for this guy) and the exact same settings this guy have set.
I have a much stronger GPU (10900K running at 49X) and this guy had a I7 (i think).
This guy says he has a 130w card. (is it a 115w + 15w turbo like previously mentionned by @etern4l? i think so.)
This guys results: 79 FPS (3080 lapopt 130w)
My results: 82 FPS (2080S laptop 200w)
This guy 3080:
My result 2080s
You will notice all settings are the same (i really hope i didn't make any mistake) and that the 3080 is only 4% slower (mighy be equal after driver update) compare to 2080S of last year. For sure the 150+watts will beat the 2080S but not by the big margin as we were expecting. I hope this helps the community evaluating the real performace of the 3080 compare to last gen.
Edit: just notice that if you look at the lower right corner that the CPU and GPU scores are separated. The 3080 then beats my 2080S by a few frames.Last edited: Jan 24, 2021seanwee, Kunal Shrivastava, Sk0b0ld and 1 other person like this. -
yrekabakery Notebook Virtuoso
-
Many thanks bro.
OK, so basically a 115W + maybe 15W chip (it's too bad the guy didn't capture the actual power draw) with a 45W TDP CPU (running at maybe 25W I guess) thin and light (2.4kg) laptop is providing the same performance as previous gen 185W GPU + twice the TDP CPU desktop replacement. That's extraordinary if you think about it. The 185W/200W 3080 models will blast last gen to smithereens. On top of that the new 3080 has 16GB of VRAM....Last edited: Jan 24, 2021Papusan, Kunal Shrivastava and Normimb like this. -
@Normimb
Thx for the test.
Just my 2 cents
1. In cpu Limit tests your 5Ghz 10 Core is way in front of the 10870h. That has an impact to the overall score as well.
2. The 130w 3080 is as fast as the 200w 2080s. We know that Desktop ampere is nearly as efficient as Turing but it looks like Notebook ampere is more efficient (same Score, but lower power)
150w+ (200w) 3080 should beat 2080s by 15-20% and could reach 3070/2080ti level. Thats what i expect.Kunal Shrivastava, etern4l and Normimb like this. -
You think 50 to 75% more power (seems the baseline here is 115W) will yield 15-20% better performance? Sounds like a very conservative estimate.
-
Its the same Chip the 3070 desktop has (a few more shader). So yeah, i expect 3070 Level with higher PL (thats the Best case)... more would be unrealistic...
-
yrekabakery Notebook Virtuoso
Not really. Power scales quadratically with clocks/voltage. For example, the 200W 2080/2080S is only 10% faster than the 150W version. -
Yes the 3080 200w (if it ever exist) will be a big improvment over the 2080s laptop + i edit my previous post and if you look at the lower right corner ot the benchmak results you will see that this guy GPU actually beats my 2080S GPU by a few frames. If on the other hand there is never a 200+w for this 3080 and only the 150+w then i don't expect more then a 20% gain over the 2080S 200watts.
like i mention before i just edit my post go see the benchmark results and look a the lower right corner. The 3080 beats my 2080S. I agree a 150 watts will beat my 2080s but for it to reach 2080TI?....or 3070.....hum... possible but i am still sceptical.Last edited: Jan 24, 2021 -
I guess. This gimped 3080 is running on fairly low clocks of 1600 and 60C, but now much higher could it go at 200W -1900? 2000? The latter, optimistic, case (+25%), on top of the increase in score due to a more powerful CPU would put 25%+ DTR vs DTR in the realm of possibility. A very respectable improvement, but not Earth-shattering.
Also, I just noticed this GE66 Raider is a 15 inch laptop LOL. It's something I would consider if Dell fails again with their m15 refresh by soldering RAM (and they almost certainly will). Love that massive 99WHr battery, the same one the better old Alienware used to fit.
Overall, given that I'm not clear on the impact of CPU performance difference on this benchmark, I am going to go with a conservative +30% estimate here (an extra 5% to account for other architectural improvements not taken into account), assuming the refresh will stay on the 10th gen CPUs. We'll find out in a couple of months, unless Alienware bin Azor's toy out of spite
Last edited: Jan 24, 2021 -
I'd keep an eye on it. The GE series seems to have double tripod heatsinks. I was looking at its big brother (GE76) Vs the Scar 17. or whichever fchasis has the 150w GPU from asus.etern4l likes this.
-
That doesn't sound good. You mean there are just 3 screws around CPU and 3 around GPU?
-
I did some tests in SOTTR with my desktop 2080 super. Just curious to see if the RTX 3080 can compete against a desktop card from the previous generation. The tests were made with the same settings as in the video. The HWinfo logs were made during the benchmark. These are the setting which I've taken:
Specs:
CPU: i9-9900KS (8/16)
GPU: RTX 2080 Super MSI Gaming X Trio
1st test with my daily UV-settings:
CPU: Stock @ 5.0 GHz, Vcore 1.221V
GPU: UV 0.845V @ 1.875 MHz
(GPU Power avg in 2h gaming: 125-130w)
2nd test with stock GPU settings:
3rd test with slight GPU OC (+75/ +500):
Between the UV-setting (130w avg) and the OC-setting (230w avg) are only 8 fps difference.
just for reference
GPU bound: 66%)Last edited: Jan 24, 2021Papusan, Normimb, etern4l and 1 other person like this. -
Yea, just like that alienware that had it. Probably not the best use of I'd watch it but I definitely recommend scrutinizing it more.etern4l likes this.
-
I think ill wait for this roller coaster to stop...tired of faster than a 2080ti then slower then faster...bah..waiting till it's released to know
-
Thanks for the warning - for completness, which AW model suffers from this, if you remember?
-
The 15R3 and 17R4 I think. TBH I haven't followed alienware so much since they killed the 18....the first time.etern4l likes this.
-
Btw, how do we know the 3080 Max-Q in the GE66 is 130W?
etern4l likes this. -
The 115W 3070 is posting 10700 graphics score on Timespy, which is better than the average 2080 Super mobile.
https://reddit.com/r/GamingLaptops/comments/l46dro/_/gkmuofd/?context=1etern4l likes this. -
Person on Reddit with a GP76 w/ RTX 3070 posted stock Firestrike and Time Spy scores:
The GPU scores are pretty much 200W RTX 2080 territory. -
This guy «Geremoy» on youtube who posted this video on cyberpunk:
-
Indeed it's very close, almost at par with my 2080S. If he is stock then it's a big improvment over 2070 laptop last gen.Last edited: Jan 24, 2021
-
We all know you can't compare the castrated 3080 mobile with the 3080 desktop card. But 200W 3080 mobile with maxed out silicon should easly match 220W 3070 desktop card. And from before we have seen the power capped Max-Q cards vs the real mobile should perform arond 20-23% above.
GeForce RTX 3070 Founder edition review
Add +20% on top and we are where the 3070 desktop cards stay.
Clamibot, Spartan@HIDevolution, Normimb and 5 others like this. -
Personally I wouldn't mind getting a RTX 3080 200w at all. We'll see when they actually release for Clevo X170. I'd love to see it in action
-
IMHO inconclusive. He didn't respond to the request for clarification below, or show any power numbers to back this up.
Bear in mind this is a thin and light. 130W is a lot. Does it have a vapour chamber?Last edited: Jan 25, 2021 -
200w 3080M matching a 3070 desktop is to be expected. Still a dissapointment knowing what it could have been. An actual 3080 die would have been able to match a 3070 desktop with 130w and beat it handily at 150w+Eclipse251, Papusan, Normimb and 1 other person like this.
-
It's disgusting when you know the jokeman Jensen gave all the love to the desktop cards. He treat the laptop users as Nvidia's own milking machine.
'Giant Step into the Future': NVIDIA CEO Unveils GeForce RTXseanwee, Eclipse251, Spartan@HIDevolution and 2 others like this. -
I'm old and indifferent to what might happen in the land of I Wish. If I can get 3070 desktop performance my ching is going to drop.
hfm, NuclearLizard and etern4l like this. -
Hey brother
Yup....i didn't say it's for sure a 130w card but that is the answer this guy «Geremoy» gave when asked. BTW he is the one providing all the benchmark information on the 3080 we have right now.
Tomorrow we will not be talking about his results anymore...lol.
The 3070 laptop another guy posted are much better. Close to what we were expecting from rtx30xx.etern4l and NuclearLizard like this. -
BrightSmith Notebook Evangelist
Very curious to see whether 16GB will make any difference for RTX and/or higher resolutions.
-
From what I remember apparently things like DLSS chew up ram. Also devs might wet their pants because thats apparently the #1 thing they ask for.
Hopefully it gets used up and isn't just to sell.etern4l likes this. -
yrekabakery Notebook Virtuoso
DLSS reduces VRAM usage, since the internal rendering resolution is lower.
Papusan, hfm, BrightSmith and 1 other person like this. -
BrightSmith Notebook Evangelist
If the game supports it. But that will probably become the standard anyway. -
That sounds incorrect. If you are rendering at 4K, say, then you need a 4K framebuffer - whether you are using DLSS or not. With DLSS on, you need additional memory for:
* The lower resolution frame buffer used for primary rendering
* Any memory required by the DLSS algo itself (it might be a fair amount given the massive parallelization likely needed to get it to work as fast as it does). -
yrekabakery Notebook Virtuoso
-
Interesting, but that's just one game. What likely happens here is because the rendering occurs at lower resolution, the engine automatically uses lower resolution textures. .In theory not every game would dumb textures down when rending at a lower resolution.
Additionally, DLSS 2.0 will require additional memory since it includes temporal feedback (it needs the history of past frames). In fairness, all these extra frame buffers probably don't impact the VRAM utilisation much compared to textures (a 4K buffer takes up around 32MB, a FHD frame takes about 8MB), so the open question is how much VRAM is needed by the DLSS implementation itself. Sadly, the Green Goblin is not sharing DLSS documentation publicly as far as I can see.Last edited: Jan 25, 2021 -
yrekabakery Notebook Virtuoso
How will Ampere scale on laptops?
Discussion in 'Gaming (Software and Graphics Cards)' started by Kunal Shrivastava, Sep 6, 2020.




