Edit: I made an error in the comparaison so i delete this post. Sorry.
-
-
-
While we are at it, his CPU is slower too.
Shadow of the Tomb Raider Benchmark on RTX 3080 Mobile (ULTRA SETTINGS + RAY TRACING) -
yep ray tracing on would explain a lot, I mean tbh for that card it would be ridiculous to NOT have rt on. Good catch.
etern4l likes this. -
-
Isn't this supposed to be an enthusiast community? It seems sitting here comparing desktop cards with massive TGP's to laptop cards where the TGP varies wildly and cooling solutions are far inferior is folly. Why not just compare it straight up to the previous 20 series mobile silicon and figure out if it's worth upgrading from your current 20 series (or even 10 series) or not?
Comparing desktop cards with massive coolers and usually triaxial (at least biaxial) fans and heatsinks seems.. silly and false equivalency?
Even if nVidia is calling them both 3080 aren't we smart enough as enthusiasts to just know that's bollocks and not equivalent and do the mental comparison based on what we know of physics?Kunal Shrivastava, alaskajoel, etern4l and 2 others like this. -
hfm likes this.
-
So far we have no information that there will ever be a more then 150+ watts (200 watts for laptop) for the 3080. I was trying to compare, last year, most powerfull laptop card (2080s super) to the most powerfull latop card we actually have information about about wich is the 3080 130 watts so far.
I don't doubt the 3080 latpop will beat all the previous generation card but like you i am interested in knowing if it's worth an upgrade $$ for me. So far i am disapointed but i keep my mind open and look for the results we have rignt now.Last edited: Jan 24, 2021seanwee, Papusan, hfm and 1 other person like this. -
-
I will use his exact same setting (but with ray tracing on this time), No overclock, no undervolt.NuclearLizard, hfm and etern4l like this. -
As for the 3080 200W, we should just wait for an announcement from Alienware. There has been no hint of a new 51M so far, they seem to be referring to the m15/m17 as the flagship in their pre-release marketing materials. The brand has been going downhill for a while, so it wouldn't be too much of a surprise if they cut their once ambitious flagship. Or maybe it's some sort of marketing strategy: lower consumers' expectations so they won't be disappointed with what they get: an ever more limited, non-upgradable "desktop replacement".Last edited: Jan 24, 2021Normimb, Eclipse251 and hfm like this. -
@kevin,@papusan, @Spartan@HIDevolution,@etern4l,@captn.ko,@JRE84, @hfm
Ok guys before i made a big mistake comparing last year strongest laptop card to this year using a different versioin of Shadow of the tomb raider for benchmark.
Now i have dowloded the correct version and did run a test, all stock for me (don't know for this guy) and the exact same settings this guy have set.
I have a much stronger GPU (10900K running at 49X) and this guy had a I7 (i think).
This guy says he has a 130w card. (is it a 115w + 15w turbo like previously mentionned by @etern4l? i think so.)
This guys results: 79 FPS (3080 lapopt 130w)
My results: 82 FPS (2080S laptop 200w)
This guy 3080:
My result 2080s
You will notice all settings are the same (i really hope i didn't make any mistake) and that the 3080 is only 4% slower (mighy be equal after driver update) compare to 2080S of last year. For sure the 150+watts will beat the 2080S but not by the big margin as we were expecting. I hope this helps the community evaluating the real performace of the 3080 compare to last gen.
Edit: just notice that if you look at the lower right corner that the CPU and GPU scores are separated. The 3080 then beats my 2080S by a few frames.Last edited: Jan 24, 2021seanwee, Kunal Shrivastava, Sk0b0ld and 1 other person like this. -
yrekabakery Notebook Virtuoso
-
OK, so basically a 115W + maybe 15W chip (it's too bad the guy didn't capture the actual power draw) with a 45W TDP CPU (running at maybe 25W I guess) thin and light (2.4kg) laptop is providing the same performance as previous gen 185W GPU + twice the TDP CPU desktop replacement. That's extraordinary if you think about it. The 185W/200W 3080 models will blast last gen to smithereens. On top of that the new 3080 has 16GB of VRAM....Last edited: Jan 24, 2021Papusan, Kunal Shrivastava and Normimb like this. -
@Normimb
Thx for the test.
Just my 2 cents
1. In cpu Limit tests your 5Ghz 10 Core is way in front of the 10870h. That has an impact to the overall score as well.
2. The 130w 3080 is as fast as the 200w 2080s. We know that Desktop ampere is nearly as efficient as Turing but it looks like Notebook ampere is more efficient (same Score, but lower power)
150w+ (200w) 3080 should beat 2080s by 15-20% and could reach 3070/2080ti level. Thats what i expect.Kunal Shrivastava, etern4l and Normimb like this. -
-
Its the same Chip the 3070 desktop has (a few more shader). So yeah, i expect 3070 Level with higher PL (thats the Best case)... more would be unrealistic...
-
yrekabakery Notebook Virtuoso
-
Last edited: Jan 24, 2021 -
Also, I just noticed this GE66 Raider is a 15 inch laptop LOL. It's something I would consider if Dell fails again with their m15 refresh by soldering RAM (and they almost certainly will). Love that massive 99WHr battery, the same one the better old Alienware used to fit.
Overall, given that I'm not clear on the impact of CPU performance difference on this benchmark, I am going to go with a conservative +30% estimate here (an extra 5% to account for other architectural improvements not taken into account), assuming the refresh will stay on the 10th gen CPUs. We'll find out in a couple of months, unless Alienware bin Azor's toy out of spiteLast edited: Jan 24, 2021 -
etern4l likes this.
-
-
I did some tests in SOTTR with my desktop 2080 super. Just curious to see if the RTX 3080 can compete against a desktop card from the previous generation. The tests were made with the same settings as in the video. The HWinfo logs were made during the benchmark. These are the setting which I've taken:
Specs:
CPU: i9-9900KS (8/16)
GPU: RTX 2080 Super MSI Gaming X Trio
1st test with my daily UV-settings:
CPU: Stock @ 5.0 GHz, Vcore 1.221V
GPU: UV 0.845V @ 1.875 MHz
(GPU Power avg in 2h gaming: 125-130w)
2nd test with stock GPU settings:
3rd test with slight GPU OC (+75/ +500):
Between the UV-setting (130w avg) and the OC-setting (230w avg) are only 8 fps difference.
just for reference
GPU bound: 66%)Last edited: Jan 24, 2021Papusan, Normimb, etern4l and 1 other person like this. -
etern4l likes this.
-
I think ill wait for this roller coaster to stop...tired of faster than a 2080ti then slower then faster...bah..waiting till it's released to know
-
-
etern4l likes this.
-
Btw, how do we know the 3080 Max-Q in the GE66 is 130W?
etern4l likes this. -
The 115W 3070 is posting 10700 graphics score on Timespy, which is better than the average 2080 Super mobile.
https://reddit.com/r/GamingLaptops/comments/l46dro/_/gkmuofd/?context=1etern4l likes this. -
Person on Reddit with a GP76 w/ RTX 3070 posted stock Firestrike and Time Spy scores:
The GPU scores are pretty much 200W RTX 2080 territory. -
-
Last edited: Jan 24, 2021
-
GeForce RTX 3070 Founder edition review
Clamibot, Spartan@HIDevolution, Normimb and 5 others like this. -
-
Bear in mind this is a thin and light. 130W is a lot. Does it have a vapour chamber?Last edited: Jan 25, 2021 -
Eclipse251, Papusan, Normimb and 1 other person like this.
-
'Giant Step into the Future': NVIDIA CEO Unveils GeForce RTXseanwee, Eclipse251, Spartan@HIDevolution and 2 others like this. -
I'm old and indifferent to what might happen in the land of I Wish. If I can get 3070 desktop performance my ching is going to drop.
hfm, NuclearLizard and etern4l like this. -
Yup....i didn't say it's for sure a 130w card but that is the answer this guy «Geremoy» gave when asked. BTW he is the one providing all the benchmark information on the 3080 we have right now.
Tomorrow we will not be talking about his results anymore...lol.
The 3070 laptop another guy posted are much better. Close to what we were expecting from rtx30xx.etern4l and NuclearLizard like this. -
BrightSmith Notebook Evangelist
Very curious to see whether 16GB will make any difference for RTX and/or higher resolutions.
-
Hopefully it gets used up and isn't just to sell.etern4l likes this. -
yrekabakery Notebook Virtuoso
DLSS reduces VRAM usage, since the internal rendering resolution is lower.
Papusan, hfm, BrightSmith and 1 other person like this. -
BrightSmith Notebook Evangelist
-
* The lower resolution frame buffer used for primary rendering
* Any memory required by the DLSS algo itself (it might be a fair amount given the massive parallelization likely needed to get it to work as fast as it does). -
yrekabakery Notebook Virtuoso
-
Additionally, DLSS 2.0 will require additional memory since it includes temporal feedback (it needs the history of past frames). In fairness, all these extra frame buffers probably don't impact the VRAM utilisation much compared to textures (a 4K buffer takes up around 32MB, a FHD frame takes about 8MB), so the open question is how much VRAM is needed by the DLSS implementation itself. Sadly, the Green Goblin is not sharing DLSS documentation publicly as far as I can see.Last edited: Jan 25, 2021 -
yrekabakery Notebook Virtuoso
How will Ampere scale on laptops?
Discussion in 'Gaming (Software and Graphics Cards)' started by Kunal Shrivastava, Sep 6, 2020.