You know "a little bit about computers" so I should trust your judgement?
That memory is shared on the consoles, so a portion of it is already being allocated to other system tasks, as well there is bus contention for access to it by the CPU. WE DON'T NEED MORE THAN 10GB unless you are planning on trying to run everything at 4K slammed to ULTRA LARGE TEXTURES without DLSS. Don't be suckered into being fleeced for another 10GB of memory you don't need by nvidia unless it'll just make you feel good. I'd rather use that money toward something else (a year of gamepass and probably two other AAA games. on top of it. Maybe $200-400 towards a new laptop..) AMD can (RUMOR MILL THEY HAVEN'T ANNOUNCED ANYTHING) push larger memory pools because they are using cheaper GDDR6. They are also doing it to try to compete with nVidia.
By the time 10GB isn't enough anymore we'll all be moving on to the 4000 series nvidia GPUs or 7000 series Radeon GPUs.
I've been watching VRAM consumption while I've been playing a few games at 1600p, I haven't seen anything using over 4-5GB even with decent texture quality. Don't bother trying to post videos about Call of Duty using all the VRAM, it's just ALLOCATING IT, it's not actually using it. I think Gamers Nexus or HUB already debunked this recently.
-
hfm likes this.
-
Last edited: Sep 23, 2020Prototime likes this.
-
Last edited: Sep 23, 2020
-
yrekabakery Notebook Virtuoso
Yeah. The old Xbox One X already has 12GB of unified GDDR5, which is more than any gaming card has for its VRAM.
Prototime likes this. -
yrekabakery likes this.
-
NVIDIA GeForce RTX 3090 Is 10-15% Faster Than The GeForce RTX 3080 On Average in 4K Gaming - Designed As A Prosumer-Class Product, Expect Limited Availability on Launch
https://wccftech.com/nvidia-geforce...-4k-gaming-confirms-limited-supply-at-launch/
NVIDIA has unveiled more official performance metrics of their fastest GeForce RTX 30 series graphics card, the GeForce RTX 3090. According to NVIDIA, the GeForce RTX 3090 is designed specifically for gaming at ultra-high resolutions such as 8K and also delivers the fastest graphics performance in professional applications making it a top of the line prosumer-class product.
In their GeForce blog, NVIDIA states that the GeForce RTX 3090 is termed as "BFGPU" because it's not just a traditional gaming-oriented graphics card. Since its unveiling, the GeForce RTX 3090 has been seen as a Titan replacement which is evident from its specifications and market positioning.
https://videocardz.com/newz/nvidia-...is-10-to-15-faster-than-rtx-3080-in-4k-gaming
Edit.
Btw. Users are reporting a crash to desktop issues with custom GeForce RTX 3080 models videocardz.de
This now clearly widespread, but still unexplored issue appears in a certain group of games when the boost clock exceeds 2.0 GHz. As soon as the clock speeds reach a certain level, the game crashes to desktop.Last edited: Sep 23, 2020TBoneSan, hfm, Aroc and 1 other person like this. -
Know when to back down and you'll save yourself a lot of humiliation
I have no idea why you're even supporting them so adamantly, you didn't buy a 3090, theres literally no incentive for you to support the 3090.Last edited: Sep 24, 2020electrosoft, hfm, Prototime and 2 others like this. -
Yeah, as we knew it's for the people that
- Need the VRAM, Supposedly DE at 8K was using 16GB of VRAM in Ultra Nightmare.. but that's 8K. There is an extreme edge case of people that are going to be using 8K for the next couple years at least. Hardly anyone is even using 4K right now. I was testing DE on my eGPU setup last night running it at 1600p Ultra Nightmare and it was using 4-5GB of VRAM. But Content Creator workloads in a lot of cases eat gobs of VRAM. But those people will have the card pay for itself in time saved doing jobs wich means revenue increase.
- People that need the extra cores because all they care about is benchmark wars.
saturnotaku and electrosoft like this. -
With such a small performance gain of 10% over the 3080 there is definitely no space for a 3080 Ti though. That settles that speculation
hfm likes this. -
-
3080@10GB - $699
3080@20GB - $899
A Ti card could smells more like $1099 or $100 usd above previous top dog 2080Ti. Yeah, Nvidia have still room for better earningsLast edited: Sep 24, 2020Aroc likes this. -
-
It could also be that GDDR6X is that expensive right now.. shrug... -
Heres a thought, what if they release a 20gb 3080, but instead of gddr6x its using gddr6 and is priced the same as the regular 3080?
hfm likes this. -
-
B&H doesn't even list the 3080 as a product any longer, I guess they will re-list them when they actually get some. Wow.
-
-
If you mean when are they going to actually be in stock.. shrug.. I'm planning on probably not being able to easily get one for 1-3 months..Prototime likes this. -
In average I'd say a 90-93% of performance of the 3080. So if I'm not able to get a 3080 for a good price and a see a sweet deal on a 2080 Ti I'll just grab a 2080 Ti, performance wise at the same power consumption they are not that different -
electrosoft likes this.
-
-
-
hfm and electrosoft like this.
-
electrosoft Perpetualist Matrixist
-
-
Check Hardware Unboxed's review for all the details
Tyranus07 likes this. -
-
-
-
Spartan@HIDevolution Company Representative
From a friend:
TLDR: The RTX 30XX Series is NOT compatible with the Alienware Graphics Amplifier
-
-
TBoneSan, Arrrrbol, Vasudev and 1 other person like this. -
-
-
-
I’m happy to be back with a Turing GPU. Picked up a 2080Ti FE last night, it has Samsung memory and appears to be a really new model based on the S/N.
I am shopping for a water block now, going to juice her up to 532 watts, for a poor mans RTX3080! -
I have a MSI 2080 Ti Ventus OC. It's a reference PCB. Also have it watercooled.
Think the 300W vBIOS is a nice fit for this card? Also looking into squeezing a little more performance out of it -
Flash to the Galax 380 watt bios, it is for reference design too.
Keep it under 55C and it’ll lock down 2.1Ghz in any game. With just the Galax bios. This around 25-30% performance uplift over a 2080Ti running stock.
For going further:
To get past 2.100Mhz-2,130Mhz. Your gonna need to run it even cooler, if you have a good custom loop this helps. Use the same Galax 380 watt bios, and solder on (2) 8ohm resistors for the 8 pin connects. This will reduce power reading by 40% allowing you to pull 532 watts.
Then you’ll be able to run 2,145-2,175Mhz locked. Maybe more if you could keep it cool with a water chiller or something.
2.1Ghz locked will match a RTX3080 though. So, that’s all ya need really.
I have owned (4) 2080Ti’s they’re are beastly.
The Galax bios helps a lot though. These cards are starving for power even at 320 watt limit. I just picked up this FE last night, and it is still on the stock air cooler, with the Galax 380 watt bios. This performance cannot relate well to gaming though, because the cards gets to hot lol.
I am expecting around 17,300-17,400 Timespy graphics once I get it watercooled.
Last edited: Oct 19, 2020 -
A-bin, that's it. It was the main reason for chosing this model when I got it. A reasonably long time ago
Meanwhile I was upgrading my loop and opening up the front and top panels on my Phanteks Evolv mATX because it was really struggling for air intake.
Now it's finally ready to get the proper vBIOSThanks for the refresher!
Where's the best place to source the BIOS? Techpowerup? From that list, would the 10 year be the most appropriate?
I have an EK block on my GPU but skimped on the EK backplate because I don't like the look and figure those are mostly for show. No problems running it with a bare back on this BIOS, right? I think it gets more than adequate air-flow over it. -
The memory on my 2080Ti is god like so far! I’m around +1,200 and its
Here's the one you need! Thats what I am running. Make sure to pull yours off of GPU-Z as a backup just in case.
https://www.techpowerup.com/vgabios/204557/kfa2-rtx2080ti-11264-180910Shark00n likes this. -
@Shark00n
You should be all set for 2.1Ghz. I wouldn’t bother with a backplate, only for looks, they usually just make them run hotter anyways.
Watercooled GPU’s don’t even get hot to the touch on the back of the PCB.
I was running 2,130Mhz on a NON-A card previously. It was actually voltage limited. The A variants can run 1.050 to 1.068 I believe. I dunno my FE doesn’t run cool enough to send either of those lol.Shark00n likes this. -
My 2080Ti is at 16,600 now on the memory and climbing “Gaming stable” ! Never seen this before. It has some crazy good Vram.
-
Is NVFlash (ID mismatch disabled) in Windows the best software to use? Or should I do it in DOS?
Already backed up my original vBIOS. Any other safety steps you recommend? Haven't flashed a vBIOS since the days of the Radeon X850 Pro I think -
I’m still using the FE cooler and I managed this.
Also, I too had a X850 Pro back in the day. I flashed it to a X850XT.
Here’s my timespy.
https://www.3dmark.com/3dm/52921404?Last edited: Nov 14, 2020Shark00n likes this. -
Thanks.
Yeah I used the command prompt, everything seemed to worked nicely.
But I just realised my card might actually not be flashable.
Although it's a MSI A-Bin, I got it early 2020 and I think the XUSB FW is updated and not compatible. I always get the following message:
Unless anyone know of a vBIOS with updated XUSB FW and a higher power limit?
Can't seem to find any.
Maybe this one? https://www.techpowerup.com/vgabios/219116/msi-rtx2080ti-11264-200203
40W higher PL than mine. But it's a custom PCB... Any insight on this @tps3443 ? Info about this online is limited. Thanks!Last edited: Nov 14, 2020 -
-
I’m surprised it broke 17K. It’ll be fully watercooled in a couple days. I expect closer to 18K in timespy graphics. It shouldn’t have any issues matching a RTX3080.
Anyways, I play at 2560x1440P 165HZ.Last edited: Nov 14, 2020 -
I would say go straight for soldering the shunts then. It’ll uplift your power by 40%. It is safe, very easy, and you can continue to use your stock bios.
My 2080Ti is flashed to 380 watt Galax bios, and I’ve soldered on the 8ohm shunts. That’s why it’s breaking 17K timespy graphics on just air cooling. But it’s sucking down power like no bodies business. It’s pulling like 425 watts through timespy run. Once I cool it down, it’ll help lower than power usage some.
RTX 3080 trumps 2080ti by a whopping margin
Discussion in 'Gaming (Software and Graphics Cards)' started by JRE84, Jun 24, 2020.