You know "a little bit about computers" so I should trust your judgement?
That memory is shared on the consoles, so a portion of it is already being allocated to other system tasks, as well there is bus contention for access to it by the CPU. WE DON'T NEED MORE THAN 10GB unless you are planning on trying to run everything at 4K slammed to ULTRA LARGE TEXTURES without DLSS. Don't be suckered into being fleeced for another 10GB of memory you don't need by nvidia unless it'll just make you feel good. I'd rather use that money toward something else (a year of gamepass and probably two other AAA games. on top of it. Maybe $200-400 towards a new laptop..) AMD can (RUMOR MILL THEY HAVEN'T ANNOUNCED ANYTHING) push larger memory pools because they are using cheaper GDDR6. They are also doing it to try to compete with nVidia.
By the time 10GB isn't enough anymore we'll all be moving on to the 4000 series nvidia GPUs or 7000 series Radeon GPUs.
I've been watching VRAM consumption while I've been playing a few games at 1600p, I haven't seen anything using over 4-5GB even with decent texture quality. Don't bother trying to post videos about Call of Duty using all the VRAM, it's just ALLOCATING IT, it's not actually using it. I think Gamers Nexus or HUB already debunked this recently.
-
Recently I started playing Resident Evil 7 at 1080p. The game has 4 options for textures: low, medium, high and very high. The crazy thing is even though the textures from medium to very high are exactly the same in resolution the difference in this game is the amount of textures stored in VRAM and hence going from very high to medium the game takes longer time stream the textures and you can actually see how the textures load while playing, I mean the textures go from blurry to clear in front of your eyes. So I had to put textures to at least high to not get this undesired visual issue and the game peaks at around 94% of my VRAM (around 7700 MB). I've never seen a game using that much VRAM at 1080phfm likes this.
-
There are some games that just allocate all the VRAM for use, but never actually use it all. Does that allocation decrease when you decrease the texture quality? That's an interesting use case. Do you notice the difference in quality of textures when you bump down, not considering the undesirable pop-in from LoD. I've noticed that if I bump textures down from whatever ULTRA is to the next lower quality, a large percentage of the time I can't tell the difference.Last edited: Sep 23, 2020Prototime likes this.
-
This is a key point that many people overlook. The consoles may have more VRAM but they don't have separate system RAM like PCs do. We'll have to wait for benchmark numbers to come in to conduct a proper comparison, but I doubt that games designed for a console that has 16GB VRAM + 0GB system RAM will struggle with memory issues on a PC that has 10GB VRAM + 16GB system RAM.Last edited: Sep 23, 2020
-
yrekabakery Notebook Virtuoso
Yeah. The old Xbox One X already has 12GB of unified GDDR5, which is more than any gaming card has for its VRAM.
Prototime likes this. -
Wow RTX 2080Ti 11GB is so lucky it was able to keep up with ports from the One X...
yrekabakery likes this. -
And what's wrong in my different posts? I can't find anything wrong with them. Most of them are spot on! I can't say the same for many others posts. So don't drag me in a direction I don't belong.
NVIDIA GeForce RTX 3090 Is 10-15% Faster Than The GeForce RTX 3080 On Average in 4K Gaming - Designed As A Prosumer-Class Product, Expect Limited Availability on Launch
https://wccftech.com/nvidia-geforce...-4k-gaming-confirms-limited-supply-at-launch/
NVIDIA has unveiled more official performance metrics of their fastest GeForce RTX 30 series graphics card, the GeForce RTX 3090. According to NVIDIA, the GeForce RTX 3090 is designed specifically for gaming at ultra-high resolutions such as 8K and also delivers the fastest graphics performance in professional applications making it a top of the line prosumer-class product.
In their GeForce blog, NVIDIA states that the GeForce RTX 3090 is termed as "BFGPU" because it's not just a traditional gaming-oriented graphics card. Since its unveiling, the GeForce RTX 3090 has been seen as a Titan replacement which is evident from its specifications and market positioning.
https://videocardz.com/newz/nvidia-...is-10-to-15-faster-than-rtx-3080-in-4k-gaming
Edit.
Btw. Users are reporting a crash to desktop issues with custom GeForce RTX 3080 models videocardz.de
This now clearly widespread, but still unexplored issue appears in a certain group of games when the boost clock exceeds 2.0 GHz. As soon as the clock speeds reach a certain level, the game crashes to desktop.Last edited: Sep 23, 2020TBoneSan, hfm, Aroc and 1 other person like this. -
@JRE84 Just looking at the specs you know theres no way the 3090 could be significantly faster than the 3080. Theres just no way.
Know when to back down and you'll save yourself a lot of humiliation
I have no idea why you're even supporting them so adamantly, you didn't buy a 3090, theres literally no incentive for you to support the 3090.Last edited: Sep 24, 2020electrosoft, hfm, Prototime and 2 others like this. -
Yeah, as we knew it's for the people that
- Need the VRAM, Supposedly DE at 8K was using 16GB of VRAM in Ultra Nightmare.. but that's 8K. There is an extreme edge case of people that are going to be using 8K for the next couple years at least. Hardly anyone is even using 4K right now. I was testing DE on my eGPU setup last night running it at 1600p Ultra Nightmare and it was using 4-5GB of VRAM. But Content Creator workloads in a lot of cases eat gobs of VRAM. But those people will have the card pay for itself in time saved doing jobs wich means revenue increase.
- People that need the extra cores because all they care about is benchmark wars.
saturnotaku and electrosoft like this. -
With such a small performance gain of 10% over the 3080 there is definitely no space for a 3080 Ti though. That settles that speculation
hfm likes this. -
Oh, don't put it past nvidia to release a 3080Ti anyway. Gotta milk all these potential customers who are still going to whine about getting only 10GB of VRAM.
-
The flagship could be the 3080@20GB card. And with a Super moniker instead for the Ti next fall. Nvdia already struggle make enough cards as it is. Higher revenue may lure Nvidia into release a 3080 Ti card.
3080@10GB - $699
3080@20GB - $899
A Ti card could smells more like $1099 or $100 usd above previous top dog 2080Ti. Yeah, Nvidia have still room for better earnings
Last edited: Sep 24, 2020Aroc likes this. -
That would be crazy... that a possible 3080 Ti beats the 3090. Based on numerical designation makes no sense, but anything can happen and is the only way that card would sell. A 3080 with 20 GB VRAM and 0-5% performance improvement to me would be just a gimmick
-
The only thing I could see them doing eventually is dropping the price of the 3090 and then launching a new Titan class card that can use the certified driver enhancements for workstation tasks.
It could also be that GDDR6X is that expensive right now.. shrug... -
Heres a thought, what if they release a 20gb 3080, but instead of gddr6x its using gddr6 and is priced the same as the regular 3080?
hfm likes this. -
It is true that the 2000 series Super cards were only narrowly faster than their non-super counterparts... SHRUG.. I don't see it being less than $899, who knows. I'm sure nVidia is just waiting to see where AMD prices their cards and how well they compete in performance. It's like a poker game where nVidia is only showing 3 of their 5 or 6 cards.. (3070/3080 "Ti/Super" and 3060 being the "hidden" ones)
-
B&H doesn't even list the 3080 as a product any longer, I guess they will re-list them when they actually get some. Wow.
-
I'm new to desktop GPU buying stuff... how long until we start to see retail price for the RTX 3080?
-
I think retail price is going to be the FE and some select AIB models at $699, then a bunch of AIB models in the 700's and maybe some in the 800's if they think they are adding tons of value like silicon lottery factory overclocks or premium cooling / power delivery. I'm just going to be happy with a run of the mill stockish clock card that has quiet fans. Asus TUF sounds nice.
If you mean when are they going to actually be in stock.. shrug.. I'm planning on probably not being able to easily get one for 1-3 months..Prototime likes this. -
I've seen a review where a power modded to 320 W RTX 2080 Ti is not that far away from a stock RTX 3080
In average I'd say a 90-93% of performance of the 3080. So if I'm not able to get a 3080 for a good price and a see a sweet deal on a 2080 Ti I'll just grab a 2080 Ti, performance wise at the same power consumption they are not that different -
I'm going to hold out and wait to get a 3080. I'm not interested in power modding or extreme overclocking. I'm not unhappy really with 2070 performance right now (Cyberpunk isn't out yet so I'll let you know how I feel after that lol) and am ok to wait till I can casually buy the AIB model I'm interested in. Gives time for more reviews to land about fan noise etc.. Asus TUF looks good though. Also gives time to see what AMD is going to announce.electrosoft likes this.
-
What makes you go for one brand over another? I've seen that within the same brand there are like 3 different variants of the same RTX 3080 with the only difference being the boost clock. If I'm happy with stock speeds etc, I'd just go for the cheapest card and brand isn't?
-
I was paying attention to fan noise dbA in reviews pretty much. Also the card has to fit in my eGPU, some of the AIB cards exceed the 55mm height limit for the card dimensions Sonnet says will fit.
-
The TUF 3080 is the most well built msrp card and is actually better that some cards that are above msrp so its a really good buyhfm and electrosoft like this.
-
electrosoft Perpetualist Matrixist
-
Yeah but, what do you mean by that? what makes that card better built? it has better memory chips or better components? or better cooling system? or just better enclosure?
-
Better Vrms, All MLCC, dedicated heatsink for vram
Check Hardware Unboxed's review for all the details
Tyranus07 likes this. -
Well after checking the video you posted, I'm sold with the TUF. I hope to be able to get one of these for retail price in the future
-
-
I'm thinking January at this point unless nv + samsung have been doing some massive production over the last few weeks.
-
Spartan@HIDevolution Company Representative
From a friend:
TLDR: The RTX 30XX Series is NOT compatible with the Alienware Graphics Amplifier
-
I thought the AGA was a PCIe x4 port, what a shame that it isn't. That pretty much kills it's upgrade capabilities. It could be months if ever Dell releases a firmware update to the AGA to support RTX 3000 series. I've seen some guys with the AGA board installed in an ATX chassis with custom power supply, external display, external keyboard and I thought why don't just buy a proper desktop instead that frankestein , lol
-
Todays question will be... Do Dell and Alienware really want to go through another round of bad publicity? Or is the ongoing Class Action Investigation enough for now?
TBoneSan, Arrrrbol, Vasudev and 1 other person like this. -
Wait a minute, the Area 51m R1 can't upgrade to 2080S? that would be pretty much scam, dell created it's own graphic form factor so users would get easier upgrade path. The CPU ok, I get it, that's how it works new socket equals no more upgrades. I remember back in the day when Alienware ditched MXM solutions and opted for the AGA as a future proof solution I thought it was actually a good idea though I still preferred a laptop with an internal MXM port. Maybe is time for a new AGA with at least PCIe x8 support and that not get performance hit while using the internal display.
-
That is odd... if you're going to bother with all that just build a desktop and sync your files on a NAS or something if you don't want two computers.
-
-
I’m happy to be back with a Turing GPU. Picked up a 2080Ti FE last night, it has Samsung memory and appears to be a really new model based on the S/N.
I am shopping for a water block now, going to juice her up to 532 watts, for a poor mans RTX3080! -
I have a MSI 2080 Ti Ventus OC. It's a reference PCB. Also have it watercooled.
Think the 300W vBIOS is a nice fit for this card? Also looking into squeezing a little more performance out of it -
I have owned a Ventus OC 2080Ti. You’ve got an “A Bin model” so that’s good.
Flash to the Galax 380 watt bios, it is for reference design too.
Keep it under 55C and it’ll lock down 2.1Ghz in any game. With just the Galax bios. This around 25-30% performance uplift over a 2080Ti running stock.
For going further:
To get past 2.100Mhz-2,130Mhz. Your gonna need to run it even cooler, if you have a good custom loop this helps. Use the same Galax 380 watt bios, and solder on (2) 8ohm resistors for the 8 pin connects. This will reduce power reading by 40% allowing you to pull 532 watts.
Then you’ll be able to run 2,145-2,175Mhz locked. Maybe more if you could keep it cool with a water chiller or something.
2.1Ghz locked will match a RTX3080 though. So, that’s all ya need really.
I have owned (4) 2080Ti’s they’re are beastly.
The Galax bios helps a lot though. These cards are starving for power even at 320 watt limit. I just picked up this FE last night, and it is still on the stock air cooler, with the Galax 380 watt bios. This performance cannot relate well to gaming though, because the cards gets to hot lol.
I am expecting around 17,300-17,400 Timespy graphics once I get it watercooled.
Last edited: Oct 19, 2020 -
A-bin, that's it. It was the main reason for chosing this model when I got it. A reasonably long time ago
Meanwhile I was upgrading my loop and opening up the front and top panels on my Phanteks Evolv mATX because it was really struggling for air intake.
Now it's finally ready to get the proper vBIOS
Thanks for the refresher!
Where's the best place to source the BIOS? Techpowerup? From that list, would the 10 year be the most appropriate?
I have an EK block on my GPU but skimped on the EK backplate because I don't like the look and figure those are mostly for show. No problems running it with a bare back on this BIOS, right? I think it gets more than adequate air-flow over it. -
The memory on my 2080Ti is god like so far! I’m around +1,200 and its
Here's the one you need! Thats what I am running. Make sure to pull yours off of GPU-Z as a backup just in case.
https://www.techpowerup.com/vgabios/204557/kfa2-rtx2080ti-11264-180910Shark00n likes this. -
@Shark00n
You should be all set for 2.1Ghz. I wouldn’t bother with a backplate, only for looks, they usually just make them run hotter anyways.
Watercooled GPU’s don’t even get hot to the touch on the back of the PCB.
I was running 2,130Mhz on a NON-A card previously. It was actually voltage limited. The A variants can run 1.050 to 1.068 I believe. I dunno my FE doesn’t run cool enough to send either of those lol.Shark00n likes this. -
My 2080Ti is at 16,600 now on the memory and climbing “Gaming stable” ! Never seen this before. It has some crazy good Vram.
-
Going to do the 380W galax bios flash today.
Is NVFlash (ID mismatch disabled) in Windows the best software to use? Or should I do it in DOS?
Already backed up my original vBIOS. Any other safety steps you recommend? Haven't flashed a vBIOS since the days of the Radeon X850 Pro I think
-
You can flash it directly in windows using command prompt. If you have a founders edition 2080Ti, then you must download the proper updated NVflash for “Mismatched ID” NVflash version 5.590 will work fine.
I’m still using the FE cooler and I managed this.
Also, I too had a X850 Pro back in the day. I flashed it to a X850XT.
Here’s my timespy.
https://www.3dmark.com/3dm/52921404?Last edited: Nov 14, 2020Shark00n likes this. -
Thanks.
Yeah I used the command prompt, everything seemed to worked nicely.
But I just realised my card might actually not be flashable.
Although it's a MSI A-Bin, I got it early 2020 and I think the XUSB FW is updated and not compatible. I always get the following message:
So I think it's not possible without specialized tools. Too bad
Unless anyone know of a vBIOS with updated XUSB FW and a higher power limit?
Can't seem to find any.
Maybe this one? https://www.techpowerup.com/vgabios/219116/msi-rtx2080ti-11264-200203
40W higher PL than mine. But it's a custom PCB... Any insight on this @tps3443 ? Info about this online is limited. Thanks!Last edited: Nov 14, 2020 -
jesus thats high...im getting 4000 and game on high 1080p...what are you aiming for 4k 144hz lol.....nice setup dude
-
This card is still air cooled, so
I’m surprised it broke 17K. It’ll be fully watercooled in a couple days. I expect closer to 18K in timespy graphics. It shouldn’t have any issues matching a RTX3080.
Anyways, I play at 2560x1440P 165HZ.Last edited: Nov 14, 2020 -
Well your card is not a founders edition, so did you use the normal NVflash?
I would say go straight for soldering the shunts then. It’ll uplift your power by 40%. It is safe, very easy, and you can continue to use your stock bios.
My 2080Ti is flashed to 380 watt Galax bios, and I’ve soldered on the 8ohm shunts. That’s why it’s breaking 17K timespy graphics on just air cooling. But it’s sucking down power like no bodies business. It’s pulling like 425 watts through timespy run. Once I cool it down, it’ll help lower than power usage some.
RTX 3080 trumps 2080ti by a whopping margin
Discussion in 'Gaming (Software and Graphics Cards)' started by JRE84, Jun 24, 2020.