Nvidia is promising to unveil the "biggest breakthroughs in PC gaming since 1999" on September 1 with a GeForce presentation. Presumably Ampere GeForce GPUs. Quite a bold statement. We'll see.
https://www.techradar.com/news/nvid...eptember-1-it-looks-like-ampere-is-on-its-way
-
I'm looking forward to the 3000 series. The first gen of RT was fun to use, and there were some games that showed off hints of what was possible. DLSS 2.0 finally pushed through something that was truly helpful. nvidia did a good job pushing some of these technologies into the mainstream, even if there was the bumps of being first gen. Still The 2080 and 2080Ti are, coming up on 2 years later, still the fastest consumer level video cards available. They have still not been challenged. That's not nvidia's fault.
I can't wait to get my hands on 2nd gen RT and more performance and features in general. Whether that be AMD or nVidia this time remains to be seen. I'm going to assume I'll be buying a 3080 at some point between September and April, but still waiting for everyone (lets be real AMD and nVidia, not trusting Intel to be competitive here yet) to put their cards on the table and see reviews from trusted sources first.Prototime likes this. -
Charles P. Jefferies Lead Moderator Super Moderator
I could not figure out what to do with the posts in this thread so I deleted them all. They weren't relevant to the original post.
As a note if you are going to post a video, you must include your commentary with it. We don't allow video-only posts outside of Off-Topic.
CharlesTR2N, custom90gt, JRE84 and 1 other person like this. -
GeForce RTX 3090 has been confirmed by Micron with VRAM specs: 12GB GDDR6X "over a 384-bit memory interface at 19-21 Gbps for a total bandwidth of between 912 to 1008 GBps, which would push the RTX 3090 over the 1 TBps milestone."
People are having a lot of fun with the number "21" showing up everywhere: the 21-day countdown to the September 1 announcement, which is about 21 years after nvidia's first GeForce GPU. Now, it's 21gbps and 12gb of VRAM (21 reversed). People are getting a little carried away with their interpretations I think. But who knows, given nvidia's terrible naming schemes, maybe they'll announce an "RTX 21" GPU that's even better than the 3090, just to mess with us
https://www.tomshardware.com/amp/news/micron-confirms-rtx-3090-will-have-over-1-tbs-gddr6x-bandwidth
Last edited: Aug 16, 2020 -
I heard we would see 4x the ray tracing performance.....can anyone confirm
-
Prototime and saturnotaku like this.
-
Well, we made it. The "biggest breakthrough in PC gaming in 21 years" is clearly referring to "breakthrough prices" for nvidia, with a $1500 3090 graphics card
Specs from nvidia for 3090 ($1500), 3080 ($700), 3070 ($500): https://www.anandtech.com/show/1605...re-for-gaming-starting-with-rtx-3080-rtx-3090
A shame that the 3070 is only getting 8gb of the older GDDR6 memory instead of the GDDR6X that the 3080 and 3090 are getting. But besides having 8gb of VRAM instead of 11gb, other specs suggest the the 3070 might beat the performance of the 2080TI. I'll be waiting for independent reviews of the cards to see.
Also--not that it bothers me--but unless you want to shell out $3000 for two 3090s, SLI is now officially dead.
Hopefully the improvements to ray tracing support are meaningful.
And TDP is up quite a bit from last gen: 3090=350W, 3080=320W, 3070=220W.
The 3090 and 3080 are coming out this month, 3070 next month. If we're lucky, we may see 3060s announced before the year's end. I wouldn't count on laptops with ampere GPUs until next year.Last edited: Sep 1, 2020 -
saturnotaku Notebook Nobel Laureate
Vasudev, hfm, Prototime and 1 other person like this. -
-
Clevo usually gives about 50W less tdp to the vga then it's desktop variant, so I am guessing 250-270W in Clevo laptops for mobile rtx 3080.
The x170's cooling should be enough and Aw 51m r2 and Asus Chimera also has a good cooling that should be enough to cool it down with good quality thermal paste.
Gpu core clock will be lower due to power throttling however it will still easely beat the Rtx 2080 for the same price .Prototime likes this. -
I was totally expecting to wait, but I also wasn't expecting the 3080 to be only $699... I might have to just jump on it release day.
-
hfm likes this.
-
And if that 3070 is only $499, there goes the resale value of my 2070 LOL. :shrug: -
Final fantasy XV eats all 8GB of my 980M @ 1080P and I’m sure it would love to eat more. 8GB was great in 2014 but 2020 really we need 16GB
Vasudev, JRE84 and electrosoft like this. -
yeah and watch dogs legion will eat 15gb on high...at 1080p...i'm preaching to the wrong crowd
-
yrekabakery Notebook Virtuoso
-
what do you think the trend will be? over 10gb? or will they cap at 4gb
Vasudev likes this. -
-
GrandesBollas Notebook Evangelist
The VRAM question is going to be interesting. I'll need to see benchmarks comparing Nvidia's offerings (10 GB flavor) with AMD's (16 GB). Lack of VRAM should manifest itself in less FPS as the GPU will have to make use of on onboard RAM. I'm not too concerned with today's crop of games, though 4k gaming with highest settings and textures could challenge performance in VRAM starved cards. I find it interesting that looking forward, PS5 compatible games will make use of 16gb of VRAM. Porting such games to PCs with VRAM less than that could be a challenge.
Vasudev likes this. -
Yeah I got into an argument here over it, in the end no one really knows...its looking like the trend is going to be above 10gb
-
-
yrekabakery Notebook Virtuoso
ajc9988, JRE84 and GrandesBollas like this. -
GrandesBollas Notebook Evangelist
Good catch! I did find the following reddit article:
https://www.reddit.com/r/PS5/comments/g316pf/will_16gb_ram_be_enough/
Specifically, look at this quote:
"Yes, 16gb is not really that much and I actually expected them to put maybe 4 extra gigabytes of slow memory in there, which they didn't. However, there is another major advancement that all the previous consoles didn't have: access to an ultra fast storing solution. Compressed data can be fetched from the SSD at 9gb/s. This means that they can quickly swap out the assets that are stored in RAM. On PS4 you have 8gb of RAM. But these 8GB have to store a lot of data that are not immediatly necessary. For example if during your game you want to turn around a corner, the PS4 has to have everything that is around that corner stored in RAM as long as youare in that area. The PS5 will be fast enough where it only has to get that data into RAM when you are actually turning around that corner. This means that of these 16gb of RAM a lot more will be available to game developers for moment to moment gameplay than on PS4."
Regardless of how the PS5 divvies up the RAM (CPU/GPU) now, future game developers will continue to expand their waistlines as more features/qualities are added and consumers continue to demand faster rendereing. As @ajc9988 has mentioned in another thread, the GPU battle is now beginning, with consumers (and game developers) the eventual winners All we have to do is wait. -
yrekabakery Notebook Virtuoso
The 3080 will be just fine. The combination of RTX I/O, Microsoft DirectStorage API, and PCIe 4.0 drives in the PC space that smoke the PS5’s SSD in raw throughput.
-
you are always so informative!
never heard of directstorage api??
Do you have a source I could read up on?yrekabakery likes this. -
Microsoft to Bring DirectStorage API to Windows in 2021...
DirectStorage is coming to PC - Microsoft Developer Blogs
GrandesBollas and JRE84 like this. -
ok I read that basically it lowers IO info by tens of thousands reducing lag to and from ssd...pretty cool...Never even heard of it as my circle of friends never talk about tech, I come here for my fix..thanks alot @Papusan you are golden character
And also Can someone max out GTA V and see how close we get to 8gb vram usage? I can't seem to top 4gb at 1080p
edit also papusan sorry for the ignorant post about you posting fluff I was drinking that night and regret my words more than a fat kid regrets eating chocolate cake -
-
imagine gta VI
-
yrekabakery Notebook Virtuoso
-
Godfall needs 12GB memory for 4K gameplay with UltraHD textures videocardz.de | Today
Interestingly, Keith Lee revealed that in order to support 4X x 4X UltraHD textures a 12GB VRAM is required. This means that Radeon RX 6000 series, which all feature 16GB GDDR6 memory along with 128MB Infinity Cache should have no issues delivering such high-resolution textures. It may also mean that the NVIDIA GeForce RTX 3080 graphics card, which only has 10GB of VRAM, will not be enough.
Yeah, Nvidia went chapo for gaming into the future with 3080. Not all swap cards each 2nd year. AMD cards normally perform better while it ages. And Nvidia's 8-10GB cards won't exactly close the gap. -
GrandesBollas Notebook Evangelist
There is so much idiocy floating regarding memory (RAM and VRAM) requirements for upcoming games. Some sites seem to use the terms interchangeably in the same sentence. I'm really waiting to see legitimate gaming benchmarks where we can see the impacts/costs of having less VRAM. As @yrekabakery mentioned yesterday, there are a bunch of optimizations that MS has developed to greatly speed data transfer from SSD to the CPU/GPU.
Higher VRAM needs in the future are a reality. Not many titles today will require it. Aside from Godfall, Nvidia's VRAM disadvantage may not be that concerning. -
Aroc, electrosoft, JRE84 and 1 other person like this.
-
GrandesBollas Notebook Evangelist
Saw this rumor in my Google feed and thought I'd share - 3080Ti may not be cancelled after all. I know WCCF is not the most reliable news site...
https://wccftech.com/nvidia-geforce-rtx-3080-ti-20-gb-graphics-card-specs-leak/
-
TBoneSan likes this.
-
-
Nvidia to unveil "biggest breakthroughs in PC gaming" in 21 years on Sep 1
Discussion in 'Gaming (Software and Graphics Cards)' started by Prototime, Aug 12, 2020.