The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Nvidia to unveil "biggest breakthroughs in PC gaming" in 21 years on Sep 1

    Discussion in 'Gaming (Software and Graphics Cards)' started by Prototime, Aug 12, 2020.

  1. Prototime

    Prototime Notebook Evangelist

    Reputations:
    206
    Messages:
    639
    Likes Received:
    888
    Trophy Points:
    106
    Nvidia is promising to unveil the "biggest breakthroughs in PC gaming since 1999" on September 1 with a GeForce presentation. Presumably Ampere GeForce GPUs. Quite a bold statement. We'll see.

    https://www.techradar.com/news/nvid...eptember-1-it-looks-like-ampere-is-on-its-way
     
    hfm and JRE84 like this.
  2. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    I'm looking forward to the 3000 series. The first gen of RT was fun to use, and there were some games that showed off hints of what was possible. DLSS 2.0 finally pushed through something that was truly helpful. nvidia did a good job pushing some of these technologies into the mainstream, even if there was the bumps of being first gen. Still The 2080 and 2080Ti are, coming up on 2 years later, still the fastest consumer level video cards available. They have still not been challenged. That's not nvidia's fault.

    I can't wait to get my hands on 2nd gen RT and more performance and features in general. Whether that be AMD or nVidia this time remains to be seen. I'm going to assume I'll be buying a 3080 at some point between September and April, but still waiting for everyone (lets be real AMD and nVidia, not trusting Intel to be competitive here yet) to put their cards on the table and see reviews from trusted sources first.
     
    Prototime likes this.
  3. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,092
    Trophy Points:
    931
    I could not figure out what to do with the posts in this thread so I deleted them all. They weren't relevant to the original post.

    As a note if you are going to post a video, you must include your commentary with it. We don't allow video-only posts outside of Off-Topic.

    Charles
     
    TR2N, custom90gt, JRE84 and 1 other person like this.
  4. Prototime

    Prototime Notebook Evangelist

    Reputations:
    206
    Messages:
    639
    Likes Received:
    888
    Trophy Points:
    106
    GeForce RTX 3090 has been confirmed by Micron with VRAM specs: 12GB GDDR6X "over a 384-bit memory interface at 19-21 Gbps for a total bandwidth of between 912 to 1008 GBps, which would push the RTX 3090 over the 1 TBps milestone."

    People are having a lot of fun with the number "21" showing up everywhere: the 21-day countdown to the September 1 announcement, which is about 21 years after nvidia's first GeForce GPU. Now, it's 21gbps and 12gb of VRAM (21 reversed). People are getting a little carried away with their interpretations I think. But who knows, given nvidia's terrible naming schemes, maybe they'll announce an "RTX 21" GPU that's even better than the 3090, just to mess with us :rolleyes:

    https://www.tomshardware.com/amp/news/micron-confirms-rtx-3090-will-have-over-1-tbs-gddr6x-bandwidth
     
    Last edited: Aug 16, 2020
    JRE84 and hfm like this.
  5. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    I heard we would see 4x the ray tracing performance.....can anyone confirm
     
  6. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    No one is going to be able to confirm anything of that sort until shipping cards are in hands. You might be able to repeat whatever Jensen says in the launch event, but that's not independent verification.
     
    Prototime and saturnotaku like this.
  7. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    Yeah that's so true....even the experts are just guessing
     
    hfm likes this.
  8. Prototime

    Prototime Notebook Evangelist

    Reputations:
    206
    Messages:
    639
    Likes Received:
    888
    Trophy Points:
    106
    Well, we made it. The "biggest breakthrough in PC gaming in 21 years" is clearly referring to "breakthrough prices" for nvidia, with a $1500 3090 graphics card :eek2: :twitchy:

    Specs from nvidia for 3090 ($1500), 3080 ($700), 3070 ($500): https://www.anandtech.com/show/1605...re-for-gaming-starting-with-rtx-3080-rtx-3090

    A shame that the 3070 is only getting 8gb of the older GDDR6 memory instead of the GDDR6X that the 3080 and 3090 are getting. But besides having 8gb of VRAM instead of 11gb, other specs suggest the the 3070 might beat the performance of the 2080TI. I'll be waiting for independent reviews of the cards to see.

    Also--not that it bothers me--but unless you want to shell out $3000 for two 3090s, SLI is now officially dead.

    Hopefully the improvements to ray tracing support are meaningful.

    And TDP is up quite a bit from last gen: 3090=350W, 3080=320W, 3070=220W.

    The 3090 and 3080 are coming out this month, 3070 next month. If we're lucky, we may see 3060s announced before the year's end. I wouldn't count on laptops with ampere GPUs until next year.
     
    Last edited: Sep 1, 2020
  9. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    It is going to take some serious engineering and/or nerfing to get these GPUs into something a laptop can handle.
     
    Vasudev, hfm, Prototime and 1 other person like this.
  10. Prototime

    Prototime Notebook Evangelist

    Reputations:
    206
    Messages:
    639
    Likes Received:
    888
    Trophy Points:
    106
    Yup. Remember when Pascal launched, nvidia dropped the "m" from laptop GPUs because they had finally reached near-parity with desktop GPUs? Good times.
     
  11. DRevan

    DRevan Notebook Virtuoso

    Reputations:
    1,150
    Messages:
    2,461
    Likes Received:
    1,041
    Trophy Points:
    181
    Clevo usually gives about 50W less tdp to the vga then it's desktop variant, so I am guessing 250-270W in Clevo laptops for mobile rtx 3080.
    The x170's cooling should be enough and Aw 51m r2 and Asus Chimera also has a good cooling that should be enough to cool it down with good quality thermal paste.
    Gpu core clock will be lower due to power throttling however it will still easely beat the Rtx 2080 for the same price .
     
    Prototime likes this.
  12. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    I was totally expecting to wait, but I also wasn't expecting the 3080 to be only $699... I might have to just jump on it release day.
     
  13. Prototime

    Prototime Notebook Evangelist

    Reputations:
    206
    Messages:
    639
    Likes Received:
    888
    Trophy Points:
    106
    I hear you. The more I think about it, the more I'm considering doing the foolish thing and leaping for a 3070 on release for $499. Even if the reviews prove it doesn't beat the 2080TI, I have little doubt it'll blow my 1060 out of the water. Still, at least the wait for the 3070 will force me to hold off until October, when we should hear news about whether the Big Navi lineup may offer a worthwhile alternative.
     
    hfm likes this.
  14. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    Man, that 3070 is super inciting at $499... My laptop CPU is already going to have issues trying to push data and through TB3 to keep it busy, but I'm looking a little toward the future.

    And if that 3070 is only $499, there goes the resale value of my 2070 LOL. :shrug:
     
    JRE84 and Prototime like this.
  15. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,312
    Trophy Points:
    431
    Final fantasy XV eats all 8GB of my 980M @ 1080P and I’m sure it would love to eat more. 8GB was great in 2014 but 2020 really we need 16GB
     
    Vasudev, JRE84 and electrosoft like this.
  16. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    yeah and watch dogs legion will eat 15gb on high...at 1080p...i'm preaching to the wrong crowd
     
  17. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Allocated != needed
     
    hfm, Prototime and JRE84 like this.
  18. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    what do you think the trend will be? over 10gb? or will they cap at 4gb
     
    Vasudev likes this.
  19. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,050
    Messages:
    11,278
    Likes Received:
    8,816
    Trophy Points:
    931
    OK, any other games that require high VRAM? I almost used 6GB of 8GB on 980M on ultra in Fallen Jedi Order but it was stuttering intermittently.
     
  20. GrandesBollas

    GrandesBollas Notebook Evangelist

    Reputations:
    370
    Messages:
    417
    Likes Received:
    563
    Trophy Points:
    106
    The VRAM question is going to be interesting. I'll need to see benchmarks comparing Nvidia's offerings (10 GB flavor) with AMD's (16 GB). Lack of VRAM should manifest itself in less FPS as the GPU will have to make use of on onboard RAM. I'm not too concerned with today's crop of games, though 4k gaming with highest settings and textures could challenge performance in VRAM starved cards. I find it interesting that looking forward, PS5 compatible games will make use of 16gb of VRAM. Porting such games to PCs with VRAM less than that could be a challenge.
     
    Vasudev likes this.
  21. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    Yeah I got into an argument here over it, in the end no one really knows...its looking like the trend is going to be above 10gb
     
  22. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,312
    Trophy Points:
    431
    GTA V when you max out everything can blow past 8GB vram even at 1080P
     
  23. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    The next-gen consoles have 16GB of memory shared between CPU and GPU.
     
    ajc9988, JRE84 and GrandesBollas like this.
  24. GrandesBollas

    GrandesBollas Notebook Evangelist

    Reputations:
    370
    Messages:
    417
    Likes Received:
    563
    Trophy Points:
    106
    Good catch! I did find the following reddit article:

    https://www.reddit.com/r/PS5/comments/g316pf/will_16gb_ram_be_enough/

    Specifically, look at this quote:
    "Yes, 16gb is not really that much and I actually expected them to put maybe 4 extra gigabytes of slow memory in there, which they didn't. However, there is another major advancement that all the previous consoles didn't have: access to an ultra fast storing solution. Compressed data can be fetched from the SSD at 9gb/s. This means that they can quickly swap out the assets that are stored in RAM. On PS4 you have 8gb of RAM. But these 8GB have to store a lot of data that are not immediatly necessary. For example if during your game you want to turn around a corner, the PS4 has to have everything that is around that corner stored in RAM as long as youare in that area. The PS5 will be fast enough where it only has to get that data into RAM when you are actually turning around that corner. This means that of these 16gb of RAM a lot more will be available to game developers for moment to moment gameplay than on PS4."

    Regardless of how the PS5 divvies up the RAM (CPU/GPU) now, future game developers will continue to expand their waistlines as more features/qualities are added and consumers continue to demand faster rendereing. As @ajc9988 has mentioned in another thread, the GPU battle is now beginning, with consumers (and game developers) the eventual winners All we have to do is wait. ;)
     
    Papusan, ajc9988 and JRE84 like this.
  25. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    [​IMG]

    gta v max causing really high temps and only using 3-4gb vram at 1080p maxed even advanced....i dont see this surpassing 8gb even with 64x aa

    as for the other post....ps5s ssd is so fast and advanced they can use it as quasi ram...i think its looking ugly for the 3080 when new games are in
     
  26. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    The 3080 will be just fine. The combination of RTX I/O, Microsoft DirectStorage API, and PCIe 4.0 drives in the PC space that smoke the PS5’s SSD in raw throughput.
     
    Papusan and JRE84 like this.
  27. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    you are always so informative! :)

    never heard of directstorage api??

    Do you have a source I could read up on?
     
    yrekabakery likes this.
  28. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    GrandesBollas and JRE84 like this.
  29. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181

    ok I read that basically it lowers IO info by tens of thousands reducing lag to and from ssd...pretty cool...Never even heard of it as my circle of friends never talk about tech, I come here for my fix..thanks alot @Papusan you are golden character


    And also Can someone max out GTA V and see how close we get to 8gb vram usage? I can't seem to top 4gb at 1080p


    edit also papusan sorry for the ignorant post about you posting fluff I was drinking that night and regret my words more than a fat kid regrets eating chocolate cake
     
    joluke and Papusan like this.
  30. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,050
    Messages:
    11,278
    Likes Received:
    8,816
    Trophy Points:
    931
    Ahh.. yes... It took 12GB and GTA V was unplayable.
     
    JRE84 likes this.
  31. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    imagine gta VI
     
  32. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Short summary of RTX I/O here:

     
    Vasudev, Papusan and JRE84 like this.
  33. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    Godfall needs 12GB memory for 4K gameplay with UltraHD textures videocardz.de | Today

    Interestingly, Keith Lee revealed that in order to support 4X x 4X UltraHD textures a 12GB VRAM is required. This means that Radeon RX 6000 series, which all feature 16GB GDDR6 memory along with 128MB Infinity Cache should have no issues delivering such high-resolution textures. It may also mean that the NVIDIA GeForce RTX 3080 graphics card, which only has 10GB of VRAM, will not be enough.

    Yeah, Nvidia went chapo for gaming into the future with 3080. Not all swap cards each 2nd year. AMD cards normally perform better while it ages. And Nvidia's 8-10GB cards won't exactly close the gap.
     
  34. GrandesBollas

    GrandesBollas Notebook Evangelist

    Reputations:
    370
    Messages:
    417
    Likes Received:
    563
    Trophy Points:
    106
    There is so much idiocy floating regarding memory (RAM and VRAM) requirements for upcoming games. Some sites seem to use the terms interchangeably in the same sentence. I'm really waiting to see legitimate gaming benchmarks where we can see the impacts/costs of having less VRAM. As @yrekabakery mentioned yesterday, there are a bunch of optimizations that MS has developed to greatly speed data transfer from SSD to the CPU/GPU.

    Higher VRAM needs in the future are a reality. Not many titles today will require it. Aside from Godfall, Nvidia's VRAM disadvantage may not be that concerning.
     
  35. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    I wouldn't even call it concerning for Godfall. Play with textures turned down one notch or even two. Hardly anyone can tell the difference. If you play at 1440p probably zero problems there even if you do use crazy texture size. I'd say if you are the type to replace your GPU every year or two, don't worry about it. If you keep it 3-4 years, maybe that's something to consider.
     
  36. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    Your bang on!! :)

    below will be pics of can it run crysis textures vs low pictures and the vram difference is huge

    [​IMG]
    [​IMG]


    I cannot see the change but one is low and one is ultra, now tell me why exactly do people even run all ultra..

    and here is after doing a full system reboot

    [​IMG]
    [​IMG]
     
    Last edited: Nov 4, 2020
    hfm, Aroc and Prototime like this.
  37. GrandesBollas

    GrandesBollas Notebook Evangelist

    Reputations:
    370
    Messages:
    417
    Likes Received:
    563
    Trophy Points:
    106
  38. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    TBoneSan likes this.
  39. Clamibot

    Clamibot Notebook Deity

    Reputations:
    645
    Messages:
    1,132
    Likes Received:
    1,567
    Trophy Points:
    181
    This reminds me of Far Cry 4. That game also has a laughably small difference between low and ultra settings. Ultra just makes the game lag with no discernible difference in graphical fidelity.
     
    JRE84 and Prototime like this.
  40. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    Alot of games aka ports are like that...it's more of an exception when ultra looks better than medium