The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    "Not Having a Ray Tracing GPU in 2019 is Just Crazy!" - Nvidia CEO

    Discussion in 'Gaming (Software and Graphics Cards)' started by hmscott, Aug 17, 2019.

  1. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    tl;dr - Jensen put his foot in his mouth again spewing BS about RTX, Nvidia's Researcher Morgan McGuire at SIGGRAPH 2019 stepped out of ranks and predicted the FIRST ray-tracing only game won't arrive until 2023 - leaving todays RTX GPU's out of the running - and Assetto Corsa Competizione Dumps NVIDIA RTX.

    Played out in painful slow motion, RTX is going nowhere fast - and the "early supporters" are reneging on their commitment to deliver RTX features - releasing without RTX features and with no plan to ever provide RTX features. It's going to be many years, and many generations of GPU's before 100% full-time ray-tracing in games will be seen.

    Just ask Nvidia's researcher Morgan McGuire on Twitter:
    Morgan McGiuire #1.jpg

    "Morgan McGuire, one of Nvidia’s top researchers, took to Twitter earlier this week to predict that consumers can expect to see the first must-have game that requires ray tracing by the year 2023.

    The tweet also included a link to publicly available presentation slides by other Nvidia scientists for this year’s Siggraph computer graphics conference. Most tantalizingly, the tweet featured an image of a single from McGuire’s own presentation, “From Raster to Rays in Games,” included among those of his Nvidia colleagues on the linked Siggraph page."

    Oh, it looks like [Nvidia?] took down Morgan McGuire's twitter post...how embarrassing for them...here is the text of the tweet as quoted in the video included below:
    Morgan McGiuire Quote.jpg
    And, Scott Herkelman of AMD agrees:
    Scott Herkelman Responds to Morgan McGuire.jpg
    Nvidia RTX GPU's of today won't even be useful for the ray-tracing only game play Morgan McGuire predicts to arrive by 2023. By 2023 powerful enough GPU's and games optimized to run on those new far more powerful GPU's will begin to be available, years from now.

    Nvidia's top engineer says here in 2019 that it will be 2023 before RTX will be useful as a ray-tracing only game, and I think he is being wildly optimistic - that's only 2 generations of GPU's away and software developers are already burned out on providing useless RTX feature's - useless compared to spending resources on actual gameplay.

    The first top-tier ray tracing-only game will land in 2023, Nvidia guru says
    By Jonathan Terrasi — Posted on July 30, 2019 2:23PM PST
    https://www.digitaltrends.com/computing/nvidia-scientist-predicts-first-aaa-ray-tracing-game-2023/

    "In practice, however, it is possible to execute a play to cement dominance over a new format too early. AMD’s reluctance to chart a coherent long-term ray tracing strategy may be a sign that Nvidia has initiated just such a premature gambit, especially considering how ruthless and calculated AMD has been as of late — if AMD isn’t concerned about ray tracing, maybe Nvidia is getting ahead of itself."

    Another game developer refuses to waste time on RTX BS, and ships without it, abandoning the sinking ship of RTX:

    Assetto Corsa Competizione Dumps NVIDIA RTX
    by btarunr Friday, August 16th 2019, 01:25 Discuss (59 Comments)
    https://www.techpowerup.com/258340/assetto-corsa-competizione-dumps-nvidia-rtx#comments

    "Assetto Corsa Competizione, the big AAA race simulator slated for a September release, will lack support for NVIDIA RTX real-time raytracing technology, not just at launch, but even the foreseeable future. The Italian game studio Kunos Simulazioni in response to a specific question on the Steam Community forums confirmed that the game will not receive NVIDIA RTX support.

    " Our priority is to improve, optimize, and evolve all aspects of ACC. If after our long list of priorities the level of optimization of the title, and the maturity of the technology, permits a full blown implementation of RTX, we will gladly explore the possibility, but as of now there is no reason to steal development resources and time for a very low frame rate implementation, " said the developer, in response to a question about NVIDIA RTX support at launch.

    This is significant, as Assetto Corsa Competizione was one of the posterboys of RTX, and featured in the very first list by NVIDIA, of RTX-ready games under development."

    Vya Domus - "The gist of all this is that developers are starting to realize that "hey this is a lot work and we have zero incentive to do it". That's all."

    Whether the rest of the early games promising RTX support even go so far as to admit this, none of them want to waste time, money, resource on RTX, as RTX offers their customers no real value - no improvement in game play - and developers know that's what gamers want to buy - gameplay first, shiny RTX nonsense dead last or not at all.

    "Not Having a Ray Tracing GPU in 2019 is Just Crazy!" - Nvidia CEO
    Boot Sequence
    Published on Aug 16, 2019
    Ahh, Real-time Ray tracing. The most revolutionary tech to come to pc gaming since the move to 3d graphics. Dare I say it, its an even bigger deal than 3d. I mean how dare you tell me that this, doesn’t look miles better than this. You have Ray-traced shadows, global illumination, and even reflections. On top of that, it just works. Seriously, at this point, it’s a foregone conclusion that if you’re going to buy a new graphics card, it’s going to last you two, three, four years, and to not have ray tracing is just crazy.”

    And scene. I don’t believe a word of what I just said, but , that last bit there, the “At this point, it’s a foregone conclusion that if you’re going to buy a new graphics card, it’s going to last you a long time and to not have ray tracing is just crazy.” is a direct quote from Nvidia’s CEO Jensen Huang during their Q&A session for their earnings call.

    I mean how gullible does the company think gamers are[?] Sure, Ray tracing adds a new layer to the gaming world and it’s appreciable, but to say that “in 2019 it's crazy not to have a hardware accelerated real time ray tracing card” is the worst marketing ploy I've ever seen. Especially when the market has a completely different focus right now.

    We don't need ray tracing, we need affordable high refresh rate monitors to go with our affordable GPU's that can push these frame rates.

    The worst part is that one of Nvidia’s top researcher even tweeted that consumers can expect to see the first game to require ray tracing in 2023. That’s a long way away and AMD’s Scott Herkelman even agreed. By then, real time ray tracing will even have evolved with more things being raytraced on the scene making cards like the current super line obsolete in that department. The really odd thing is that this tweet from the researcher is no longer available. Almost like a company wanted to shut him up.
    —————————————————————————————————
    TODAY'S DOSE OF TECH NEWS SOURCES
    —————————————————————————————————
    Nvidia CEO Quote : https://bit.ly/2YTh028
    Asseto Corsa not supporting RTX : https://bit.ly/2Naiden
     
    Last edited: Aug 17, 2019
  2. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,375
    Trophy Points:
    431
    And here I thought I was crazy for wanting more AMD in laptops.
     
    joluke and hmscott like this.
  3. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Nvidia has another terrible quarter, with the Nvidia bulls clutching at straws, all numbers down tremendously year over year, I'll update with a full list of numbers later, here's a short snippet:
    Nvidia 2019 2nd Quarterly Financials - looking bad willis....JPG

    Nvidia got a small upblip by the bulls from the earnings call, but Nvidia's still way down:
    Nvidia down overall the last year - small upblip is nothing.JPG

    From the same earnings call Jensen made that silly statement trying to push the RTX GPU's - that aren't selling - but what about all of the new Nvidia non-RTX GPU's? Apparently Jensen thinks those are a bad buy / investment for the future...

    Nvidia's CEO calls it "crazy" to buy a GPU without raytracing
    Erm... what about the GTX Turing series?
    Published: 16th August 2019 | Source: Nvidia | Author: Mark Campbell
    https://www.overclock3d.net/news/gp...ls_it_crazy_to_buy_a_gpu_without_raytracing/1

    " Nvidia's CEO calls it "crazy" to buy a GPU without raytracing
    During the company's Q2 FY2020 Financials webcast, Nvidia's CEO Jensen Huang has stated that it's "crazy" to purchase a new graphics card that lacks support for raytracing.

    Here's the full quote; "At this point, it’s a foregone conclusion that if you’re going to buy a new graphics card, it’s going to last you two, three, four years, and to not have ray tracing is just crazy."

    While Jensen's statement makes sense from a futureproofing perspective, it doesn't make sense when you look at Nvidia's current product lineup. Within the past three months, Nvidia has released their GTX 1660 Ti, GTX 1660 and GTX 1650 graphics cards, all of which lack support for RTX raytracing. Yes, Nvidia has, in a roundabout way, said that it is "crazy" to purchase three of its most recent graphics cards.

    With his statement, Jensen is targetting AMD's Radeon RX 5700 series of graphics cards, which currently lack support for hardware-accelerated raytracing. These graphics cards compete with the Geforce RTX and Geforce RTX Super series, all of which support hardware-accelerated raytracing
    (Jensen doesn't recommend the GTX 1660 Ti)"

    AngryGoldfish - "Maybe I'm reading it wrong, but it sounds like he's trying to get people to buy big Turing because of lack lustre sales and is hoping consumers will not actually 'keep their graphics card for four years' and will instead replace said 2080/2080Ti with a new and shiny one in 2020 or 2021".

    Bridges - "You'd be crazy to invest in a proprietary, poorly supported, overpriced implementation of ray-tracing too. Given that both of the next consoles are all AMD and will support ray-tracing to some degree, the idea that people are going to be playing ray-traced games on their RTX 2070 four years from now is pretty funny."

    Source:
    Nvidia CEO Jensen Huang calls it "crazy" to buy a GPU without raytracing.
    Thread starter IbizaPocholo Start date Yesterday at 9:37 PM
    https://www.neogaf.com/threads/nvid...razy-to-buy-a-gpu-without-raytracing.1497937/

    CRON - "I played Metro Exodus with raytracing enabled and not once during my playthrough did I notice it."

    lol - Jensen Huang calls “crazy” buying a video card without ray tracing
    Discussion in ' The Guru's Pub' started by LIGuitar77, Aug 17, 2019 at 1:29 PM.
    https://forums.guru3d.com/threads/lol-jensen-huang-calls-“crazy”-buying-a-video-card-without-ray-tracing.428077/

    -Tj- "Yeah he saw it doesn't sell so good, so had to make some silly statements..."

    SerotoNiN - "Considering this claim of buying a gpu for the next 2-4 years...ray tracing is so abysmal just in frame rates right now that it would be crazy to buy an RTX card now and think it will run games 2-4 years out. Especially with a new generation of consoles with more demanding graphics a year away. What a stupid comment by a petty man. I don't mind nvidia as a product, but every time they open their mouth, it makes me want to go full on AMD next upgrade. And I just might."
     
    Last edited: Aug 17, 2019
    joluke likes this.
  4. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Coreteks takes a realistic look at why Nvidia released a such a bad product: RTX...going back as far as 3dfx...

    The FUTURE of Graphics
    Nov 19, 2019
    Coreteks


    Aggnog 1 week ago
    "When your whole life flashes before your eyes, how much of it do you want to not have ray tracing?"

    TheSaipher 1 day ago
    "in 2019 we dont have 4k at 144hz but hey RTX is here ...."

    friedkangaroo 1 week ago
    "2080ti here i never used rt in any games i play, wow six games hahaha"
     
    Last edited: Nov 29, 2019
  5. thegreatsquare

    thegreatsquare Notebook Deity

    Reputations:
    135
    Messages:
    1,068
    Likes Received:
    425
    Trophy Points:
    101
    Not as crazy as being the sucker who falls for Nvidia's plea to help clear out their old overstock.

    I'd really like to wait until Hopper, timing wise it will be the first architecture launch after the new-gen console releases and that's the sweetest sweet spot IMO. But while the MCM design isn't new in general, a company's implementation of any new design is often a bit conservative. Plus there could be an Ampere-Super stretching out the timeline. If I chose to go with a midrange Ampere laptop, then another midrange ~2024, that could be smart...

    ...buying 1st gen RTX now isn't smart at all.
     
    Casowen and hmscott like this.
  6. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Fun, and Smart - mixed with desire - don't always mix. If there isn't anything else out there, like in laptops especially or with Black Friday sales configurations - there's little alternative of choice available; that's unfortunately what we are faced with today (and tomorrow).

    That's why I recommend going for the non-Nvidia / Intel option if you have the choice, because there are still areas where we can't avoid them. But, don't sweat it if you can't find that choice when you need what you need. Fun's fun.

    @coreteks talks about "Dark Silicon" - the wasted silicon space of RTX features between us and what we need - the gaming engine we have relied on for so long. Nvidia screwed us over - and Nvidia screwed themselves over in 2019, let's hope they wise up in 2020 before we loose them as a competitor.

    Hopefully Nvidia will give up the RTX farce in 2020 with the new 3000 series, stupid is as stupid does though, Nvidia will probably double down on RTX.

    Nvidia wasting another $B on RTX instead, priceless.
     
    Last edited: Nov 29, 2019
  7. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    You crusade against ray-tracing is getting a bit weird.

    You are going to be massively disappointed if you think it is going anywhere, as AMD's high-end RDNA2 GPUs are 100% shipping with dedicated ray-tracing hardware in 2020, and the 2020 Playstation 5 and Xbox releases are as well. You can guarantee that every 1sy-party Sony and Microsoft game will showcase the technology, and 3rd-party devs will follow along to not be left behind.

    The amount of games releasing with ray-tracing support is about to explode in a nuclear fashion, whether you like it or not.

    [​IMG]
     
    Prototime and hfm like this.
  8. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    786
    Trophy Points:
    131
    Yeah that's a stupid comment when it comes to 2019, totally unnecessary atm. Eventually if all the manufs put it in there it won't really be a marketing gimmick that it is now, more a commodity in a vg sense. I've had conversations with a few CEO's of mid size manuf companies and they're total salesmen Machiavellians, seeing(high margin) fantasy and believing it and imposing it on others one way or another by hook or crook. Those types are the ones who get promoted by those like them above. But megcorps have whole wings of programming/brainwashing aka marketing/promotions depts to go out and convince the naive, ignorant and plain stupid that this is "how it is" and how great it all is, mind control. If RT kills performance afa fps and other settings it's a farce in 2019, seems to be the case. It's all just about selling the latest thing, not about what is best for your case. For average computer use a 8 year old cpu/gpu would be fine, gaming a different story if you want the latest games but seriously there's an avalanche of games in the "old" or "low spec" category that would keep you busy for years.
     
    hmscott likes this.
  9. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    "Killing FPS" still equates to plenty playable on a lot of titles. Even my lowly 2070 in an eGPU can manage to be just fine on Metro: Exodus with RTX on. This is pretty good for 1st gen.

    Don't mistake this for me thinking every game has to have RT to be good, but it can definitely make an already great game better. You still have the option of turning it off, I just don't get the hate. It's here to stay and will just be a default feature in pretty much every GPU going forward. Well, all but the most budget ones anyway. And if you don't like it, then go into the game options and disable it.
     
    Prototime likes this.
  10. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    786
    Trophy Points:
    131
    well yeah I'm not disagreeing with you, but if someone purposely sold a perfectly fine gpu specifically to get an RT one it's a waste, but many people are early adopters just because and have the money then fine. I'm thinking of getting rich selling bridges with RGB LEDs and large rgb fans lol
    Getting back to the point though, RT totally unnecessary in 2019 and by the time the manufs all put it in all their products it's a moot point anyways. Apart from FPS I don't think anyone can claim RT isn't a performance hit for current cards and games. I turn off AA in most games I play bc it's a performance hit that I don't notice much anyways, rather have better fps and I assure you it does improve FPS. Higher resolutions don't even need AA but I'm on 1080p or a little lower and fine, doesn't make the games I enjoy less fun.
     
    Prototime, hmscott and Dennismungai like this.
  11. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    The "hate" is a reality check. Let's see...how can I "turn it off" and get my money back for the 50% of the GPU I can't use??

    RTX = 50% of silicon real estate wasted 99% of the time when not using those features. Real estate that should have been used to give us what everyone wanted - Real 4k @ 144hz + everything we can think of that will fit in that space - 50%!! of the silicon is now dark - useless 99% of the time.

    RTX is nothing, costing us everything. The main unspoken problem with RTX is the missing performance that we don't even know is missing!! Performance that we should have had implemented with that 50% of the silicon working for us 99% of the time instead of laying dark and unused.

    Someday, maybe, this will all clear up - alternatives will be there, and Nvidia's RTX will seem as silly and "useless" to everyone - like it already is so clearly useless to me and many many others that can easily see through Nvidia's RTX BS for what it is.

    Let's see how the Consoles handle ray-tracing at release, what games actually release on Console with any % of ray-tracing features implemented.

    Marketing will push any little tidbit, amplify it like it's a magical miracle, only to find when you get it in your hands that you can't even tell whether it's enabled or not most of the time - except you see 50% lower FPS.

    If in 2021 ray-tracing games don't get released as planned - with minimal reduced ray-tracing features like on PC. Also like on PC game developers straggle off to spend time doing what is really important - improving game play. Now we get our years worth of 6 new RT RT games on Xbox and 6 new RT RT games on PS4 - and you know they'll double count a single RT RT game as 2 RT RT games when the same game comes out on both Consoles + PC makes 3 games (instead of the 1 it is).

    We'll get smoke blown at us from all directions about RT RT a bunch of times next year leading up - a big huge list of supported games - just like on PC, and it will look like something - just like Nvidia's BS for RTX PC - only to have 24 games turn to 3 plus 3 more and nothing else for over a year.

    We would need every game that comes out on Console and PC to have full ray-tracing; RT RT on Console and on low to mid-level PC's without FPS loss before you can call it anymore than a wisp in the wind, where Nvidia is now, and where Consoles will be in a year.

    Nothing is a done deal until it's in every game, runnable on every gaming PC and Console.

    IMHO, until then RT RT is a buncha hooey, and Nvidia is guilty of hucking RTX out way before it was ready - then overcharging for RTX, and continuing to shamelessly promote RTX even though it's a useless BS waste of 50% of our GPU silicon.

    ClownWorks On FPS vs ClownWorks Off FPS, RTX isn't even needed to kill performance...
    DudeRandom84
    "Frame time graph now included as requested! Cards Used - Nvidia Asus GTX 1080 TI Founders Edition CPU - Coffee Lake 8700K CPU Cooler - Corsair H115i Motherboard - ASUS ROG Intel Z370 MAXIMUS X HERO RAM - 16GB 3200MHz Final Fantasy XV Nvidia GameWorks On Vs GameWorks Off GTX 1080 TI 8700k Frame Rate Comparison Benchmark The texture pack is not installed as it will take me a day to download the pack which means no videos on this game for a while.
    https://www.youtube.com/channel/UCCVh...

    Od1sseas 1 year ago
    "-20 FPS GG"

    Sergio Chipo 1 year ago (edited)
    "LOL!! GameWorks Strikes Again.."

    Darkswordz 1 year ago
    "ClownWorks."

    Kenneth Larsen 1 year ago
    "Gimpworks is doing amazing as always"

    1 month ago
    "Why does Nvidia do this? They developed something not just hurting the AMD Radeon competition but their own cards with small visual gains but good stresser. 6gb more memory loaded 3gb more vram and dram. Introduces lag spikes etc."

    Филип Пешић 1 year ago
    "Nice RAM/vRAM consumption on GW On.. Also, frame times are really admirable. :D"

    Moloch 6 months ago
    "Looks clearer with it off. lol."

    Nick Rodrigez 1 year ago
    "Gamenotworks attacks again"

    FaultBat 1 year ago
    " I feel like I must be blind, because while I see the framerate hit with GameWorks, I can't tell what's actually different with the quality."

    And that's the prototypical Nvidia "It just Works - Gameworks" POA laid out and ready for RTX / DLSS to fill in. Same MO, same BS format, you lose 50% performance for no value in return...

    Some do find joy with RTX by bringing back old games with new "shiny" features, as PC game developers trying to earn a living are apparently not interested in spending their resources to put RTX in their new AAA games:

    Quake 2 RTX Upgraded: v1.2 Adds New Ray Tracing Features, Dynamic Res + More!
    Nov 29, 2019
    Digital Foundry
    What? Quake 2 RTX looks even better than it did at launch? Yup, Nvidia's Lightspeed Studios has returned to their remastered version of the 1997 id Software classic and improved visuals even more. Alex Battaglia - inevitably - has the full story.


    RTX = Dancing around in front of mirrors all day??
     
    Last edited: Dec 2, 2019
  12. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    Someday, maybe, this will all clear up - alternatives will be there, and Google's Stadia will seem as silly and "useless" to everyone - like it already is so clearly useless to me and many many others that can easily see through Google's Stadia BS for what it is.
     
  13. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Not having a Ray Tracing GPU in 2019 is crazy?
    Considering its a pretty useless feature that doesn't really improve graphics to a noticeable degree in the first place, I'd rather opt out of RTX as a whole and get AMD because it supports open source features that do pretty much the same (are as capable and even MORE flexible than proprietary features).
     
    hmscott likes this.
  14. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    That's an option. 1080Ti, 5700XT, 1660 Ti/Super.. those are all options if you want to opt out of a card with RT features. The choice is there.
     
  15. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Indeed, but why support a company that predominantly pushes closed source features and already has a majority of market share in their grasp when the competitor GPU's are just as good?
     
    hmscott likes this.
  16. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    If, for your use case, AMD has GPUs that are just as good then that's what you should get. I also suspect you are using Linux not Windows as that would maintain your open source or don't use that product mentality.

    Also, you can use RT on Vulkan with an nVidia GPU.
     
    hmscott and saturnotaku like this.
  17. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    This is the question I'm asking:

    What are you going to do when AMD's new GPUs launch in Q1 of 2020 with ray-tracing hardware? The alternative you think you are going to have doesn't exist.

    AMD has already said it's happening so I just don't get why everyone is glossing over that like Nvidia is going to be the sole proprietor of what you don't want.
     
    sniffin, Prototime and hfm like this.
  18. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    Agreed. As well the new consoles are including RT acceleration of some sort, the writing is on the wall. I for one am 100% in. I've watched 3D acceleration technology drag itself into the future time and again since that day that the first 3dfx Voodoo/Rendition Verite cards hit the market back in 1996. It's been a ride of small iterations and some key milestones and some misses here and there. Hardware transform and lighting of the GeForce 256 was a high point. I think that was the first time playing games in 32-bit color was feasible as well. The milestone of real-time ray tracing acceleration is the first one I've been excited about for some time. I'm sure we'll see some advancements and improvements along the way the next couple generations as the dust settles on this technology from a hardware and software standpoint.

    Just like everything else in many different industry segments, there's early adopters that will spend the money to see and play with new features or better performance, and those that don't care that much that will buy more affordable options (aforementioned 1660 Super/Ti, 1070Ti/1080Ti, 5700/5700 XT).
     
    Last edited: Dec 1, 2019
    hmscott, Felix_Argyle and Prototime like this.
  19. Casowen

    Casowen Notebook Evangelist

    Reputations:
    64
    Messages:
    401
    Likes Received:
    108
    Trophy Points:
    56
    I predicted last year that we are about a good decade away from ray tracing even being close to viable, even with the current rate of performance increase per generation. And thats all assuming good engine script optimization, which is not at all a realistic expectation given what developers have to go through. The most convincing game with ray tracing is minecraft and even with an rtx titan, it still sinks your fps to sub 10, and thats a bare simple game. Any other game like warframe with legitimately convincing real high volume ray cast would be sunk to sub Fps to point of regular CGI render time. Im not saying nvidia couldnt do some legitimately convincing real time ray casting in eventually, but less talk and more show, because 2023 probably wont be that. I Could be wrong, but its all talk regardless, and they are known to lie.

    If google Stadia gets going with all its Server TCU, it could be a viable way of making high ray cast gaming be cheap and way better then even people with high end rigs.
     
    Last edited: Dec 1, 2019
    hmscott likes this.
  20. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    viable?

    It already is
     
    hfm and hmscott like this.
  21. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Funny you mention the 256. I was in a heated ray-tracing discussion on a different forum and one of the guys from Digital Foundry weighed in and brought it up as well.

    Here is what he had to say on the whole RTX debate:

     
    hfm and Prototime like this.
  22. sniffin

    sniffin Notebook Evangelist

    Reputations:
    68
    Messages:
    429
    Likes Received:
    256
    Trophy Points:
    76
    Honestly, it comes down to idiotic tribalism. As soon as AMD releases RT accelerated hardware, RT will no longer be a “gimmick”. I can’t wait for RDNA2, just so people will stop whining and whining about new graphics technology.

    I mean think about it, people are actually complaining about advances in rendering technology. How stupid is that? It’s as stupid as fanboyism, because that is the source.
     
    hfm likes this.
  23. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Correction, AMD Gpu's CAN do raytracing at the moment (via compute) and they do it better than NV gpu's that don't have specialised hw for raytracing.
    And again, I don't care about RTX to begin with. Its a non-factor for games and software right now as far as I'm concerned.

    My second point is, why waste money on NV which supports closed source features? Its clearly evident that RTX as it currently works wouldn't even execute on AMD GPU's even if they had the specialised hw. Its the same with Hairworks and other closed-source features.
    Hmscott was accurate when he mentioned that 99% of the time, raytracing hw on NV gpu's is just lying unused... and its not particularly useful to enable on their mid-range gpu's.

    AMD may not have RTX at the moment, but when they release GPU's with the feature, they are probably going to implement an open-source version that will be just as capable and probably more flexible than NV's proprietary feature for raytracing which will run on any GPU which has those capabilities (from the patent I read on how AMD will go about this implementation, it won't exactly be specialised hw, but will use a combo of software and hw compute to get the same results with minimal to no performance impact).

    Raytracing as its currently used has minimal impact on gaming visuals, and is used in a handful of games. Its equivalent going from High to Ultra without really noticing any visual improvements while suffering a large performance impact - then again, games are badly optimised on PC's as is, so that could stand to improve too.

    Will I spend money on an raytracing capable GPU at some point in the future? Probably. Mainly because it will be a feature that will come by default in a GPU (as it does now on NV hw) but I don't particularly care about RTX in current form.
     
    hmscott likes this.
  24. sniffin

    sniffin Notebook Evangelist

    Reputations:
    68
    Messages:
    429
    Likes Received:
    256
    Trophy Points:
    76
    Incorrect. For many reasons including GCN and RDNAs abysmal cache latencies. Where on earth did you read this information?
     
    hfm likes this.
  25. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    It's a figment of his imagination, like many of his assertions of AMD's theoretical prowess which fall flat in real-world examples. CryTech's Neon Noir tech demo is clear enough. In real-time ray tracing utilizing general purpose compute rather than bespoke dedicated ray tracing hardware, the hierarchy is pretty clear. Turing> Pascal=RDNA>GCN.

    https://www.eurogamer.net/articles/digitalfoundry-2019-crytek-neon-noir-software-ray-tracing-tested

    How is HairWorks an example of that? Not only is it open source, it runs on DX11+ GPUs including AMD's just fine. It's not Nvidia's fault that AMD graphics architectures over the years have historically had weaker geometry engines and tessellation performance.
     
    Last edited by a moderator: Dec 2, 2019
    hfm likes this.
  26. sniffin

    sniffin Notebook Evangelist

    Reputations:
    68
    Messages:
    429
    Likes Received:
    256
    Trophy Points:
    76
    Not only that, but RTX as used in games uses DirectX DXR extensions. If it didn’t execute in AMD GPUs, it would be because AMD didn’t follow the spec.
     
    hfm likes this.
  27. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Games selectively crippled AMD gpu's by ramping up tessellation to ridiculously unnecessary levels to 64x (and it wasn't even running that well on Nvidia hardware to begin with on those high settings - because past certain threshold [aka 16x or even 8x], the visuals didn't even change to a degree a player would notice - oh and at the said threshold of Tesselation, AMD gpu's gave pretty high performance as is).
    GameWorks is also predominantly optimised for NV hw, not AMD, so AMD cannot optimise for those features easily or at all.

    Also, if I remember correctly, AMD came out with TressFX first... in which Nvidia initially showed garbage performance. However, all it took were a few driver optimisations to run properly on NV hw (which NV easily implemented because TressFX is Open Source).. but it still ran a bit slower than on AMD gpu's because NV gaming gpu's had lower compute performance (and TressFX was predominantly compute reliant).

    Fact of the matter is that in compute, AMD repeatedly kicked Nvidia, but this metric wasn't particularly noted nor was it relevant to the gaming industry. Why?
    Well lets see... Nvidia gave large amounts of cash to devs so they would optimise software for their GPU's and proprietary features in the first place (ranging from non pro to professional software) which further skewed performance metrics.
    And even when AMD had consistently stronger GPu's at their disposal in the past, people still flocked to NV due to mindshare and NV having deeper pockets.

    I won't deny AMD has been using same uArch for a long time, but when they manage to come up with open source solutions to proprietary garbage NV pulls which ends up running the same or better across any hw, I have to ask myself, why would I want an NV gpu for its closed source feature when open source works better and is more easily optimised for by any company?

    NV gpu's have their place, but I just don't want to support a company like that at this time (I did before, but not right now).
    Why bother?
    AMD also has Freesync, and monitors with that capability are far cheaper vs GSync (and no, I don't deny that NV came out with GSync first... my point was that the same could be achieved via pre-existing mechanisms inside the display that didn't require a specialised chip that initially only ran on NV gpu's... and now Nvidia is even conforming their GSync chips so they would allow AMD gpu's to use it - how 'nice' of NV to try and stop an open source feature - that still doesn't detract from the fact that GSync monitors are more expensive - and again, why bother paying a premium for something Freesync does to a high degree just as good?).

    Also, Metro developers themselves said that raytracing doesn't require specialised RTX cores to run... this was later proved by Crytek... whereas NV would have you think that you need RTX cores to enable raytracing at all (case in point 'Control' game and others which enable RTX to such a degree where visual differences become indistinguishable from NON-RTX option when you're playing the game).

    Tell that to Crytek and Metro gaming devs who demonstrated and said you don't need RTX cores to create raytracing (just some decently powerful compute performance). Oh and in the Crytek demo, AMD non RTX gpu's did beat NV non RTX capable gpu's because AMD has more powerful compute hw at its disposal.
     
    Last edited by a moderator: Dec 3, 2019
    hmscott likes this.
  28. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    786
    Trophy Points:
    131
    Hm interesting, I had read that somewhere that amd is better at direct compute but had no idea RT didn't require specialty stuff, not a surprise at all that NV seeks to hide that with their marketing prop wing. IDK they all seem the same to me these manufs so it's just convenience w nvidia, driver support is #1. Also amd, at least older mobile gpus hd 7970m, run way way hotter, the m4000 in my m4600 runs at 75c while doing cad etc and when I had an m6100 in the m6700 it ran at over 90c stock just using the passmark starting screen, get nowhere near that w nvidia gpu's. Current stuff might be better who knows. Seems like the strategy now is just market confusion of too many products and actually downgrading performance and convincing people it's an upgrade, planned obsolescence etc
     
  29. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,375
    Trophy Points:
    431
    The m6700 can handle the m5000m/980m which is typically a 115w card. The m6100 is around 75w pending the tuning so 90c seems a bit crazy to me.

    Ran that card in my AW17 R1 for a short while and never broke 60c in SG mode.
     
  30. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    I don't think there is conclusive evidence that NV has been crippling performance on its older GPU's.
    To what are you referring when you mentioned 'downgrading performance'?

    AMD does offer a viable alternative, and their driver support (at least on desktop) has been impeccable (and thanks to coaxing from people who have AMD hw, they brought RIS (radeon Image sharpen) feature to virtually all their GPU's).

    Their drivers are arguably better than NV's... and have been for a while (again, for desktops) - the UI and features alone trample over NV's implementation along with a range of customisation (with built in Wattman no less) for years - I don't understand why people still carry old misinformation regarding drivers from AMD (as a previous Nvidia owner, I had my share of problematic drivers on desktop and mobile, so I do not think that NV has been #1 in regards to driver support necessarily).

    In the mobile sector, AMD has been abysmally slow in bringing updates to mobile GPU's, but have increased the frequency of their updates at the start of this year and make it a lot easier to use recent drivers (which is only a plus).
     
  31. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    The performance difference in Witcher 3 between HairWorks preset high (visually comparable to 64x tessellation factor in the AMD driver) and low (comparable to 8x) is around 10% on my Pascal GPU, which is hardly a crippling performance loss. Again, it’s not Nvidia’s fault that AMD’s tessellation performance is weak. Put down the tinfoil hat.

    As for the visual difference, I do notice the difference in the roundness of the hair strands between high and low. It’s visually similar to the difference between higher and lower poly count 3D geometry. But as with most discussions centering around the merits of higher visual settings in games, YMMV.

    As verified by independent testers like Battle(non)sense, FreeSync monitors do not work as well as G-Sync monitors with the physical FPGA inside. The refresh rate on FreeSync monitors is not as stable in adjusting itself to match the in-game frame rate, which results in more stutter. This is also apparent when comparing mobile G-Sync, which is based on the same Adaptive-Sync standard as FreeSync, with moduled G-Sync monitors side-by-side. The monitor with the G-Sync module is noticeably smoother.

    Nvidia did not claim this. In fact, they allow DXR on GPUs without dedicated RT hardware (GTX 1060 and faster) using compute fallback, but as expected the quality is lower and performance is vastly slower.

    Control has by far the most robust real-time ray tracing implementation to date in a AAA title. This is why cranking all ray tracing settings to the max is so expensive, as it is very much a future looking game. You can watch the Digital Foundry video and judge for yourself whether the visual difference are indistinguishable or not:



    This is wrong. AMD’s GPUs lose to their non-RTX Nvidia performance equivalents at all levels. 5700XT/R7 loses to 1080 Ti, V64 loses to 1080, and 580 loses to 1060.

    EB4C73CD-AA15-4B67-9B43-670E97D38F41.jpeg
     
    saturnotaku likes this.
  32. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,299
    Likes Received:
    3,050
    Trophy Points:
    431
    I agree with everything that person said to a tee. All of it. Very well reasoned argument, and they are correct. A lot of features we take for granted from the birth of these cards in 1996 to now were spearheaded on new metal where the game devs had to catch up to it or we had to go through a couple gens to get the full scope of what it meant.

    Like i've been saying all along, nVidia KNEW full well people would complain to no end about this, but they had to do it to keep pushing things forward. If AMD is already launching RT features next year they were well on their way to adding it to their designs as well. It takes MANY YEARS to get this stuff from the design phase to actual shipping product. AMD was doing it a while back as well if they are launching stuff in 2020.
     
    Prototime likes this.
  33. hertzian56

    hertzian56 Notebook Deity

    Reputations:
    438
    Messages:
    1,003
    Likes Received:
    786
    Trophy Points:
    131
    Idk man maybe they're better now but when I had an hd 6850 the drivers seemed whack to me and not as upkept as nvidias that was way back then though. Thermally the ones I mentioned stand, they just run way too hot. My quadro k4000m is an example of a crippled by nv card, I think they're ok in the newer quadros though, I'm thankfully out of that hole. The latest drivers for my m4000 are pretty old, win10 does all that, that is a good thing about win10 bc it was a pain getting drivers for such an old card on win7. Drivers don't even recognize it as an m4000 I think it says 7700m series hd or something like that in the amd utility that win10 installed auto. AMD's website is a mess, what a maze to find older drivers or drivers at all when I tried it for my m4000.
     
  34. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    So basically you will accept it when AMD starts selling GPUs with dedicated raytracing hardware built into the die space, only because AMD isn't Nvidia.

    You are also falsely believe Nvidia is using some proprietary implementation, when they are just using Microsoft's open DirectX Raytracing APIs (DXR), which are exactly what AMD is going to do as well.
     
    hfm likes this.