The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Nvidia Gameworks strikes again ? (F1 2017 Benchs)

    Discussion in 'Gaming (Software and Graphics Cards)' started by Assembler, Aug 29, 2017.

  1. Assembler

    Assembler Notebook Consultant

    Reputations:
    49
    Messages:
    110
    Likes Received:
    81
    Trophy Points:
    41


    As mentioned in the video, F1 2016 and 2017 are using the exact same game engine, but the 2017 version now is part of the Nvidia Gameworks program. Just look at those charts, especially the R9 390...

    [​IMG]

    [​IMG]
     
    hmscott and Vistar Shook like this.
  2. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    Nvidia GameBreaks :D
     
  3. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
  4. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    What am I supposed to be outraged about here? The 390 is the only card I see underperforming.
     
    Vistar Shook likes this.
  5. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    For all us dummies, what the hell am I supposed to be comparing?
     
  6. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    Mostly the way GW is designed. It forces unnecessary amounts of AA and Tess for no apparent reason. The default Tess AA for HW is 16x. The difference between that and 8x is non-existent
    [​IMG]
     
  7. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    You can't see the difference between each level? For 8x vs. 16x, look at the difference in the curve of the top hair strand.
     
    killkenny1 likes this.
  8. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    Are you going to notice that when playing the game? So instead of following the story or fighting boss monsters, you sit around looking at Geralt's small curved hair strand and get outraged that it's not smoother to the point where you'd go from 60fps to 40fps just to make it smoother?
     
  9. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    You know I'm right. Stop trying to deflect. Maybe some people have different priorities (eye candy, frame rate, somewhere in-between) so who are you to tell them otherwise?

    Anyway, does F1 2017 even use tessellation? It's not in the graphic settings. I smell an Nvidia bashing thread.
     
  10. killkenny1

    killkenny1 Too weird to live, too rare to die.

    Reputations:
    8,268
    Messages:
    5,258
    Likes Received:
    11,615
    Trophy Points:
    681
    I can, even during gameplay. Not saying I like how nVidia implemented Hairworks, but the difference is noticeable. At least for my eye.
     
  11. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    Tess is baked into quite a few GW features. And no, the difference is not enough to justify the performance hit. Especially when you have other rendering methods that don't impact performance nearly as much. A person with a 1080 may not care but that's 1% of users. About 10% of users have 760/960/280/380-class cards and around 40% have even lower-end hardware. For those people, having 16xAA be the default and requiring modding the game files to go lower is a pain since they don't even have the option to adjust it based on their own hardware. Sure, Witcher 3 got an AA slider after the mass public outrage but it was implemented by CDPR, not Nvidia and other GW games still have no or very few settings, with which to control AA and Tess.
     
  12. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    People with those slower GPUs shouldn't be using GW features anyway. It's like the dolts who jack Ghost Recon: Wildlands to the max and complain about the 10 FPS on their 750 Ti and raise an nGimpia sh*tstorm online. Are we blaming Nvidia for user stupidity now? Are we blaming developers for including halo graphic settings in their games?
     
  13. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    Why not, though? TressFX and its derivatives run perfectly fine on even an R7 360/GTX 750 (I've tested, it's definitely usable in TR2013 and RotTR). Fact is, Nvidia have a LOT of resources at their disposal and this just feels like a beta version. They can and should do better. Optimization is what has always been their hallmark and I've got no idea why they've gone the route of pushing unoptimized and excessively-taxing effects in games when you CAN make them run better as evident by the Witcher 3 drama that went on 2 years ago.
     
  14. Assembler

    Assembler Notebook Consultant

    Reputations:
    49
    Messages:
    110
    Likes Received:
    81
    Trophy Points:
    41
    Not only the 390, in F1 2016 all AMD cards were faster (or equal) to their Nvidia counter parts (R9 390, RX480/580 faster than GTX1060, Vega56 faster than 1070, Vega64 equal to 1080). In F1 2017 the opposite is the case, suddenly all AMD cards do perform far worse, while the performance of Nvidia cards is technically the same (+- 5%).

    According to some reviews there is a CPU overhead for AMD cards that did not exist in F1 2016. We seen such things before, and people usually would blame the AMD driver for not handling DX 11 very well. But now we have the case that there are 2 games based on the same game engine version with the only difference that one now uses Gameworks. Doesn't it seem odd that CPU usage is suddenly much higher on AMD cards ? I do not think that the driver does anything different then in F1 2016 that could be the cause for that.
     
    don_svetlio likes this.
  15. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    And you'd be correct. When a game uses the same engine as an older game, the driver optimizations needed are already mostly in place. And if he only variable in this case is GW - then the most logical conclusion would be that the features added are influencing results.
     
    TBoneSan and Assembler like this.
  16. Assembler

    Assembler Notebook Consultant

    Reputations:
    49
    Messages:
    110
    Likes Received:
    81
    Trophy Points:
    41
    Exactly, and we are talking about a CPU overhead on a Core i7-6850K clocked at 4.3 GHz in case of the benches that CB did.
     
  17. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    Then why not talk about Deus Ex: Mankind Divided? MSAA destroys performance, it guzzles CPU power, and stutters in the Prague hub due to unoptimized texture streaming (even on consoles). That's a Gaming Evolved title. Because it doesn't fit the narrative and you're just here to bash Nvidia, right? Why not look at the good things that have come from GameWorks, like day zero game ready drivers and SLI support?
     
  18. Assembler

    Assembler Notebook Consultant

    Reputations:
    49
    Messages:
    110
    Likes Received:
    81
    Trophy Points:
    41
    But at least you can change that, GW effects rarely can be turned off in the video settings and no one knows what else it does in the background even after deactivating. There is a good reason why not even game developers get an insight of the GW libs, it's like a black box were they can mess around with the system as they please.
     
    don_svetlio and TBoneSan like this.
  19. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    You talking about Deus Ex MD? No you can't change that. MSAA still rekts performance if enabled, it still eats CPU, and Prague still stutters due to texture loading even with the game installed on an SSD and ample RAM. But yeah it's a shining example of optimization because of the Gaming Evolved sticker on the box. :rolleyes:
     
  20. Assembler

    Assembler Notebook Consultant

    Reputations:
    49
    Messages:
    110
    Likes Received:
    81
    Trophy Points:
    41
    Never played Deus Ex MD so I can't say for sure, but it seems MSAA can be turned off:

    http://www.pcgamer.com/how-to-get-the-best-performance-out-of-deus-ex-mankind-divided/
     
  21. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
  22. Assembler

    Assembler Notebook Consultant

    Reputations:
    49
    Messages:
    110
    Likes Received:
    81
    Trophy Points:
    41
    Those issues do occur on all cards, right ? I only remember that DX12 was kind of broken in DS, but that is nothing unusual as devs are still not familiar with that API. Don't get me wrong, I do not have any problem if a title is part of GameWorks or Gaming Evolved and game developers use some effects that are optimized for AMD or Nvidia cards (as long as they can be deactivated).

    However, I do have problem with the GameWorks libraries developers have to add to their game, even if they do not use any of the effects they should provide. F1 2017 is yet another example in the long history of GW were performance drops for no apparent reason (mysteriously only on GPUs of the competition in this case).
     
    don_svetlio likes this.
  23. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    Yeah the CPU usage and texture streaming related stutter are still issues. DX12 is mostly OK now, but can still be hit-or-miss on Nvidia. DX:MD used GPUOpen (AMD equivalent of GameWorks) for its effects, many of which were broken for months after release. Examples include CHS reducing shadow render distance and not applying to all shadows, higher shadow settings having artifacts, AO flickering/shimmering when enabled with TAA, AO causing black crush taking over the entire screen, the higher AO setting affecting radius/coverage (AO not applied to things closer to the camera), tessellation causing ghosting on faces with TAA enabled, etc. Some of these have been fixed, others have not over 1 year after release. Not a good look for GPUOpen.

    Just give AMD some more time to optimize its drivers for F1 2017 as the game just came out. We saw the same on Nvidia side with Gaming Evolved titles like Sleeping Dogs and TR13, those games ran better on AMD cards for a few months (TressFX notably) before Nvidia optimized its drivers.
     
    Assembler likes this.
  24. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    Driver work can only go so far, though. The Tahiti-based GPUs don't handle AA nearly as well as Polaris/Vega hence why performance tanks. It's the same reason why AMD's driver force-reduces the Tess in Witcher 3 - because 16x Tessellation for no good reason on random objects (Crysis 2) is an underhand gameworks tactic.
     
    Assembler and TBoneSan like this.
  25. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    While it may not be tessellation specifically, AMD developed effects have done the same thing in the past and I gave a very specific example of that with GPUOpen in DX:MD. Also what about Gaming Evolved titles like Hitman 2016 and AoTS that have always ran significantly better on AMD cards, is that underhanded too? And before you say async compute, isn't tailoring GameWorks to the strengths of its hardware what Nvidia has done as well? Whodathunkit? Not to mention Hitman: Absolution, which didn't use DX12 or async, also ran better on AMD cards, but I don't remember a huge Internet sh*tstorm about that back in the day.
     
  26. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    There's a difference between leveraging a feature of a GPU and suffocating all GPUs. With A-sync, you have an increase in performance on AMD's hardware due to the simple fact that it was overbuilt to begin with and has a larger compute-focused feature set. Thus the GPUs perform better. Kepler and Maxwell simply have no support for it but that doesn't reduce performance whatsoever. The exact opposite is true of Gameworks where the performance is reduced for no noticeable gains in the vast majority of cases. As for DE:MD - I've not played the game myself and thus cannont comment.

    Also, Hitman Absolution hasn't been tested on modern cards and it depends on what hardware it was being run on. If you're comparing a 7870 to a 660 or something of the sort, then it's naturla for the 7870 to overtake it as it's a more powerful card. Sorry but GW is not something I personally feel like praising because it's simply a lazy attempt at marketing.
     
  27. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    Because you're focusing on only one side. GW and GE both have their good and bad, and you're only focusing on GW's bad. Whether that's due to bias or lack of experience, I don't know. It doesn't help either that Nvidia has far more market share, clout/influence, and money to throw around the game industry, so they have partnerships with a greater number of bigger AAA studios/publishers, so GW appears in far more games than GE does. But yeah, let's just conveniently forget the good that comes out of GW (and GE as well). Sending engineering resources and money from Nvidia/AMD to help devs optimize their games (sometimes for only one GPU brand) resulting in day zero game/driver optimizations/bug fixes and mGPU support. The practice is nothing new at all, developer relations programs have existed on both sides for years before GW ever became a hot topic. Nvidia got its foot in the door way earlier (almost a decade) with TWIMTBP, which was later renamed GW, and AMD didn't come out with GE until TWIMTBP was well established in the industry.
     
    Last edited: Aug 30, 2017
  28. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    You may want to go read-up on your history. Nvidia have been doing this for years. In some cases, pretty blatantly. This particular example is from the 500 vs 6000 days when the 500 series cards were notably better at tessellation. Now, normally that wouldn't even be reason to bring up anything but when you look at the absurd use of it on totally random objects, which most people don't even look at - you get a feeling that there's something else afoot.



    As for TWIMTBP vs (I believe back then it was "The Future is Fusion" as a slogan, though that may be from their CPUs and there was also "AMD Vision") - in any case, both companies have and will continue to sponsor games. And that's a good thing. The bad thing is purposefully adding excessive effects so as to reduce both your performance and your competitor's performance simply because you get impacted slightly less. I mean, these dumb decisions are the very reason why Nvidia are constantly (and somewhat rightfully) accused of planned obsolescence.
     
  29. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    I get it, Crysis 2 water tessellation, don't need to repeat yourself about something I knew since 2011.

    But let's look at AMD. Like how Global Illumination in DiRT Showdown crippled performance on Nvidia cards due to it using a DirectCompute-based scheme developed by AMD (gee that sounds awfully lot like what happened with TressFX in TR13 doesn't it? :rolleyes:).

    Or how AMD's developer relations program barred Koroush Ghazi of TweakGuides.com from publishing his Far Cry 3 tweak guide after he had already finished it. A guide which would've benefitted all gamers. Only because FC3 was a GE title and he was contracted by Nvidia at the time.

    So let's not pretend both companies don't intentionally sabatoge their competition by implementing specific features that don't run well on their competitor's hardware, OK? With Nvidia it's tessellation, with AMD it's compute-based lighting and hair. End of the day what difference does it make?

    Hey maybe it's just me, but I'm sick of the double standard. When Nvidia's developer relations program is mentioned, the reaction is "Nvidia bribes devs!" and "Nvidia cripples the game on other platforms to make themselves look better!" When AMD's developer relations program is mentioned, the reaction is "Gold star for AMD!" and "This is good for the industry!" and "This benefits all gamers!"
     
  30. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    All things that have been fixed by a simple driver update. Let me know when that happens with GameBreaks :)
     
  31. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    And you would know because when was the last time you actually used an AMD GPU...?
     
  32. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    Last week? I've got experience with everything from the 5000 series onward.
     
  33. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    Then why are you acting like AMD's driver team can't optimize for GameWorks titles? Because they can, aside from hardware limitations like tessellation and multi-res shading, etc. GameWorks was open-sourced in May 2016. If you look at performance of AMD cards on current drivers in controversial GW titles like Witcher 3 and Project CARS, they run really well now compared to Nvidia.
     
  34. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    Because, as we've seen from F1, a problem still exists and as we've seen in the past 2 years, it's not driver-related, it's, usually, absurd amounts of Tess or AA being forced on GW features.
     
  35. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    Nobody cares about your speculation.
     
  36. Assembler

    Assembler Notebook Consultant

    Reputations:
    49
    Messages:
    110
    Likes Received:
    81
    Trophy Points:
    41
    Was it ? I thought they only opened a few effects and most of the libraries are still closed.
     
    TBoneSan and don_svetlio like this.
  37. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    Source code for these GameWorks features are on GitHub:
    The ones without 'public' next to them require you to join the free Nvidia Developer Program to access them on Git.