As mentioned in the video, F1 2016 and 2017 are using the exact same game engine, but the 2017 version now is part of the Nvidia Gameworks program. Just look at those charts, especially the R9 390...
![]()
![]()
-
don_svetlio In the Pipe, Five by Five.
Nvidia GameBreaks
-
don_svetlio In the Pipe, Five by Five.
-
What am I supposed to be outraged about here? The 390 is the only card I see underperforming.
Vistar Shook likes this. -
For all us dummies, what the hell am I supposed to be comparing?
-
don_svetlio In the Pipe, Five by Five.
Mostly the way GW is designed. It forces unnecessary amounts of AA and Tess for no apparent reason. The default Tess AA for HW is 16x. The difference between that and 8x is non-existent
-
killkenny1 likes this.
-
don_svetlio In the Pipe, Five by Five.
Are you going to notice that when playing the game? So instead of following the story or fighting boss monsters, you sit around looking at Geralt's small curved hair strand and get outraged that it's not smoother to the point where you'd go from 60fps to 40fps just to make it smoother?
-
Anyway, does F1 2017 even use tessellation? It's not in the graphic settings. I smell an Nvidia bashing thread. -
killkenny1 Too weird to live, too rare to die.
-
don_svetlio In the Pipe, Five by Five.
-
-
don_svetlio In the Pipe, Five by Five.
-
According to some reviews there is a CPU overhead for AMD cards that did not exist in F1 2016. We seen such things before, and people usually would blame the AMD driver for not handling DX 11 very well. But now we have the case that there are 2 games based on the same game engine version with the only difference that one now uses Gameworks. Doesn't it seem odd that CPU usage is suddenly much higher on AMD cards ? I do not think that the driver does anything different then in F1 2016 that could be the cause for that.don_svetlio likes this. -
don_svetlio In the Pipe, Five by Five.
-
-
-
don_svetlio and TBoneSan like this.
-
-
-
-
Those issues do occur on all cards, right ? I only remember that DX12 was kind of broken in DS, but that is nothing unusual as devs are still not familiar with that API. Don't get me wrong, I do not have any problem if a title is part of GameWorks or Gaming Evolved and game developers use some effects that are optimized for AMD or Nvidia cards (as long as they can be deactivated).
However, I do have problem with the GameWorks libraries developers have to add to their game, even if they do not use any of the effects they should provide. F1 2017 is yet another example in the long history of GW were performance drops for no apparent reason (mysteriously only on GPUs of the competition in this case).don_svetlio likes this. -
Just give AMD some more time to optimize its drivers for F1 2017 as the game just came out. We saw the same on Nvidia side with Gaming Evolved titles like Sleeping Dogs and TR13, those games ran better on AMD cards for a few months (TressFX notably) before Nvidia optimized its drivers.Assembler likes this. -
don_svetlio In the Pipe, Five by Five.
Driver work can only go so far, though. The Tahiti-based GPUs don't handle AA nearly as well as Polaris/Vega hence why performance tanks. It's the same reason why AMD's driver force-reduces the Tess in Witcher 3 - because 16x Tessellation for no good reason on random objects (Crysis 2) is an underhand gameworks tactic.
-
-
don_svetlio In the Pipe, Five by Five.
There's a difference between leveraging a feature of a GPU and suffocating all GPUs. With A-sync, you have an increase in performance on AMD's hardware due to the simple fact that it was overbuilt to begin with and has a larger compute-focused feature set. Thus the GPUs perform better. Kepler and Maxwell simply have no support for it but that doesn't reduce performance whatsoever. The exact opposite is true of Gameworks where the performance is reduced for no noticeable gains in the vast majority of cases. As for DE:MD - I've not played the game myself and thus cannont comment.
Also, Hitman Absolution hasn't been tested on modern cards and it depends on what hardware it was being run on. If you're comparing a 7870 to a 660 or something of the sort, then it's naturla for the 7870 to overtake it as it's a more powerful card. Sorry but GW is not something I personally feel like praising because it's simply a lazy attempt at marketing. -
Last edited: Aug 30, 2017
-
don_svetlio In the Pipe, Five by Five.
You may want to go read-up on your history. Nvidia have been doing this for years. In some cases, pretty blatantly. This particular example is from the 500 vs 6000 days when the 500 series cards were notably better at tessellation. Now, normally that wouldn't even be reason to bring up anything but when you look at the absurd use of it on totally random objects, which most people don't even look at - you get a feeling that there's something else afoot.
As for TWIMTBP vs (I believe back then it was "The Future is Fusion" as a slogan, though that may be from their CPUs and there was also "AMD Vision") - in any case, both companies have and will continue to sponsor games. And that's a good thing. The bad thing is purposefully adding excessive effects so as to reduce both your performance and your competitor's performance simply because you get impacted slightly less. I mean, these dumb decisions are the very reason why Nvidia are constantly (and somewhat rightfully) accused of planned obsolescence. -
But let's look at AMD. Like how Global Illumination in DiRT Showdown crippled performance on Nvidia cards due to it using a DirectCompute-based scheme developed by AMD (gee that sounds awfully lot like what happened with TressFX in TR13 doesn't it?).
Or how AMD's developer relations program barred Koroush Ghazi of TweakGuides.com from publishing his Far Cry 3 tweak guide after he had already finished it. A guide which would've benefitted all gamers. Only because FC3 was a GE title and he was contracted by Nvidia at the time.
So let's not pretend both companies don't intentionally sabatoge their competition by implementing specific features that don't run well on their competitor's hardware, OK? With Nvidia it's tessellation, with AMD it's compute-based lighting and hair. End of the day what difference does it make?
Hey maybe it's just me, but I'm sick of the double standard. When Nvidia's developer relations program is mentioned, the reaction is "Nvidia bribes devs!" and "Nvidia cripples the game on other platforms to make themselves look better!" When AMD's developer relations program is mentioned, the reaction is "Gold star for AMD!" and "This is good for the industry!" and "This benefits all gamers!" -
don_svetlio In the Pipe, Five by Five.
All things that have been fixed by a simple driver update. Let me know when that happens with GameBreaks
-
-
don_svetlio In the Pipe, Five by Five.
Last week? I've got experience with everything from the 5000 series onward.
-
-
don_svetlio In the Pipe, Five by Five.
Because, as we've seen from F1, a problem still exists and as we've seen in the past 2 years, it's not driver-related, it's, usually, absurd amounts of Tess or AA being forced on GW features.
-
-
TBoneSan and don_svetlio like this.
-
Nvidia Gameworks strikes again ? (F1 2017 Benchs)
Discussion in 'Gaming (Software and Graphics Cards)' started by Assembler, Aug 29, 2017.