The embargo on Ashes of Singularity reviews lifted today, giving the first benchmarks from a game that will support DX12 when released. And I must say, the results were rather surprising, enough that I checked multiple sites to ensure the first one I saw wasn't an aberration. But the results are consistently showing major gains over DX11 on AMD's cards, and even-to-slightly-worse ones on NVIDIA's (with one exception).
While it may be that other games give results more similar to the DX11 status quo, and NVIDIA may be able to optimize its drivers to see gains similar to what AMD is seeing (or the one-off GTX 770 result above), it's certainly a much more positive result for AMD than I was expecting. Granted, with the Mantle work it wouldn't have surprised me to see AMD get better news than NVIDIA from DX12, but I wasn't expecting such drastic changes.
It's also probably too early to base GPU choices on DX12 yet, but if the next couple benchmarks give similar differences, that could be a significant boost to AMD's fortunes.
-
-
I can't even get dxdiag to tell me I have DX12 capability under Win10 on my 870M... It's possible the support for it is poor right now.
-
DX12 places more burden on the game developer to produce better game code to fully utilize the potential of DX12. Given how a majority of PC games are console ports with extremely poor implementation, I don't have high hopes that we will see large performance gain any time soon. If anything, I expect more buggy games with worse performance since your hardware and driver matters less.
-
-
King of Interns Simply a laptop enthusiast
I guess AMD having already developed Mantle have the edge with driver support.
Nvidia will close the gap in the coming month I guess. -
moviemarketing Milk Drinker
So...is R9 M390x going to beat 980M/990M in DX12 games?
Notebookcheck rates it as worse than 970M for DX11 titles.TomJGX likes this. -
I think it's is as it is. AMD got a little* jump on this title from previous Mantle support and probably caught Nvidia snoozing since it's not yet a proper release. -
-
Just give them some time to polish the D12 drivers
-
But when I look under the display itself and the driver feature support levels:
-
mine says the same thing in the display tab (of dxdiag) but it does list dx12 in the render tab. i am using an older driver version though...
-
Meaker@Sager Company Representative
If you have WDDM 2.0 then you have DX12.
-
This post provides a nice explanation of why we're seeing the results we're seeing. It's a wall of text but a very enlightening read.
http://www.overclock.net/t/1569897/...gularity-dx12-benchmarks/390_30#post_24321843TBoneSan, Apollo13 and godlyatheist like this. -
Basically each vendor has a different bottleneck. Now it's the developer's job to optimize their game engine so their game doesn't hit a bottleneck.
-
). If not, it's the same thing as before.
-
Nice link, n = 1. That goes into a lot more detail about the "why" than the mainstream reviews I'd seen. It's interesting that it really does essentially come down to architecture and GCN being more suited to highly-parallel scheduling. With GCN 1.1/1.2 having notably higher parallelism than 1.0, it would also suggest that as DX12 starts taking off, AMD should probably update its last remaining GCN 1.0 cards in the 300 series with more modern ones (Fury Nano?). But as interesting is that to really shine in DX12, NVIDIA may need to release new hardware - and if Pascal does launch in Q2 2016, they may well have it out before DX12 is a major factor for most gamers.
Though TBH, I'm kind of glad that there's a reason for this that could show up in other games as well, and that AMD's advantage in DX12 may last for several months once DX12 is supported in several titles. They could use a few more sales to help maintain competitiveness in the GPU industry. We'll see, but at this point I see any performance-based reason to buy AMD GPUs as likely a good thing in the long-term, at least until the market share evens out somewhat. They can't really afford for Pascal to be as dominant in DX12 as Maxwell has been in DX11, so I'm hoping they did see things out ahead of time in this case. -
-
Optimizations will likely help, but I wouldn't count on them to make or break the end results, because (for the time being) Nvidia's hardware is NOT as good as AMD in parallelization tasks.
DX12 is already very close to the metal, so how much do you think you can go with optimizations if the hardware is incapable of doing something? -
-
moviemarketing Milk Drinker
After DX12 driver updates for both sides, Fury X and 980 Ti look about even, although I don't see any breakdown with the asynchronous compute portion:
Nvidia does much better in DX11 benchmark:
http://www.computerbase.de/2015-10/...ashes-of-the-singularity-directx-12-1920-1080Last edited: Oct 16, 2015 -
moviemarketing Milk Drinker
Interesting to see the latest round of Ashes DX12 benchmarks, comparing performance from mixing your AMD and Nvidia cards together: Red + Green > Green + Red
hmscott likes this. -
Clearly NVidia needs to optimize drivers for this game. It's one game, not worth worrying about or making any assumptions with.
NVidia DX11 support is excellent. Remember Dragon Age 2? A unabashed AMD optimized game? After a few weeks, NVidia optimized drivers and helped patch the game until NVidia was outperforming AMD in it's own game.
Remember FB3? Every single tech promo and preview of BF4 was with AMD, another shameful AMD optimized game and Mantle fail. NVidia GPUs once again after some work outperform AMD in their own AMD featured game. -
HAS NO ONE NOTICED THIS DUMB GAME IS AN AMD GAME?!?!?!? AMD logo is plastered all over their promotional videos and website.
This is crappy engine developed for AMD Mantle and AMD crap console APUs. -
AotS is a PC exclusive.
-
-
-
killkenny1 Too weird to live, too rare to die.
-
-
And lastly, 7970M that burned itself. WTF, seriously WTF. An Asus 5870M that refused to downclock so the memory was constantly running at max and voltage was fixed and GPU was unstable at lower min clocks. What a piece of **** that was.
Overall the last few years when I said enough was enough, 675MX while wasn't as powerful as I liked, there were few complaints for it, Optimus wasn't optimal. 980M has been overall a great GPU for me. I can't say this for AMD that I've used in 4 other laptops. -
Seriously why do you think nVidia was able to fix the TressFX performance issue in Tomb Raider within 2 weeks? Because CrystalDynamics supplied them with the goddamn source code!TomJGX likes this. -
That shows how incredible nVidia is, no one can defy them, they are forced to bend to their will and provide the source code.
I'll stick with this evil that gets their way all the time over AMD that only gives me anginas and aneurysms. -
Ol' hulawafu has become a caricature of himself
-
moviemarketing, TBoneSan, TomJGX and 1 other person like this.
-
-
AMD isn't competitive in mobile graphics and the highest end of desktop graphics, but one tier below the highest end on the desktop side they're still very competitive.
Besides, just because a company offers what you want doesn't mean you have to be a slavering sheep and agree with their clearly anti-competitive practices.moviemarketing and TBoneSan like this. -
I'll admit, recently NVidia has been trolling their customers with the GTX 970 ram issue, but drivers on mobile, but overall, I have had a clean experience. It just works. -
It works. And only slight boost over DX11, maybe 4 FPS? Don't remember. And I see from Fable DX12 benchmark, NVidia also lagging. NVidia definitely needs to start working on DX12. They shouldn't have made the statement they are confident with DX12. Admitting would not be recommended, they should have just kept their mouth shut and answer with improvements to their DX12 support.
But no doubt NVidia will have this sorted.
Last edited: Oct 29, 2015 -
moviemarketing Milk Drinker
Here is another video with R9 Fury and GTX 970 asymmetrical SLI:
-
King of Interns Simply a laptop enthusiast
Pretty cool! -
They couldn't get 60 FPS average with a Fury and 980, for a game that looks this terrible? The lighting effects are awful, the explosions are underwhelming, the amount of texture it supposedly renders well looks worse than most DX9 games, worse than Skyrim's background textures. This is not the DX12 game to be testing our hardware, clearly this was optimized by gorillas. I know this is only Alpha but for a game this terrible and run this horribly, that's pathetic. Two of the best GPU available, that probably would run BF4 @ 100 FPS @ 4K...
-
DX12 is very much in a beta stage when it comes to game development so give them a break.. it will get better in a few months or a year or two.. -
DX12 Ashes of Singularity: R9 290X matches 980 Ti
Discussion in 'Gaming (Software and Graphics Cards)' started by Apollo13, Aug 20, 2015.