The embargo on Ashes of Singularity reviews lifted today, giving the first benchmarks from a game that will support DX12 when released. And I must say, the results were rather surprising, enough that I checked multiple sites to ensure the first one I saw wasn't an aberration. But the results are consistently showing major gains over DX11 on AMD's cards, and even-to-slightly-worse ones on NVIDIA's (with one exception).
While it may be that other games give results more similar to the DX11 status quo, and NVIDIA may be able to optimize its drivers to see gains similar to what AMD is seeing (or the one-off GTX 770 result above), it's certainly a much more positive result for AMD than I was expecting. Granted, with the Mantle work it wouldn't have surprised me to see AMD get better news than NVIDIA from DX12, but I wasn't expecting such drastic changes.
It's also probably too early to base GPU choices on DX12 yet, but if the next couple benchmarks give similar differences, that could be a significant boost to AMD's fortunes.
-
-
I can't even get dxdiag to tell me I have DX12 capability under Win10 on my 870M... It's possible the support for it is poor right now.
-
DX12 places more burden on the game developer to produce better game code to fully utilize the potential of DX12. Given how a majority of PC games are console ports with extremely poor implementation, I don't have high hopes that we will see large performance gain any time soon. If anything, I expect more buggy games with worse performance since your hardware and driver matters less.
-
While you could be right, it may also work out well since X1 will also use DX12. It might be in their best interest to do things properly.
-
King of Interns Simply a laptop enthusiast
I guess AMD having already developed Mantle have the edge with driver support.
Nvidia will close the gap in the coming month I guess. -
moviemarketing Milk Drinker
So...is R9 M390x going to beat 980M/990M in DX12 games?
Notebookcheck rates it as worse than 970M for DX11 titles.TomJGX likes this. -
I highly doubt it. I think these pre Beta results aren't really indicative of very much.
I think it's is as it is. AMD got a little* jump on this title from previous Mantle support and probably caught Nvidia snoozing since it's not yet a proper release. -
How the hell is that possible? Are you using the latest drivers? I have DX12 on my pc.
-
Yep, my guess too.
Just give them some time to polish the D12 drivers
M390X/970M is irellevant by the time DX12 becomes mainstream
-
Oh.. it definitely says I have DX12 installed:
But when I look under the display itself and the driver feature support levels:
-
mine says the same thing in the display tab (of dxdiag) but it does list dx12 in the render tab. i am using an older driver version though...
-
Meaker@Sager Company Representative
If you have WDDM 2.0 then you have DX12.
-
This post provides a nice explanation of why we're seeing the results we're seeing. It's a wall of text but a very enlightening read.
http://www.overclock.net/t/1569897/...gularity-dx12-benchmarks/390_30#post_24321843TBoneSan, Apollo13 and godlyatheist like this. -
Basically each vendor has a different bottleneck. Now it's the developer's job to optimize their game engine so their game doesn't hit a bottleneck.
-
I've always wondered about the reason why console ports perform so poorly. DX12 being able to feed the GPU faster doesn't change the raw processing power of a GPU, it only eliminates a previous bottleneck. If having DX12 makes consoles ports work as well on PC as the native platform then life is great (yeah I'm dreaming
). If not, it's the same thing as before.
-
Nice link, n = 1. That goes into a lot more detail about the "why" than the mainstream reviews I'd seen. It's interesting that it really does essentially come down to architecture and GCN being more suited to highly-parallel scheduling. With GCN 1.1/1.2 having notably higher parallelism than 1.0, it would also suggest that as DX12 starts taking off, AMD should probably update its last remaining GCN 1.0 cards in the 300 series with more modern ones (Fury Nano?). But as interesting is that to really shine in DX12, NVIDIA may need to release new hardware - and if Pascal does launch in Q2 2016, they may well have it out before DX12 is a major factor for most gamers.
Though TBH, I'm kind of glad that there's a reason for this that could show up in other games as well, and that AMD's advantage in DX12 may last for several months once DX12 is supported in several titles. They could use a few more sales to help maintain competitiveness in the GPU industry. We'll see, but at this point I see any performance-based reason to buy AMD GPUs as likely a good thing in the long-term, at least until the market share evens out somewhat. They can't really afford for Pascal to be as dominant in DX12 as Maxwell has been in DX11, so I'm hoping they did see things out ahead of time in this case. -
That's my problem, I don't see the "render" tab in dxdiag. Perhaps due to Optimus? I see you aren't showing a 2nd display adapter for the integrated GPU.
-
Not quite.
Optimizations will likely help, but I wouldn't count on them to make or break the end results, because (for the time being) Nvidia's hardware is NOT as good as AMD in parallelization tasks.
DX12 is already very close to the metal, so how much do you think you can go with optimizations if the hardware is incapable of doing something? -
i do have an hd 4600 and optimus on. although i dont think that should show in the dxdiag video and render tab.
-
moviemarketing Milk Drinker
After DX12 driver updates for both sides, Fury X and 980 Ti look about even, although I don't see any breakdown with the asynchronous compute portion:
Nvidia does much better in DX11 benchmark:
http://www.computerbase.de/2015-10/...ashes-of-the-singularity-directx-12-1920-1080Last edited: Oct 16, 2015 -
moviemarketing Milk Drinker
Interesting to see the latest round of Ashes DX12 benchmarks, comparing performance from mixing your AMD and Nvidia cards together: Red + Green > Green + Red
hmscott likes this. -
Clearly NVidia needs to optimize drivers for this game. It's one game, not worth worrying about or making any assumptions with.
NVidia DX11 support is excellent. Remember Dragon Age 2? A unabashed AMD optimized game? After a few weeks, NVidia optimized drivers and helped patch the game until NVidia was outperforming AMD in it's own game.
Remember FB3? Every single tech promo and preview of BF4 was with AMD, another shameful AMD optimized game and Mantle fail. NVidia GPUs once again after some work outperform AMD in their own AMD featured game. -
HAS NO ONE NOTICED THIS DUMB GAME IS AN AMD GAME?!?!?!? AMD logo is plastered all over their promotional videos and website.
This is crappy engine developed for AMD Mantle and AMD crap console APUs. -
Mantle is dead.
AotS is a PC exclusive.
-
lol this just proves Gaming Evolved titles run well on both AMD and nVidia hardware, and AMD isn't trying to artificially gimp the competition. Now can you say the same for Gimpworks titles?
-
Proves to me NVidia is better at providing software support to even crush AMD sponsored titles.
-
killkenny1 Too weird to live, too rare to die.
And gimping their own titles with stuff like Game(sorta)works
-
plus ça change, plus c'est la même chose
-
For someone who has been gaming exclusively on laptops, I'll take some evil over frustration and years worth of anger using AMD. Years of buying a top tier AMD and AMD claiming it was more powerful than NVidia but in real games, getting crushed. Years of shaders and shadows not being properly rendered, I remember for a while Borderlands shadows weren't properly rendered on AMD mobile. Years of the worst drivers available. Years of absurd amount of heat. Years of unstable and frustrating Enduro BS.
And lastly, 7970M that burned itself. WTF, seriously WTF. An Asus 5870M that refused to downclock so the memory was constantly running at max and voltage was fixed and GPU was unstable at lower min clocks. What a piece of **** that was.
Overall the last few years when I said enough was enough, 675MX while wasn't as powerful as I liked, there were few complaints for it, Optimus wasn't optimal. 980M has been overall a great GPU for me. I can't say this for AMD that I've used in 4 other laptops. -
And the reason that's possible is because AMD doesn't use proprietary libraries that's essentially a complete blackbox like the Gameworks suite.
Seriously why do you think nVidia was able to fix the TressFX performance issue in Tomb Raider within 2 weeks? Because CrystalDynamics supplied them with the goddamn source code!TomJGX likes this. -
That shows how incredible nVidia is, no one can defy them, they are forced to bend to their will and provide the source code.
I'll stick with this evil that gets their way all the time over AMD that only gives me anginas and aneurysms. -
Ol' hulawafu has become a caricature of himself
-
Well with attitudes like this no wonder nVidia has been treating its customers with the utmost haughty disdain recently.moviemarketing, TBoneSan, TomJGX and 1 other person like this.
-
I see no other choices. AMD is not a choice. Far as I'm concerned the company doesn't exist. They have nothing to offer for desktop, server, mobile devices and or laptop gaming/desktop grade GPUs. This company has no future with the paths they have chosen.
-
AMD isn't competitive in mobile graphics and the highest end of desktop graphics, but one tier below the highest end on the desktop side they're still very competitive.
Besides, just because a company offers what you want doesn't mean you have to be a slavering sheep and agree with their clearly anti-competitive practices.moviemarketing and TBoneSan like this. -
You're right, but I just don't care enough anymore. I whined and created numerous threads, polls and petitions in the past, emails with Clevo, Asus service depts and AMD never fixed a damn thing, not one. So I give up, I'll go with whatever NVidia wants to do as long as it performs well on the hardware I bought. I no longer care if features are detrimental to AMD, far as I'm concerned, then those AMD users should just get a 965M/970M then. Even on desktop, the 960/970 perform great and are priced competitively.
I'll admit, recently NVidia has been trolling their customers with the GTX 970 ram issue, but drivers on mobile, but overall, I have had a clean experience. It just works. -
It works. And only slight boost over DX11, maybe 4 FPS? Don't remember. And I see from Fable DX12 benchmark, NVidia also lagging. NVidia definitely needs to start working on DX12. They shouldn't have made the statement they are confident with DX12. Admitting would not be recommended, they should have just kept their mouth shut and answer with improvements to their DX12 support.
But no doubt NVidia will have this sorted.
Last edited: Oct 29, 2015 -
moviemarketing Milk Drinker
Here is another video with R9 Fury and GTX 970 asymmetrical SLI:
-
King of Interns Simply a laptop enthusiast
I wonder what would happen if you put a 970m and m290x together in a laptop. Would the game use both cards?
Pretty cool! -
They couldn't get 60 FPS average with a Fury and 980, for a game that looks this terrible? The lighting effects are awful, the explosions are underwhelming, the amount of texture it supposedly renders well looks worse than most DX9 games, worse than Skyrim's background textures. This is not the DX12 game to be testing our hardware, clearly this was optimized by gorillas. I know this is only Alpha but for a game this terrible and run this horribly, that's pathetic. Two of the best GPU available, that probably would run BF4 @ 100 FPS @ 4K...
-
DX12 is very much in a beta stage when it comes to game development so give them a break.. it will get better in a few months or a year or two.. -
It's Stardock. So it will never be a good game to judge DX12, I'm sure they will improve performance. But this will never be a proper benchmark for DX12.
DX12 Ashes of Singularity: R9 290X matches 980 Ti
Discussion in 'Gaming (Software and Graphics Cards)' started by Apollo13, Aug 20, 2015.