Take a look at this interesting article: AnandTech Portal | Understanding AMD's Mantle: A Low-Level Graphics API for GCN
AMD Mantle is trying to bring the benefits of console-like low-level graphics rendering API to the PC gaming space. What could this potentially mean for future PC games? How much extra performance could developers possibly squeeze out of the PC graphics card as a result of this? Will this mean fragmentation of the graphics APIs used on PCs - with some using Mantle and others using Direct3D? Will this directly harm NVIDIA?
The first game to utilise Mantle will be Battlefield 4.
Discuss!
-
TheBlackIdentity Notebook Evangelist
If this gives a significant performance advantage In bf4 I'll sell my 780 and get an r290.
-
It is open source, so I do not see how it would hurt Nvidia, because they could just adopt it themselves if it is that good ...
-
Karamazovmm Overthinking? Always!
Im wondering if its going to actually happen, I hope so
-
-
This would REALLY suck for people with external graphics. NVidia has been the go to card for egpus because of Optimus Compression, but if AMD cards get a much bigger boost in performance, it'll be hard to stick with nVidia.
-
-
In all seriousness though, I completely agree. Mantle is the next 3dfx Glide and we all know how that turned out.
-
I am pretty sure AMD stated that it was open source during the conference .
I will try and find where I heard/saw that to link it .
http://www.tomshardware.com/news/amd-mantle-api-gcn-battlefield-4,24418.html -
It won't hurt nVidia in the sense they are no longer relevant, it simply means we will be seeing better optimized games for AMD hardware than previous times. The main attraction tho, is the fact that consoles can readily use it though.
Hopefully it all means better performance all around. -
Imagine now an AMD 65W-75W APU that has optimized performance for gaming that will easily compete with an Intel 35W CPU + 75W GPU for gaming performance. AMD has an end game in mind, and it looks like they are playing their cards right again since a long time. Intel and nVidia will have the indpendent CPU and GPU crowns but the all-in-one solution will likely belong to AMD. In the end, less power, less heat, and much cheaper. I used to be a major advocate for Microsoft, but at the rate they've been going lately I'd love to see a Linux and AMD competitor.
-
Karamazovmm Overthinking? Always!
what I never understood is why AMD limits the TDP so much. they could have a better igpu performance, that kabini high tcp cpu that appeared is one interesting piece. the cpu processing power is still way below par
-
So for PCs, AMD has 33% of the gaming market according to Steam. So now a developer could spend all thier time optomizing for Directx or Opengl. Or they could optomize for AMD with Mantle and optomize for Directx/Opengl. So now they have double the work.
As pointed out above, Glide was the same thing for 3DFX cards. In fact glide was better than directx at the time because directx was so new and buggy, and hard to program for. But developers had to develope for glide and limit thier market, or double the work and develope for glide and directx. In the end they chose to develope just for directx, even though it wasn't the best, it was still less work for them.
Directx has advanced a lot since then. And this says it is open, it is only open in who can use it. Just like CUDA is open, but I don't see AMD using that. This would only be open if Nvidia had an equal say on the language developement.
As I see this, it is goid for consoles. But if, or when, they try to port those Mantle console games it will create more work. And we will see bad graphics ports. I see this as a bad thing for PC gaming. Just as bad as Glide was. Things got so much better after Glide died. -
The difference is that if AMD provides developer kits for using Mantle, they will be coding for it anyhow. Not to mention that publishers have ten times the budgets that they did 20 years ago when Glide was first available. Sure the market was different and they would throw money at pretty much anything, but the budgets for the bigger title games were still miniscule compared with today.
-
So many of you are hailing Mantle as the greatest thing since sliced bread, but I'm not convinced. There's just too little information for me to judge whether Mantle is a good thing or a bad thing. So being the natural skeptic that I am, I'm gonna play devil's advocate here.
First of all, you guys need to understand the reason that people like me compare Mantle to Glide. A low-level API is inherently closed and proprietary because of the vast architectural differences between the GPU's made by Nvidia, AMD, and Intel. Glide was superior to D3D and OpenGL back in the day because not only did it enable extra graphical effects but it also performed faster and with more stability. However, this was fantastic for only 3dfx card owners. People who owned other GPU's were left in the dark. It was a good thing for everyone that Glide eventually died and D3D took over as the sole graphics API for PC games. Having developers write games exclusively in D3D improved and matured the API rapidly. And the abstraction provided by a high-level API is also a good thing to ensure compatibility and support across the broad spectrum of PC hardware out there.
But now with Mantle, AMD is essentially reversing all of that. Mantle will, like Glide, again compete with D3D and OpenGL. So while Mantle is good for AMD users and consoles, what does it mean for the majority of PC gamers who are Nvidia or Intel users? If Mantle provides significant improvements for AMD hardware and gains enough traction in the market that it tips market share in AMD's favor, will Nvidia and Intel sit idly by and let that happen? Rumor is that Mantle is "open," but what does that mean? Does "open" really mean anything at all? I mean, we all know how architecturally different the GPU's from the three big players are. Do you seriously think Nvidia is going to devote a huge amount of resources to reverse-engineer Mantle to work on their specific GPU architecture? Why not just develop their own proprietary low-level API? Ditto for Intel. Is this kind of fragmentation even a good thing?Robbo99999 likes this. -
Yes, times have changed, so who knows what will happen. -
average consumer know what is a hot/noisy machine, but TDP and performance? what the hell is that?
also, higher tdp = larger adapter, possibly larger cooling/fan. Some people swear when they see a brick size adapter. -
-
davidricardo86 Notebook Deity
With Mantle being geared towards AMD's GCN architecture would it be safe to say that it could also bring such improvements such as longer battery life for mobile GCN-equipped devices like my v5-122p? Correct me if I'm wrong.
In regards to the CPU, one of the slides mentions "Very low overhead rendering, loading & streaming." As for the GPU, "Lots of low-level optimizations made possible." Sure using Mantle for compute performance on GCN-equipped gaming machines is a given, but what about for efficiency and say battery life endurance in mobile devices?
Could this help those using "professional-level" AMD GCN GPUs too? -
It's not about computing, it's about rendering. With a low level api, you need almost 10 times less cpu time than with this y directX.
DirectX is, compared to consoles, the biggest bottleneck you can ever imagine for video games. On gaming pc, we have so much firepower, but restrained with a x10 ratio. Biggest example, for the same rendering scene, on console, you can execute almost 20k draw calls, when on pc, you can't get above 2k/3k, all that because of directX ( and only because of DX ).
With a low level api, all those stupid bottlenecks are over. You can do whatever you want with draw calls, memory buffering, shaders, and in a much more efficient way.
So for the sake of video games futur and as gamers, whe really should pray mantle will have a huge commercial success. Even if it means having to give up nvidia.
And in a few years, i also hope we will get a unified memory as well, wich amd is working on. It's the second bottleneck on gaming pc.
Amd won a great war against nvidia, getting ALL consoles market. Mantle is almost the same api as the low level api in xbox one, wich in fact is also really close from libgc in ps4 as well. Any rendering code on console can be port on pc really easily. Only memory assets need to be rewritten on pc.
Sincerely, mantle is really the best thing any pc gamer could wish for. Now up to devs tu use it and optimize it as much as they can, even if it means giving up on dx optimisation. because yes, without optimisation, that won't change anything compared to a really good dx optimised game.
And btw, mantle is NOT directX or opengl. it's another api. And situation is really different from glide's 3dfx. Glide went done because they couldn't and didn't follow up with inovation. In 1997, they had a big advantage, which they didn't improve. glide never went above 16bits on screen where dx introduced 32 bits ? as well as textures. And glide wasn't inside every console you can buy. -
This entire thing is a push to create something certain people at Beyond3d wishes for, and what certain useless hardware "experts" insists is essential for a console. "Low-level hacks!" It's necessary to get "performance" on "consoles".
Same every time. It's going to be talked up by said sources, such as Leadbetter on eurogamer. I can practically hear him making a fool of himself already. It's going to cause lots of "good" "will" from various media outlets. And then no one is going to use it for anything whatsoever. Apart from trying to shave off init times for badly written directx functions. -
It's faster for drawing a 3d scene. No more directX "layer" for cpu to handle =>much more cpu time => much more avaible power for ia/physic.
And meaning it's essential for the console doesn't mean it's not for PC. Yes we have much more power in our high end gaming laptops, so devs don't need good optimisation to have good visuals.
I agree with what you say. It's up to devs to use this on pc gaming port. Will they use mantle ? we'll see. But compared to glide, they will already use it on X1. It's not like they have to start from scrath on pc. It's a fact.
But once again, as a gamer ( don't give a **** about amd/nvidia fanboy war ), i just want this to come on pc games. I don't want to see poorly optimised games on pc anymore. -
Also, what is "a 3d scene", right? Remember the hoopla about deferred rendering a while back? You had serious and very knowledgeable people in the industry saying that deferred rendering was a sham. That it increased the running time of any given function for no purpose. That it was possible to take any deferred rendering engine and optimize it with a direct one.
That was just people talking out of their asses, and they were all proven to be extremely wrong. Any amount of people would have been able to explain that perfectly, and demonstrate specifically what they accomplished with it. While any attempt to replicate these effects would have failed. That even on a completely standard single-core system, it would be possible to run per object passes without increasing the running times linearly. That was the theoretical justification for it as well, and it was absolutely sound.
But no one in the games-media actually admitted that or wrote anything to promote successful solutions that came from that concept (such as anything from extremely quick pseudo-3d environments in unity, to very clever overlay solutions).
And we're stuck with people still believing that as long as you have access to the "low level". Or "the metal", as certain people insist on saying, then what you automatically get is fantastically optimized games. It doesn't work like that. It never did.
What you can do with low-level implementations is to create if not assembly routines (very rarely will anyone actually do that), then certain high-level language routines that exploit a particular architecture. This is of course extremely useful if you have almost no ram, or you have a chip with a main ram area that can literally only holds a handful of references at a time. Then using certain tricks will allow you to add in more functions than before. This was useful when ram was measured in kb, clocks were measured in hours, and stores had physical addresses at all times.
But on anything made since 1998, you won't benefit from doing this over a competent implementation on the high level. Anyone claiming differently are fooling themselves. Had the same thing from a bunch of people who were involved with the Cell processor development early on. They believed that if you only could write routines that completed and had system control on the low-level, then you could easily squeeze out performance that wasn't available otherwise. While what it really does is just creating program locks that are unscheduled, and which can only ever fit in one particular software construction, that then cannot be changed. You can optimize this function that runs autonomously, but it will require hooks and save memory addresses that block other functions.
See, even on standard 64 bit x86 based architectures with a single bus to a graphics array - all of the calls that you actually make to these graphics cores in the first place, from the high level, will execute individual memory placement and math-operations faster than a low-level optimisation. Meanwhile, any amount of "trickery" to do, say, lighting halos on objects, etc. - all of that can be done indirectly with functions that then suddenly are not hardware dependent, but rather require say, such and such amount of clock-cycles in a setup with such and such main routine response.
So how people like Leadbetter at Eurogamer, or whoever else is running around going off about "the Cell processor cannot prepare a 16-bit memory operation as fast as a true x86 system can, according to my obnoxiously stupid paper that I actually put out on my own name", can keep on not getting nailed on this crap, I don't understand. But there you go. Lots of things about the games "industry" that I don't get.
The thing is that there's a reason why you won't see the most profiled games-developers talking about coding strategies on gdc. It used to be different. You used to have developers such as.. you know, Particle Systems, or even id, Psygnosis, etc., having clever things to say about coding practices. While nowadays, if one of the DICE developers have a talk about how to create high-level routines for graphics and game-logic that can be implemented successfully on hardware with such and such instruction level capability. Then you have a presumption about it that this is just a marketing ploy. "DICE are being flagrant, trying to make themselves look like experts". No one with access listens to it, while people believe, truly, that no good solution can exist without being created by a large company, with a proprietary solution stamped on it, that costs thousands of dollars to buy.
Then they're on board. And you have people arguing that the Unreal 3 engine actually automatically gives you better performance, better graphics, and better story-telling, and that it cures cancer - rather than a quicker product to production time.
Another thing, this is not like Glide at all. Glide was a selection of "known" routines that were optimized to run only on voodoo cards. It allowed games to use shortcuts for massively more expensive routines than similar graphics cards could do in OpenGl. So pushing out the voodoo cards and letting developers use Glide for certain lighting effects and math operations suddenly allowed a range of new functions to be used on personal computers possible to buy for money.
While Mantle would be a way to let protected sdks (such as directx for xbone, or Sony's a**ho** sdks) allow access to specific low-level routines. Like I said, this is completely useless, and only there to make consoles seem more "PC like". But you will be able to read about some gnome on beyond3d "proving" that this is extremely helpful for the next couple of months, without any doubt. While Leadbetter at Eurogamer will come up with a ********* narrative about how it increases the performance of "so far believed inferior console hardware" tenfold. Just wait and see. -
..so.. did you actually spend time on making that picture - or did you just copy it in from somewhere else?
octiceps likes this. -
-
-
I didn't know the hulk had a sense of humor.
Consoles sure could use some optimizing... Here is a gif of Grand Theft Auto V
http://i.imgur.com/XF7hIdk.gifD2 Ultima and davidricardo86 like this. -
D2 Ultima and davidricardo86 like this.
-
So basically, what we're all saying here is that Mantle is a separate API like DirectX and OpenGL. This means it is likely to only run on AMD cards. This means games that take advantage of it must either forego one (or both) of the two prior APIs and simply make a Mantle API based game which nvidia and intel cards cannot access, OR they code more than one render method. I remember old games having multiple render methods; UT '99 is one that comes to mind instantly. But I don't think that with a publisher-driven, time-constrained release cycle like we uhh... "enjoy"... today, the developers who have the resources and capabilities to make two optimized API versions of their game will even get the chance to do it.
That being said, it'd simply be nice if Mantle was just a way for AMD cards to forego the inefficiencies of most PC's call unoptimizations and simply say... run in the background or something under DirectX or OpenGL,wokring with those pre-existing APIs to supremely optimize and draw out power from the cards that would not be there as of right now. But only time will really tell I suppose. I just would like PC gaming to move forward well. -
I think the fact that a developer can code a game for PS4, Xbone, and AMD PC all at once is great.
Also this allows for x86 tablets using AMD APU's to put out better 3D performance while requiring less cpu muscle.triturbo and davidricardo86 like this. -
Most games don't need a fast cpu, the cpu is not the bottleneck. So making a new api for less cpu over head doesn't seem to offer much. Also they reduced quite a bit of the overhead with Directx 10. So there isn't a lot of room for improvement.
I hold more hope for SteamOS to improve PC gaming than I do for this. -
davidricardo86 Notebook Deity
In their case especially, the CPU IS the bottleneck. Just saying.
AMD is at an advantageous position with regards to the xbone and ps4. It'd be foolish not to pursue the Mantle project I think. I cannot wait to see what they show for the Developer Summit in November. -
-
I'm very interested to see where this goes. Things look promising, and its not just AMD that stand to gain here either. The party that stands to gain the most here is clearly Steam. nVidia have also been reported assisting Steam OS development and I doubt they are sitting idly by. If things pan out the consoles and MS are the ones getting the big finger.
It's not like this is half baked either. BF4 (more so Frostbite 3) will implement it, I guess we'll be able to see if it's what it's cracked up to be in the December update.
Nice read about it here -
I fail to see how/why the consoles are getting the big finger? they will definitely benefit from this, even if it is along side Steambox ...
-
On the other hand, I hope AMD doesn't turn into a GPU monopoly, since they cornered the consoles, and those games carry over to the PC...TBoneSan likes this. -
-
I totally agree that PC is the way to go, I even skipped last gen consoles entirely, I am a PC gamer, if Mantle is all it is said to be than we will definitely see a boost from it, but saying next gen will not benefit from this is nonsense, if mantle did not exist the gap between console and PC would still be there, the fact is Mantle could make things better on every platform, be it console, PC or Steambox, thus we would all benefit from it .
-
After seeing how BF4 punishes the CPU like crazy, curious to see how it is implemented in consoles. This game requires some serious CPU horsepower, and not sure how an AMD CPU is gonna cut the mustard.
-
5 vs 5 servers at capped 30 fps
-
-
Yay, devs now developing for directx, opengl, console optimizations and now mantle. And then likely whatever Nviida responds with. Sweet...
Rather have Microsoft stop being lazy with DirectX and give devs more access to hardware and less API overhead. -
Meaning only 2 api used : DX and consoles api. The same as in the past. -
It's a nice concept, but frankly I think it's stupid having DX, OpenGL, whatever API Playstation uses, whatever Nintendo uses and now Mantle? It's no wonder why games run like garbage compared to the hardware they are running on.
Also Mantle for console? Console has one model, one set of hardware. Devs already have hardware access to consoles. Devs are already developing to the metal on console. Mantle for console? Waste of time to me.
It's PC that needs it. That's why I said I much prefer MS fixing DirectX to give more hardware access to devs and less overhead costs. We are using hardware substantially more powerful, efficient, new tech etc. But 2013 games don't look any better or to me, run any better than when I was using 5870M 3+ years ago.
Yay for Mantle... Whatever. -
I agree with your last point about fragmentation but you're wrong about games running like garbage because of Mantle. The whole point of Mantle is to provide a great performance boost over high-level API's like D3D and OpenGL, if only for AMD graphics cards. -
Microsoft doesn't have to support it. They already have low level hardware support. They don't need Mantle. Why would they want to support Mantle when they control game development on the Xbox and PC with DirectX? If I was Microsoft, I wouldn't.
-
Consoles have always relied on low-level API's to squeeze the most performance out of their weak hardware. You'd be daft to think that the Xbox One won't run games like poo if it relied just on DIrectX 11.2. And how does Microsoft already have "low-level hardware support?" Last I checked the Xbox One is running a semi-custom AMD APU. Only AMD has this level of low-level access because its their own hardware they're programming for. Mantle is not about Microsoft; it's about achieving parity between PC and console develoepment and something developers have been asking for years, which wasn't possible before the consoles became x86. It should decrease development costs and make games much easier to port to the different platforms as well as increase performance for AMD GPU owners.
TBoneSan likes this. -
"AMD has an interesting opportunity with Mantle because of their dual console wins, but I doubt Sony and MS will be very helpful," tweeted John Carmack, before adding, "Considering the boost Mantle could give to a Steambox, MS and Sony may wind up being downright hostile to it."
AMD's Mantle and the future of graphics API's on PC
Discussion in 'Gaming (Software and Graphics Cards)' started by Cakefish, Sep 26, 2013.