Well then he has no excuse. It is a ridiculous direction he wants to take things. If all his experience tells him "let's toss away 20 years of progress because I can't get the code to run fast enough", well good for him. I'm sure AMD can help us all by writing a driver of their own that bypasses OpenGL and DirectX yes? And that this driver will then be picked up by the top 10 or 20 developers around the world, yes?
Don't hold your breath for that one.
-
And most people don't know is: GPUs have chip-level defects. The driver can help to correct those defects in a software level, which makes it transparent to most devs.
Console chips have the same defects, but since the chip never changes, devs know what "not to do" with Console chips.
With such a gigantic different configuration in PC world, I cannot imagine anyone wants to get rid of standard API. -
C/C++ are both compiled languages. It makes no sense to say that it should "work on the GPU, natively". C/C++ both need to be compiled into assembly before it will run on anything. A compiler translates the language to the native assembly for the hardware (in the form of a binary or executable) which is then run.
The point of an API is to allow you to code in C/C++ and have your code work on a GPU without specifying the exact architecture that you're working with. Now, I can see someone arguing that DirectX is a poor API - I don't know enough about it myself to make an informed decision - but to say that we should get rid of APIs for graphics programming, I just don't see how that's possible.
In short, most games are already coded in C/C++. The only way this is possible is with a unified API.
(And by the way, what the heck is "rendering code"?) -
-
H.A.L. 9000 Occam's Chainsaw
I still don't see it happening in the immediate future. -
Interesting article. Would be nice if, like pointed in the article, at robust yet flexible enough API could be developed. However I could not understand this statement:
"On consoles, you can draw maybe 10,000 or 20,000 chunks of geometry in a frame, and you can do that at 30-60fps. On a PC, you can't typically draw more than 2-3,000 without getting into trouble with performance, and that's quite surprising - the PC can actually show you only a tenth of the performance if you need a separate batch for each draw call. "
Do you know guys how is it possible? Do consoles have some 'magical tricks' behind or is information missing (like 10-20k before shaders and/or lighting, etc.)?? -
-
Thanks. -
We need to back up and rethink how to program 3D graphics before we started down and got stuck on the polygon road. We need to go back to the common denominator which was before the polygon pushing technology of the GPU, which is the CPU and we have now been given a way with a different method of doing 3D graphics that doesn't require the compications of different GPU architectures and system setups. This new way is a promising 3D engine called Unlimited Detail point cloud algorithm.
We just need the videogame industry not to be lazy and resist learning a new way of programming 3D videogames. You see that catch phrase that was used by Mr. Huddy of the ability to "program direct to metal" was used back in 1994 when both the Sega Saturn and Sony Playstation 1 was released.
To those that aren't old enough to have followed this battle of the next generation videogame console wars of 1994, I'll give you a brief account of what took place, and how we're in the situation we are today...
In 1994, the GPU was in it's infancy and along with the PC this brand new technology would also be used in the very first so called "Next Generation" of video game consoles, more specifically on the Sony Playstation 1, as Sega instead opted to use 2 powerful RISC CPUs aided by 2 VDPs.
This decison on Sega's part of using an architecture based on supercomputing and massive parallel processing, was to be the downfall of Sega's 32 bit videogame console.
The Sega Saturn's architecture proved to be "ahead of it's time" since game developers had to relearn how to program using 4 processors running in parallel, in order to achieve substantial results in their games. This seemed too complicated for the lazy development community and so the game industry instead chose to give full support over to the much simpler development platform of the Sony Playstation 1 architecture, which set the standard for a CPU and GPU that we still follow today.
The marketing for the Playstation won the hearts and minds of the average gamer who didn't really understand supercomputing and parallel processing that was offered in the Sega Saturn system, and so the Playstation having hit critical mass and taking over the majority of the videogame console market drowned out the potential of the Sega Saturn, so all who were left to develop games for the Saturn was Sega's inhouse development teams.
Luckily Sega had the best development teams in the world such as AM2 and Sonic team who vowed to show the true potential of the Sega system by programming "directly to metal", and they came out with stunning videogames with graphics that could not be replicated on the Playstation, unless, of course, history were to be rewritten and other game developers didn't abandon the Saturn, but instead would have taken a shot and put effort into learning how to program "directly to metal" by developing for the Sega system, we probably would have many different and innovative methods of programming 3D videogames today, since now Intel and AMD has followed Sega's lead by implementing parallel processing into their new line of CPU dual and quad cores.
The only way now besides what Mr. Huddy is proposing, is to use a system that won't prefer any one GPU to any other, but instead utilize what is most common and standard in every PC, the CPU, and my understanding is that Ultimate Detail point cloud algorithm is this savior system for all us PC gamers. -
1) There is a massive difference in PS3 and PC output: Resolution (720 vs 1080 with AA and goodies). And if you compare 720 to 720, PC framerate (assuming you have a $1500 desktop) will be 4-5x the console.
2) 360 and PS3 both use API's.
3) Imagine the stability issues if we got rid of PC API's...the average game programmer doesn't even work with the same type of logic/code that would be needed to program for low level processing. Hugely increase game cost by needing to expand teams.
4) the cost argument? really? again? I spent $1300 on my machine (see sig). The GAMING components on my machine cost about as much as a PS3. Those people with a PS3 also have a PC (probably without a great graphics card or a fast CPU). If they had NOT bought a PS3, but added the cost of one to the PC they had...they would have a gaming machine.
Summary:
There is no argument here. Saying that a ton of console fanbois cant see the difference between PC and console means nothing. I can. anybody with objectivity and a trained eye can. Console lovers play console because they are probably not to interested in dispaly resolution, framerate, AA, AF, tessellation, driver tweaking, overclocking etc. Any survey of the ability of console gamers to tell the difference between console and PC would be inherently biased, not only because they like their console, but mostly because their demographic is substantially more n00b than ours. -
masterchef341 The guy from The Notebook
consoles and PCs are using the same types of hardware. the xbox 360 is using a triple core powerpc processor (a normal processor) and an ATI x1900 class GPU (a normal gpu)
none of this has anything to do CPU rendering, or Unlimited Detail (that's a company that is still seeking funding to work on a CPU rendering algorithm and engine)
this is an issue about hardware optimization, whether the DX API is getting in the way of performance. -
Where we took a wrong turn in PC 3D graphics history and got stuck on this enclosed API/DX/GPU technology road.
But now we come to a fork in this road that points in a new innovative way. Do we choose to ignore this cut off (Unlimited Detail), and keep going down this problematic road (API/DX/ upgrade GPU)? -
Sure you can improve performance without DirectX but it would segregate the market. Not to mention if the GPU architecture changed, then you would have to write emulators for backwards compatibility, and that would probably reduce performance more than DirectX.
Look at 3Dfx. Their Monster 3D card was the first true 3D consumer accelerator and used the Glide API, pretty much proprietary to 3Dfx, and it was still even an API and not a direct to metal coding. Meaning assembly language is the most ridiculously difficult way to program. I did some assembly programming in the wee years of the Commodore 64, Tandy Color Computer, and Commodore Amiga. That was with very simplistic hardware and was a pita to program then. I can't imagine what it'd be like now.
Unlimited Detail rendering is problematic too as in it has zero development or backing by any major corporation. I don't see Unlimited Detail's tech coming to fruition unless they can prove it's more efficient than DirectX and reduces the workload of the programmers. I don't see that happening. -
masterchef341 The guy from The Notebook
It's frustrating because you don't understand computer graphics, and that you completely fell for Euclideon's "unlimited detail" speech, which is straight up advertising, all because you are vulnerable, all because you are bothered that your GPU can't run Battlefield Bad Company 2 at 32xaa. It's nuts. It doesn't make any sense to keep pushing someone else's private unreleased unfunded maybe-one-day-could-be technology so hard. -
Yeah let's stop while this thread is still open. Having it closed because you can't stop pontificating about polygons and unlimited detail will be disappointing.
-
-
jacob808: Do yourself a favor and stop interjecting in technical threads. You have no clue what you're talking about. You're like the guy going into an automobile forum and insisting that everyone should switch to spaceships. A lot of us here have actually dabbled in DirectX/OpenGL. You playing games does not equal a developer making games.
-
PUBLIC SERVICE ANNOUNCEMENT:
The ignore feature is there just for these sorts of emergencies.
Use it. It's free. It's fun. And most importantly, it preserves sanity. -
masterchef341 The guy from The Notebook
must... find... ignore button... where is it?
-
It's under control panel, one of the options on the left sidebar has the buddy/ignore list. Simple matter of adding the username into the applicable area.
Back on topic: I don't really get the whole point behind the AMD manager's argument, there's only so many ways you can program games for PCs and DirectX and OpenGL are the most familiar APIs for lay users. If PC gaming quality was really so poor, then it's quite possible that developers are being constrained by something else other than the tools they have to work. -
Look at recent PC games. How many of them are being developed ground up on PC using D3D10+?
Probably less than 3 or even 0.
Most of recent PC games are console ports which uses old D3D9. And we all know a post-release D3D10+ patch doesn't mean better performance and/or image quality.
It's the developers who chose to develop on D3D9 and then about too much D3D overhead on PC platform?
And even with console ports we see a lot of bugs and performance issues on PC, which hardware suppose to be at least 5x better than current gen consoles.
If those devs can't even get D3D9 PC ports right, how would you expect them to dev code-to-metal right?
I seriously don't get this "blame DirectX" argument. -
-
Speaking from the outside, I'm not privy to all of the factors taken into consideration when developing and publishing games. What I do know is that blaming the current state of the PC gaming industry on the developing tools is like putting the cart before the horse. -
Apologize if I am too repetitive. -
I think we're forgetting that most games are developed on an engine (Unreal, Gamebryo, CryEngine) and not straight DirectX. That's another level of indirection to consider.
-
Good point, although I'm not sure if the passiveness involved is simply a matter of the developers deciding to use a cost-effective solution already available rather than spend time and resources doing up an engine from scratch.
Edit: Looking at the reply below mine, perhaps the AMD manager should be blaming the overreliance on established game engines in the PC gaming scene at present rather than the base APIs they were originally built from. -
-
Tripwire Interactive is the only PC Exclusive FPS developer I know of and their games run on the UE2.5 Engine and their next ROH2 will be UE3.
CD Projekt created The Witcher on the Auora Engine with the renderer made from ground up. CD Projekt already released a great video from the developers of their next engine for The Witcher 2. Does it count if CD Projekt made an engine exclusively for the PC? Maybe AMD will be proven wrong if CD Projekt's DX11 engine made exclusively for the PC destroys performance of the console port engines?
AMD Blames DirectX for Poor PC Quality
Discussion in 'Gaming (Software and Graphics Cards)' started by mushishi, Mar 18, 2011.