PC hardware is dozens times better than consoles. But console fanboys can continue to frustrate any conversation with a PC Gamer by the fact that BC2 in DX9 running on 24 core, G70 (7800) Nvidia on PS3 vs the 512 Core GTX 580 behemouth DX11 on the PC look almost the same.
Should graphics on PC also demolish console if the hardware is substantially better?
AMD claims it's the DirectX API that hampers the hardware and prevents game developers from truly taking advantage of the hardware.
Personally I think it's about $$$ and it's far cheaper to just port a game from Console to PC knowing that PC Gamers will buy it anyways. AMD and I may disagree over this, but somehow I think most people would agree with me. Publishers and game developers care far more about share holders and investors than they do about the gamers.
Farewell to DirectX? | bit-tech.net
Of course what AMD is suggesting means that developers would then be tailoring game development to specific hardware. Nvidia being the king of proprietary technology and being the most effective to tailor game development to their hardware, it is surprising that AMD would look so disfavorably on API. API is what ensures that whatever hardware PC Gamer chooses, the game will run on it.
Unless AMD and Nvidia both agree to make the same architectural designed, general purpose shader units, then I really don't see the point in discussing getting rid of the DirectX API. As it is now, even with DirectX 11 some games look different on AMD and on Nvidia. More smoke in COD BLOPS on Nvidia, flickering shadows on AMD for Bulletstorm etc. Already the differences in the architecture has some serious negative consequences in current games now.
As it is, Game Developers are too lazy to optimize games for PC after porting it from Console, how does AMD expect game developers ensure that their direct to metal developed games would run flawlessly on all AMD hardware, HD6xxx, HD5xxx, HD4xxx and on Nvidia's 5xx, 4xx, 3xx, 2xx? Within just a few years that consoles have been around, architecture design has changed considerably with in both camps, AMD/Nvidia. I think it's ludicrous.
What I suggest instead, Game Developers, Nvidia and AMD should pressure Microsoft to rewrite DirectX to allow game developers to do what they want now that both AMD and Nvidia have programmable general processing units. This also means that game developers need to finally say, sorry. We will no longer support older GPUs for our games, you must have Stream and CUDA processors to run our games. Considering how cheap the HD5770 and GTS 450/GTX 460 is now, I do not think this is an issue.
-
he he nobs
Does amd feels but hurt lol
Write your own game engine from hardware up like they did in the dos age .. or [pipe down] and take it all the way what m$ gives. LOL
No1 forces game writers to use dx ... -
As is, neither AMD, Nvidia or Microsoft can offer gamers a better experience with DX11 and drastically superior hardware. Playstation 3 still continues to have great exclusive games and even on DICE's flagship BC2 with all the advertising of special PC treatment and DX11 hype, few people can tell the difference between the PS3 and DX11 PC running on GTX 580. -
I can tell the difference of BC2 playing on ps3 and on my laptop. When you play them side by side the PS3 doesn't look as good and the framerate is low.
-
the directx is good as it is now. There won't be any better solution to this standard API.
also it's funny to see some high positioned worker comparing those ridiculously low-resolution console game to a much higher resolution pc game.
/s I'm sure he knows what he's talking about /s -
-
-
-
This isn't a debate, just because a few PC gamers here are more discerning doesn't mean the MILLIONS who play on 360 and PS3 feel the differences are noticable enough to buy PC or upgrade their rig. -
And for the topic, I love the way your logic works. Also the fact is not 720(p/i) vs. 720(p/i). Many PS3/Xbox360 games don't output at 720p. As a matter of fact you can't find many games with a true 720p rendering. Most of them are of low resolution like 540i but upscaled.
Anyway. -
Megacharge Custom User Title
Thanks. -
But here you are, just some examples:
Playstatic Playstation 3 resolution mystery revealed -
I have a PC and I do game at 1080p with high settings. So it's not just my logic as you imply. -
Megacharge Custom User Title
-
I would think after somewhat 3 years that most people will at least know the topic. But I am wrong. -
Megacharge Custom User Title
-
-
Megacharge Custom User Title
-
I don't really care if you decide to use google or realize where you are posting. Stay on topic, the topic is AMD claiming Game developers believe DirectX 11 is the reason for game quality on PC lacking. If you think it's noticeable and sufficient, then sadly you and I along with MILLIONS of PC Gamers have to disagree with you. That's the basic premise for the complaints of console port... DX9 games that look nearly identical to their console counter part.
Also my logic was more about how feasible it would be to do what AMD suggests and the type of cooperation it would take from AMD and Nvidia which I think is not feasible. And the type of work game developers would have to put in to make their games compatible with opposing architectures. I question how legitimate AMD's claim that Game Developers are blaming the DX API for their lack of results for PC. That's what my logic was about. -
There's big differences in choosing, e.g. Geforce GTX 580 SLI or a GTX 460.
Do you know what API you code for Xbox360? for PS3?
Do you still remember some 20 years ago you have to buy hardware according to the games?
I will stop here. Seems nobody here wanted an informed discussion. -
Why have a discussion if you didn't even read the article and the interview that this topic is referenced upon. -
masterchef341 The guy from The Notebook
game programmers can't recreate the wheel every time they release a game. DirectX is here to stay. They also use it and OpenGL on the consoles, so I don't get the AMD guy's point.
Yes, if you have infinite development time, you can optimize all your code for hardware and extract all the performance you can out of it. Obviously no one has that. -
saturnotaku Notebook Nobel Laureate
-
Playstation 3 released in 2006. AMD introduces the stream processor in 2006. Calling security to escort out of the building for something game developers do all the time... right.
DICE already did this with BattleField 3. DX9 is not support on PC. And the game while supporting DX10 will definitely be emphasizing DX11 which would require at minimum the HD5770/GTS 450/GTX 460. And unless you have a GTX 260 or HD4870 at minimum, you probably won't be playing BF3 on even DX10. Right my idea is just insane! Call security and escort DICE out of the building now!
Also it's obvious you did not read the article either. Why bother having a discussion if you didn't read the article? If you did, you would have known the reason for getting rid of the API and allow game developers for direct to metal development was because of Nvidia's and AMD's programmable shader units! -
No disrespect to the fine and intelligent community here at NBR, but I'm more inclined to believe the top guy at AMD over most the people here. I own a PS3, my brother owns an Xbox 360. I have played Crysis 2 demo on every platform. Sure, the PC is slightly, but noticably better looking than the consoles, but it is nowhere near what it should be for the magnitudes of difference in hardware power. We get a 10-30% (go ahead and be extreme and say 100%) visual improvement over consoles when the hardware is over 2133% more powerful.
Honestly though, if the consoles didn't decide to use Direct X would PC gaming today even exhist? -
Web gaming.
-
Crysis 2 was built with the consoles in mind first. Thereby it looks worse than Crysis 1 on very high settings. Not even sure that Crysis 2 will be as tweakable as Crysis 1 is. I am not impressed with Crysis 2 graphics at all, it is really noticable it is built for consoles thereby the small levels it has and restricted freedom of movement.
BC2 looks far more better on a PC than on the console even at the same low resolution 720p. We are talking DX11 and DX10 effects here over it´s console counterpart. Throw in 16xAF and some AA and you leave the console behind like you always do with multiplats. Just look at Bulletstorm, you don´t have the high quality textures on the consoles nor the high quality for post processing.
Bad Company 2 runs with medium textures on the consoles. High res textures looks 2 x better than the consoles medium settings. -
For me , i'd say Direct X is good enough. The quality is great and seriously way better than consoles anytime. You could say cost wise its not worth it but i rather lug a laptop around instead of an Xbox+TV
-
I'm amazed no one is actually on topic about what the article was talking about.
The issue over this wasn't about 16 AF, or 8x AA or whether you prefer to carry your laptop instead of 360 and TV.
Anyways, there is no point to this conversation if no one actually reads what AMD has to say about the state of PC game development vs the level of hardware that is available. -
Well I think when a new console generation gets released people will really use their GPU´s 100%. As it is now I think it´s pretty good anyway, my laptop can still game 99% of the games at 1920x1200 even after more than 3 years since I bought it. So I am all happy.
-
Games on the PC DO look better. ?
Outside of ports from consoles, of course, and even then you can bump up the resolution as well as play on 60fps. -
masterchef341 The guy from The Notebook
I read it. The problem is with the AMD head's expectation. Rendering at high resolution has approximately linear computational cost but doesn't increase perceived quality linearly.
In other words, playing a game in 1920x1080 vs. 1280x720 takes roughly double the computational power to render at an equivalent frame rate, all else being equal. However, you would be hard pressed to say the 1920x1080 game looks TWICE as good (keep in mind I'm talking about running these both at native resolution on otherwise equivalent screens). It would look almost exactly the same. The 1080 image would look more crisp, slightly more well defined, but the difference is subtle. Doesn't change the fact that it takes 2x the power to do it.
Then factor in that most people play console games on relatively large screens from a distance vs. up close on smaller screens. That has a huge effect on perceived quality, especially compared to a 2x resolution bump. There are other factors at play here, too, and basically, the particular AMD guy who made that quote doesn't know which way is up.
I also don't think an average gaming PC is 20x+ as powerful as a console. Basically we are dealing with a 3 ghz triple core IBM on the xbox, and a specialized processor on the ps3 that is a little hard to compare, but lets ignore the ps3 for a minute. If we are calling ourselves 20 times as powerful as an xbox, that would mean a 60 core 3 ghz IBM processor. That would mean a 20x 7800 GT render farm. That is some crazy spec nonsense. I would say that in graphics, your average gaming PC with an 8800 GT, or a GTX 460 or a 5850 or something in that range is probably 4-5 times as powerful as the box. The cpu is also maybe 4 times as powerful as the box. Regardless, the point is that if you expect a linear quality increase with linear power increase, you already jumped off the deep end. -
To summerize,
What Huddy said is either:
1. He doesn't know what he's talking about.
2. There's some kind of agenda behind this.
I tend to believe it's #2. -
response @ what masterchef341 stated about resolution. The whole odd thing about this is that consoles tend to be played on screens much larger than a PC screen. Albeit at PC you're a lot closer to the screen but 720p on a 42" screen looks horrid compared to 1080p on a 24" monitor.
-
First off, how did this thread become a PC v. Console flamewar?
Now on topic:
APIs exist for a reason. An API is a uniform way for a developer to take advantage of hardware without knowing the specifics of that hardware. While it's true that programming "direct-to-metal" would result in performance gains, it is also true that programming in assembly would probably result in a faster program than if you chose C++ or C#. But there's a reason no one writes programs in assembly. First off, it's extremely laborious. You need to know your hardware intimately for you to be successful. Secondly, there's the lack of portability. At that low of a level, a program for one piece of hardware has no guarantees that it will work on different hardware. I definitely do not want to make a game once for your Radeon HD 6870, and have to make it again for your GTX 460.
APIs exist in all areas of programming. It's the only way developers can focus on their programs instead of learning a brand new strategy for every piece of hardware that comes out. -
Kade Storm The Devil's Advocate
-
-
masterchef341 The guy from The Notebook
1. normal gaming PC maybe has about 10x the horsepower of a console
2. when we take the same game and render it at high resolution on the PC, it doesn't even look 10 times as good
3. therefore, DirectX is evil? (no comment on OpenGL which ultimately provides comparable functionality)
problems with this logic:
- a lot of console games are rendered at 720p (about half) or less resolution and have half (or less) frame rate. That accounts for at least 4x performance difference right off the bat, ignoring additional effects, post processing, or texture resolution or detail options that may exist only in the PC version.
- taking a game and then increasing the resolution by double doesn't make it look twice as good. everything will be slightly sharper and more detailed, but the difference is subtle. doesn't change the fact that it takes 2x the power to render double the res. (1920*1080 is about double 1280*720)
- even for PC only games, you aren't going to push 10 times ahead in observed graphical quality with 10x the hardware power. There are serious diminishing returns with regards to resolution and special effects, etc. as you increase hardware power. And even though the hardware might be 10x as powerful, the target resolution is at least double and the target frame rate for high end hardware might also be double. That leaves you with maybe 3x the power of the console. What do you with that? Increase the draw distance of a console game by 1.7x and you are out of performance room. Add a few extra lights on a scene and you are definitely out. And a few extra lights wont make the game look twice as good. Neither will a slightly larger draw distance. The disproportionate expectation is the only issue here.
And the biggest problem?
- consoles also use APIs, including a variant of DX9 on the xbox, and OpenGL on the PS3 -
-
Kade Storm The Devil's Advocate
Consoles do last longer with subtle but steady improvements in software.
-
One of the differences between consoles and computers regarding on how to make games is that...
With consoles, developers focus on increasing peformance and getting the most out of the hardware, pushing it exclusively for that set combination of elements. Thus, everyone has their own code, engine, and a vast difference in quality between games.
In PC, hardware is not a constant, software becomes the "constant" So you develope an engine and ensure it can be played on a vast combination of hardware, with different architectures among the same vendor (generational architecture changes) etc. They end up optimizing the part where it should run on several machines, instead of tayloring it to a single combination of components and squeezing the best performance out of it.
Frankly, I am impressed with the visuals the ps3 and xbox360 can muster with their limited hardware. So I can only imagine what the next get can bring, considering that the hardware is massively more powerful.
Unless hardware components become standardize, a console like approach won't work on PC and thus it will remain in the "poor quality" area. Of course, with standardize hardware.... AMD and Nvidia wouldn't exactly exist. -
masterchef341 The guy from The Notebook
-
Consoles you can do direct to metal development with 20,000 draw calls vs 3,000 on the PC with the API overhead. The whole article discusses that the DirectX API has a massive performance overhead and that the DX11 multi-threading still only a factor of 2.
Read the article, all your points were addressed as Lithus said. -
Don't they have cross platform software already anyhow in order to code for Xbox 360 and PS3? I mean this would require 100% unique code for both, which I'm sure developers wouldn't want to do.
-
-
As far as I'm concerned this person has to be pretty young, and quite naive as a manager. We had exactly what he proposes, 20 years ago, and it was a disaster. Every card had different code written for it and manufacturers had their "camp" of developers who wrote only for certain cards. Every new generation of chips invalidated software that came before it and every consumer moaned about whether they'd picked the "right" hardware or not. Of course AMD would love to return to this, just like Nvidia, so they could battle it out, exclude competitors other than the two of them, and lock in consumers to their hardware via the software route.
Taking away several decades of development is not the right way to go. AMD can't write drivers without bugs as it is, can you imagine trying to get all their cards right on all versions of software written for their cards? Heck you'd have drivers for each program that was out there, no more one driver to yield improvements for all software.
The goal "should" be to improve the Direct X driver and resulting compilers (compilers are likely much of the issue in this) such that you get closer to the "metal" results from standard routines. Using standardized interfaces assures all folks they can run a program with much less regard for the hardware they bought, and encourages reasonable hardware competition without the proprietary garbage that is a plague on the industry. If you can't gain enough performance from the "standard" you improve the standard, not ditch it in favor of anarchistic software standards. Comparing the wide hardware industry to a couple of locked in, cheap as you can get, consoles, and assuming the same standards apply is plain ignorant as an industry leader. -
@FXi:
The dude you are talking about someone who has been coding and been involved in game development since the early 1980s. He's been influential in the development of DirectX since 1996 with 3DLabs and then for 4 years as the developer relations manager with Nvidia and then with ATi for 6 years. He was headhunted by both Nvidia and ATi. They asked him to work for them, not the other way around. I guess you would then be saying, Nvidia and ATi are young and naive also?''
It's his job to be technically proficient enough to tell developers everything ATi and DirectX 11 has to offer, in finite detail.
He is neither young nor naiveIf anyone knows what he is talking about, it would be Richard Huddy. Whether you agree with him or not, is a different story.
But he does seem contradictory since he spent almost 2 years screaming, be excited for DirectX 11 and DirectX 11 is awesome etc. So I am surprised that is so public with this. Almost 2 decades of hand in hand relations with MS and DirectX, this certainaly is surprising.
But then if any game developer wants cooperation from AMD, the person they talk to is Richard Huddy. So I doubt he's lying if game developers have asked him to make the API go away. But I doubt DirectX 11 is doing away. But be interesting to see what the future of DirectX will be if game developers truly do want to get rid of the overhead costs of DX11. -
Kade Storm The Devil's Advocate
-
I'm pretty sure everyone is reading into this wrong. The AMD manager is not asking for assembly level coding, instead he wants C and C++ to be working on the GPU, natively. Of course this would require building the rendering code for each GPU architecture, but I think that is a low price to pay (in fact they could have the rendering code be compiled the first time it runs on each system, so that the code itself would be sent to the GPU and compiled there rather than sending the compiled binary). I'm pretty sure this is the way things will go anyway... Coding close to the assembly level in a common language across multiple GPU architectures would improve performance and retain the compatibility we have now. There would be no "API" per se, but rather a system of getting your code to run on massively parallel architectures.
-
Given the financial and time constraints (and lazy programmers), do you seriously believe developing "to-the-metal" is the way? And a typical PC game dev can have less problems?
Don't forget API is another thing that makes our system more stable when a game crashes: the crash doesn't bring down the whole system. Remember games from MS-DOS era?
Also don't forget more and more architectures are coming in our way. Windows 8 can run on ARM. You sure it's a low price to pay to program for each architecture?
It is not DirectX that caused poor PC Quality. It's the financial (reads: shareholders, profits, etc.) that caused poor PC Quality.
P.S. With DirectX 11 you can do MUCH MORE than you think if you build a game from ground up with it.
AMD Blames DirectX for Poor PC Quality
Discussion in 'Gaming (Software and Graphics Cards)' started by mushishi, Mar 18, 2011.