The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    AMD Blames DirectX for Poor PC Quality

    Discussion in 'Gaming (Software and Graphics Cards)' started by mushishi, Mar 18, 2011.

  1. mushishi

    mushishi Notebook Consultant

    Reputations:
    137
    Messages:
    238
    Likes Received:
    0
    Trophy Points:
    0
    PC hardware is dozens times better than consoles. But console fanboys can continue to frustrate any conversation with a PC Gamer by the fact that BC2 in DX9 running on 24 core, G70 (7800) Nvidia on PS3 vs the 512 Core GTX 580 behemouth DX11 on the PC look almost the same.

    Should graphics on PC also demolish console if the hardware is substantially better?

    AMD claims it's the DirectX API that hampers the hardware and prevents game developers from truly taking advantage of the hardware.

    Personally I think it's about $$$ and it's far cheaper to just port a game from Console to PC knowing that PC Gamers will buy it anyways. AMD and I may disagree over this, but somehow I think most people would agree with me. Publishers and game developers care far more about share holders and investors than they do about the gamers.

    Farewell to DirectX? | bit-tech.net

    Of course what AMD is suggesting means that developers would then be tailoring game development to specific hardware. Nvidia being the king of proprietary technology and being the most effective to tailor game development to their hardware, it is surprising that AMD would look so disfavorably on API. API is what ensures that whatever hardware PC Gamer chooses, the game will run on it.

    Unless AMD and Nvidia both agree to make the same architectural designed, general purpose shader units, then I really don't see the point in discussing getting rid of the DirectX API. As it is now, even with DirectX 11 some games look different on AMD and on Nvidia. More smoke in COD BLOPS on Nvidia, flickering shadows on AMD for Bulletstorm etc. Already the differences in the architecture has some serious negative consequences in current games now.

    As it is, Game Developers are too lazy to optimize games for PC after porting it from Console, how does AMD expect game developers ensure that their direct to metal developed games would run flawlessly on all AMD hardware, HD6xxx, HD5xxx, HD4xxx and on Nvidia's 5xx, 4xx, 3xx, 2xx? Within just a few years that consoles have been around, architecture design has changed considerably with in both camps, AMD/Nvidia. I think it's ludicrous.

    What I suggest instead, Game Developers, Nvidia and AMD should pressure Microsoft to rewrite DirectX to allow game developers to do what they want now that both AMD and Nvidia have programmable general processing units. This also means that game developers need to finally say, sorry. We will no longer support older GPUs for our games, you must have Stream and CUDA processors to run our games. Considering how cheap the HD5770 and GTS 450/GTX 460 is now, I do not think this is an issue.
     
  2. sama98b

    sama98b Notebook Evangelist

    Reputations:
    40
    Messages:
    435
    Likes Received:
    0
    Trophy Points:
    30
    I love how they compare the pc max resolution highest quality to the consoles default :p he he nobs

    Does amd feels but hurt lol :p

    Write your own game engine from hardware up like they did in the dos age .. or [pipe down] and take it all the way what m$ gives. LOL
    No1 forces game writers to use dx ...
     
  3. mushishi

    mushishi Notebook Consultant

    Reputations:
    137
    Messages:
    238
    Likes Received:
    0
    Trophy Points:
    0
    Of course AMD and Nvidia are both butt hurt. Microsoft should care also.

    As is, neither AMD, Nvidia or Microsoft can offer gamers a better experience with DX11 and drastically superior hardware. Playstation 3 still continues to have great exclusive games and even on DICE's flagship BC2 with all the advertising of special PC treatment and DX11 hype, few people can tell the difference between the PS3 and DX11 PC running on GTX 580.
     
  4. xxERIKxx

    xxERIKxx Notebook Deity

    Reputations:
    159
    Messages:
    1,488
    Likes Received:
    0
    Trophy Points:
    55
    I can tell the difference of BC2 playing on ps3 and on my laptop. When you play them side by side the PS3 doesn't look as good and the framerate is low.
     
  5. Richteralan

    Richteralan Notebook Evangelist

    Reputations:
    20
    Messages:
    416
    Likes Received:
    11
    Trophy Points:
    31
    the directx is good as it is now. There won't be any better solution to this standard API.

    also it's funny to see some high positioned worker comparing those ridiculously low-resolution console game to a much higher resolution pc game.


    /s I'm sure he knows what he's talking about /s
     
  6. mushishi

    mushishi Notebook Consultant

    Reputations:
    137
    Messages:
    238
    Likes Received:
    0
    Trophy Points:
    0
    You would be among the few.
     
  7. xxERIKxx

    xxERIKxx Notebook Deity

    Reputations:
    159
    Messages:
    1,488
    Likes Received:
    0
    Trophy Points:
    55
    Me and all my friends who watched it side by side.
     
  8. Richteralan

    Richteralan Notebook Evangelist

    Reputations:
    20
    Messages:
    416
    Likes Received:
    11
    Trophy Points:
    31
    Do you actually know the final output resolution of PS3? Xbox360?
     
  9. mushishi

    mushishi Notebook Consultant

    Reputations:
    137
    Messages:
    238
    Likes Received:
    0
    Trophy Points:
    0
    What are you yelling at me for? Fact is 720 vs 720, most gamers can't tell the difference between PC and Console. There were articles upon articles when BC2 was released. It was a huge issue that the PS3 looked nearly as good as the PC.

    This isn't a debate, just because a few PC gamers here are more discerning doesn't mean the MILLIONS who play on 360 and PS3 feel the differences are noticable enough to buy PC or upgrade their rig.
     
  10. Richteralan

    Richteralan Notebook Evangelist

    Reputations:
    20
    Messages:
    416
    Likes Received:
    11
    Trophy Points:
    31
    I am not yelling at you. If you think a properly capitalized sentence equals yelling, you might want to loosen up.

    And for the topic, I love the way your logic works. Also the fact is not 720(p/i) vs. 720(p/i). Many PS3/Xbox360 games don't output at 720p. As a matter of fact you can't find many games with a true 720p rendering. Most of them are of low resolution like 540i but upscaled.

    Anyway.
     
  11. Megacharge

    Megacharge Custom User Title

    Reputations:
    2,230
    Messages:
    2,418
    Likes Received:
    14
    Trophy Points:
    56
    Not calling you a liar or anything, so don't take this the wrong way, but some links would be nice, and very informative.

    Thanks.
     
  12. Richteralan

    Richteralan Notebook Evangelist

    Reputations:
    20
    Messages:
    416
    Likes Received:
    11
    Trophy Points:
    31
    A simple googling can give you results.

    But here you are, just some examples:

    Playstatic Playstation 3 resolution mystery revealed
     
  13. mushishi

    mushishi Notebook Consultant

    Reputations:
    137
    Messages:
    238
    Likes Received:
    0
    Trophy Points:
    0
    And none of this matters at all. Absolutely none of it. Playstation 3 users still don't see any compelling reason to rush to the stores and spend $1,500 on a gaming PC rig and neither do I. And it's not my reasoning. Not sure how you came to that conclusion. Look at the statements from AMD and the game developers. The logic is there. Look at the sales of AMD and Nvidia. Fact is for AMD majority of their revenue is not from gamers, but from their mobile sales now and from the cheap computers you buy at Best Buy with a cheap AMD graphics in them. AMD even has OEM ONLY lineup of GPUs specific just for that now.

    I have a PC and I do game at 1080p with high settings. So it's not just my logic as you imply.
     
  14. Megacharge

    Megacharge Custom User Title

    Reputations:
    2,230
    Messages:
    2,418
    Likes Received:
    14
    Trophy Points:
    56
    It's not that I couldn't look for it, it was more a matter of if you're going to make that statement, back it up for the benefit of the thread. Thanks for adding a link for everyone.
     
  15. Richteralan

    Richteralan Notebook Evangelist

    Reputations:
    20
    Messages:
    416
    Likes Received:
    11
    Trophy Points:
    31
    that was been discussed since 2008.

    I would think after somewhat 3 years that most people will at least know the topic. But I am wrong.
     
  16. Megacharge

    Megacharge Custom User Title

    Reputations:
    2,230
    Messages:
    2,418
    Likes Received:
    14
    Trophy Points:
    56
    I have a PS3 collecting dust on my tv table. Only time I use it is once in a while when me and my Girlfriend play the odd game or watch movies on it. I prefer gaming on my PC.
     
  17. Richteralan

    Richteralan Notebook Evangelist

    Reputations:
    20
    Messages:
    416
    Likes Received:
    11
    Trophy Points:
    31
    Show me a $1500 gaming rig components.
     
  18. Megacharge

    Megacharge Custom User Title

    Reputations:
    2,230
    Messages:
    2,418
    Likes Received:
    14
    Trophy Points:
    56
    He said a 1500$ gaming rig, not 1500$ components. RCFTW
     
  19. mushishi

    mushishi Notebook Consultant

    Reputations:
    137
    Messages:
    238
    Likes Received:
    0
    Trophy Points:
    0
    As you said above, use google. ROFL, sorry but that's hilarious. Plenty of NBR here have gaming machines that exceed twice the tag of $1,500. Hilarious.

    I don't really care if you decide to use google or realize where you are posting. Stay on topic, the topic is AMD claiming Game developers believe DirectX 11 is the reason for game quality on PC lacking. If you think it's noticeable and sufficient, then sadly you and I along with MILLIONS of PC Gamers have to disagree with you. That's the basic premise for the complaints of console port... DX9 games that look nearly identical to their console counter part.

    Also my logic was more about how feasible it would be to do what AMD suggests and the type of cooperation it would take from AMD and Nvidia which I think is not feasible. And the type of work game developers would have to put in to make their games compatible with opposing architectures. I question how legitimate AMD's claim that Game Developers are blaming the DX API for their lack of results for PC. That's what my logic was about.
     
  20. Richteralan

    Richteralan Notebook Evangelist

    Reputations:
    20
    Messages:
    416
    Likes Received:
    11
    Trophy Points:
    31
    gaming rig has components. Maybe I wasn't clear, I was asking what components he'll choose for the $1500 gaming rig.

    There's big differences in choosing, e.g. Geforce GTX 580 SLI or a GTX 460.

    Thanks for making claims like that without backing up.

    Do you know what API you code for Xbox360? for PS3?

    Do you still remember some 20 years ago you have to buy hardware according to the games?

    I will stop here. Seems nobody here wanted an informed discussion.
     
  21. mushishi

    mushishi Notebook Consultant

    Reputations:
    137
    Messages:
    238
    Likes Received:
    0
    Trophy Points:
    0
    Seems you didn't even read the article, so doesn't matter, since the article addresses your complaint. If you read the article you would have known that.

    Why have a discussion if you didn't even read the article and the interview that this topic is referenced upon.
     
  22. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    game programmers can't recreate the wheel every time they release a game. DirectX is here to stay. They also use it and OpenGL on the consoles, so I don't get the AMD guy's point.

    Yes, if you have infinite development time, you can optimize all your code for hardware and extract all the performance you can out of it. Obviously no one has that.
     
  23. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    You do that as a game developer, and a publisher would laugh you out of the office after calling security to escort you from the building.
     
  24. mushishi

    mushishi Notebook Consultant

    Reputations:
    137
    Messages:
    238
    Likes Received:
    0
    Trophy Points:
    0
    Umm it happens every generation. That's why there is a always a minimum requirement on the box, on the website and so on. And of course, Killzone 3 was made to run on the Playstation 2 right?

    Playstation 3 released in 2006. AMD introduces the stream processor in 2006. Calling security to escort out of the building for something game developers do all the time... right.

    DICE already did this with BattleField 3. DX9 is not support on PC. And the game while supporting DX10 will definitely be emphasizing DX11 which would require at minimum the HD5770/GTS 450/GTX 460. And unless you have a GTX 260 or HD4870 at minimum, you probably won't be playing BF3 on even DX10. Right my idea is just insane! Call security and escort DICE out of the building now!

    Also it's obvious you did not read the article either. Why bother having a discussion if you didn't read the article? If you did, you would have known the reason for getting rid of the API and allow game developers for direct to metal development was because of Nvidia's and AMD's programmable shader units!
     
  25. rschauby

    rschauby Superfluously Redundant

    Reputations:
    865
    Messages:
    1,560
    Likes Received:
    0
    Trophy Points:
    55
    No disrespect to the fine and intelligent community here at NBR, but I'm more inclined to believe the top guy at AMD over most the people here. I own a PS3, my brother owns an Xbox 360. I have played Crysis 2 demo on every platform. Sure, the PC is slightly, but noticably better looking than the consoles, but it is nowhere near what it should be for the magnitudes of difference in hardware power. We get a 10-30% (go ahead and be extreme and say 100%) visual improvement over consoles when the hardware is over 2133% more powerful.

    Honestly though, if the consoles didn't decide to use Direct X would PC gaming today even exhist?
     
  26. Bullit

    Bullit Notebook Deity

    Reputations:
    122
    Messages:
    864
    Likes Received:
    9
    Trophy Points:
    31
    Web gaming.
     
  27. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Crysis 2 was built with the consoles in mind first. Thereby it looks worse than Crysis 1 on very high settings. Not even sure that Crysis 2 will be as tweakable as Crysis 1 is. I am not impressed with Crysis 2 graphics at all, it is really noticable it is built for consoles thereby the small levels it has and restricted freedom of movement.

    BC2 looks far more better on a PC than on the console even at the same low resolution 720p. We are talking DX11 and DX10 effects here over it´s console counterpart. Throw in 16xAF and some AA and you leave the console behind like you always do with multiplats. Just look at Bulletstorm, you don´t have the high quality textures on the consoles nor the high quality for post processing.

    Bad Company 2 runs with medium textures on the consoles. High res textures looks 2 x better than the consoles medium settings.
     
  28. TomJG90

    TomJG90 Notebook Evangelist

    Reputations:
    46
    Messages:
    425
    Likes Received:
    0
    Trophy Points:
    30
    For me , i'd say Direct X is good enough. The quality is great and seriously way better than consoles anytime. You could say cost wise its not worth it but i rather lug a laptop around instead of an Xbox+TV :D
     
  29. mushishi

    mushishi Notebook Consultant

    Reputations:
    137
    Messages:
    238
    Likes Received:
    0
    Trophy Points:
    0
    I'm amazed no one is actually on topic about what the article was talking about.

    The issue over this wasn't about 16 AF, or 8x AA or whether you prefer to carry your laptop instead of 360 and TV.

    Anyways, there is no point to this conversation if no one actually reads what AMD has to say about the state of PC game development vs the level of hardware that is available.
     
  30. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Well I think when a new console generation gets released people will really use their GPU´s 100%. As it is now I think it´s pretty good anyway, my laptop can still game 99% of the games at 1920x1200 even after more than 3 years since I bought it. So I am all happy.
     
  31. Hungry Man

    Hungry Man Notebook Virtuoso

    Reputations:
    661
    Messages:
    2,348
    Likes Received:
    0
    Trophy Points:
    55
    Games on the PC DO look better. ?

    Outside of ports from consoles, of course, and even then you can bump up the resolution as well as play on 60fps.
     
  32. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    I read it. The problem is with the AMD head's expectation. Rendering at high resolution has approximately linear computational cost but doesn't increase perceived quality linearly.

    In other words, playing a game in 1920x1080 vs. 1280x720 takes roughly double the computational power to render at an equivalent frame rate, all else being equal. However, you would be hard pressed to say the 1920x1080 game looks TWICE as good (keep in mind I'm talking about running these both at native resolution on otherwise equivalent screens). It would look almost exactly the same. The 1080 image would look more crisp, slightly more well defined, but the difference is subtle. Doesn't change the fact that it takes 2x the power to do it.

    Then factor in that most people play console games on relatively large screens from a distance vs. up close on smaller screens. That has a huge effect on perceived quality, especially compared to a 2x resolution bump. There are other factors at play here, too, and basically, the particular AMD guy who made that quote doesn't know which way is up.

    I also don't think an average gaming PC is 20x+ as powerful as a console. Basically we are dealing with a 3 ghz triple core IBM on the xbox, and a specialized processor on the ps3 that is a little hard to compare, but lets ignore the ps3 for a minute. If we are calling ourselves 20 times as powerful as an xbox, that would mean a 60 core 3 ghz IBM processor. That would mean a 20x 7800 GT render farm. That is some crazy spec nonsense. I would say that in graphics, your average gaming PC with an 8800 GT, or a GTX 460 or a 5850 or something in that range is probably 4-5 times as powerful as the box. The cpu is also maybe 4 times as powerful as the box. Regardless, the point is that if you expect a linear quality increase with linear power increase, you already jumped off the deep end.
     
  33. Richteralan

    Richteralan Notebook Evangelist

    Reputations:
    20
    Messages:
    416
    Likes Received:
    11
    Trophy Points:
    31
    To summerize,

    What Huddy said is either:

    1. He doesn't know what he's talking about.
    2. There's some kind of agenda behind this.

    I tend to believe it's #2.
     
  34. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    response @ what masterchef341 stated about resolution. The whole odd thing about this is that consoles tend to be played on screens much larger than a PC screen. Albeit at PC you're a lot closer to the screen but 720p on a 42" screen looks horrid compared to 1080p on a 24" monitor.
     
  35. Lithus

    Lithus NBR Janitor

    Reputations:
    5,504
    Messages:
    9,788
    Likes Received:
    0
    Trophy Points:
    205
    First off, how did this thread become a PC v. Console flamewar?

    Now on topic:

    APIs exist for a reason. An API is a uniform way for a developer to take advantage of hardware without knowing the specifics of that hardware. While it's true that programming "direct-to-metal" would result in performance gains, it is also true that programming in assembly would probably result in a faster program than if you chose C++ or C#. But there's a reason no one writes programs in assembly. First off, it's extremely laborious. You need to know your hardware intimately for you to be successful. Secondly, there's the lack of portability. At that low of a level, a program for one piece of hardware has no guarantees that it will work on different hardware. I definitely do not want to make a game once for your Radeon HD 6870, and have to make it again for your GTX 460.

    APIs exist in all areas of programming. It's the only way developers can focus on their programs instead of learning a brand new strategy for every piece of hardware that comes out.
     
  36. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    A predictable outcome of sentimentally driven allegiances that are exploited by each side for self-amusement. Among many other reasons.

    Well put, sir.
     
  37. mushishi

    mushishi Notebook Consultant

    Reputations:
    137
    Messages:
    238
    Likes Received:
    0
    Trophy Points:
    0
    Amazing someone on topic. Yeah that's what I thought also, AMD's game developer relations PR man, Huddy seems to live in fantasy world, unless Nvidia and AMD are planning on standardizing programmable shader units like AMD/Intel did with x86. Nvidia probably would be very upset about a proposal like that.
     
  38. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    I wasn't really flame-warring console vs. PC. The only reason I even brought it up was that because of the AMD guy's painfully flawed logic:

    1. normal gaming PC maybe has about 10x the horsepower of a console
    2. when we take the same game and render it at high resolution on the PC, it doesn't even look 10 times as good
    3. therefore, DirectX is evil? (no comment on OpenGL which ultimately provides comparable functionality)

    problems with this logic:

    - a lot of console games are rendered at 720p (about half) or less resolution and have half (or less) frame rate. That accounts for at least 4x performance difference right off the bat, ignoring additional effects, post processing, or texture resolution or detail options that may exist only in the PC version.
    - taking a game and then increasing the resolution by double doesn't make it look twice as good. everything will be slightly sharper and more detailed, but the difference is subtle. doesn't change the fact that it takes 2x the power to render double the res. (1920*1080 is about double 1280*720)
    - even for PC only games, you aren't going to push 10 times ahead in observed graphical quality with 10x the hardware power. There are serious diminishing returns with regards to resolution and special effects, etc. as you increase hardware power. And even though the hardware might be 10x as powerful, the target resolution is at least double and the target frame rate for high end hardware might also be double. That leaves you with maybe 3x the power of the console. What do you with that? Increase the draw distance of a console game by 1.7x and you are out of performance room. Add a few extra lights on a scene and you are definitely out. And a few extra lights wont make the game look twice as good. Neither will a slightly larger draw distance. The disproportionate expectation is the only issue here.

    And the biggest problem?

    - consoles also use APIs, including a variant of DX9 on the xbox, and OpenGL on the PS3
     
  39. Lithus

    Lithus NBR Janitor

    Reputations:
    5,504
    Messages:
    9,788
    Likes Received:
    0
    Trophy Points:
    205
    That's addressed in the article:

    With a console, you're guaranteed 5+ years of the same hardware. Whereas on computers, you might get 6 months.
     
  40. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    Consoles do last longer with subtle but steady improvements in software.
     
  41. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,087
    Trophy Points:
    431
    One of the differences between consoles and computers regarding on how to make games is that...

    With consoles, developers focus on increasing peformance and getting the most out of the hardware, pushing it exclusively for that set combination of elements. Thus, everyone has their own code, engine, and a vast difference in quality between games.

    In PC, hardware is not a constant, software becomes the "constant" So you develope an engine and ensure it can be played on a vast combination of hardware, with different architectures among the same vendor (generational architecture changes) etc. They end up optimizing the part where it should run on several machines, instead of tayloring it to a single combination of components and squeezing the best performance out of it.

    Frankly, I am impressed with the visuals the ps3 and xbox360 can muster with their limited hardware. So I can only imagine what the next get can bring, considering that the hardware is massively more powerful.

    Unless hardware components become standardize, a console like approach won't work on PC and thus it will remain in the "poor quality" area. Of course, with standardize hardware.... AMD and Nvidia wouldn't exactly exist.
     
  42. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    what about the other problems? what about the fact that PC developers can also do as much hardware optimization as they want? yes, with the PC, there are a lot of platforms to support, but that is inherent to the nature of the PC. If it was a single platform, then it would be a console. How does any of this reflect poorly on the availability of a graphics API?
     
  43. mushishi

    mushishi Notebook Consultant

    Reputations:
    137
    Messages:
    238
    Likes Received:
    0
    Trophy Points:
    0
    Are you sure you read the article, cause you couldn't have if you are asking this.

    Consoles you can do direct to metal development with 20,000 draw calls vs 3,000 on the PC with the API overhead. The whole article discusses that the DirectX API has a massive performance overhead and that the DX11 multi-threading still only a factor of 2.

    Read the article, all your points were addressed as Lithus said.
     
  44. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Don't they have cross platform software already anyhow in order to code for Xbox 360 and PS3? I mean this would require 100% unique code for both, which I'm sure developers wouldn't want to do.
     
  45. mushishi

    mushishi Notebook Consultant

    Reputations:
    137
    Messages:
    238
    Likes Received:
    0
    Trophy Points:
    0
    Maybe but isn't needed. The PS3 is not DX9 yet the UE3 runs on both PS3 and 360. UE3 even runs on iPhone... The Engine doesn't have to conform to just one API for all the platforms.
     
  46. FXi

    FXi Notebook Deity

    Reputations:
    345
    Messages:
    1,054
    Likes Received:
    130
    Trophy Points:
    81
    As far as I'm concerned this person has to be pretty young, and quite naive as a manager. We had exactly what he proposes, 20 years ago, and it was a disaster. Every card had different code written for it and manufacturers had their "camp" of developers who wrote only for certain cards. Every new generation of chips invalidated software that came before it and every consumer moaned about whether they'd picked the "right" hardware or not. Of course AMD would love to return to this, just like Nvidia, so they could battle it out, exclude competitors other than the two of them, and lock in consumers to their hardware via the software route.

    Taking away several decades of development is not the right way to go. AMD can't write drivers without bugs as it is, can you imagine trying to get all their cards right on all versions of software written for their cards? Heck you'd have drivers for each program that was out there, no more one driver to yield improvements for all software.

    The goal "should" be to improve the Direct X driver and resulting compilers (compilers are likely much of the issue in this) such that you get closer to the "metal" results from standard routines. Using standardized interfaces assures all folks they can run a program with much less regard for the hardware they bought, and encourages reasonable hardware competition without the proprietary garbage that is a plague on the industry. If you can't gain enough performance from the "standard" you improve the standard, not ditch it in favor of anarchistic software standards. Comparing the wide hardware industry to a couple of locked in, cheap as you can get, consoles, and assuming the same standards apply is plain ignorant as an industry leader.
     
  47. mushishi

    mushishi Notebook Consultant

    Reputations:
    137
    Messages:
    238
    Likes Received:
    0
    Trophy Points:
    0
    @FXi:

    The dude you are talking about someone who has been coding and been involved in game development since the early 1980s. He's been influential in the development of DirectX since 1996 with 3DLabs and then for 4 years as the developer relations manager with Nvidia and then with ATi for 6 years. He was headhunted by both Nvidia and ATi. They asked him to work for them, not the other way around. I guess you would then be saying, Nvidia and ATi are young and naive also?''

    It's his job to be technically proficient enough to tell developers everything ATi and DirectX 11 has to offer, in finite detail.

    He is neither young nor naive :D If anyone knows what he is talking about, it would be Richard Huddy. Whether you agree with him or not, is a different story.

    But he does seem contradictory since he spent almost 2 years screaming, be excited for DirectX 11 and DirectX 11 is awesome etc. So I am surprised that is so public with this. Almost 2 decades of hand in hand relations with MS and DirectX, this certainaly is surprising.

    But then if any game developer wants cooperation from AMD, the person they talk to is Richard Huddy. So I doubt he's lying if game developers have asked him to make the API go away. But I doubt DirectX 11 is doing away. But be interesting to see what the future of DirectX will be if game developers truly do want to get rid of the overhead costs of DX11.
     
  48. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    Yes, but opinions and views change. People can get 'wiser'. While I like some degree of consistency myself, I also find that it can sometimes imply narrow-mindedness on the part of certain people who just stick with a certain argument simply for the sake of continuity.
     
  49. gdansk

    gdansk Notebook Deity

    Reputations:
    325
    Messages:
    728
    Likes Received:
    42
    Trophy Points:
    41
    I'm pretty sure everyone is reading into this wrong. The AMD manager is not asking for assembly level coding, instead he wants C and C++ to be working on the GPU, natively. Of course this would require building the rendering code for each GPU architecture, but I think that is a low price to pay (in fact they could have the rendering code be compiled the first time it runs on each system, so that the code itself would be sent to the GPU and compiled there rather than sending the compiled binary). I'm pretty sure this is the way things will go anyway... Coding close to the assembly level in a common language across multiple GPU architectures would improve performance and retain the compatibility we have now. There would be no "API" per se, but rather a system of getting your code to run on massively parallel architectures.
     
  50. Richteralan

    Richteralan Notebook Evangelist

    Reputations:
    20
    Messages:
    416
    Likes Received:
    11
    Trophy Points:
    31
    A typical PC game dev nowadays even having problems with DirectX. It is already a quite unified solution for different GPU architectures.
    Given the financial and time constraints (and lazy programmers), do you seriously believe developing "to-the-metal" is the way? And a typical PC game dev can have less problems?

    Don't forget API is another thing that makes our system more stable when a game crashes: the crash doesn't bring down the whole system. Remember games from MS-DOS era?

    Also don't forget more and more architectures are coming in our way. Windows 8 can run on ARM. You sure it's a low price to pay to program for each architecture?

    It is not DirectX that caused poor PC Quality. It's the financial (reads: shareholders, profits, etc.) that caused poor PC Quality.

    P.S. With DirectX 11 you can do MUCH MORE than you think if you build a game from ground up with it.
     
 Next page →