Whats the reasons? From my perspective it hinders performance and dosent look much better than DX9, but im on an 8600m GT... However i hear this being said by people with much more powerful DX10 hardware, GTX260's etc. Why does DX10 suck?
-
A lot of games took advantage of it. CoH, Halo 2, Crysis, Far Cry are the ones off the top of my head. Plus it is only two - three years old.
-
Well on the user end, you pretty much said it: it sucks because it's basically a performance hit with little gains.
For the game developers(the real people for whom DirectX is updated), they never really adopted DirectX 10 because the gains weren't significant enough over DirectX 9. Also factor in that at the time DirectX 10 was exclusive to Vista(and we all know how well the Vista launch went), lots of developers were still catering to the XP users by making backwards compatible Dx9 games(which arguably looked just as good as Dx10 games).
DirectX 11 is basically called "what DirectX 10 was supposed to be", bringing forth tessellation and other technologies which developers wanted to have more flexibility in their renderings.
ATI technically had the technology(hardware wise) to provide some of the DirectX 11 features in their HD4xxx series hence why they've been the first to release full fledged Dx11 GPUs. Nvidia currently doesn't want to implement Dx11 in their next generation of GPUs last I heard. -
One main reason is because it did not work on XP and Vista had a very slow start and bad press prior to the service packs.
-
masterchef341 The guy from The Notebook
biggest reason was vista exclusivity.
that gave developers almost no incentive to use dx10 features, because:
1. most of the userbase didn't have dx10 hardware
2. of those that did, most of them couldn't use the dx10 features anyway because they were running xp.
3. the xbox 360 uses dx9, so cross platform games coming to the pc from the xbox (or developed at the same time) would require additional developer work to implement dx10 features just for the PC version, just for the few PC users that could actually use it.
I for one was basically using XP until windows 7 hit rtm... -
Masterchief said it all basically.
DX10 failed because of Vista.
The ONLY game that actually saw an improvement in graphics and performance was Hellgate: London. Under DX10 the animations were smoother and the game played much better. -
masterchef341 The guy from The Notebook
-
Haha i know, and that was just personal experience for me. Under Xp the game was darn near unplayable. monsters rarely fell when you killed them, sometimes your weapon didnt show the firing animation, and arms spazzed out. Under vista and DX10, these problems mostly went away.
-
Kade Storm The Devil's Advocate
Lost Planet was a MUCH better example of DX10 features.
-
I do like the full object motion blur in Crysis in DX10 though someone has told me its also possible in DX9. I know some features can be hacked in DX9 but I dunno about full object motion blur.
-
Kade Storm The Devil's Advocate
One can implement full object motion blur. Quite a few DX9 games use this feature. F.E.A.R. 2, Gears of War, and Clive Barker's Jericho come to mind. Even Lost Planet in DX9 mode did a good job with the motionblur.
However, with regards to Crysis, the very detailed textures and shaders in this game cause the visuals to look glitchy when full object motion blur is implemented in DX9. (Ref. blurry models, and aliens). It wasn't the full object motion blur that worked differently, but rather, the shaders and textures that reacted funny to the feature. This is where a certain DX10 niche-feature came into play.
From what I've read, full object motion blur in Crysis is only best implemented through the DX10 geometry shaders, which don't cause the textures to go glitchy when the blur is implemented. This was the feature that made full object motionblur work correctly the way it did in DX10 for Crysis. -
masterchef341 The guy from The Notebook
in the end, Crysis dx9 and dx10 were almost the same visually, with dx10 having a slight edge and a massive performance hit.
but that is just one example... -
Let's just let Vista and DirectX 10 rest in peace. They were both a botched launch by Microsoft, and they admitted it. Requiring Vista to have DirectX 10 was the final nail in the coffin. It's too bad too, because I had hope for Microsoft's "Games for Windows" initiative, hoping it would create the much needed standards for system requirements, but all is lost for the moment.
-
I can't believe no one mentioned DX10 under CoH. It was like night and day. Also Halo 2.
Oh well...... -
i dont think halo 2 had a dx10 option just required vista to play.
-
Halo 2 was definitely not Dx10. The original engine was in DX9 I doubt they would bother to port it and then update the engine entirely.
-
Call of Juarez was amazing in DX10 and really used DX10, unlike Crysis, which just faked DX10 (This is why it was "unlockable" in DX9. I still buy a DX10 game over a DX9 game. A few I am re-playing/playing right now are Red Faction: Guerrilla, H.A.W.X. and S.T.A.L.K.E.R. Clear Sky.
As a side note Halo 2 was not DX10, just a Vista exclusive to launch the OS. Horrible choice, Halo 2 looked like garbage and was nothing more than a straight from console-trash port. -
true, you can hack halo 2 to install on xp. not that you would because it's a horrible port.
-
ViciousXUSMC Master Viking NBR Reviewer
DX10 was not often used and when it was used it was misused, so it really didnt have a chance.
-
masterchef341 The guy from The Notebook
yeah halo 2 was just "vista only"- but it was a totally artificial limitation that had nothing to do with dx10.
-
-
Kade Storm The Devil's Advocate
Such as? Because if I didn't still the screen and stare at the details, most of the very fine 'niceness' was missed, and easily executed by DX9.
True, proper DX10 execution is actually supposed to bring better performance. -
Tinderbox (UK) BAKED BEAN KING
So why did they bring out 10.1 and not wait for 11 , what does the .1 do for you.
-
Kade Storm The Devil's Advocate
10.1 is capable of doing certain things that only ATi cards could do; supposedly more efficient--almost performance-hit free--anti-aliasing, along with HDAO. It had potential, but was hardly used in any game.
Refer to Assassin's Creed. The first version supported DX10.1 and ATi owners could run the game efficiently, with antialiasing. Of course, then they removed this pathway. -
thinkpad knows best Notebook Deity
From this information about DX10 theoretically being able to deliver all the perks without a performance hit would have been great in GTA 4, and Rockstart could have made this game easy to run. It's a shame they coded the game as poorly as they did.
-
DX10 didn't take off for the same reasons Vista didn't take off. Vista did nothing for the customer and everything for the corporation. Too much DRM, too much protectionism by the corporation, and a whole lot of fail for the customer.
I run Vista x64 on my main box, and I'm not a bitter customer - I'm just one that has a lot of scorn for companies that are willing to sacrifice their customers on the altar of license profits. Vista failed because it was too greedy. Here's hoping that 7 doesn't take that same path. -
Its really to hard to pick just one reason for DX10 fail.
Vista
consoles don't use dx10 and are becoming more like computers with each new console
developers are to lazy to implement it correctly
games that did use it were often hit with a massive performance drop, crysis, dmc4 ect
nvidia made a big mistake by releasing low end gpu's that couldn't handle it and claiming they could.
dx10.1 was how is what it should have been out the door, but it was killed by nvidias "the way its meant to be played campaign" resulting in games patching it out.
The list really just goes on and on. There are a few games like lost planet and hellgate london that implemented it correctly though. These games can be run in dx10 mode giving the performance boost that it should and then they allow you to "choose yourself" if you want to turn on the increased DX10 features like furring and motion blurr. Several games don't give you that option and force the dx10 extras down your throat causing performance drops. -
davepermen Notebook Nobel Laureate
this, again, now happened with dx11, which again, gives developers new ways to code with the library, enhancing performance, stability, and visual richness of games.
why it never looked much better is simple: xbox360 and ps3 are dx9 level hw, and are major selling points, so games get designed for that kind of hw. more power doesn't add more visual detail, when artists didn't create it.
why it never really took off? well, it did, just as dx11 now. it's just a library to use to code with. there's no "wow, it's here!!" or what ever. it's just the new generation, nothing special. but obviously, as dx10 only ever was available for vista+, it never got that much of a user base, as "vista never took off". -
Kade Storm The Devil's Advocate
Do you know what's really funny. Those old, out-dated consoles, with the right effort, seem to be exuding some of the best visuals this generation. Lawl.
Edit:
And DX10 fur and motionblur for Lost Planet, along with volumetric smoke for Hellgate London, did show improvement in not just performance but looks.
It's just that people are caught up in this illconceived notion that somehow, new APIs will mean instant ground-breaking ray-trace-level photorealism or something. The companies and media take the blame for that one, because I remember the original images touted by certain press outlets regarding DX10 textures and visuals (facial textures of Crysis vs. Halo),which could actually be done in DX9 as well. -
-.-; who cares about how Koreans squint at you as you blow them away??? Just like dirt2 how no one will care how much more realist the flags look when they wave in the wind....There certainly some nice effects that are brought but it comes down to the developers implementation. The faces in mass effect looked 10x better than crysis and that probably just has to do with you actually getting to look at them when you get to make chat selections.
-
I agree that DX10 was partly hampered by being released with Vista and only with Vista, but that is not to say that it "never took off." DX10 is significant in many ways.
First, I would point out that DX10 was not an evolutionary step in terms of developing graphics API. In fact it was quite major; it had completely different hardware requirements than DX9 that were significant changes from the requirements that preceding versions of DirectX APIs required, and it introduced a number of new rendering techniques that offered far more potential than DX9. The fact that they weren't tangible eye-candy to gaming audiences meant that DX10 was quickly dismissed as a MS gimmick. If you read technical details about DX10, you'll quickly learn that it is a significant improvement, just not one that had immediate mind-blowing effects on developers and the gaming community.
Secondly, DirectX10 does not degrade performance significantly. Certainly performance regressions occurred in major titles such as Crysis, but these were minor (ie, 3-5% decrease in FPS) and almost certainly attributable to Vista's mind-numbing sluggishness. This ties back to the point that DX10 wasn't fully appreciated at the time of release. With Windows 7, it is clear the DX10 was not the cause of the lack of improvement in performance.
Those are essentially my views on the matter of DX10 stumbling blocks. -
-
I also thought that DX10 was sabotaged by NVDIA when they didn't want certain features of it... but anyways, i see no difference in DX9 and 10.. they look the same...
-
Kade Storm The Devil's Advocate
But, just for the sake of the argument, I don't remember the 'failbox' releasing games like Uncharted 2, Killzone 2, Heavy Rain, GT5 Prologue. My leaning goes towards some of the positive exploitations of the PS3 hardware.
Granted, they are lower res than your 1080p hypothesis, but still, they implement some nice effects, and with efficient rendering in a way that would bring the most out of PC hardware. (E.g. Killzone 2 was doing full object motion blur, which actually looked better than Crysis' implementation, with deferred rendering and 2xQAA and it did not 'need' DX10 or some fancy card to pull this off. There, using old hardware and limited API, we have a game doing two distinct things done on two different premium DX10 hardware-thrashing games - S.T.A.L.K.E.R. Clear Sky and Crysis.) You see, these visually amped-up games did not need even a single 8 series card or a so-called 'this is the new life API powered by a fusion engine' to show their stuff. Even 'teh cell' has its limits, and is -very- limited compared to 2007 PC hardware.
That right there is the problem in perception. PC gamers buy hardware almost three-times as powerful and priced to play games that look slightly better, but not leaps better for the price and hardware. Then there's the exclusives and quite a few more that have glitches on that very hardware, while others don't even run properly. Then you have to get something even more powerful to compensate for those games with shortcomings that won't even run properly. Essentially, it's newer hardware and APIs to produce stunning visuals, and none of them get pushed to the max - within 90 days it's a new card and no reason to push the hardware, meanwhile consoles do get pushed to the max, and that is how you have machines almost four-times as powerful and expensive, producting results approximately 1.5x that of the much so inferior console.
Why did DX10 never take off?
Discussion in 'Gaming (Software and Graphics Cards)' started by Soviet779, Dec 7, 2009.