As expected, they are somewhat giving the same promises to the consumers they gave about two years ago...
http://www.pcgameshardware.com/aid,...ster-than-DX101-due-to-Local-Data-Share/News/
We were told that DirectX 10 would yield better graphics without the cost of performance (due to the new GPU architecture and the way DX10 was supposedly going to use it more efficiently than DX9).
What did it prove to be? Most of us know... A sometimes costy decrease in FPS for some slightly better light effects... And supposedly DX10 was efficient...
Now we are close to the intoduction of DX11 and similar words of optimism are being heard. It's confirmed both Vista and Win7 will support DX11 and DX10 cards will be compatible.
ATi is going to release the first DX11 cards by the end of this year
http://www.tomshardware.com/news/ati-dx11-40nm-gpu,6462.html
Since new cards will be introduced, this logically means our current DX10 cards differ physically, and therefore will require some kind of algorithms to "simulate" a DX11, so what performance gain are we going to get without changing our cards? This most probably means we will sacrifice performance... And if I'm not mistaken, this is indirectly confirmed by the interview in the first link (the Local Data Share feature a la ATI RVxxx, which apperently is not featured in our current cards), so we are likely to sacrifice performance again...
If you feel like what I've written here is wrong, feel free to oppose my view of the subject, but without turning this thread into a battlefield![]()
-
-
masterchef341 The guy from The Notebook
most likely, DX11 simply won't run on DX10 hardware.
this has been true for every other version of DX, i don't see why this would be any different. even if it did, it would be through emulation (there is general purpose GPU software these days) but it looks like it would just be a massive performance cost in that case especially.
combine that with the fact that no one owns DX11 hardware, so no game developers are going to be rushing to support it (especially not exclusively!). It isn't like the day DX11 is released everyone is going to be trashing their ATI 4000 and GTX 200 systems.
When DX11 comes out, DX10 hardware will finally have propagated through the PC gaming community. Anyone can get a card capable of running DX10 fluently for under $100. We are basically 3 generations of hardware into DX10. Maybe two, depending on how you count.
Also, the consoles are driving a lot of this. Cross platform games aren't made using DX10 for the Xbox, so there is really no point in making them DX10 only for the PC. DX10 will probably continue to see itself as an additional option in games, as will DX11, until the next series of consoles comes out. -
Honestly, I'm more concerned as to how it compares with the current DX9 games...
-
It'll be a while before anyone can form some good opinions, because the technology itself has to find it's way into the hands of the developers and the masses.....until then, I'll stay on the skeptical side (history will repeat itself....) -
ViciousXUSMC Master Viking NBR Reviewer
DX10 was more efficient than DX9, the original goal of DX10 was to allow the same graphics that DX9 uses with less system power.
This is 100% true when DX10 was properly used. However 99.9% of all the game developers instead of keeping DX9 and DX10 content the same ADDED more things to the DX10 version of the game that made it harder to render, or did a very poor implementation of DX10.
I guess they figured if you have DX10 you have a high end computer and they may as well push the envelope and take advantage of some of the features DX10 can do that DX9 can not do, and they had to stay competitive in the game market. If one game had super flashy DX10 graphics posted all over the web and they wanted to do the more sensable thing and just clone the DX9 content into DX10 for better performance, you know people are going to go for the better graphics. I mean Crysis is the glowing example of this. -
I agree with you, many devs always want to bring the most advanced graphics at any costs: crysis (poorly coded). -
If they used the same graphics for DX10 as for DX9, it would be more efficient, the developers just feel the need to have the "best looking game" when really it doesnt look any different from the DX9 version and has worse framerates.
-
masterchef341 The guy from The Notebook
there is also at least one example where the performance is MARGINALLY better in dx10 with the same visuals...
so, whatever. -
ViciousXUSMC Master Viking NBR Reviewer
DX10.1 was the real optimized version. Assassins Creed was like some 20-30% faster with DX10.1
-
-
ViciousXUSMC Master Viking NBR Reviewer
Yep, because only ATI cards supported it and AC was a Nvidia sponsored game. Not hard to connect the dots there, esp if you read some of the interviews with the game team that made AC. -
Nothing benefits the comsumer more than pure honest competition. Thanks Nvidia.
-
There will be zero difference with DirectX 11 unless some revolutionary change happen in the GPU industry and all of a sudden game developers quit coding for DX9.
-
I won't lie. I watched the DX9 vs. DX10 war from the sidelines. I'm still using XP, but plan to move to Win7 when it shows up.
My understanding was that Microsoft was using DX10 to entice people to move to Vista, however like you all said, it didn't really deliver the way it was supposed to.
I would expect something similar to get users to move to Win7. Just depends on whether developers capitalize (properly?).
--------
I was sort of hoping we'd see some sort of unified API by now. I know that ATI and Intel (and I believe Nvidia) were in talks about developing a unified API that would help standardize things and make it easier on developers, however I haven't heard anything about it for a few months. -
Back in the days before DirectX became industry standard, there were a number of competing APIs. Not all cards supported all APIs, and not all games supported all cards. Developers had to rewrite entire engines for all the different APIs out there - just to get their game to work on everything. Ah, the good old days of Voodoo and exclusive features that required Glide.
Nowadays, a developer can just pick up DirectX and pretty much be guaranteed it'll work on everything of relevance. -
or make it go opengl and go cross platform
-
I think it will all come down to what the 'next-gen' consoles will use. It will either be DX11 or DX12. At least on MS platforms. Then every game will be coded natively and won't require a patch so performance will improve compared to today's games running in DX10 modes.
Soooooo... 2012 maybe? If Windows 8 launches before the next consoles, we may see a move to DX12 instead. Otherwise DX11 will be the true successor to DX9. Looking forward to it. -
I chose #2 because I dont think it will run best on dx10 cards.
but who knowss -
This is the just the same Pre-launch hype that companies spew before the arrival of any major release. I have never seen any direct X update that have significantly increased my performance for past games (I started with DX 5). They are meant to introduce new features like HDR or bump mapping, and usually at steep performance hits until the technology has had time to mature.
-
davepermen Notebook Nobel Laureate
dx10 is much better than dx9 by design. that doesn't mean gamedevs have used that new design really to make "better games", sadly.
but, having worked with both in theory and praxis, dx10 is (or can be) much faster for the same result than dx9.
anything else is marketing driven crap. mainly the consoles-first, pc-afterwards thing. games for dx10 are very bad for consoles, as they have a different architecture from the bottom up. so it would mean coding two different games in parallel or make the dx10 game work bad on consoles.
obviously, the consoles can't be upgrades, to they try to code it good for consoles, bad for pcs, first.
btw, crysis is quite well coded. the visual difference of the dx9 path to the dx10 only path is huge imho (but i care for details), and the visual complexity is immense => huge resource eaters.
dx11 will continue this: making a very well designed api for very fast games possible. but will gamedevs use it? we don't know. i guess most won't. engine developers, maybe, will (valve with source? i sure hope so).
-
The issue with compatibility is not restricted to consoles. The problem is that on the PC, DX10 requires both Windows Vista and a compatible GPU. The result of this is that even in June 2009 (i.e. less than half a year before the release of Windows 7 with DX11), fewer than 30% of all systems in the Steam Hardware Survey are DX10 compatible.
This makes a DX10-only game completely absurd -- the consoles can't run it and the overwhelming majority of PCs can't run it either. The most that developers making what are fundamentally DX9 games can be bothered to do is add some superficial graphical differences (which is what you got in Crysis).
Whether DX11 repeats the story of its predecessor depends on how Windows 7 fares and whether or not it is ultimately adopted by the new generation of consoles. -
mobius1aic Notebook Deity NBR Reviewer
DirectX 11... Will history repeat itself?
Discussion in 'Gaming (Software and Graphics Cards)' started by Nikko_300bhp, Jul 15, 2009.