The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Some myths about DX10

    Discussion in 'Gaming (Software and Graphics Cards)' started by Moidock, Feb 5, 2008.

  1. Moidock

    Moidock Notebook Consultant

    Reputations:
    527
    Messages:
    228
    Likes Received:
    0
    Trophy Points:
    30
    (Don't know if this subject has been discussed here before, so feel free to move, merge and/or delete if necessary)

    DX9 graphics cards can or will be hacked to run DX10:

    No, it is not possible to hack a DX9 card to run DX10 games. It is not simply a matter of hacking drivers or flashing a card’s BIOS with another one to use DX10, video cards that support DX10 have additional hardware, software routines in their GPU and components to be able to run DX10 games.

    In short, a video card that is DX10 compatible HAS various additions at hardware level to run these programs. No amount of driver hacks and driver settings will allow an older DX9 card to “magically” run DX10.

    Various DX9 cards have hidden support for DX10, or some current DX9 cards will have drivers to support DX10:

    This is a common myth surrounding some of the last DX9 only cards like the Nvidia 7xxx and ATI X18xxx cards or earlier. As mentioned before, for a video card to support DX10 it HAS to have DX10 support at the hardware level. Neither Nvidia or ATI are working into developing drivers that will allow older DX9 cards to run DX10.

    The ONLY video cards currently available on the market that support DX10 are the Nvidia 8xxx and the ATI HD 2xxx/3xxx video cards. Neither Nvidia or ATI have confirmed the existence of DX10 support on earlier cards.

    DX10 cards are always faster than DX9 cards:

    While the top of the line Nvidia 8800 and ATI R600 video cards are faster than the latest DX9 cards (Nvidia 7950 for example) this does not necessarily mean that all the DX 10 cards are faster than their DX9 counterparts. The Nvidia 8800 and ATI HD3000 are completely separate lines and are meant to be the TOP performing cards.

    Although DX10 makes things look better that does not mean that all of these cards are faster in any way.

    For example:

    Although one would think that a 8400 card would kick the snot out of a 7900 card, this is not true. In the Nvidia numbering scheme the second number is the number that lets you know about the card's performance:

    Low/basic performance series:
    x100
    x200
    x300

    Medium/medium-high performance:
    x400
    x500
    x600
    x700

    High performance:
    x800

    So although 8400 cards are DX10 cards they are not superior to all 7x00 series cards.

    To get more frames per second, higher resolutions or higher graphics settings the user will need to switch to a top of the line card.

    Direct X10 will be ported Windows XP:

    This is another common myth. Direct X10 only works with Vista since various of the features and programming routines DX10 uses are built into Vista itself as Vista uses a completely different driver structure and coding compared to older Windows offerings. Microsoft themselves have no plans to port DX10 to Windows XP because of this.

    DirectX 10 is not "backwards compatible," and your current library of 3D games won't work on Vista:

    Vista includes DX9 in its libraries as well and is used when needed so DX 9 games run without problems. DX 10 runs separately and only DX10 games can use it.

    Some older games may require some tweaks or patches to run or install under Vista though.

    Gamers must upgrade to Vista to play any PC games released post-Vista:

    A large amount of users around the world still have not upgraded or will not upgrade their systems for quite some time. While there are some exclusive DX10 only launches, video game makers are aware that a large base of users will be running their current systems for quite some time or are waiting for the initial kinks or bugs to be worked out. For this reason, the large majority of video game makers will be releasing games that are both DX9 and DX10 compatible for some time.

    I hope this helps.
     
  2. Doxie

    Doxie Notebook Consultant

    Reputations:
    63
    Messages:
    185
    Likes Received:
    0
    Trophy Points:
    30
    well dx 9.c is still going strong.

    the only "dx10" bit of crysis is blocked for dx9 users. by adding simple command variables to the config of crysis you can run the game in 'very high' mode. something that is "exclusive" to dx 10 version of crysis.

    crytek simply imposes an artificial limitation to the game,the two games will look near damn identicle. and runs better to boot.

    dx10 maybe new, but dx9.c still has some life in it yet :p
     
  3. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    And will for a very long time.

    DX10 is still an infant, it`s going to be years until there will actually be DX10 only games that actually LOOK and RUN smoothly.
     
  4. Doxie

    Doxie Notebook Consultant

    Reputations:
    63
    Messages:
    185
    Likes Received:
    0
    Trophy Points:
    30
    and hardware that can run the thing properly :p
     
  5. metaldeath

    metaldeath Notebook Consultant

    Reputations:
    14
    Messages:
    224
    Likes Received:
    0
    Trophy Points:
    30
    That's not true.I run Halo 2 and Shadowrun on XP with dx 9 :D with some patches so i believe that dx 10 is only the newest Microsoft way to force users to play games only on his last OS...

    http://www.gamersquad.com/category/PC/Hackers-crack-Vista-only-game-code-for-Halo-2-and-Shadowrun/

    "Reports have arisen stating that the crack fix for the two games only requires a simple overwrite of a few files. Once successfully completed, the games will run in Windows XP even if users have only got DirectX 9 installed. Naughty, naughty!"
     
  6. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Er, no. DX10 is not, has never been, and will never be, backwards compatible.
    Instead, Vista comes with a port of DX9.0c, which takes care of the need for backwards compatibility. But DX10 can *only* run DX10 games.

    :)

    Those are not DX10 games. They run DX9 on Vista. (Which there's no technical reason for. So disabling the OS check makes them run on XP)
     
  7. Moidock

    Moidock Notebook Consultant

    Reputations:
    527
    Messages:
    228
    Likes Received:
    0
    Trophy Points:
    30
    Yeah, I know. Vista includes both DX9 and DX10, the main point I was trying to get across is that you can still run DX9 games on Vista.

    My grammar sucks at times. :p

    Those games were originally being developed for DX9 and XP, M$ is pushing Vista vigorously and as part of that strategy some game makers were forced into making these highly anticipated games Vista only and M$ spread the news that the decision to make these games Vista only was because the would look better as they were now going to make use of DX10.

    Of course, they didn't have anything DX10 in them and proof of it are the hacks that just disable the OS check to run on XP without any enhancements to DX9, the hacks do not "convert" these games into DX9 games. M$ thought that for some reason this would make gamers take the plunge in mass and switch to Vista just to play their games.

    Curious h4x0rs got the best of M$ and uncovered the truth.
     
  8. MegaBUD

    MegaBUD Notebook Evangelist

    Reputations:
    45
    Messages:
    670
    Likes Received:
    3
    Trophy Points:
    31
    Vista will die like ME did... so stay with XP and wait next OS...

    anyway gaming on XP is a lot faster...

    but yeah dx10card are faster than dx9card...

    lot of people will stay on XP... so they probably wont make Vista only game... except maybe microsoft game...
     
  9. 2.0

    2.0 Former NBR Macro-Mod®

    Reputations:
    13,368
    Messages:
    7,745
    Likes Received:
    1,036
    Trophy Points:
    331
    The way you worded it was fine. I got it right away since your post was addressing misconceptions about Vista and DX10. It's a very good post by the way. Hopefully it will open up some good dialogue or at least be a reference for those confused or worried about switching to Vista.
     
  10. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    No, but there are third parties working on a DirectX 10 "wrapper" that will run games requiring DX10 code and interpreting it with DirectX 9 calls. DX9 is capable of displaying DX10 features, it just requires more CPU and GPU horsepower to do so.
     
  11. Moidock

    Moidock Notebook Consultant

    Reputations:
    527
    Messages:
    228
    Likes Received:
    0
    Trophy Points:
    30
    Yeah, heard about it too. Problem is that since it will consume more CPU and GPU power to interpret the DX10 calls the performance will suffer drastically so don't expect to be able to keep high resolutions or high framerates if you're interested in that.

    Still, a good alternate solution to those not willing to switch to Vista by any means.
     
  12. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    You should edit your post to reflect it though. Avoids confusion in the future. :)
    (Saying that you can run DX9 games on Vista is fine)

    Working on, but nothing usable yet. So as far as the "myth" that you can hack DX10 games to run on XP, it's irrelevant, because you still can't.
     
  13. KernalPanic

    KernalPanic White Knight

    Reputations:
    2,125
    Messages:
    1,934
    Likes Received:
    130
    Trophy Points:
    81
    This is false. 7600's are pretty well schooled by 8600s by 30+% or more even in standard DX9 games.
    this difference is sometimes up to THREE TIMES more in actual frame rates in shader-intensive games like Oblivion.

    Don't believe me, go to Tom's VGA comparisons and look for yourself.
    The newer the game, the more likely it will be shader-intensive like Oblivion.

    Note the 128-bit memory bus of the 7600 and 8600 series cards start choking performance above 1280x1024 or so ...

    To be entirely fair, the 8600 series is a newer card with better architecture.
    The 8600m has a fillrate of 7600MT/sec compared to the 6000MT/sec of the 7600 Go... the 8600GT with GDDR3 has a 22.4GB/sec memory bandwidth as well compared to the 19.2GB/sec of the 7600GT go.

    To say the two cards are "pretty much the same except DX10 mode" is quite frankly speaking from the wrong orifice.
     
  14. Moidock

    Moidock Notebook Consultant

    Reputations:
    527
    Messages:
    228
    Likes Received:
    0
    Trophy Points:
    30
    Yes, but I'm referring to cards in the same circumstances. For example, of course, a 7600 card with DDR2 memory isn't going to stand up to an 8600 with DDR3 memory.

    An 8600 card with DDR2 memory isn't going to perform much better than a 7600 with DDR2. A GT card does not stand up to a GTS card either, still not a great difference between each.

    Yes, I'm aware of Tom's Hardware and other sites. There are various types of cards and manufacturers make custom versions (some with faster/slower GPU, DDR2 or DDR3 memory, higher/lower data bandwidth) so I'm taking that into account for cards in the same circumstances.

    The most I've seen with a higher clocked video card say for example a 7600 GT with DDR3 and an Nvidia 8600 with DDR3 and a higher clocked GPU and higher fill rates is something like 13-20 frames per second so it's not something really impressive between both the 7600 and 8600 compared to the 40-50+ faster frames per second difference an 8800 would give.

    Unless benchmark numbers really matter to you, 15-20 frames more per second more is not something a regular gamer would need unless his card is too stressed out already and impedes him from enjoying the game properly.

    I didn't kill your puppy and stomped on it.
     
  15. lozanogo

    lozanogo Notebook Deity

    Reputations:
    196
    Messages:
    1,841
    Likes Received:
    0
    Trophy Points:
    55
    Well, it will matter depending on the base of the difference. Let me clarify the point: if your base is like 60fps, then 15-20 will not matter a lot for many people. But, if you base is anything between 10-30 fps, then that difference is a lot.
     
  16. Moidock

    Moidock Notebook Consultant

    Reputations:
    527
    Messages:
    228
    Likes Received:
    0
    Trophy Points:
    30
    Yep, that's that the point I was trying to make. If your game runs smoothly, a 15-20 frame per second difference won't matter to a regular gamer.

    30+ frames make for a large difference if you are really into numbers or need those 30+ frames per second difference to play.
     
  17. notyou

    notyou Notebook Deity

    Reputations:
    652
    Messages:
    1,562
    Likes Received:
    0
    Trophy Points:
    55
    Just thought I'd let you know you should edit your post since the RXXX series is already out.
     
  18. Moidock

    Moidock Notebook Consultant

    Reputations:
    527
    Messages:
    228
    Likes Received:
    0
    Trophy Points:
    30
    Done. Thx. :D
     
  19. metaldeath

    metaldeath Notebook Consultant

    Reputations:
    14
    Messages:
    224
    Likes Received:
    0
    Trophy Points:
    30
    15-20 framerates aren't unplayable if they are constant,problem is when you're passing from 30-50 to 10 like oblivion(indoor/indoor)

    At this point there aren't full dx 10 games and in near future they will not approach so early...games companies aren't microsoft that take profits from his licenses software...
    For ati/nvidia the enthusiastic target of people aren't the big deal while the casual/mid gamer is the main target that cover the production of games and video cars...consider that an old pentium4 2800 can run well Call Of Duty 4,Gears Of War or UT3 and for someone that was impossible
     
  20. vshade

    vshade Notebook Evangelist

    Reputations:
    35
    Messages:
    340
    Likes Received:
    0
    Trophy Points:
    30
    What is this radeon Rxxx??
    The radeons dx10 compatible aren't the hd2xxx and hd3xxx?

    If you are talking about the codenames, than every radeon card is a Rxxx, the R300 is known as radeon 9700, the r200 is the radeon 8500 and the r580 is the x1900, and none of these support dx10
     
  21. Dustin Sklavos

    Dustin Sklavos Notebook Deity NBR Reviewer

    Reputations:
    1,892
    Messages:
    1,595
    Likes Received:
    3
    Trophy Points:
    56
    A 15-20 fps difference absolutely makes a difference to a regular gamer, and it makes a difference because it raises the low end of the framerate. If the minimum fps is over 60, then yes, it doesn't matter, but in any other circumstances it's going to be relevant and in some cases even crucial, meaning the difference between resolutions and detail settings.

    The shader-heavy 8000 series will perform on par, usually a hair better than its predecessors in older games, but in newer ones they'll blow by, and that's why it's a misnomer to suggest a minimal difference between the two generations.
     
  22. KernalPanic

    KernalPanic White Knight

    Reputations:
    2,125
    Messages:
    1,934
    Likes Received:
    130
    Trophy Points:
    81
    Please refer to my previous post... I compared the 7600GT DDR3 to the 8600GT with DDR3...

    30-200% better is a decent chunk better to worlds apart...

    Even 30% is quite noticeable... if the 7600 is at 30fps, the 8600 is at 39fps.
    Trust me, that is a pretty noticeable chunk especially when we are talking minimums.

    The 200% better of Oblivion makes it playable at lower-mid resolutions, the 7600 is hurting in comparison.

    I quoted the actual stats from nvidia's sites... the 8600GT(DDR3) 's hardware is quite literally 26% better fillrate and 17% more memory bandwidth than the 7600GT(DDR3). While certain cards vary, the BASELINE card is better by enough of a significant chunk to completely destroy your "basically the same" arguement.


    The 7600 in its day and the 8600 now are perfectly capable of resonable framerates at resolutions that aren't insane. If you have the lower GPUs, play at lower resolutions and with a few details off.
    The 8600GT will STILL perform significantly better than the 7600GT because its technology is superior.

    The 8600 and ther 7600 are not "basically the same" unless you are one of the people who think anything below **insert the best card on the market in SLI here** is the same... The numbers and benchmarks say quite clearly otherwise.

    This is a cop-out. Be a man and admit you are wrong.
    30% is a significiant performance boost... no the 8600 is not an 8800... but its quite a bit better than an 7600. At 7600 or 8600 performance levels, 30% can mean a lot... like for instance whether you can play the game or not.
    200% better like oblivion is the difference between a slideshow and a game.

    No, but you did say something pretty silly and its time to admit you were wrong.
     
  23. 2.0

    2.0 Former NBR Macro-Mod®

    Reputations:
    13,368
    Messages:
    7,745
    Likes Received:
    1,036
    Trophy Points:
    331
    Becareful of Nvidia's stats. They're mostly for marketing purposes.
    Better to quote third party real world performance stats. Especially given how certain drivers can make a difference.
     
  24. lokster

    lokster Notebook Deity

    Reputations:
    63
    Messages:
    1,046
    Likes Received:
    0
    Trophy Points:
    55
    thanks for this useful post that should clear up some of the doubts that people have about vista and DX10 and DX9. vista rulez :p

    i can attest that much of my pre 2005 games work. its the only ones that run really great on my igp ^^
     
  25. Moidock

    Moidock Notebook Consultant

    Reputations:
    527
    Messages:
    228
    Likes Received:
    0
    Trophy Points:
    30
    Yeah, the HD 2000s/3000s are of the ones that are compatible with DX 10 are those are based on the R600, R700 and R800 chips.

    Sorry about that, wrote that about a month or two ago

    Thx for pointing it out. ;)
     
  26. Moidock

    Moidock Notebook Consultant

    Reputations:
    527
    Messages:
    228
    Likes Received:
    0
    Trophy Points:
    30
    Indeed, but 9 frames per second isn't going to make a huge difference in real world gaming, if you turn the resolution up and other graphics options the difference will be even less.

    It only matters in numbers.

    Whoop de doo. Link

    I see you need to read Tom's Hardware more often, a nine-eleven frame difference is definitely 200% better. You sure you read Tom's Hardware as you claim?

    Yup, coming from Nvidia or ATI. The PS2 is capable of Toy Story like graphics.

    Link.

    Hmm... 38 frames per second is definitely 200% better. Wait a sec... a 6800GT is on par with a 8600GT by just 9 frames?

    I'm not talking dual cards here. 200% better seems a bit steep to me.

    30% is a significant boost indeed, but nothing that will mean between playable and unplayable.

    Are you sure you don't need to take your Zoloft right now? Your OCD is kicking in.

    I didn't attack you in the first place, I didn't cause your universe to come to a halt did I?
     
  27. Dustin Sklavos

    Dustin Sklavos Notebook Deity NBR Reviewer

    Reputations:
    1,892
    Messages:
    1,595
    Likes Received:
    3
    Trophy Points:
    56
    It's cool, man, please feel free to ignore me, it's not like I know what I'm talking about or anything.

    But this should clear things up. And it bears keeping in mind that a Go 7600 is substantially more crippled compared to its desktop counterpart (8 pipelines vs. 12 in addition to much lower clock speeds) than an 8600M GT is.

    10fps matters big time when it's the difference between playable and unplayable at a given resolution, and on mid range cards that don't have power to spare, that case occurs far more often than not.

    Don't bull**** a bull****er. Your post is wrong in this regard and you should correct it. It's such a miniscule thing that I honestly don't know why you're unwilling to take this one on the chin.
     
  28. Moidock

    Moidock Notebook Consultant

    Reputations:
    527
    Messages:
    228
    Likes Received:
    0
    Trophy Points:
    30
    Yes, I saw it. What I wanted to mean is that there is no substantial difference between both video cards as lots of people claim and am basically trying to clear that up. I deal with most people who think that they will get a substantial difference between both cards (40+ frames), when it is really is something 15-20.

    Perhaps I worded it wrong. I am willing to correct this for the better of the place and am open to suggestions on how to do so. Just point out things nicely.

    I know when I'm wrong and if proven wrong then I take it, simple. You know there's a correct (polite) way of saying things and help correct them.

    Suggesting that I'm using the wrong hole to talk is just a way to push the wrong buttons.
     
  29. The Forerunner

    The Forerunner Notebook Virtuoso

    Reputations:
    1,105
    Messages:
    3,061
    Likes Received:
    0
    Trophy Points:
    105
    I don't even understand the point of this thread. These topics have been mentioned and beaten to death.

    And the 8600m performs on par with a 7900 gs so yes I would consider it a step up from the 7600.
     
  30. lozanogo

    lozanogo Notebook Deity

    Reputations:
    196
    Messages:
    1,841
    Likes Received:
    0
    Trophy Points:
    55
    I think the root of the discussion lies in the conception of the regular/casual player. In your posts you kept mentioning that anything that gives 10-20 fps increase is a minor boost.

    Well, first of all I deduce that for you the casual/regular player will play at 40+ fps. Remember than not everyone has a top GPU (8800m or Go 7950 or the radeon equivalent to top), so for the other people (having mid range to low range GPU's) who also play casually/regularly any increase will be very significative, as Pulp mentioned and I mentioned before. Why? Simply because with today games requirements (Crysis, World in Conflict, etc.), achieving more than 40fps with good graphics (high settings) is not possible, and most people like to keep the balance between playable (good fps) and eye candy (high settings), therefore even a 10 fps increase becomes very valuable.

    Am I right in my assumption or were you thinking something else?
     
  31. Moidock

    Moidock Notebook Consultant

    Reputations:
    527
    Messages:
    228
    Likes Received:
    0
    Trophy Points:
    30
    You're right. :)
     
  32. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    DX9 wrapper?

    OGL wrapper OK but slow, but DX9 wrapper would be beyond inefficient.

    Multi-pass, inefficient loops with writes to system ram. What the heck is the point? :confused:

    It would be like using 3 passes with dithering to emulate FP OpenEXR stle HDR with an X800 or using 2 ROP loops to allow FP HDR + AA in a GF7900.

    No point, it's like saying you can run 'D3D 10.0' on a CPU, you CAN but Why would you want to !?!

    Still don't understand the point of a wrapper other than for the Linux and Apple crowd where there is no other option, but a DX9 wrapper doesn't have that driving force.
     
  33. bmwrob

    bmwrob Notebook Virtuoso

    Reputations:
    4,591
    Messages:
    2,128
    Likes Received:
    0
    Trophy Points:
    55
    Moidock, I see that my post was deleted by yet another overzealous mod - seriously, you guys are too much with your deletions! Do you (the deletion Grand-Master) actually think my post was serious? You missed the big, yellow smilie which looked something like this: [​IMG]

    I know Moidock didn't complain. What's wrong with you? People joke with each other, and post far . . . stronger posts than mine every damn day - and their remarks remain posted. Your day must be very slow and boring.


    Edit: can't spell
     
  34. Moidock

    Moidock Notebook Consultant

    Reputations:
    527
    Messages:
    228
    Likes Received:
    0
    Trophy Points:
    30
    Late night posting Rob? ;)
     
  35. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    This is like watching children discuss economics or geo-politics. :rolleyes:

    Check the next link where they enable the shader intensive HDR, and you'll see the 8600GT outperforms the 7600GT 18fps to 10fps, which while not 200% is still significantly better, and like Kernal mentioned the most important thing will be the minimum fps which will likely plummet much further in the GF7600, while in the easier segment keeping a similar top end fps number. Look at that min fps and you're likely to see a larger delta than you see in the averages.

    But the main thing is to not focus on the DX9 or DX10 component it's the shader composition and number that matter. It's like the X1800 and X1900 have the same D3D support, but it's that the the X1900 has 3 times as many pixel shader ALUs than the X1800 that helps it in pixel shader instensive apps like Oblivion, the GF8600 with it great number, and more flexible shader composition is what helps to give it the leg up, not which version of DX it supports.


    You're talking about DirectX support by linking to an OGL game? And also using a game designed for the GeforceFX to compare the differences in cards that both support much higher supersets of OGL extensions.

    However the difference between 10 and 18 fps would be the difference between playable and unplayable in Oblivion.

    I won't bother with the attacks, but to say that the GF7600 and GF8600 are the same or even very close is a mistake IMO or at least an oversimplification. And the reasons behind the differences are more related to architectural composition with regards to shader numbers and texture support than the DX level supported.
     
  36. bmwrob

    bmwrob Notebook Virtuoso

    Reputations:
    4,591
    Messages:
    2,128
    Likes Received:
    0
    Trophy Points:
    55
    Hey, Moidock, glad you made it over here. No doubt I'll be going on vacation when the deletion-happy mod sees my last post, in which case, I should say while I still can, that it's good to see you again. Hope you enjoy your stay here, but be careful with your jokes. :rolleyes:
     
  37. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Ok, does it really matter whether the 8600 is faster than 7600?
    The original point was valid enough, a DX10 card is not *automatically* faster than a DX9 card. If you don't like the _600 cards comparison, we can use 8400 vs 7900. Voila, a DX9 card that beats a DX10 card.

    It was a valid point, that you shouldn't automatically assume that the card is the fastest on the market, just because it says DX10 on the box.

    Now quit bickering, you babies. (And especially, stop using Toms Hardware as some kind of reliable source to back up your claims. ;))
     
  38. mujtaba

    mujtaba ZzzZzz Super Moderator

    Reputations:
    4,242
    Messages:
    3,088
    Likes Received:
    514
    Trophy Points:
    181
    Well, I was subscribed to the Alky project and it ended with folks reaching the decision that it was impossible to do a proper emulation of DirectX 10 at least with their approach.
    [Though the new Geometry shaders probably require you to re-implement all the vertex and geometry shading on the CPU]
     
  39. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Yeah, an OpenGL wrapper is the only one that'd be at all practical. (Because it at least exposes geometry shaders and the other DX10 capabilities, which DX9 doesn't)
    But it'd still be an awful lot of work, and as it stands right now, DX10 games can not run anywhere other than on Vista, with a DX10 GPU.
     
  40. KernalPanic

    KernalPanic White Knight

    Reputations:
    2,125
    Messages:
    1,934
    Likes Received:
    130
    Trophy Points:
    81
    Maybe you should try playing Oblivion or perhaps something like Bioshock on a 8600m GT DDR3 and a 7600GT Go.
    I have very little doubt that the difference will be quite obvious and not just in "numbers" as you claim.

    The difference between 30fps avg and 39fps avg is QUITE noticeable for pretty much everyone.

    I noted a range... one of the tests (1280x1204) showed 6 fps for the 7600 and 18fps for the 8600... I suppose I might have rounded as he lists decimal places, but quite frankly are you telling me you cannot tell the difference between 6-7fps and 18fps?


    Only I quoted specific data that wasn't from their marketing people.
    Its from the actual fillrate and memory bandwidth of their cards taken from the reference design.

    I think I may have overstepped your capabilities by mentioning things like fillrate and memory bandwidth. Perhaps you should try some reading so you understand.

    I said from 30-200%... please actually read the post next time.
    the 6800 is an enthusiast board and in the example can get close... but in other examples gets blown away... note the 6800 is not the 7600.

    Others have already killed this... 30fps avg means your minimum is closing in on visibly choppy. 39fps avg more than likely brings you minimum frame rate into the "smooth" zone.


    Let's talk about what actually happened...

    I pointed out you were wrong and showed why I believed this using the stats of the cards, the results of benchmarks done in comparison of the cards and just plain logic.

    I asked you to amend your statement and admit you were wrong because your post is misleading.

    What I attacked was your false statement which is right here in front of everyone.

    You tried to attack me personally... You have no idea what drugs I take or what medical conditions I have or do not have.


    Let's step back and take a look at where we are.

    This board is about educating people about laptops and by proxy, computers in general. Your post is proveably incorrect and you have been admonished by more than one poster for your conclusion about the 7600 and 8600.

    The point of "being a DX10 card does not necessarily mean is faster" is actually correct in some cases... but your example is quite frankly false.

    Maybe simply admitting that statement to be false and editing the posts to say "for instance 8400 cards are DX10 cards but not superior to all 7x00 series cards" or perhaps by mentioning the basics of Nvidia's naming scheme would be a better solution instead of making up something about someone you don't know?
     
  41. Otter

    Otter Notebook Consultant

    Reputations:
    85
    Messages:
    105
    Likes Received:
    0
    Trophy Points:
    30
    DX10 is a farce intended to add value to Vista.

    There is nothing in DX10 that is not achievable by DX9 hardware. It just requires software to map those instructions to the existing hardware.

    DX10 runs fine on Windows XP, Microsoft chooses not to release it.

    The hype that DX10 requires Vista is solely because MS will not release a driver for XP.

    Having worked with Development copies of Longhorn and Vista, I guarantee you that DX10 is 100% a business decision and has nothing to do with graphics.

    That being said, I don't think MS is every going to allow DX10 to run on XP, but I also don't think game studios are going to move to Vista for another 2 years, I also think OEM XP will be sold for another 1-2 years.
     
  42. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Not quite true. There are some hardware capabilities that exist inn DX10 hardware but not in DX9 hardware. Geometry shaders is the typical example, but there are others (integer support on the GPU, and longer shaders, for example)

    So no, there are things in DX10 that are not achieveable by DX9 hardware. Of course, they could be software emulated.... If you don't mind lowering your performance by 98%.

    Perhaps it does, perhaps it doesn't. Of course it could run just fine on XP, if it was ported, but unless you have some Microsoft insider information, you don't know that such a port currently exists. You're right though, that Microsoft chooses not to release it.

    How are development copies of Longhorn and Vista related to this?
    There are several graphics-related improvements in DX10, as I mentioned above. It is a business decision not to enable these things on XP, yes, but the existence of DX10 in the first place is obviously because of the graphics features it enables.
     
  43. Moidock

    Moidock Notebook Consultant

    Reputations:
    527
    Messages:
    228
    Likes Received:
    0
    Trophy Points:
    30
    Finally, a nicer, more polite response. I was simply pushing your buttons as you could have pointed out in a more polite way any corrections you may have had and not say I was talking through the wrong hole, if somebody is an ass to me then I'm an ass to them.

    We're not all telepathically connected to know what others do. I'm not gonna shoot myself over somebody in a forum who says that I suck. I will correct it when I get back from work.

    Forcing me to make me acknowledge that I'm wrong isn't going to get you a prize. If you're ever wrong I wonder how long it would take for you to admit you were.

    I never said my word is what runs the industry. If somebody has any more corrections or suggestions to make please do so and I will post them right up. :D
     
  44. Dustin Sklavos

    Dustin Sklavos Notebook Deity NBR Reviewer

    Reputations:
    1,892
    Messages:
    1,595
    Likes Received:
    3
    Trophy Points:
    56
    I'm pretty sure this is an urban legend that's largely gained strength on the backs of DX9 games that were artificially tied to DX10. And that makes sense, given how we have virtually no ACTUAL DX10 games.
     
  45. djembe

    djembe drum while you work

    Reputations:
    1,064
    Messages:
    1,455
    Likes Received:
    203
    Trophy Points:
    81
    we're not? :eek: Rats! *retries telepathic omniscience*
     
  46. Otter

    Otter Notebook Consultant

    Reputations:
    85
    Messages:
    105
    Likes Received:
    0
    Trophy Points:
    30
    That isn't true. Any 'Shader' has been depreciated several hard generations ago (with the G8 cores), shades were generalized in stream processors, which are basically high speed, specialized math processors. You write little 'programs' that you can then apply whatever you want. A very common use is to use these to accurate apply effects to Pixels, hence a 'pixel shader', but in reality they use the exact same hardware, they just have drivers that simplify their use. What is a geometry shader? It still just a 'shader' program that runs on one of these stream processors. There is no hardware difference. The only difference is the API that is used to write that program, DX10 provides a much needed update to the api to take advantage of the more advanced stream processor features that are present.

    I do suspect that hardware differences may exist because Vista wants to 'protect' each pixel, so yeah there may be additional security devices on the chip, but nothing related to graphics. Its all just business posturing.

    As for Dx10 on XP, there is no myth, several universities had it working fine before Microsoft threatened them out of existence (it violated the beta testing agreement and the schools would face multi-million dollar lawsuits + pulled funding if they continued ).
     
  47. unknown555525

    unknown555525 rawr

    Reputations:
    451
    Messages:
    1,630
    Likes Received:
    0
    Trophy Points:
    55
    This entire thread is rubbish.

    The 8 series vs the 7 series is a performance increase of up to 300% in some games, the added DX10 capability is NOT why most people buy them.

    A 7600GT vs an EQUALLY CLOCKED 8600GT still shows a massive difference.
    Even in older games like CS:S with HDR, the FPS difference is about 40%-150%+ for the 8 series over the 7 series, and CS:S does NOT use DX10.
    The same is true through out the ENTIRE 8 series and the 7 series cards and AMD counterparts. An 8300GT is about 100-300% faster over the 7300GT in shader intensive games.

    DX10 uses a MUCH different coding method than DX9, and will only work exclusively with Vista, Microsoft has made this very clear, any claims to make DX10 on XP is completely false, and attempting to run a game in DX10 mode on XP will deny you as all newer games with DX10 will auto detect your OS and know what to allow and what to disallow.

    If someone DID happen to create a DX10 for XP, then they would be required to completely modify the entire OS thus completely killing the point since you would be required to run an illegal version of windows XP to use it.
     
  48. Dustin Sklavos

    Dustin Sklavos Notebook Deity NBR Reviewer

    Reputations:
    1,892
    Messages:
    1,595
    Likes Received:
    3
    Trophy Points:
    56
    I actually don't believe this, and the reason is because simply put, we haven't seen it. Something like that, especially spread out over several universities as you claim, would be bound to get leaked somewhere.

    I mean come on. After the Half-Life 2 source leak and the Doom 3 Alpha leak, as well as how freaking easy it was to get super early beta versions of Vista, you expect us to believe that something that would be as sought after as DirectX 10 for XP existed at one point but had its strings cut? Isn't that awful clandestine? If DX10 for XP existed, we'd have it. It would be all over Pirate Bay and the front page of slashdot.
     
  49. Otter

    Otter Notebook Consultant

    Reputations:
    85
    Messages:
    105
    Likes Received:
    0
    Trophy Points:
    30
    Suit yourself, but there is absolutely no hardware difference between DX10 and DX9. It is for this exact reason you can plot a DX10 card into a WindowXP computer and go about your merry business - are you seeing all of the bells and whistles? Nope - Why? Because the software has not been adapted for XP - IT IS ALL SOFTWARE. DX10 is used by Vista, and the method it draws windows and scroll bars is distinctly different than DX9, but again this is a software issue, it deals with Windows Presentation Foundation, but there is no reason DX10 can not be run on XP, all that was required was to remove the WPF dependency which is only a small portion of the library.

    Yes it is sad many things in the world are killed, but Vista is worth billions to MS, and they will ruin anyone that attempts to get in their way. In research projects are killed overnight with no reason given or questions asked - it is just how it works.
     
  50. Dustin Sklavos

    Dustin Sklavos Notebook Deity NBR Reviewer

    Reputations:
    1,892
    Messages:
    1,595
    Likes Received:
    3
    Trophy Points:
    56
    I wouldn't argue that DX10 is artificially limited to Vista; it sure as hell wouldn't be the first time.

    I will argue that DX9 and DX10 class hardware is fundamentally different, though, and your argument that you can put a DX10 card into a machine running XP just fine is fundamentally flawed: hardware doesn't break compatibility. This is like saying you can't put a GeForce 2 in an XP machine running DirectX 9.0c because it doesn't support anything higher than DX7 in hardware.

    New hardware just adds new features; in the case of DX10 class hardware, the unified shader units are vastly more programmable than Shader Model 3 parts. Suggesting geometry shaders could be run on DX9 class hardware ignores the fact that that hardware had separate pixel and vertex shader units, suggesting there would be a need for a geometry shader unit.

    What you wind up with when running DX10 class hardware is a GPU that can dynamically allocate shader resources where needed, but is also profoundly useful as a GPGPU.

    Again, I don't disagree that DX10 is artificially limited to Vista. I DO disagree that there is no hardware difference between DX10 and DX9, because an explicit hardware difference is called for in the specification.
     
 Next page →