The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    DX10 NOT Obsolete

    Discussion in 'Gaming (Software and Graphics Cards)' started by The Forerunner, Aug 17, 2007.

  1. The Forerunner

    The Forerunner Notebook Virtuoso

    Reputations:
    1,105
    Messages:
    3,061
    Likes Received:
    0
    Trophy Points:
    105
  2. Necromas

    Necromas Notebook Deity

    Reputations:
    198
    Messages:
    730
    Likes Received:
    0
    Trophy Points:
    30
    Little late on the news there, but I guess the other topics dropped off the front page.
     
  3. adinu

    adinu I pwn teh n00bs.

    Reputations:
    489
    Messages:
    2,842
    Likes Received:
    0
    Trophy Points:
    55
    The only thing that .1 provided was the usage of 4xAA w/o a performance loss. But seeing as how any recent game can be set to run 4xAA, it doesn't really matter.
     
  4. usapatriot

    usapatriot Notebook Nobel Laureate

    Reputations:
    3,266
    Messages:
    7,360
    Likes Received:
    14
    Trophy Points:
    206
    There you go, thats what everyone was complaining about.
     
  5. The Forerunner

    The Forerunner Notebook Virtuoso

    Reputations:
    1,105
    Messages:
    3,061
    Likes Received:
    0
    Trophy Points:
    105
    Yeah was just trying to reassure the people that were worrying, no-one mentioned these official statements from microsoft in the other threads.
     
  6. fabarati

    fabarati Frorum Obfuscator

    Reputations:
    1,904
    Messages:
    3,374
    Likes Received:
    0
    Trophy Points:
    105
    But It was the same with DX9. And DX8. And DX10.1 is backwards compatible with DX10, so if your card don't support 10.1 it'll use DX10 features.
     
  7. The Forerunner

    The Forerunner Notebook Virtuoso

    Reputations:
    1,105
    Messages:
    3,061
    Likes Received:
    0
    Trophy Points:
    105
    Nah just in response to the comments about that your 8 series card is already useless which some people actually believed.
     
  8. Necromas

    Necromas Notebook Deity

    Reputations:
    198
    Messages:
    730
    Likes Received:
    0
    Trophy Points:
    30
    Long story short your DX10 cards are still just as good as they were, and you won't be seeing any incompatibility error messages anytime soon.
     
  9. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Thank you.
    I think my head was about to explode from people whining that they were so unfortunate to have the best GPU on the market, because it wouldn't last forever...

    Thank you for this pocket of sanity.
    If anyone still feel unfortunate to have a 8800, PM me and I'll give you my address. You may mail it to me, and I'll send you my 6800 in return. ;)
     
  10. The Forerunner

    The Forerunner Notebook Virtuoso

    Reputations:
    1,105
    Messages:
    3,061
    Likes Received:
    0
    Trophy Points:
    105
    Lol. :D Yeah you obsolete claimers!
     
  11. fabarati

    fabarati Frorum Obfuscator

    Reputations:
    1,904
    Messages:
    3,374
    Likes Received:
    0
    Trophy Points:
    105
    I want a 8800.
     
  12. ViciousXUSMC

    ViciousXUSMC Master Viking NBR Reviewer

    Reputations:
    11,461
    Messages:
    16,824
    Likes Received:
    76
    Trophy Points:
    466
    I am glad I didnt have the money or a 8800gtx tho, from what I can gather from another forum a 9800gtx is due near the end of the year and will be 2x stronger than the 8800gtx for the same price.

    We shall wait and see :p My C90 took my desktop upgrade funds, im so busy these days I have not played a game on my desktop in a good 4 months anyways... Oh how I miss my battlefield 2142
     
  13. fabarati

    fabarati Frorum Obfuscator

    Reputations:
    1,904
    Messages:
    3,374
    Likes Received:
    0
    Trophy Points:
    105
    I'll likely get a 9900GTS or something equivalent when i buy a desktop (late 2008, early 2009)
     
  14. narsnail

    narsnail Notebook Prophet

    Reputations:
    2,045
    Messages:
    4,461
    Likes Received:
    1
    Trophy Points:
    106
    how about a 4mb 3d accelerator, its an artifact for sure :rolleyes:
     
  15. Necromas

    Necromas Notebook Deity

    Reputations:
    198
    Messages:
    730
    Likes Received:
    0
    Trophy Points:
    30
    Hasn't technology always worked this way?
     
  16. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Yes.
    And my prediction is that next year, another GPU will be launched which is roughly twice as fast as a 9800. And the year after... well, you get the idea.

    In other words, nothing have changed. GPU's have roughly doubled in speed since, oh, I don't know, the Voodoo 2, possibly.
     
  17. fabarati

    fabarati Frorum Obfuscator

    Reputations:
    1,904
    Messages:
    3,374
    Likes Received:
    0
    Trophy Points:
    105
    Well, the speed up might be more if they decide to go multicore.
     
  18. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Jalf since my 8800GTX is obsolete I can send you mine ;)
     
  19. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    gpu's are already essentially multicore, and they have been for a long time.
     
  20. fabarati

    fabarati Frorum Obfuscator

    Reputations:
    1,904
    Messages:
    3,374
    Likes Received:
    0
    Trophy Points:
    105
    In what way?
     
  21. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    tons of processors on a single chip. ati uses like 100+ processors on a single chip for shading. nvidia uses like 30 more powerful (individually) processors.

    gpu architecture is inherently "multicore" - in the processor sense of the word, and its been that way a long time.
     
  22. fabarati

    fabarati Frorum Obfuscator

    Reputations:
    1,904
    Messages:
    3,374
    Likes Received:
    0
    Trophy Points:
    105
    Well, think about having two of the chips? Should be MUCH more effective than SLI or Crossfire, if designed properly. And they would have a use for putting 512 MB ram on a mid range, although it would need a 256 bit bus
     
  23. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    lol.

    well let me ask you to clarify, are you referring to desktops or laptops?
     
  24. fabarati

    fabarati Frorum Obfuscator

    Reputations:
    1,904
    Messages:
    3,374
    Likes Received:
    0
    Trophy Points:
    105
    Well, desktops actually. Heat issues and whatnot. Should still be less than 2 cards in SLI, So gaming laptop DTR's could likely use them as well.
     
  25. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    And much less efficient than what they're doing now.
    So no, don't count on it.

    Why? We're talking about two GPU's on one card. And since the GPU's themselves generate almost all the heat, you should see pretty much twice the amount of heat if you have two GPU's on a card.

    As masterchef341 said, GPU's are already, and have been for a decade, more "multicore" than CPU's could ever dream of.

    The multicore implementation you see on CPU's is not efficient. It's the best that can be done with CPU workloads that are not particularly parallelizable.

    For GPU's, why settle for this second-best solution? When you have the far more efficient one at hand, and have been using it for years?
     
  26. fabarati

    fabarati Frorum Obfuscator

    Reputations:
    1,904
    Messages:
    3,374
    Likes Received:
    0
    Trophy Points:
    105
    Yeah, ok. Just went with the childish way of thinking "more is better".
     
  27. The Forerunner

    The Forerunner Notebook Virtuoso

    Reputations:
    1,105
    Messages:
    3,061
    Likes Received:
    0
    Trophy Points:
    105
    Future is perhaps "fusion" technology. It makes sense but lets just see how well it becomes implemented.
     
  28. fabarati

    fabarati Frorum Obfuscator

    Reputations:
    1,904
    Messages:
    3,374
    Likes Received:
    0
    Trophy Points:
    105
    And GPGPU's
     
  29. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    yeah i was going to explain this in a story.

    first (not part of the story) the gpu workload happens to be very parallelized by nature. thats why they jumped the gun and started parallelizing everything internally, whereas cpu work is very linear. that is why just now cpu's are starting to become multicore (and its a huge hurdle for developers to take advantage of it).

    so back to the story.

    think of the gpu as a car circa 1890. the car has a long way to go upwards and onwards in the internal design. we want to be as efficient as possible in getting passenger to their destination. making them happy. that sort of thing.

    we have two choices to increase the power of the car. we can strap multiple engines to the car... 2 engines, maybe 4, or even 8.

    or we could just refine the engine itself.

    if you want to put two chips on the card, its sort of like two engines in a car. it could work. but there is a serious limit on your return (similar problem with SLI).

    Lets just say SLI and multichip cards are equivalent. who would buy 10x 8800gtx's for futureproof rendering when dx11 is right around the corner and chips double in power every year? Literally in 3 or 4 years your 10x 8800gtx render farm madness will be eclipsed by a single card.

    They still have a lot of room to improve the internal design. Thats why they do it. When that is no longer feasible (we are approaching the point where you can't manufacturer on a smaller process size anymore with conventional methods - i forget when that is supposed to hit... a few years maybe) they will probably do another revision in design or two, and then come up with another means of increasing performance.
     
  30. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    The future of what, though?
    It'll be a *very* long time before this makes sense for high-end systems.

    In low-power or simply cheap systems, it's a brilliant idea, and AMD is going provide a kick-ass product when it comes out.

    The reason:
    Simple, GPU's are friggin' huge, and they need a ton of memory bandwidth. On the other hand, they're not that reliant on a fast bus directly the GPU.

    Now, if you were to merge a GPU and CPU onto a single core, here's what would change compared to the traditional setup:
    - Communication between CPU and GPU would get far faster (obviously since they're on the same core). But we just said this isn't so important

    - Memory bandwidth would decrease (Not only would it have to be shared with the CPU, but CPU's also typically use a vastly slower memory bus. You never see 512-bit buses between system RAM and the CPU. You also don't see 2GHz DDR3 on system memory, while it's not pretty much what we get on a high-end GPU). And oops, didn't we just say GPU's are very reliant on memory bandwidth?

    - The amount of transistors dedicated to the GPU would go down. (Simple, at the moment, GPU's contain something like twice as many transistors as CPU's. There's a practical limit to how big dies can be made, and GPU's are hovering right on that limit, probably a bit above. - That is, it'd be impossible to make GPU's at "normal" CPU-like prices, or in "normal" cpu-like quantities. They're just too big). Now, if they had to be integrated into a CPU, the die couldn't really afford to grow beyond what a high-end GPU is today. (And it'd probably have to shrink, even), and it has to share that space with the CPU.

    And of course, it'd be rather tricky to upgrade your GPU without the CPU... [wink]
     
  31. fabarati

    fabarati Frorum Obfuscator

    Reputations:
    1,904
    Messages:
    3,374
    Likes Received:
    0
    Trophy Points:
    105
    That makes sense. Thanks for the explanations, Jalf and Masterchef.
     
  32. The Forerunner

    The Forerunner Notebook Virtuoso

    Reputations:
    1,105
    Messages:
    3,061
    Likes Received:
    0
    Trophy Points:
    105
    I meant farther down the road distant future. But of course fusion won't be able to put up a fight to discrete gpus right away, its fairly new concept. It is a start though.

    Using fusion your shifting the graphics from the north bridge to the cpu thereby increasing the speed between the core and the memory which offers lower power consumption.

    Also the integrated graphics will be able to access memory directly from the processor then through the northbridge which will give it lower latencies for more gaming power.

    For now its a low end gaming system solution which will basically give you more performance while drawing less energy. Thats what I meant perhaps a future solution, we will see where it goes from there.
     
  33. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Agreed. Of course, some 10+ years from now, it might be the standard way to build GPU's (the same way the floating-point coprocessor these days is integrated into the CPU)

    But for now, it's definitely a promising replacement for integrated graphics at least, but it won't stand up to discrete GPU's.
     
  34. fabarati

    fabarati Frorum Obfuscator

    Reputations:
    1,904
    Messages:
    3,374
    Likes Received:
    0
    Trophy Points:
    105
    Will it be cheaper than making an integrated gpu?
     
  35. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    most likely not. it should be better than the integrated solutions available for power consumption and the potential for better performance than integrated parts is there. dedicated will still be the way to go for quite a long time. years down the road the fusion idea will be practical in the high end, as stated above.
     
  36. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    I'd expect so, yes.
    The CPU core might get slightly bigger (but not by much, since some functionality can be shared between the CPU and GPU parts), but on the other hand, you can remove a chip and a *lot* of wiring from the motherboard (both of which are expensive)

    Also, the GPU will be able to use the same memory controller as the CPU, another nice saving. And it won't have a need for dedicated memory or complex shared memory solutions at all. All in all, the motherboard will become a lot cheaper (as cheap as mobo's without integrated graphics)

    The only possible downside I can see is that the extra complexity on the CPU might drive cost up a bit on that, but I think that should be a very minor difference.

    Of course, this is just my guess... We'll see when it actually comes out :D
     
  37. fabarati

    fabarati Frorum Obfuscator

    Reputations:
    1,904
    Messages:
    3,374
    Likes Received:
    0
    Trophy Points:
    105
    I wonder If AMD will manage to get back the performance throne, in both the CPU and GPU business.
     
  38. knightingmagic

    knightingmagic Notebook Deity

    Reputations:
    144
    Messages:
    1,194
    Likes Received:
    0
    Trophy Points:
    55
    I wish we just had two chips: the CPU and the general acceleration unit which does visuals, sound, file caching, and physics.
     
  39. The Forerunner

    The Forerunner Notebook Virtuoso

    Reputations:
    1,105
    Messages:
    3,061
    Likes Received:
    0
    Trophy Points:
    105
    Lets just hope they haven't fallen too far in the cycle that they can't catch up but will always stay a step behind. I can't wait till intel joins the discrete gpu market next year, then the competition heats up.
     
  40. fabarati

    fabarati Frorum Obfuscator

    Reputations:
    1,904
    Messages:
    3,374
    Likes Received:
    0
    Trophy Points:
    105
    Yup. Intel need competition. Other wise we get a new P4 on out hands. And since nVidia is getting less competition from ATI, Intel will stir up the market.
     
  41. The Forerunner

    The Forerunner Notebook Virtuoso

    Reputations:
    1,105
    Messages:
    3,061
    Likes Received:
    0
    Trophy Points:
    105
    Nvidia-Intel merger anyone? Or perhaps the less popular Nvidia-IBM merger?
     
  42. fabarati

    fabarati Frorum Obfuscator

    Reputations:
    1,904
    Messages:
    3,374
    Likes Received:
    0
    Trophy Points:
    105
    Won't be allowd. Anti-trust laws
     
  43. The Forerunner

    The Forerunner Notebook Virtuoso

    Reputations:
    1,105
    Messages:
    3,061
    Likes Received:
    0
    Trophy Points:
    105
    Well nvidia ibm rumors have been looming for a bit now. How so though its not really monopolization and its not like between them any real predatory pricing can occur?
     
  44. fabarati

    fabarati Frorum Obfuscator

    Reputations:
    1,904
    Messages:
    3,374
    Likes Received:
    0
    Trophy Points:
    105
    not between them. But Intel-nvidia is definitly antt-trust worthy
     
  45. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    if daamit can merge, why not intel-nvidia?
     
  46. System64

    System64 Windows 7 x64

    Reputations:
    94
    Messages:
    1,318
    Likes Received:
    0
    Trophy Points:
    55
    That may have happened to AMD and ATi, it recieved an a letter regarding an anti-trust possibility.
     
  47. fabarati

    fabarati Frorum Obfuscator

    Reputations:
    1,904
    Messages:
    3,374
    Likes Received:
    0
    Trophy Points:
    105
    Because today there are three graphics card vendors: ATI (amd), nVidia, Intel. If the two largest, Intel an nVidia would merge, they would have a overly dominating part of the gpu market.

    Amd and ATI weren't really in the same market.
     
  48. The Forerunner

    The Forerunner Notebook Virtuoso

    Reputations:
    1,105
    Messages:
    3,061
    Likes Received:
    0
    Trophy Points:
    105
    Ah that makes sense. Fine nvidia-ibm then!
     
  49. Scavar

    Scavar Notebook Evangelist

    Reputations:
    50
    Messages:
    498
    Likes Received:
    0
    Trophy Points:
    30
    I've heard the nVidia-IBM rumors, but I highly doubt it. It's clear that the DAAMIT merger hurt them for a bit, and nVidia doesn't really need to merger. They already have excellent manufacturing plants, doing good money wise on their own, and just well, I don't know that IBM is all that appealing.

    Though a Cell PC processor might be interesting with a 9800......IBM was one of the companies to work on that with Sony wasn't it?
     
  50. The Forerunner

    The Forerunner Notebook Virtuoso

    Reputations:
    1,105
    Messages:
    3,061
    Likes Received:
    0
    Trophy Points:
    105
    Yeah ibm worked on the cell with sony. Yeah I was just kidding around about the merger, nvidia is strong on their own two feet right now.
     
 Next page →