The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    GPU's that help with physics???

    Discussion in 'Gaming (Software and Graphics Cards)' started by londez, Aug 4, 2006.

  1. londez

    londez Notebook Evangelist

    Reputations:
    114
    Messages:
    602
    Likes Received:
    8
    Trophy Points:
    31
    Does anybody know if ati or nvidia have plans to release gpu's that have extra processing components for handling physics??? That would be real nice for games like Crysis.
     
  2. usapatriot

    usapatriot Notebook Nobel Laureate

    Reputations:
    3,266
    Messages:
    7,360
    Likes Received:
    14
    Trophy Points:
    206
    there are already seperate physics handling cards.
     
  3. Phil17

    Phil17 Notebook Consultant

    Reputations:
    13
    Messages:
    292
    Likes Received:
    0
    Trophy Points:
    30
    Yes, but they are not from Nvidia or ATI. They are seperate and only enhance the physiks effects and are ony available for desktops at the moment.
     
  4. usapatriot

    usapatriot Notebook Nobel Laureate

    Reputations:
    3,266
    Messages:
    7,360
    Likes Received:
    14
    Trophy Points:
    206
    Thats what I meant.
     
  5. TwilightVampire

    TwilightVampire Notebook Deity

    Reputations:
    362
    Messages:
    1,376
    Likes Received:
    0
    Trophy Points:
    55
    Nvidia and ATi are graphics card companies, not physics card companies. They wont be making a physics card. A graphics card doesnt even have an affect on in-game physics. Thats pretty much your CPU (or the physics card if supported and present).
     
  6. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Both NVidia and ATI are working on some physics support, yes. But it'll be limited. As said above, the graphics card is really bad at affecting gameplay, so it'll most likely just be physics-based eyecandy they support.
     
  7. londez

    londez Notebook Evangelist

    Reputations:
    114
    Messages:
    602
    Likes Received:
    8
    Trophy Points:
    31
    A graphics card doesnt even have an affect on in-game physics.

    First off, that statement is completley false:

    " In a system with multiple NVIDIA GPUs, one GPU can be dedicated to physics. In fact there are no technical impediments to multiple cards or even unmatched configurations. All of these configurations work already on NVIDIA: Single GPU, SLI, Mixed GPUs (i.e., not SLI), Multi-GPU Cards, Multi-GPU card plus unmatched GPU for physics.
    "

    http://enthusiast.hardocp.com/article.html?art=MTA5NywxLCxoZW50aHVzaWFzdA==

    Both ATI and Nvidia have run tests in which they were able to dedicate the extra gpus in crossfire and SLI systems to physics processing


    Second, I was just wondering if it is at all possible/beneficial to add a second processing component to a single GPU card to help out with physics.
     
  8. usapatriot

    usapatriot Notebook Nobel Laureate

    Reputations:
    3,266
    Messages:
    7,360
    Likes Received:
    14
    Trophy Points:
    206

    owned........
     
  9. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Possible, yes. Beneficial? Not really.

    The reason they're working on GPU physics is that the GPU is already pretty well suited for it. That's the only reason. It's a "free ride", so to speak.

    But it's not worth designing logic specifically for this. If you do that, then it'd be more efficient to offload it to a separate chip/card, so it doesn't have to share bus and memory access with the GPU.
     
  10. gethin

    gethin Notebook Evangelist

    Reputations:
    2
    Messages:
    401
    Likes Received:
    0
    Trophy Points:
    30

    actually your statement isnt entirely correct - this is a work in progress, currently this can not be done (neither ati nor nvidia have released their physics handling protocols yet). So for the time being GPU's do not have any effect on physics. Besides a much more usefull way of handling physics would be to let the second cpu core do it.
     
  11. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Not neccesarily. CPU's are hopelessly slow at this stuff. A GPU can run circles around it in these sorts of tasks. If more performance is needed, either the GPU or a dedicated physics chip will have to step in.
    The CPU does have the advantage that it's, well, the CPU, so it's easier to exchange data with the rest of the game, which is a weak point of the GPU. That's why I said above that GPU physics will probably just end up as mostly eye candy. Add physics to stuff you can't interact with, so it doesn't affect gameplay, just looks nice.
     
  12. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,200
    Likes Received:
    17,914
    Trophy Points:
    931
    Not really, a CPU is much better at implementing more longterm maxtrix based physics. The reason we are seeing physics on the GPU is because of the just recently highly programmable natures of the chip allowing you to change its function.
     
  13. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    And because the GPU can perform ~20 times as much work on such parallel data than a CPU can.
     
  14. escapedturkey

    escapedturkey Notebook Guru

    Reputations:
    1
    Messages:
    63
    Likes Received:
    0
    Trophy Points:
    15
    I hope the second core of the new CPUs will be used more efficiently for such things as physics -- i.e. perhaps Havok will enhance their product to utilize the second core better. I know it's not as good as a dedicated physics component, but it sure is an economical way to get more enhanced gameplay. :)
     
  15. gethin

    gethin Notebook Evangelist

    Reputations:
    2
    Messages:
    401
    Likes Received:
    0
    Trophy Points:
    30
    go and read the new report on physics in tomshardware. It will show you that gpu's are good at basic physics, but arent as flexible or as powerfull as a cpu to workout complicated physics.
     
  16. jeffmd

    jeffmd Notebook Evangelist

    Reputations:
    65
    Messages:
    554
    Likes Received:
    20
    Trophy Points:
    31
    what is complicated pysics vs simple physics? these are games, not science projects. We don't exactly need big blue here. ;)
     
  17. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Ah, got to love when people use an article to back up a claim, without linking to the friggin' thing.... ;)

    Which just means one thing. At the moment, GPU's aren't as flexible as CPU's. We knew that. But they're a lot faster at what they *can* do. But GPU's are only going to get more flexible over time. In a few years, they'll be able to handle much more complex physics.

    Anyway, I found the article, and I assume you're talking about these quoteq:
    I don't know what he's been smoking, but the CPU is not the "master of matrix operations" *at all* (it's quite inefficient at them), and he seems to believe that dualcore allows the CPU to close the performance gap up to the GPU. Well, it won't. Once we get 30-core CPU's, they'll have *roughly* the same raw performance as *current* CPU's.

    But yes, CPU's are a lot more flexible, and his point of view is regarding scientific simulations, where performance is not an issue, and some problems are hard/impossible to solve on current GPU's. But moving them to the CPU isn't a viable solution for gaming, where you actually need decent performance.

    Also, LOL, one of the guys they interviewed in the article (Kenny Erleben) is a professor at my university. Followed one of his classes last semester... Cool guy. :D