The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Physics card - worth it?

    Discussion in 'Gaming (Software and Graphics Cards)' started by SGKoneko, Mar 8, 2008.

  1. SGKoneko

    SGKoneko Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    I'm seeing *some* laptops with physics cards available.
    The Sager that I'm looking at does not offer it.
    The cards are like $150.
    I'm just wondering if it really matters that much. I know it's very useful for games like the new Unreal Tournament, but just how much of an advantage is it?
     
  2. Crimsonman

    Crimsonman Ex NBR member :cry:

    Reputations:
    1,769
    Messages:
    2,650
    Likes Received:
    0
    Trophy Points:
    55
    Really, it makes no difference. There are so few games that actually can use one well its not even worth it.
     
  3. revoletion

    revoletion Notebook Consultant

    Reputations:
    6
    Messages:
    237
    Likes Received:
    0
    Trophy Points:
    30
    It only matters if the game supports it. I personaly havent seen many games that do.

    Its one of those things that only games made for the super high end pcs usualy have and laptops are usualy not high end pcs. Physics cards havent seen too much popularity so not many games use them. for $150 i would get a better gfx card or a proccessor upgrade. CPUs are needed for much more than physics cards
     
  4. Shimon

    Shimon Notebook Geek

    Reputations:
    3
    Messages:
    75
    Likes Received:
    0
    Trophy Points:
    15
    I found this, I don't see anything confirmation officially released by nvidia though.
     
  5. Pai

    Pai Notebook Evangelist

    Reputations:
    464
    Messages:
    657
    Likes Received:
    0
    Trophy Points:
    30
    I agree, you are probably better of just upgrading that CPU, assuming it can be upgraded and there are still higher models that you may upgrade to.
     
  6. Gophn

    Gophn NBR Resident Assistant

    Reputations:
    4,843
    Messages:
    15,707
    Likes Received:
    3
    Trophy Points:
    456
    nope, PhysX has now been phased out... since Nvidia acquired Ageia.

    Only a handful of games even showed any form of improvement with the PhysX:
    - City of Heroes/Villians
    - Ghost Recon Advanced Warfighter 1 & 2
    - Unreal Tournament 3

    But have a quad-core will make up for it, since some of these games are programmed to use an extra core for physics calculations.
     
  7. lunateck

    lunateck Bananaed

    Reputations:
    527
    Messages:
    2,654
    Likes Received:
    0
    Trophy Points:
    55
    Nvidia card with stream processors will be able to run PhysX using their CUDA program.. so there is no need to buy.. when the CUDA PhysX is out, just get a SLI.
     
  8. knightingmagic

    knightingmagic Notebook Deity

    Reputations:
    144
    Messages:
    1,194
    Likes Received:
    0
    Trophy Points:
    55
    Absolutely not.
     
  9. Xirurg

    Xirurg ORLY???

    Reputations:
    3,189
    Messages:
    7,375
    Likes Received:
    3
    Trophy Points:
    206
    *off topic*
    any news about 9800gx2?
     
  10. JCMS

    JCMS Notebook Prophet

    Reputations:
    455
    Messages:
    4,674
    Likes Received:
    0
    Trophy Points:
    105

    Although it is not THE physX, every DX10 cards have a physics accelerator, since it is a new function of DX10 to use the GPU for physics randering. I know that I can turn PhysX On in UT3 at least.
     
  11. Beatsiz

    Beatsiz Life Enthusiast

    Reputations:
    95
    Messages:
    1,411
    Likes Received:
    0
    Trophy Points:
    55
    PhysX Ageia X100 is unfortunately the latest and probably last physics card...

    BUT... however... advanced users may need it or so... it was just advertising for the XPS M1730... it go me so yeah ;)

    DX10 cards can run the physics in their shaders or whatsoever...

    And processors nowadays can handle physics just fine... since C2D is now standard... and quad mobile is right around the corner... power power power takes the physcs hooray :D
     
  12. gspot333

    gspot333 Notebook Consultant

    Reputations:
    1
    Messages:
    158
    Likes Received:
    0
    Trophy Points:
    30
    NOT WORTH IT, better off upgrading CPU, GPU, and ram
     
  13. Harleyquin07

    Harleyquin07 エミヤ

    Reputations:
    603
    Messages:
    3,376
    Likes Received:
    78
    Trophy Points:
    116
    Not worth it since only 3 games benefit from the hardware physics card, and as others have stated Nvidia bought over the company making the physics card so that they could offer software/driver support for the function for Nvidia 8 series cards.
     
  14. Prasad

    Prasad NBR Reviewer 1337 NBR Reviewer

    Reputations:
    1,804
    Messages:
    4,956
    Likes Received:
    10
    Trophy Points:
    106
    Yeah exactly! When nVidia releases those drivers for the current 8 series, then the PhysX card will be like software to you ;) (as long as you have an 8 series GPU).
     
  15. MrFong

    MrFong Notebook Evangelist

    Reputations:
    57
    Messages:
    654
    Likes Received:
    0
    Trophy Points:
    30
    Like everyone else in this thread, I'm gonna go ahead and say it's absolutely not worth the money.
     
  16. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    DX10 does not have 'DirectPhysics' the proposed physics API, any DX9 card could do phsyics calculations since the R9700 days.

    As for PhysX, even the games that did support it are limited UT3 is a little tiny add-on segment. A more global API or push is required for stronger support throughout hopefully that will make it worth programming for. And while nV bought the PhysX/Novodex engine, it doesn't power enough games to matter just yet. But the future is brighter than when it was just under Ageia, with Havok dragging their heels on FX.

    Right now PhysX hardware is close to useless and in a short time it will simply be taking up CPU cycles and power.

    Better to get more GPU power which has the flexability of either giving you great graphics features, or some physics assist, without locking you into a product with no other utility.
     
  17. Iceman0124

    Iceman0124 More news from nowhere

    Reputations:
    1,133
    Messages:
    3,548
    Likes Received:
    0
    Trophy Points:
    105
    The idea was sound, but the implementation was horrid, the initial asking price was way too high, if it had come out around $75 US , and 5 top tier games from the last 3 years had strong support for it, we'd all have some sort of physics add on, but it came out stupidly expensive, no real use whatsoever for the absurd price except hope that it winds up getting support, which it didnt because no one bought into it. Software developers arent going to spend a whole lot of time and effort on a sure fire maybe, look how long it takes to get titles that truely take advantage of the latest video cards, and when titles do take advantage of the hardware, the initial versions of said hardware usually arent powerful enough anyway, and refresh or even next gen parts are out or about to be.
     
  18. Greg

    Greg Notebook Nobel Laureate

    Reputations:
    7,857
    Messages:
    16,212
    Likes Received:
    58
    Trophy Points:
    466
    Its been around for years and barely any game supports it. That (to me) means its worthless.
     
  19. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,084
    Trophy Points:
    931
    These were a gimmick from the start, and nothing more. Even on paper . . . do we really need a separate piece of hardware to calculate physics? No. Nvidia is going to do it right on the graphics card:
    http://techreport.com/discussions.x/14147
    I always found it hard to believe that a Core 2 Duo couldn't do the physics that 'required' a PhysX chip. :rolleyes:
     
  20. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    Well it could do more than even a few Core Quads in certain situations, but the thing is that increasing the physics load also increases what you do with that infomation, and thus still stresses a CPU. And really the most valuable part is the non visual, non-shiny stuff. Like realistic bullet drop or realistic explosion calculation is it really appreciably better than a short cut parrabolic equation that doesn't account for wind, true gravity, deflection, interactions, etc? Do you need 100 stage building collapses to make it closer to realistic or 10 to make it look just better than a pre-determined explosion effect?

    Even if a bit cheaper the problem becomes that even if you pay just $75 for it, will you get much benefit if it's wicked awesom at physics but essentially still need a fast CPU to make these calculations more integral, and then in the games it doesn't work in, it's a $75 IR lightbulb consuming low level energy and radiating a little bit of heat.

    At least for graphics cards, either shader units or extra cards can find additional utility when not being used for physics, and spending the money to go fraom a GF8600 to GF8800 be it mobile or desktop, means you have access to more global power, just like intel's Havok solution will mean your access to more generalized computing power will also have benefits elsewhere.

    The dedicated PPU was doomed to fail from the start unless it got to the level of being a MoBo integrated part, and adoption and the ability to show off a killer app were stronger.

    Now the question is who will open up their physics sandbox first to the greater community to try and make it the standard, intel/Havok have more clout and power so if things stay the same will likely still dominate and be the defacto standard to come. However, if nVidia opens up their physics acceleration to beyond just their own CUDA and their own hardware, then they could challenge the normal way fo doing things and have a better shot at becoming the standard way for dev to do things.
    Right now CPUs have the advantage as both easier to code for and being the defacto standard, remaining a closed system won't help to change that.