I'm seeing *some* laptops with physics cards available.
The Sager that I'm looking at does not offer it.
The cards are like $150.
I'm just wondering if it really matters that much. I know it's very useful for games like the new Unreal Tournament, but just how much of an advantage is it?
-
Crimsonman Ex NBR member :cry:
Really, it makes no difference. There are so few games that actually can use one well its not even worth it.
-
It only matters if the game supports it. I personaly havent seen many games that do.
Its one of those things that only games made for the super high end pcs usualy have and laptops are usualy not high end pcs. Physics cards havent seen too much popularity so not many games use them. for $150 i would get a better gfx card or a proccessor upgrade. CPUs are needed for much more than physics cards -
I found this, I don't see anything confirmation officially released by nvidia though.
-
-
nope, PhysX has now been phased out... since Nvidia acquired Ageia.
Only a handful of games even showed any form of improvement with the PhysX:
- City of Heroes/Villians
- Ghost Recon Advanced Warfighter 1 & 2
- Unreal Tournament 3
But have a quad-core will make up for it, since some of these games are programmed to use an extra core for physics calculations. -
Nvidia card with stream processors will be able to run PhysX using their CUDA program.. so there is no need to buy.. when the CUDA PhysX is out, just get a SLI.
-
Absolutely not.
-
*off topic*
any news about 9800gx2? -
Although it is not THE physX, every DX10 cards have a physics accelerator, since it is a new function of DX10 to use the GPU for physics randering. I know that I can turn PhysX On in UT3 at least. -
PhysX Ageia X100 is unfortunately the latest and probably last physics card...
BUT... however... advanced users may need it or so... it was just advertising for the XPS M1730... it go me so yeah
DX10 cards can run the physics in their shaders or whatsoever...
And processors nowadays can handle physics just fine... since C2D is now standard... and quad mobile is right around the corner... power power power takes the physcs hooray -
NOT WORTH IT, better off upgrading CPU, GPU, and ram
-
Not worth it since only 3 games benefit from the hardware physics card, and as others have stated Nvidia bought over the company making the physics card so that they could offer software/driver support for the function for Nvidia 8 series cards.
-
(as long as you have an 8 series GPU).
-
Like everyone else in this thread, I'm gonna go ahead and say it's absolutely not worth the money.
-
TheGreatGrapeApe Notebook Evangelist
As for PhysX, even the games that did support it are limited UT3 is a little tiny add-on segment. A more global API or push is required for stronger support throughout hopefully that will make it worth programming for. And while nV bought the PhysX/Novodex engine, it doesn't power enough games to matter just yet. But the future is brighter than when it was just under Ageia, with Havok dragging their heels on FX.
Right now PhysX hardware is close to useless and in a short time it will simply be taking up CPU cycles and power.
Better to get more GPU power which has the flexability of either giving you great graphics features, or some physics assist, without locking you into a product with no other utility. -
Iceman0124 More news from nowhere
The idea was sound, but the implementation was horrid, the initial asking price was way too high, if it had come out around $75 US , and 5 top tier games from the last 3 years had strong support for it, we'd all have some sort of physics add on, but it came out stupidly expensive, no real use whatsoever for the absurd price except hope that it winds up getting support, which it didnt because no one bought into it. Software developers arent going to spend a whole lot of time and effort on a sure fire maybe, look how long it takes to get titles that truely take advantage of the latest video cards, and when titles do take advantage of the hardware, the initial versions of said hardware usually arent powerful enough anyway, and refresh or even next gen parts are out or about to be.
-
Its been around for years and barely any game supports it. That (to me) means its worthless.
-
Charles P. Jefferies Lead Moderator Super Moderator
These were a gimmick from the start, and nothing more. Even on paper . . . do we really need a separate piece of hardware to calculate physics? No. Nvidia is going to do it right on the graphics card:
http://techreport.com/discussions.x/14147
I always found it hard to believe that a Core 2 Duo couldn't do the physics that 'required' a PhysX chip. -
TheGreatGrapeApe Notebook Evangelist
Well it could do more than even a few Core Quads in certain situations, but the thing is that increasing the physics load also increases what you do with that infomation, and thus still stresses a CPU. And really the most valuable part is the non visual, non-shiny stuff. Like realistic bullet drop or realistic explosion calculation is it really appreciably better than a short cut parrabolic equation that doesn't account for wind, true gravity, deflection, interactions, etc? Do you need 100 stage building collapses to make it closer to realistic or 10 to make it look just better than a pre-determined explosion effect?
Even if a bit cheaper the problem becomes that even if you pay just $75 for it, will you get much benefit if it's wicked awesom at physics but essentially still need a fast CPU to make these calculations more integral, and then in the games it doesn't work in, it's a $75 IR lightbulb consuming low level energy and radiating a little bit of heat.
At least for graphics cards, either shader units or extra cards can find additional utility when not being used for physics, and spending the money to go fraom a GF8600 to GF8800 be it mobile or desktop, means you have access to more global power, just like intel's Havok solution will mean your access to more generalized computing power will also have benefits elsewhere.
The dedicated PPU was doomed to fail from the start unless it got to the level of being a MoBo integrated part, and adoption and the ability to show off a killer app were stronger.
Now the question is who will open up their physics sandbox first to the greater community to try and make it the standard, intel/Havok have more clout and power so if things stay the same will likely still dominate and be the defacto standard to come. However, if nVidia opens up their physics acceleration to beyond just their own CUDA and their own hardware, then they could challenge the normal way fo doing things and have a better shot at becoming the standard way for dev to do things.
Right now CPUs have the advantage as both easier to code for and being the defacto standard, remaining a closed system won't help to change that.
Physics card - worth it?
Discussion in 'Gaming (Software and Graphics Cards)' started by SGKoneko, Mar 8, 2008.