![]()
Great news gamers!
Nvidia buys Aegia, makers of the PPU Aegia PhsyX.
Nvidia's reasoning for buying them?
Check the links for further details:
http://www.fpslabs.com/news/latest/confirmed-nvidia-acquires-ageia
http://www.nvidia.com/object/io_1202161567170.html
-
-
edit: Quote now context-ed, ignore next paragraph. I still don't see why if nVIDIA thought they could integrate physics into the GPU better, thus eliminating physics accelerators, they didn't just hire a few engineers instead of buying a whole company. True, there's value in the company, but worth buying it? If nVIDIA delivers a better integrated solution Aegia would've dissolved anyways.
Whoa! You got that quote way out of context. That quote was by Gabe Newell, co-founder of Valve Software. And though the article states Valve has a close relationship with nVIDIA, it in no way says that is the view of nVIDIA or anybody working for nVIDIA.
Rather the article is saying that with nVIDIA's investment in the physics processor market, and thus presumable acceptance of it as something that will stay and expand, the effect on Valve, who is very much against physics processors as that quote indicates, will be very interesting. -
Meh, by the time this technology gets incorporated into the GPU, (think DX11 or DX12) a basic PC/notebook will have a 4 to 8 core CPU....do we really need dedicated "physics shaders" in our GPUs??
I see no great rush to by Aegia products today....and there'll be even less demand as time goes on and CPUs get faster. -
I'm glad they bought out Aegia. Hopefully they'll dissolve the company and end the push for another expensive computer part.
-
-
moon angel Notebook Virtuoso NBR Reviewer
-
-
masterchef341 The guy from The Notebook
for what its worth, just tossing the quote up there like it is direct from nvidia IS misleading.
i see your logic, and i actually agree with your point, and i would even go so far as to suppose that you might have nailed the reason why nvidia is interested in buying out ageia, but I still think you took that quotation out of context.
either way, im glad the ppu thing was nipped out before it became a big deal. -
-
I'm just surprised that Intel hasn't moved to snap up Nvidia like AMD did with ATI.
-
This was inevitable. There's always been talk about moving physics to a core of the CPU or the GPU, and what Aegia offered was never anything that spectacular. I like the idea in theory - every object/item has a set density, material, and weight - and its movement (or destruction) is calculated in real time. But this is nothing that can't be done with a fancy algorithm computed by the GPU or CPU.
I do hope that nVidia uses the technology, however, as the next step in GPU evolution. So not only do objects have textures, shaders, and shadows, but assigned physical values as well. In the same way that light source will cast a shadow off that object real time, it will also move or be destroyed real-time.
nVidia better not screw this up. -
moon angel Notebook Virtuoso NBR Reviewer
Well apparently Nvidia can do no wrong so... -
Nvidia probably didn't buy them for the current tech itself, rather for their patents, which Nvidia needs to develop an integrated solution in future parts without paying Aegia $$$$ in license fees. Kind of like when they bought 3DFX. Sometimes buying a company outright is cheaper (long term) than licensing patented tech (or getting sued) from them.
That said, I'd rather physics acceleration became a CPU issue, instead of a GPU one. 99% of modern games are limited by the GPU, not CPU. Giving the GPU even more work to do only makes that limitation worse...... -
Otherwise, it's "why you think NVidia bought them". Which is very different. -
So if Nvidia says they bought it because, well... because Intel bought Havoc... you'd take that as gospel? End of story? Even though it's no secret that Nvidia had been planning for a while on integrating physics processing with graphics processing but put it on the back burner not too long ago.
Yeah. Well, it's definitely... "a" reason. And in all probability "a" major reason. Plus if you factor in the fact that while Intel may be great at making CPU's, their current(integrated) and untested future(discrete) GPUs are no major threat to Nvidia's gaming niche. It renders their(Intel's) purchase of Havok practically impotent.
So really, what other reason is there for the purchase other than Nvidia's desire to not have to reinvent the wheel, grab talent, and stay on top?
But I've saved the best for last.
Here's a quote from Nvidia themselves:
"The computer industry is moving towards a heterogeneous computing model, combining a flexible CPU and a massively parallel processor like the GPU to perform computationally intensive applications like real-time computer graphics," continued Mr. Huang. "NVIDIA's CUDA technology, which is rapidly becoming the most pervasive parallel programming environment in history, broadens the parallel processing world to hundreds of applications desperate for a giant step in computational performance. Applications such as physics, computer vision, and video/image processing are enabled through CUDA and heterogeneous computing."
"The AGEIA team is world class, and is passionate about the same thing we arecreating the most amazing and captivating game experiences," stated Jen-Hsun Huang, president and CEO of NVIDIA. "By combining the teams that created the world's most pervasive GPU and physics engine brands, we can now bring GeForce®-accelerated PhysX to hundreds of millions of gamers around the world."
Looks like Nvidia more or less says so too.
http://www.nvidia.com/object/io_1202161567170.html
Sorry to be so harsh and brash, but nitpicking is one of my pet peeves. It's like spelling or grammar nazi-ing. I don't mind it when info is critical. You know, like when you have a case where there's a wrong model number or product name. That kind of thing is warranted.
Anyway, let's not devolve this thread into a contest of wits or what have you. -
Donald@Paladin44 Retired
Maybe Intel learned a lesson from the AMD purchase of ATi.
-
It does not matter how much faster current x86 CPUs get, they are still a terrible platform for physics calculations. GPU-style parallel architecture is orders of magnitude more efficient at those types of calculations, and that is why they should put thrown on a GPU. We do not actually need dedicated shaders on the GPU because modern GPUs use programmable shaders that can run in either mode.
Heck, Nvidia already announced they will be releasing a software patch that will allow Geforce 8 GPU hardware to run PhysX software. We are already at the point of integration.
Now for MORE EXPLODING BARRELS!
Nvidia Acquires Aegia
Discussion in 'Gaming (Software and Graphics Cards)' started by 2.0, Feb 4, 2008.