The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Nvidia Acquires Aegia

    Discussion in 'Gaming (Software and Graphics Cards)' started by 2.0, Feb 4, 2008.

  1. 2.0

    2.0 Former NBR Macro-Mod®

    Reputations:
    13,368
    Messages:
    7,745
    Likes Received:
    1,035
    Trophy Points:
    331
    [​IMG]

    Great news gamers!

    Nvidia buys Aegia, makers of the PPU Aegia PhsyX.

    Nvidia's reasoning for buying them?

    Check the links for further details:

    http://www.fpslabs.com/news/latest/confirmed-nvidia-acquires-ageia

    http://www.nvidia.com/object/io_1202161567170.html
     
  2. Apollo13

    Apollo13 100% 16:10 Screens

    Reputations:
    1,432
    Messages:
    2,578
    Likes Received:
    210
    Trophy Points:
    81
    edit: Quote now context-ed, ignore next paragraph. I still don't see why if nVIDIA thought they could integrate physics into the GPU better, thus eliminating physics accelerators, they didn't just hire a few engineers instead of buying a whole company. True, there's value in the company, but worth buying it? If nVIDIA delivers a better integrated solution Aegia would've dissolved anyways.

    Whoa! You got that quote way out of context. That quote was by Gabe Newell, co-founder of Valve Software. And though the article states Valve has a close relationship with nVIDIA, it in no way says that is the view of nVIDIA or anybody working for nVIDIA.

    Rather the article is saying that with nVIDIA's investment in the physics processor market, and thus presumable acceptance of it as something that will stay and expand, the effect on Valve, who is very much against physics processors as that quote indicates, will be very interesting.
     
  3. Tony_A

    Tony_A Notebook Evangelist

    Reputations:
    67
    Messages:
    487
    Likes Received:
    0
    Trophy Points:
    30
    Meh, by the time this technology gets incorporated into the GPU, (think DX11 or DX12) a basic PC/notebook will have a 4 to 8 core CPU....do we really need dedicated "physics shaders" in our GPUs??

    I see no great rush to by Aegia products today....and there'll be even less demand as time goes on and CPUs get faster.
     
  4. knightingmagic

    knightingmagic Notebook Deity

    Reputations:
    144
    Messages:
    1,194
    Likes Received:
    0
    Trophy Points:
    55
    I'm glad they bought out Aegia. Hopefully they'll dissolve the company and end the push for another expensive computer part.
     
  5. 2.0

    2.0 Former NBR Macro-Mod®

    Reputations:
    13,368
    Messages:
    7,745
    Likes Received:
    1,035
    Trophy Points:
    331
    Actually, the quote is the reasoning behind Nvidia's purchase. That's why I quoted it. The quote is a statement against having a non integrated solution as being a waste. Nvidia will rectify that by integrating the PPU with the GPU (a homogenous architecture) thus eliminating the middleman (the artist formerly known as CPU) from the equation. Now all the CPU will do is handle things like game logic and input parsing.
     
  6. moon angel

    moon angel Notebook Virtuoso NBR Reviewer

    Reputations:
    2,011
    Messages:
    2,777
    Likes Received:
    15
    Trophy Points:
    56
    Quoted for truth, plus rep.
     
  7. 2.0

    2.0 Former NBR Macro-Mod®

    Reputations:
    13,368
    Messages:
    7,745
    Likes Received:
    1,035
    Trophy Points:
    331
    I agree. I think it's a waste. Especially with notebooks soon to be overtaking desktops in sales. Nvidia needs to hurry up and get more processing power for less watts and less $coin$. But for those high-end lappies, the watts be damned - lots o' fan noise - bulky boxies, this ought to be great!
     
  8. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    for what its worth, just tossing the quote up there like it is direct from nvidia IS misleading.

    i see your logic, and i actually agree with your point, and i would even go so far as to suppose that you might have nailed the reason why nvidia is interested in buying out ageia, but I still think you took that quotation out of context.

    either way, im glad the ppu thing was nipped out before it became a big deal.
     
  9. 2.0

    2.0 Former NBR Macro-Mod®

    Reputations:
    13,368
    Messages:
    7,745
    Likes Received:
    1,035
    Trophy Points:
    331
    Okie doke. I'll fix the post so there's no confusion...
     
  10. Wu Jen

    Wu Jen Some old nobody

    Reputations:
    1,409
    Messages:
    1,438
    Likes Received:
    0
    Trophy Points:
    55
    I'm just surprised that Intel hasn't moved to snap up Nvidia like AMD did with ATI.
     
  11. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    This was inevitable. There's always been talk about moving physics to a core of the CPU or the GPU, and what Aegia offered was never anything that spectacular. I like the idea in theory - every object/item has a set density, material, and weight - and its movement (or destruction) is calculated in real time. But this is nothing that can't be done with a fancy algorithm computed by the GPU or CPU.

    I do hope that nVidia uses the technology, however, as the next step in GPU evolution. So not only do objects have textures, shaders, and shadows, but assigned physical values as well. In the same way that light source will cast a shadow off that object real time, it will also move or be destroyed real-time.

    nVidia better not screw this up.
     
  12. moon angel

    moon angel Notebook Virtuoso NBR Reviewer

    Reputations:
    2,011
    Messages:
    2,777
    Likes Received:
    15
    Trophy Points:
    56

    Well apparently Nvidia can do no wrong so...
     
  13. Tony_A

    Tony_A Notebook Evangelist

    Reputations:
    67
    Messages:
    487
    Likes Received:
    0
    Trophy Points:
    30


    Nvidia probably didn't buy them for the current tech itself, rather for their patents, which Nvidia needs to develop an integrated solution in future parts without paying Aegia $$$$ in license fees. Kind of like when they bought 3DFX. Sometimes buying a company outright is cheaper (long term) than licensing patented tech (or getting sued) from them.

    That said, I'd rather physics acceleration became a CPU issue, instead of a GPU one. 99% of modern games are limited by the GPU, not CPU. Giving the GPU even more work to do only makes that limitation worse......
     
  14. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    No, it might make sense, but it's not NVidia's reasoning unless NVidia says so.
    Otherwise, it's "why you think NVidia bought them". Which is very different.
     
  15. 2.0

    2.0 Former NBR Macro-Mod®

    Reputations:
    13,368
    Messages:
    7,745
    Likes Received:
    1,035
    Trophy Points:
    331
    Geez, now we're getting anal.

    So if Nvidia says they bought it because, well... because Intel bought Havoc... you'd take that as gospel? End of story? Even though it's no secret that Nvidia had been planning for a while on integrating physics processing with graphics processing but put it on the back burner not too long ago.

    Yeah. Well, it's definitely... "a" reason. And in all probability "a" major reason. Plus if you factor in the fact that while Intel may be great at making CPU's, their current(integrated) and untested future(discrete) GPUs are no major threat to Nvidia's gaming niche. It renders their(Intel's) purchase of Havok practically impotent.

    So really, what other reason is there for the purchase other than Nvidia's desire to not have to reinvent the wheel, grab talent, and stay on top?

    But I've saved the best for last.

    Here's a quote from Nvidia themselves:

    "The computer industry is moving towards a heterogeneous computing model, combining a flexible CPU and a massively parallel processor like the GPU to perform computationally intensive applications like real-time computer graphics," continued Mr. Huang. "NVIDIA's CUDA™ technology, which is rapidly becoming the most pervasive parallel programming environment in history, broadens the parallel processing world to hundreds of applications desperate for a giant step in computational performance. Applications such as physics, computer vision, and video/image processing are enabled through CUDA and heterogeneous computing."

    "The AGEIA team is world class, and is passionate about the same thing we are—creating the most amazing and captivating game experiences," stated Jen-Hsun Huang, president and CEO of NVIDIA. "By combining the teams that created the world's most pervasive GPU and physics engine brands, we can now bring GeForce®-accelerated PhysX to hundreds of millions of gamers around the world."

    Looks like Nvidia more or less says so too. :cool:

    http://www.nvidia.com/object/io_1202161567170.html

    Sorry to be so harsh and brash, but nitpicking is one of my pet peeves. It's like spelling or grammar nazi-ing. I don't mind it when info is critical. You know, like when you have a case where there's a wrong model number or product name. That kind of thing is warranted.

    Anyway, let's not devolve this thread into a contest of wits or what have you.
     
  16. Donald@Paladin44

    Donald@Paladin44 Retired

    Reputations:
    13,989
    Messages:
    9,257
    Likes Received:
    5,843
    Trophy Points:
    681
    Maybe Intel learned a lesson from the AMD purchase of ATi.
     
  17. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    Oh yes, we DO need something optimized to perform physics calculations. This could be part of the GPUs processing units or maybe an entire GPU (SLI, anyone?).

    It does not matter how much faster current x86 CPUs get, they are still a terrible platform for physics calculations. GPU-style parallel architecture is orders of magnitude more efficient at those types of calculations, and that is why they should put thrown on a GPU. We do not actually need dedicated shaders on the GPU because modern GPUs use programmable shaders that can run in either mode.

    Heck, Nvidia already announced they will be releasing a software patch that will allow Geforce 8 GPU hardware to run PhysX software. We are already at the point of integration.

    Now for MORE EXPLODING BARRELS!