The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Ageia = waste of time, money and a good PCI port? Or the future?

    Discussion in 'Gaming (Software and Graphics Cards)' started by Apuleyo, Jul 29, 2008.

  1. Apuleyo

    Apuleyo Notebook Consultant

    Reputations:
    11
    Messages:
    100
    Likes Received:
    0
    Trophy Points:
    30
    edit:I delete this on account of me being a complete klutz and confusing Intel with nVidia

    edit2: Ok, i'll repost it, but be aware that I made a major screw up with the text. I'll take the bolds by popular request (since i always post in bolds) but I keep telling you that normal fonts cramp my (lack of) style.

    TEH PSOT ROFLMAO!111!!11!

    A waste. At least it looks that way when you start to think that multi-core processing aims directly at the task of taking over the physics calculations. The pages Intel has about their processors boast on how mighty they are with that.

    So... WTF WAS INTEL THINKING WHEN THEY BOUGHT AGEIA? I mean, seriously. Are they plotting to completely separate the physics calculations from the CPU? Combine that with dedicated audio and video and then, the processor is almost worthless. So probably they won't be going with that one.

    My guess is, they thought "Hey! What if we come up with a new product, label it as essential for the future and arrange some of our partner companies bac us up?".

    Then you have sites like Dell and Alienware including those things in desktops and laptops, saying how much they will improve games and all. BULL****!

    Very few games take full advantage of AGEIA for what I've seen.

    But most people are gullible. Most people will buy them anyway.

    So, inevitably, the apps and games of the future will rely and depend on PPU as much as they do now on GPU. Unless something really bad (for them) and really cool (for us) happens first.

    What 'miraculous' technological 'breakthrough' will be here next? I don't want to know.

    Also:

    http://www.devhardware.com/forums/showpost.php?p=560069&postcount=4

    final edit: this post was because of honest rage at the thought of intel as a power hunger monopolyst monster, but as stormeffect points out, actually nVidia bought ageia. intel bought havoc, which prompted amd to try to buy ageia to compete. soooooooooooooooo, looks like i was totally wrong whith the conspiracy part. but nonetheless, ageia sucks and blows big time. how that doesnt result in explosive decompresion is out of my grasp.
     
  2. Shroomy

    Shroomy Notebook Consultant

    Reputations:
    55
    Messages:
    271
    Likes Received:
    0
    Trophy Points:
    30
    You are right, Ageia is a waste. Watching that tech demo of that FPS game where you could send tons of barrels around and watching flags tear realistically was cool back then, but now it could be done easily enough with a fast CPU/GPU.
     
  3. shoelace_510

    shoelace_510 8700M GT inside... ^-^;

    Reputations:
    276
    Messages:
    1,525
    Likes Received:
    0
    Trophy Points:
    55
    Hm... very interesting. I sure hope my CPU in the future won't have MORE to do but I guess we'll see. ;)

    Also, that quote from Apuleyo at the bottom made me laugh so hard. >.< LOL
     
  4. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    First, would you mind writing your posts without bolding all of the text?

    Second, Intel bought Havok, not Ageia. Nvidia bought Ageia.

    Intel wants EVERYTHING on the CPU. AMD, and especially NVIDIA, want more general processing (for example, physics) on the GPU.

    Intel bought Havok because they are probably the most popular physics engine on the market, followed by PhysX by Ageia. Intel may want to integrate Havok better on Intel hardware, which is a great idea. I doubt they want to make an add-in card for physics when they'd rather just sell you a few more cores in your CPU.

    Nvidia bought Ageia for the PhysX interoperability. Now they can try to do some physics processing on your GPU, which is actually kind of a neat idea, considering the nature of physics processing. More value in your GPU is good for AMD and Nvidia. More value in your CPU is good for AMD and Intel. I think the GPU will continue to gain ground, as Intel is investing in a GPU-like parallel architecture "Larabee" card that will compete with other GPUs.

    These purchases were good for a few reasons. One, nobody went out of business, leaving consumers out of luck. Two, tighter integration and better support. Three, more possibilities for parallel computing. And four, competition.
     
  5. Apuleyo

    Apuleyo Notebook Consultant

    Reputations:
    11
    Messages:
    100
    Likes Received:
    0
    Trophy Points:
    30
    My bad! Sorry man, I'll better delete this
     
  6. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    Na it's cool. These are interesting reads, nonetheless.

    I'd much rather you stop bolding all of your text than delete the post! :D
     
  7. Rorschach

    Rorschach Notebook Virtuoso NBR Reviewer

    Reputations:
    1,131
    Messages:
    3,552
    Likes Received:
    17
    Trophy Points:
    106
    Using Physx on the gpu has already resolved the issue with ppu's. Its simply run on the gpu now and there are games that use it. Such as UT3 according to Nvidia there will be more to come since its built into all new gpu's.
     
  8. Apuleyo

    Apuleyo Notebook Consultant

    Reputations:
    11
    Messages:
    100
    Likes Received:
    0
    Trophy Points:
    30
    Be aware of the fact that I'm completely insane.

    Might I add that I tend to confuse nVidia and Intel because, since AMD owns ATI it makes sense that Intel tries to side with nVidia. In my mind, nVidia and Intel are like siamese evil twins. Yeah. Both are the evil twin.
     
  9. Rorschach

    Rorschach Notebook Virtuoso NBR Reviewer

    Reputations:
    1,131
    Messages:
    3,552
    Likes Received:
    17
    Trophy Points:
    106
    We figured that out already, I'm sure most saw this and lauged.
     
  10. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    Yeah, lucky for AMD/ATI, Nvidia and Intel have been pretty mean to each other lately, souring their relationship a bit. Most recently was an issue where people were not going to be able to get SLI with future Intel Nehalem processors. I think they rectified that, but I don't really mind.

    This is the perfect time for AMD/ATI to come out swinging while their competetors trip a little bit. The 4850/4870 are genius, now if we can get our hands on 45nm Phenom and then Bulldozer Cores with Fusion...things will be getting sexy competition wise.

    Even better, AMD/ATI released the first impressive integrated graphics with the 780G (mobile PUMA) chipset containing the HD3200. I am really sad I can't get an Intel processor in an AMD chipset with an HD3200. :-(
     
  11. Apuleyo

    Apuleyo Notebook Consultant

    Reputations:
    11
    Messages:
    100
    Likes Received:
    0
    Trophy Points:
    30
    I wonder if I can get a mobile version of the HD4870 on my future NP9262 in Cross Fire. That would kick the asses of SLI 9800M GTX!
     
  12. KGann

    KGann NBR Themesong Writer

    Reputations:
    317
    Messages:
    2,742
    Likes Received:
    0
    Trophy Points:
    55
    I think Sager sticks with Intel, which means sticking with Nvidia.
     
  13. Apuleyo

    Apuleyo Notebook Consultant

    Reputations:
    11
    Messages:
    100
    Likes Received:
    0
    Trophy Points:
    30
    But I thought that those cards use the exact same connectors... :(
    Too bad, I really like ATI products better than nVidia.
     
  14. ltcommander_data

    ltcommander_data Notebook Deity

    Reputations:
    408
    Messages:
    1,398
    Likes Received:
    0
    Trophy Points:
    55
    The lines really aren't that clear since nVidia owns Ageia and Intel owns Havok, but interestingly AMD is siding with Intel announcing a partnership on Havok just last month.

    http://www.amd.com/us-en/Corporate/VirtualPressRoom/0,,51_104_543~126548,00.html
     
  15. KGann

    KGann NBR Themesong Writer

    Reputations:
    317
    Messages:
    2,742
    Likes Received:
    0
    Trophy Points:
    55
    Doesn't matter what connections they use. I doubt AMD would let their pride and joy be lent to their rival. (Intel)
     
  16. Harper2.0

    Harper2.0 Back from the dead?

    Reputations:
    2,078
    Messages:
    3,108
    Likes Received:
    0
    Trophy Points:
    105
    they did on the Clevo M8660 :confused:
     
  17. KGann

    KGann NBR Themesong Writer

    Reputations:
    317
    Messages:
    2,742
    Likes Received:
    0
    Trophy Points:
    55
    Was that before AMD purchased ATi?
     
  18. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    *sobs* I feel like I'll never get my Intel processor and AMD chipset + IGP. :-(

    Then again, by the time that may be possible, around the time Intel goes with an integrated memory controller, maybe Bulldozer and Fusion will make it a non-issue.
     
  19. KGann

    KGann NBR Themesong Writer

    Reputations:
    317
    Messages:
    2,742
    Likes Received:
    0
    Trophy Points:
    55
    I hope they will do some cross-overs. It would be nice to have Intel+ATi again...

    But you never know. Now that I think about it, we are seeing some ATi cards packaged with Intel CPU's...
     
  20. Apuleyo

    Apuleyo Notebook Consultant

    Reputations:
    11
    Messages:
    100
    Likes Received:
    0
    Trophy Points:
    30
    *Apuleyo shivers*
     
  21. AznFlamer

    AznFlamer Notebook Consultant

    Reputations:
    1
    Messages:
    290
    Likes Received:
    0
    Trophy Points:
    30
    lol so there IS a chance
     
  22. fabarati

    fabarati Frorum Obfuscator

    Reputations:
    1,904
    Messages:
    3,374
    Likes Received:
    0
    Trophy Points:
    105
    You can get dedicated ATI cards with Intel CPUs. The ones who decide that the dedicated cards work with this or that platforms are the OEM's. Take Asus for instance. They use the same connector for GPUs on all F8's. They make the ATI and nVidia cards in the same formfactor, as that's cheaper for them. Thus it fits both AMD and Intel plaforms.

    No. That happened back in 2006.