edit:I delete this on account of me being a complete klutz and confusing Intel with nVidia
edit2: Ok, i'll repost it, but be aware that I made a major screw up with the text. I'll take the bolds by popular request (since i always post in bolds) but I keep telling you that normal fonts cramp my (lack of) style.
TEH PSOT ROFLMAO!111!!11!
A waste. At least it looks that way when you start to think that multi-core processing aims directly at the task of taking over the physics calculations. The pages Intel has about their processors boast on how mighty they are with that.
So... WTF WAS INTEL THINKING WHEN THEY BOUGHT AGEIA? I mean, seriously. Are they plotting to completely separate the physics calculations from the CPU? Combine that with dedicated audio and video and then, the processor is almost worthless. So probably they won't be going with that one.
My guess is, they thought "Hey! What if we come up with a new product, label it as essential for the future and arrange some of our partner companies bac us up?".
Then you have sites like Dell and Alienware including those things in desktops and laptops, saying how much they will improve games and all. BULL****!
Very few games take full advantage of AGEIA for what I've seen.
But most people are gullible. Most people will buy them anyway.
So, inevitably, the apps and games of the future will rely and depend on PPU as much as they do now on GPU. Unless something really bad (for them) and really cool (for us) happens first.
What 'miraculous' technological 'breakthrough' will be here next? I don't want to know.
Also:
http://www.devhardware.com/forums/showpost.php?p=560069&postcount=4
final edit: this post was because of honest rage at the thought of intel as a power hunger monopolyst monster, but as stormeffect points out, actually nVidia bought ageia. intel bought havoc, which prompted amd to try to buy ageia to compete. soooooooooooooooo, looks like i was totally wrong whith the conspiracy part. but nonetheless, ageia sucks and blows big time. how that doesnt result in explosive decompresion is out of my grasp.
-
You are right, Ageia is a waste. Watching that tech demo of that FPS game where you could send tons of barrels around and watching flags tear realistically was cool back then, but now it could be done easily enough with a fast CPU/GPU.
-
shoelace_510 8700M GT inside... ^-^;
Hm... very interesting. I sure hope my CPU in the future won't have MORE to do but I guess we'll see.
Also, that quote from Apuleyo at the bottom made me laugh so hard. >.< LOL -
First, would you mind writing your posts without bolding all of the text?
Second, Intel bought Havok, not Ageia. Nvidia bought Ageia.
Intel wants EVERYTHING on the CPU. AMD, and especially NVIDIA, want more general processing (for example, physics) on the GPU.
Intel bought Havok because they are probably the most popular physics engine on the market, followed by PhysX by Ageia. Intel may want to integrate Havok better on Intel hardware, which is a great idea. I doubt they want to make an add-in card for physics when they'd rather just sell you a few more cores in your CPU.
Nvidia bought Ageia for the PhysX interoperability. Now they can try to do some physics processing on your GPU, which is actually kind of a neat idea, considering the nature of physics processing. More value in your GPU is good for AMD and Nvidia. More value in your CPU is good for AMD and Intel. I think the GPU will continue to gain ground, as Intel is investing in a GPU-like parallel architecture "Larabee" card that will compete with other GPUs.
These purchases were good for a few reasons. One, nobody went out of business, leaving consumers out of luck. Two, tighter integration and better support. Three, more possibilities for parallel computing. And four, competition. -
My bad! Sorry man, I'll better delete this
-
I'd much rather you stop bolding all of your text than delete the post! -
Using Physx on the gpu has already resolved the issue with ppu's. Its simply run on the gpu now and there are games that use it. Such as UT3 according to Nvidia there will be more to come since its built into all new gpu's.
-
Be aware of the fact that I'm completely insane.
Might I add that I tend to confuse nVidia and Intel because, since AMD owns ATI it makes sense that Intel tries to side with nVidia. In my mind, nVidia and Intel are like siamese evil twins. Yeah. Both are the evil twin. -
We figured that out already, I'm sure most saw this and lauged.
-
This is the perfect time for AMD/ATI to come out swinging while their competetors trip a little bit. The 4850/4870 are genius, now if we can get our hands on 45nm Phenom and then Bulldozer Cores with Fusion...things will be getting sexy competition wise.
Even better, AMD/ATI released the first impressive integrated graphics with the 780G (mobile PUMA) chipset containing the HD3200. I am really sad I can't get an Intel processor in an AMD chipset with an HD3200. :-( -
I wonder if I can get a mobile version of the HD4870 on my future NP9262 in Cross Fire. That would kick the asses of SLI 9800M GTX!
-
I think Sager sticks with Intel, which means sticking with Nvidia.
-
But I thought that those cards use the exact same connectors...
Too bad, I really like ATI products better than nVidia. -
ltcommander_data Notebook Deity
http://www.amd.com/us-en/Corporate/VirtualPressRoom/0,,51_104_543~126548,00.html -
-
they did on the Clevo M8660
-
Was that before AMD purchased ATi?
-
Then again, by the time that may be possible, around the time Intel goes with an integrated memory controller, maybe Bulldozer and Fusion will make it a non-issue. -
I hope they will do some cross-overs. It would be nice to have Intel+ATi again...
But you never know. Now that I think about it, we are seeing some ATi cards packaged with Intel CPU's... -
-
-
Ageia = waste of time, money and a good PCI port? Or the future?
Discussion in 'Gaming (Software and Graphics Cards)' started by Apuleyo, Jul 29, 2008.