Hi guys, Merry Christmas!
Now, for the real question, why is that architecture being nominated as the "best thing ever"? I mean, there are rumors that it can beat a i7 980+5870...
Can some one please tell me exactly what that architecture is and what is there so special about it? Why was it cancelled?
Thanks!
-
-
It is a GPGPU. It got cancelled because of bad performance and delays.
-
With bad performance, you mean worse than mine, for example? ( look
sig) -
Larrabee does not exist. It would not have performed as well as your current laptop had it been adapted for video game graphics. It may have performed better with technologies such as ray tracing, but because it never reached market, it will never get the chance.
-
-
I heard they cancelled, mostly out of taking so long they couldnt compete with what was coming to the market, but i read like 6 months ago that they were starting again, but maybe its just a rumor. I think Intel should just buy nvidia and have similar offerings like AMD has, maybe even drop xfire support as amd did with their boards on nvidia.
-
Intel confirmed that Larrabee technology will be integrated in their next gen of integrated graphics, X5000HD.
And the Larabee what made it different was it would have been x86 architecture just like CPU. So buying Nvidia is not what Intel would have wanted anyways.
But IMO Intel is the smart one, not AMD or Nvidia. The future of 3D seems to be ray tracing/voxel tree, not more polygons like Tessellation. I have little doubt Intel did not throw away their GPGPu ambitions completely out the door, we'll hear something in the near future. Once others realize more polygons is not the future, Intel will be at the forefront I'll bet. There is no need to buy Nvidia. Because AMD has license to for x86 they should be able to keep up, it will be Nvidia left in the dust. There is little doubt that's why Nvidia attempted to make x86 but were quickly shut down by Nvidia, yay for Intel legal department. Maybe this will never happen, but I think it will.
This video was Intel's demonstration of real time rendering ray tracing on an old game Quake Wars. But you can see how smooth everything is, round without having to buy a $500 Nvidia for tessellation. What is awesome about ray tracing is that you don't need a graphics card. Quake Wars Ray Traced can be run on on two quad core CPU only. Larabee projected to have around 32 core x86 processors. Maybe in the future when Intel releases it will be like 128 drool, which hopefully put Nvidia to shame.
For programmers, the Larrabee would have been a lot easier since it does use x86 instructions. Reflections in water would have only taken 10 lines of code in comparison to what you need for traditional GPU. Can even program in C++ for it. Real time ray tracing rendering is something even my HD5870M would probably get crushed with. I'm no tech guru, this is just from browsing the web, sure you can find more technical readings and be just as excited. Intel for sure did not quit, still at it.
<param name="movie" value="http://www.youtube.com/v/mtHDSG2wNho?fs=1&hl=en_US"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/mtHDSG2wNho?fs=1&hl=en_US" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width='640' height="385"></embed></object>Last edited by a moderator: May 6, 2015 -
-
Using ray tracing instead of polygon models seems silly, although it's more realistic it also sucks in terms of performance (find a game with ray tracing in it, enable it and then watch your FPS drop).
-
-
-
masterchef341 The guy from The Notebook
i think over the next 5 years as tech matures and becomes mainstream we will start to see a *SLOW* shift in consumer 3d rendering applications (read: games) towards using ray tracing over rasterization.
I forget the exact quote, but:
"Ray tracing requires cleverness to be fast. Rasterization requires cleverness to look like ray tracing."
----
If ray tracing doesn't need to be fast because we have hardware that can do it in real time, we will stop investing in rasterization because it is difficult to make look good, but it won't happen overnight, even if intel releases ray tracing hardware. -
mobius1aic Notebook Deity NBR Reviewer
-
SoundOf1HandClapping Was once a Forge
In addition to ray tracing, wasn't there something else that involved clouds of points? I remember there being a thread on that from way back when.
-
masterchef341 The guy from The Notebook
-
Meaker@Sager Company Representative
-
But I think you're correct. I want to see it sooner rather than later so that's probably why I want to believe in an aggressive timeline for implementation. -
Ray tracing requires considerable computing horsepower, and to render games real-time with it using comparable graphics to DirectX 11 I don't think they're even close. Not to mention all the tricks used for animation and realism with polygons and textures. Ray tracing definitely increases the realism from a static image perspective, but I still don't see how it can suddenly enter the market as an IGP and take over current 3D technology. Perhaps over time it will be integrated with existing 3D tech and evolve, but a sudden shift, I don't see it happening.
-
masterchef341 The guy from The Notebook
ray tracing is a rendering technology. it doesn't get rid of polygons or textures... i think some people were suggesting that earlier...
-
Asset loading needs to improve first, hopefully with all the new tech in nano carbon wires, and stuff like "Light Peek" the time to retrieve info will become much quicker. Voxels I believe load into mem on the fly, not pre loaded like most current generation games. That and when we can have 1TB of ram
, infinite textures.
What's with the Intel Larrabee architecture?
Discussion in 'Gaming (Software and Graphics Cards)' started by Phistachio, Dec 25, 2010.