The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    What's with the Intel Larrabee architecture?

    Discussion in 'Gaming (Software and Graphics Cards)' started by Phistachio, Dec 25, 2010.

  1. Phistachio

    Phistachio A. Scriabin

    Reputations:
    1,930
    Messages:
    2,588
    Likes Received:
    145
    Trophy Points:
    81
    Hi guys, Merry Christmas!

    Now, for the real question, why is that architecture being nominated as the "best thing ever"? I mean, there are rumors that it can beat a i7 980+5870...
    Can some one please tell me exactly what that architecture is and what is there so special about it? Why was it cancelled?


    Thanks!
     
  2. KoldWar

    KoldWar Notebook Consultant

    Reputations:
    2
    Messages:
    157
    Likes Received:
    0
    Trophy Points:
    30
    It is a GPGPU. It got cancelled because of bad performance and delays.
     
  3. Phistachio

    Phistachio A. Scriabin

    Reputations:
    1,930
    Messages:
    2,588
    Likes Received:
    145
    Trophy Points:
    81
    Thanks for the reply! +rep

    With bad performance, you mean worse than mine, for example? ( look
    sig)
     
  4. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    Larrabee does not exist. It would not have performed as well as your current laptop had it been adapted for video game graphics. It may have performed better with technologies such as ray tracing, but because it never reached market, it will never get the chance.
     
  5. Phistachio

    Phistachio A. Scriabin

    Reputations:
    1,930
    Messages:
    2,588
    Likes Received:
    145
    Trophy Points:
    81
    Oh, ok! Thanks for the reply! +rep and Merry Christmas!
     
  6. Abula

    Abula Puro Chapin

    Reputations:
    1,115
    Messages:
    3,252
    Likes Received:
    13
    Trophy Points:
    106
    I heard they cancelled, mostly out of taking so long they couldnt compete with what was coming to the market, but i read like 6 months ago that they were starting again, but maybe its just a rumor. I think Intel should just buy nvidia and have similar offerings like AMD has, maybe even drop xfire support as amd did with their boards on nvidia.
     
  7. Ruckus

    Ruckus Notebook Deity

    Reputations:
    363
    Messages:
    832
    Likes Received:
    1
    Trophy Points:
    0
    No need, Intel as is has the largest market share on graphics. And more polygons from Nvidia is not what they want, it's the opposite.

    Intel confirmed that Larrabee technology will be integrated in their next gen of integrated graphics, X5000HD.

    And the Larabee what made it different was it would have been x86 architecture just like CPU. So buying Nvidia is not what Intel would have wanted anyways.

    But IMO Intel is the smart one, not AMD or Nvidia. The future of 3D seems to be ray tracing/voxel tree, not more polygons like Tessellation. I have little doubt Intel did not throw away their GPGPu ambitions completely out the door, we'll hear something in the near future. Once others realize more polygons is not the future, Intel will be at the forefront I'll bet. There is no need to buy Nvidia. Because AMD has license to for x86 they should be able to keep up, it will be Nvidia left in the dust. There is little doubt that's why Nvidia attempted to make x86 but were quickly shut down by Nvidia, yay for Intel legal department. Maybe this will never happen, but I think it will.

    This video was Intel's demonstration of real time rendering ray tracing on an old game Quake Wars. But you can see how smooth everything is, round without having to buy a $500 Nvidia for tessellation. What is awesome about ray tracing is that you don't need a graphics card. Quake Wars Ray Traced can be run on on two quad core CPU only. Larabee projected to have around 32 core x86 processors. Maybe in the future when Intel releases it will be like 128 drool, which hopefully put Nvidia to shame.

    For programmers, the Larrabee would have been a lot easier since it does use x86 instructions. Reflections in water would have only taken 10 lines of code in comparison to what you need for traditional GPU. Can even program in C++ for it. Real time ray tracing rendering is something even my HD5870M would probably get crushed with. I'm no tech guru, this is just from browsing the web, sure you can find more technical readings and be just as excited. Intel for sure did not quit, still at it.

    <param name="movie" value="http://www.youtube.com/v/mtHDSG2wNho?fs=1&amp;hl=en_US"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/mtHDSG2wNho?fs=1&amp;hl=en_US" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width='640' height="385"></embed></object>
     
    Last edited by a moderator: May 6, 2015
  8. Althernai

    Althernai Notebook Virtuoso

    Reputations:
    919
    Messages:
    2,233
    Likes Received:
    98
    Trophy Points:
    66
    They did not cancel it, they re-purposed it as a high-throughput computing solution called Knight's Corner. It's not going to be a GPU for a while though -- it needs a die shrink to make sense in terms of power consumption. On the other hand, Intel doesn't through things away so it will be back eventually.

    They can't. Intel has the money to buy both AMD and Nvidia and they would have done it a long time ago if they could, but they're not allowed to because of various regulatory boards. There are only 3 serious players in GPUs and a merger of any two of them by any means wouldn't get past the anti-monopoly legislation unless one of them was going bankrupt.
     
  9. funky monk

    funky monk Notebook Deity

    Reputations:
    233
    Messages:
    1,485
    Likes Received:
    1
    Trophy Points:
    55
    Using ray tracing instead of polygon models seems silly, although it's more realistic it also sucks in terms of performance (find a game with ray tracing in it, enable it and then watch your FPS drop).
     
  10. Althernai

    Althernai Notebook Virtuoso

    Reputations:
    919
    Messages:
    2,233
    Likes Received:
    98
    Trophy Points:
    66
    But this is the entire point: the performance is awful because the hardware is not designed for it. If Intel makes some hardware that can do it well (granted, a rather substantial "if"), ray tracing will become a lot more appealing.
     
  11. Paralel

    Paralel Notebook Evangelist

    Reputations:
    57
    Messages:
    396
    Likes Received:
    0
    Trophy Points:
    30
    Everything I have read points to this being the case as well. Depending on how fast this matures we could easily see a fundamental shift in how 3D is done within the next 2-5 years.
     
  12. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    not a "fundamental" shift... granted, i may be splitting hairs here.

    i think over the next 5 years as tech matures and becomes mainstream we will start to see a *SLOW* shift in consumer 3d rendering applications (read: games) towards using ray tracing over rasterization.

    I forget the exact quote, but:

    "Ray tracing requires cleverness to be fast. Rasterization requires cleverness to look like ray tracing."

    ----

    If ray tracing doesn't need to be fast because we have hardware that can do it in real time, we will stop investing in rasterization because it is difficult to make look good, but it won't happen overnight, even if intel releases ray tracing hardware.
     
  13. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    Are not Pixar movies still done using polygons (+ ray tracing of course)? I think polygons have a while before they are completely replaced by voxels. I'm not too sure how the ray-tracing model works, but I would assume tessellation still has a use.
     
  14. SoundOf1HandClapping

    SoundOf1HandClapping Was once a Forge

    Reputations:
    2,360
    Messages:
    5,594
    Likes Received:
    16
    Trophy Points:
    206
    In addition to ray tracing, wasn't there something else that involved clouds of points? I remember there being a thread on that from way back when.

    Yes, yes, AMD/ATI good, nVidia evil, we know, we know. Don't have to keep reminding us every <s>other</s> GPU-related post.
     
  15. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    Found your bias. Nvidia has competitive offerings at a large, normal range of price levels against AMD/ATI. The end.
     
  16. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Lol it's Nvidia who refuse to license their technology on AMD products.
     
  17. Paralel

    Paralel Notebook Evangelist

    Reputations:
    57
    Messages:
    396
    Likes Received:
    0
    Trophy Points:
    30
    Agreed :p

    But I think you're correct. I want to see it sooner rather than later so that's probably why I want to believe in an aggressive timeline for implementation.
     
  18. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Ray tracing requires considerable computing horsepower, and to render games real-time with it using comparable graphics to DirectX 11 I don't think they're even close. Not to mention all the tricks used for animation and realism with polygons and textures. Ray tracing definitely increases the realism from a static image perspective, but I still don't see how it can suddenly enter the market as an IGP and take over current 3D technology. Perhaps over time it will be integrated with existing 3D tech and evolve, but a sudden shift, I don't see it happening.
     
  19. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    ray tracing is a rendering technology. it doesn't get rid of polygons or textures... i think some people were suggesting that earlier...
     
  20. daranik

    daranik Notebook Deity

    Reputations:
    57
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    30
    Asset loading needs to improve first, hopefully with all the new tech in nano carbon wires, and stuff like "Light Peek" the time to retrieve info will become much quicker. Voxels I believe load into mem on the fly, not pre loaded like most current generation games. That and when we can have 1TB of ram :), infinite textures.