The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Intel Larrabee-A match for Nvidia/ATI?

    Discussion in 'Gaming (Software and Graphics Cards)' started by Howitzer225, Jun 21, 2009.

  1. Howitzer225

    Howitzer225 Death Company Dreadnought

    Reputations:
    552
    Messages:
    1,617
    Likes Received:
    6
    Trophy Points:
    56
    Intel Corporation is presenting a paper at the SIGGRAPH 2008 industry conference in Los Angeles on Aug. 12 that describes features and capabilities of its first-ever forthcoming "many-core" blueprint or architecture codenamed "Larrabee."

    Details unveiled in the SIGGRAPH paper include a new approach to the software rendering 3-D pipeline, a many-core (many processor engines in a product) programming model and performance analysis for several applications.

    The first product based on Larrabee will target the personal computer graphics market and is expected in 2009 or 2010. Larrabee will be the industry's first many-core x86 Intel architecture, meaning it will be based on an array of many processors. The individual processors are similar to the Intel processors that power the Internet and the laptops, PCs and servers that access and network to it.

    Larrabee is expected to kick start an industry-wide effort to create and optimize software for the dozens, hundreds and thousands of cores expected to power future computers. Intel has a number of internal teams, projects and software-related efforts underway to speed the transition, but the tera-scale research program has been the single largest investment in Intel's technology research and has partnered with more than 400 universities, DARPA and companies such as Microsoft and HP to move the industry in this direction.

    Over time, the consistency of Intel architecture and thus developer freedom afforded by the Larrabee architecture will bring about massive innovation in many areas and market segments. For example, while current games keep getting more and more realistic, they do so within a rigid and limited framework. Working directly with some of the world's top 3-D graphics experts, Larrabee will give developers of games and APIs (Application Programming Interface) a blank canvas onto which they can innovate like never before.

    Initial product implementations of the Larrabee architecture will target discrete graphics applications, support DirectX and OpenGL, and run existing games and programs. Additionally, a broad potential range of highly parallel applications including scientific and engineering software will benefit from the Larrabee native C/C++ programming model.

    http://www.intel.com/pressroom/archive/releases/20080804fact.htm

    Just came across this new GPU supposedly to be launched by mid 2009. Intel's soon to be released discrete graphics solution will utilize GDDR5 memory and features x86 cores to do graphics processing. I'm just guessing that while it may not be aimed for the mainstream GPU market, Intel's Larrabee could do just that if it succeeds. News is as old as 2008 but I've got to see any developments this year. Any further info on this? I'm curious since I'm used to Intel releasing IGPs, and now they plan to release a high end graphics card.
     
  2. Harleyquin07

    Harleyquin07 エミヤ

    Reputations:
    603
    Messages:
    3,376
    Likes Received:
    78
    Trophy Points:
    116
    If it goes the way of Intel's predecessor IGP products in development and execution the Larabee is going to make anyone over at Nvidia and AMD/ATI lose any sleep.

    If Intel pull a rabbit out of the hat and release a product that can genuinely compete with the established companies, then things will get interesting.
     
  3. catacylsm

    catacylsm Notebook Prophet

    Reputations:
    423
    Messages:
    4,135
    Likes Received:
    1
    Trophy Points:
    106
    Problem is, will game devs start to support it when its released, or will it just be a half used product.

    It seems like a good idea though :)
     
  4. Howitzer225

    Howitzer225 Death Company Dreadnought

    Reputations:
    552
    Messages:
    1,617
    Likes Received:
    6
    Trophy Points:
    56
    I'm guessing they finally got bored of churning out IGPs and decided to create a more capable graphics card of their own. It's planned to be manufactured in 45nm, but will differ from the usual Nvida/ATI cards in a sense that the technology will stem from Pentium, Core and Core 2 Duo processors. Bound to be called a GPGPU rather than a GPU, it's real weird to be using the micro architecture found in processors than from a graphics core. :confused: I'm guessing if Intel will launch this on a large enough scale, game developers will soon follow trend. :p

    Release has been pushed back to the first half of 2010. Still, it's a big step for a company known mostly for its underpowered integrated graphics solutions. :rolleyes: http://www.tomshardware.com/news/intel-larrabee-gpgpu-gpu-cpu,7815.html#xtor=RSS-181
     
  5. Althernai

    Althernai Notebook Virtuoso

    Reputations:
    919
    Messages:
    2,233
    Likes Received:
    98
    Trophy Points:
    66
    Intel is putting a whole lot of effort into this -- IIRC they've bought a company to make games for it and they're definitely pushing it in the film industry and in scientific communities that do a lot of computing. I'm not sure how good it will be as a graphics card (probably quite good assuming they can get the drivers to work properly), but if they can deliver 32+ cores per unit, it will have a lot of potential in high performance computing.
     
  6. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    which will be better at graphics processing?

    general purpose processors, or specialized graphics processors?

    there will definitely be some science applications that will benefit from this, but i just don't see intel pulling magic out of their hats and competing with the established GPU companies as far as blockbuster games... consider that current GPU's already use a modular "add cores for more performance" model, and GPGPU already exists for a niche in windows, and is going mainstream on the mac in a few months... i think intel is actually just trying to keep up, rather that innovate, in this particular situation.

    i have a sneaking suspicion that this product will be approximately as over-hyped and underwhelming as many - nay, all - of their prior products in this market.
     
  7. Kamin_Majere

    Kamin_Majere =][= Ordo Hereticus

    Reputations:
    1,522
    Messages:
    2,680
    Likes Received:
    0
    Trophy Points:
    55
    I dont know it seems when Intel puts their mind to something they tend to excel at it. AMD beat them out for a while in CPU's then they buckled down and have pretty much been winning since. They wanted to build a SSD and they ended up making one of the best on the market (if not the best).

    So if they put their mind to making the best GPU i'm pretty sure they have the resources (both money and people) to make it happen
     
  8. Lum-X

    Lum-X Notebook Evangelist

    Reputations:
    27
    Messages:
    665
    Likes Received:
    10
    Trophy Points:
    31
    in my opinion this will more for pros and not for gaming. nodoubt that they will bo good.

    The problem is with library's that devs will use. most of them use Nvidia thats why games on Nvidia GPU usually perform better but ATI trully has better GPU at the moment and it can perform better if the code is just a bit more optimised for that architecture. its the same like translating something for ATI when using Nvidia library's.
     
  9. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    Game developers do not use Nvidia <del>library's</del> libraries to make their games.

    The graphics libraries come from:

    a) Microsoft (Direct X)
    b) the Khronos Group (OpenGL)

    Nvidia, AMD/ATI, and Intel are all members of the Khronos Group, among others.

    Don't spread misinformation.
     
  10. catacylsm

    catacylsm Notebook Prophet

    Reputations:
    423
    Messages:
    4,135
    Likes Received:
    1
    Trophy Points:
    106
    Actually, this could really help apple and be grabbed by the horns here.

    OpenCL in the new snowlep os would be able to use them cores fully, with much more function, adding much more processing power to the laptop in general.
     
  11. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    as far as i know, this would defeat the purpose of OpenCL if it catches on and replaces the market (unlikely imo)

    OpenCL uses graphics processors in Nvidia and ATI gpu's and opens them up for general purpose computing.

    This is like a general purpose computing graphics chip, and therefore wouldn't rely on OpenCL.

    I feel like OpenCL is going to the next big thing, not this. I guess I have already made this comment though ;)
     
  12. Lum-X

    Lum-X Notebook Evangelist

    Reputations:
    27
    Messages:
    665
    Likes Received:
    10
    Trophy Points:
    31
    as i know a game is developed on DirectX of OpenGl. this is a base but for getting better performance they have soem librarys.

    Crysis for ex. was optimized for Nvidia.

    I have e friend of mine that works at Nvidia he told me so. they have a SDK that developers use to optimize game for a specific GPU architecture. Every time you open a game it detects you Gpu and loads some code to work better on that architecture. thisi is what i meant
     
  13. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    I think Larabee will fail, unless future games will be engineered towards ray-tracing.
     
  14. BlitZX

    BlitZX Notebook Consultant

    Reputations:
    5
    Messages:
    166
    Likes Received:
    0
    Trophy Points:
    30
    Real-time ray-tracing is still many years away unfortunately (impressive stuff). Rasterization remains the quickest/easiest way to render images on todays hardware. Maybe in 10 years or so, who knows.

    Intel really needs to make an impression upon release and even if they do, NVIDIA and ATI aren't going anywhere anytime soon. Although i'm still pissed that they integrated their higher-clocked GMA4500 with Arrandale. I was looking forward to coupling it with a 40nm 9400m variant. :mad:

    Btw, the Intel GMA team aren't working on Larrabee, the Oregon team is (people who designed Nehalem). These are the same people who are developing the 22nm Haswell architecture. Which could mean integrated graphics based on Larrabee could very well arrive with Haswell in 2012.
     
  15. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
  16. Lum-X

    Lum-X Notebook Evangelist

    Reputations:
    27
    Messages:
    665
    Likes Received:
    10
    Trophy Points:
    31
    Real-time ray-tracing is also hard to program, power hungry, too complex, developers must learn these new API.

    Things will get too complex for gaming industry but for CAD and 3D applications that is a big advantage in render time and we will se better quality in less time.

    Also must develop new API new GPU architecture ATI and Nvidia.