The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Intel ends it's plans for Larrabee as a discrete GPU

    Discussion in 'Gaming (Software and Graphics Cards)' started by Phinagle, May 26, 2010.

  1. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Technology@Intel An Update On Our Graphics-related Programs

     
  2. ziddy123

    ziddy123 Notebook Virtuoso

    Reputations:
    954
    Messages:
    2,805
    Likes Received:
    1
    Trophy Points:
    0
    Too bad. I was really hoping Intel would lead the way for alternative methods for graphics, like ray tracing and voxel based 3D rendering for games. Read some interesting blogs from Carmack and was thinking that Intel's new graphics would be able to make what his wishes for ID Tech 6 reality. That and was hoping Intel's acquisition of Havok was indication of some exciting intel graphics solutions to come.
     
  3. Harleyquin07

    Harleyquin07 エミヤ

    Reputations:
    603
    Messages:
    3,376
    Likes Received:
    78
    Trophy Points:
    116
    Unfortunately, it means Intel will go back to "concentrating" on their integrated cards which are no longer the only option considering what Nvidia and ATI have to offer.
     
  4. Althernai

    Althernai Notebook Virtuoso

    Reputations:
    919
    Messages:
    2,233
    Likes Received:
    98
    Trophy Points:
    66
    They made the right call -- it's too late to get into the discreet GPU market. Intel's on-package IGPs are about halfway to being good enough and AMD's Fusion will almost certainly be a lot better. With any luck at all, the era of discreet graphics cards will finally end in the next few years.
     
  5. funky monk

    funky monk Notebook Deity

    Reputations:
    233
    Messages:
    1,485
    Likes Received:
    1
    Trophy Points:
    55
    Intel IGP's aren't half bad at what they do, they do what they were designed for very well imo.

    However I like having discreet graphics cards, it means you can replace them and swap them around and stuff. Think about it, when an IGP goes out of date (things in the gaming world go out of date VERY quickly) you have to buy a whole new mobo, if you're a gamer and have a decent mobo like an asus rampage or something like that then it would mean spending an extra £200 each time you need to update.
     
  6. lozanogo

    lozanogo Notebook Deity

    Reputations:
    196
    Messages:
    1,841
    Likes Received:
    0
    Trophy Points:
    55
    ?? Yeah, but most of those mobos are on the realm of 2000+ usd, unless they are talking of a desktop. I think that the wide range of laptop users will never see their GPU switched to another in their laptops, unless (obviously) they buy a new one.
     
  7. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    Intel's latest HD graphics integrated onto i3 and i5 packages are pretty decent, and with proper drivers it could match a Radeon 3200.
     
  8. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Being as good as the Radeon HD3200 isn't going to be that impressive when Llano APUs will have more Evergreen based DX11 stream processors than a Mob. HD5650.

    That said though, even with an IGP potentially that strong I will always want a discrete GPU to go with it...running in hybrid crossfire. :biggrin:
     
  9. thinkpad knows best

    thinkpad knows best Notebook Deity

    Reputations:
    108
    Messages:
    1,140
    Likes Received:
    0
    Trophy Points:
    55
    The "era"? What era? There was/is no "era" of discrete cards...to say that is just very simplistic, there won't be an "era". There has always been a market for performance enthusiasts, what about workstations as well? There always will be, integrated GPU's and discrete GPU's have been paralell, usually with discrete GPU's being ahead by alot, the integrated finally catching up to lets say 3-5 generation old discrete card technology, but following still behind newer discrete GPU's, and so on and so forth.
     
  10. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Uh, yeah. Discrete will never go away, period. It's nice to have decent performing integrated GPU's, but discrete will always be over ten times better performance.
     
  11. Althernai

    Althernai Notebook Virtuoso

    Reputations:
    919
    Messages:
    2,233
    Likes Received:
    98
    Trophy Points:
    66
    The era when a discreet GPU is more or less required for something a significant fraction of people want to do -- gaming. They will still be used for professional purposes (note that Intel is not giving up on this one) and enthusiasts will still insist on them, but it won't be something for the mass market anymore. Think about what happened to discreet sound cards: you can still buy one if you are interested, but very few people bother and you won't find them in the common HPs/Dells/etc.

    If the integrated variety becomes good enough, the discreet will fall by the wayside. They'll still be purchased by the "60 FPS in Crysis III or bust" crowd, but that market is not large enough for a company like Intel to get into it. The way the gaming industry currently stands, they just have to match the consoles and that is not a very high standard.
     
  12. thinkpad knows best

    thinkpad knows best Notebook Deity

    Reputations:
    108
    Messages:
    1,140
    Likes Received:
    0
    Trophy Points:
    55
    I think that "era" must have been an illusion, even then the market share was about 70-90% Intel, and the rest split about the major dedicated card companies. The consoles are not a high standard to beat at all, they're getting terribly outdated, with those crappy 7900 GTX based chips in them, with apparent processing power making up for it.
     
  13. funky monk

    funky monk Notebook Deity

    Reputations:
    233
    Messages:
    1,485
    Likes Received:
    1
    Trophy Points:
    55
    Of which they make little use *cough cough PS3*

    I honestly couldn't care less what the gaphics card market does provided they still perform well and I can yank them out from one computer and put them in another, that's all that really matters to me.
     
  14. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    As nice as the whole GPU on CPU die package sounds, there's no telling how they could be sold. Combinations of low to high end CPUs with GPU could be endless, though it would be nice for DAAMIT to rest on a single GPU until a next CPU generation. Quad core + 400 SP like GPU would be a very good starting point. Relatively good performance with great CPU could provide very good video/audio decode or graphics or perhaps GPGPU physics while a discreet board handles graphics output. There is alot of potential, but it comes with various needs, like chip yields, power needs, and I think most importantly, memory bandwidth. Current DDR3-1333 dual channel bandwidth provides 21 GB/s and that would not be enough to get the most out of a quad core + 400 SP ATi GPU part. Bare minimum I think 40 GB/s would be my preference. A 400 SP part like the desktop 5570 or laptop 5650 is around 25 GB/s in memory bandwidth on G or DDR3 128 bit buses which is performance limiting for them in many cases. The more bandwidth the better.
     
  15. sean473

    sean473 Notebook Prophet

    Reputations:
    613
    Messages:
    6,705
    Likes Received:
    0
    Trophy Points:
    0
    that isn't suprising.. been looking like a failure ever since i heard of it... good thing intel killed it instead of wasting money.. anyways , intel gpu's have always been crap.
     
  16. f4ding

    f4ding Laptop Owner

    Reputations:
    261
    Messages:
    2,085
    Likes Received:
    0
    Trophy Points:
    55
    Uh, larrabee was supposed to compete with CUDA and stream openCL or what ever they have, pretty much GPGPU stuff. Not sure whether this is a good move or not for Intel, maybe they've perfected the parallelism on the CPU front. But they better get in the game some way some how with massively parallel computing or they'll lose server market share to daamit or nvdia in the not too distant future.