The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Larabee will sting?

    Discussion in 'Gaming (Software and Graphics Cards)' started by MonkeyMhz, Apr 6, 2009.

  1. MonkeyMhz

    MonkeyMhz Notebook Evangelist

    Reputations:
    68
    Messages:
    303
    Likes Received:
    0
    Trophy Points:
    30
    Whats your opinions, would you be willing to switch GPU vendors?
     
  2. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    I wouldn't switch.
     
  3. yuio

    yuio NBR Assistive Tec. Tec.

    Reputations:
    634
    Messages:
    3,637
    Likes Received:
    0
    Trophy Points:
    105
    Lets see...
    Intel Owns the mobile PC CPU market
    Intel owns a HUGE majority of the Desktop PC market
    Intel owns ALL of the Apple market
    Intel owns the netbook market
    Intel Owns the IGP market(more or less)
    Intel will own the SSD market considering how good they are
    Intel has a solid presence in the Wi-Fi market
    Intel has a very strong presence in the chipset market

    No, I'd much rather support ATi or Nvidia.
     
  4. mrzzz

    mrzzz Notebook Consultant

    Reputations:
    49
    Messages:
    179
    Likes Received:
    10
    Trophy Points:
    31
    hehe fanboys :)
    gamers will support whoever has the best hardware that is supported by third parties the most. and atm that would be nv. the tide is changing to ATI again though, yay.
     
  5. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    I honestly don't see Intel gaining a foothold into the performance graphics market. The cost will be high or Intel will have to purposely lose lots of money just take a bite out of ATi or Nvidia. I do definetly see Larrabee architecture becoming Intel's standard IGP set, but it comes down not only to capability, but power efficiency as well. Sure Intel has smaller node process down to a science, but by the time Larrabee makes it's real debut ATi and Nvidia will have smaller process GPUs, much more powerful GPUs as well as cheaper and more power efficient for the performance than what is available now.

    And I'll admit, I'm a bit of a fanboy for AMD CPU products, I don't want to see Intel become a monopoly either. Took enough hitting myself in the head just to break down and get this Asus because it was such a good deal. I really wanted an AMD system, but couldn't beat this laptop for the price and the graphics performance.
     
  6. Hep!

    Hep! sees beauty in everything

    Reputations:
    1,806
    Messages:
    5,921
    Likes Received:
    1
    Trophy Points:
    206
    I could care less who makes it - if it performs, it performs. I would buy it.
    Benchmarks alone would not win me over, I'd want to see that it has equal or lower power consumption, heat output, etc as well. But yeah. I'd buy it.

    Intel already tried this though - they made a dedicated GPU several years ago, I am sure some of you remember this. It was a huge flop. I don't have high expectations.
     
  7. elijahRW

    elijahRW Notebook Deity

    Reputations:
    940
    Messages:
    1,797
    Likes Received:
    0
    Trophy Points:
    0
    well... when nvidia gpu's got that heat issue... and I haven't used ati gaming cards... and intel does make nice cpu's and chipsets imo... i would pic the 1st and go with intel ;)
    I like them :p
     
  8. usapatriot

    usapatriot Notebook Nobel Laureate

    Reputations:
    3,266
    Messages:
    7,360
    Likes Received:
    14
    Trophy Points:
    206
    Honestly, I don't see Intel's "Larabee" offering the performance necessary to convert power users from either Nvidia or ATI GFX.
     
  9. Ayle

    Ayle Trailblazer

    Reputations:
    877
    Messages:
    3,707
    Likes Received:
    7
    Trophy Points:
    106
    If it makes a good platform for switchable graphics, why not? But as a high performance GPU? I think not.
     
  10. Althernai

    Althernai Notebook Virtuoso

    Reputations:
    919
    Messages:
    2,233
    Likes Received:
    98
    Trophy Points:
    66
    Intel has money to invest. Keep in mind that it's market capitalization is $88.2B whereas AMD's is $2.35B and Nvidia's is $6.16B. Intel is around 10 times bigger than Nvidia and AMD combined. What they need to do is get Larrabee into a console and/or buy some studios to build decent games for it.

    Although there is very little information on it out yet, I suspect the initial iteration of Larrabee will not be that great -- either the TDP will be too much or the performance will be too little. However, Intel is great at refining stuff and they can price their stuff competitively and absorb the early losses while the dies are in the process of being shrunk. And of course there is always a chance it might surprise everyone and be the best GPU on the market. We'll see.
     
  11. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    If Intel can swim with the sharks, then so be it. It will only force nVidia and ATI to get off their butts and beat Intel. Remember, competition drives technology.
     
  12. cathy

    cathy Notebook Evangelist

    Reputations:
    47
    Messages:
    551
    Likes Received:
    0
    Trophy Points:
    30
    But think about how poor Nvidia will crumble and die! :(
     
  13. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Poor nVidia? The company that has monopolized the GPU market for years? Please. nVidia will not crumble an die. Trust me, with the weak product lines they've offered lately only because ATI has had trouble keeping up, watch them come out with some pretty interesting stuff in short order that they've kept in their back pocket. It's called business...
     
  14. cathy

    cathy Notebook Evangelist

    Reputations:
    47
    Messages:
    551
    Likes Received:
    0
    Trophy Points:
    30
    I'm just kidding really. The 4th option amused me so much I decided to follow up on it. :p
     
  15. theneighborrkid

    theneighborrkid Notebook Evangelist

    Reputations:
    161
    Messages:
    655
    Likes Received:
    0
    Trophy Points:
    30
    So what about the opposite of this and letting nVidia into the CPU world...
     
  16. heliopath

    heliopath Notebook Guru

    Reputations:
    0
    Messages:
    66
    Likes Received:
    0
    Trophy Points:
    15
    Well lets face it, Nvidia is still a big company, and everyone likes an underdog.. But since AMD and Nvidia have been in the Gpu market for a lengthier period of time they should be able to adapt and continue to expand. The only thing that Intel has is a hell of alot mmore funds to throw around.. But remember Intel will still want to keep up the expansion of their cpu's and possibly expand into other area's to dominate the market (I would not be surprised if they turned around and attempted to make LCD screens... Overclockable LCD screens... Actually is that possible? imagine upping the refresh rates to such an extent that you notice no difference anyway.. pretty much as it is now)

    Wasn't intel going to combine the cpu and gpu into one? haven't they started that on netbooks.. But then again I think Nvidia has already done that with a customised Hp...

    As for AMD the 4970x2's crossfired is just insane.. what they these *underdogs* need to do is to keep pushing the envolope, instead of just rehashing (I already asked you to stop that Nvidia, you naughty little profiteering company.. Next time that happens its going to be me putting you into time out and going AMD or Intel, you have been warned) and whilst pushing the envolope, invest in ways of making the processes cheaper to produce the cards, whilst making the gap between lower end and higher end models smaller..

    but then again I may just invest in an old atari and enjoy the use of dots for entertainment.. that or decide to attack random innocent really old games that no longer push the envolope, and mod'em to run better with the newer ideas added in (e.g. the strategic view on supreme commander), therefore expanding that old universe in new ways.. but that takes effort..

    And now having forgotten what I started talking about, I would try Larabee as long as they don't pull a Telstra (if you live in AUS you would know that they are the reason that internet is so expensive and slow, that monopolising industry)..

    as was stated previously by previous posters, competition drives innovation, and that is what the masses want.. or is it??


    I guess I should end the post at some point??
     
  17. lunateck

    lunateck Bananaed

    Reputations:
    527
    Messages:
    2,654
    Likes Received:
    0
    Trophy Points:
    55
    For mobile user, it will sting, for gaming powerhouse usage, it will just stink.

    No, Nvidia wont just die off, they are trying their best to acquire/work with VIA. ;)
     
  18. Harleyquin07

    Harleyquin07 エミヤ

    Reputations:
    603
    Messages:
    3,376
    Likes Received:
    78
    Trophy Points:
    116
    So what we're seeing is an integrated GPU that can probably compete with the low-end dedicated GPUs that are currently on the market now, sounds great if you're a casual notebook user who plays games on the side.

    Other users who prioritise GPU power over every other single component in a notebook configuration will just continue and pick up the high-end GPU notebooks as they've always done.
     
  19. tianxia

    tianxia kitty!!!

    Reputations:
    1,212
    Messages:
    2,612
    Likes Received:
    0
    Trophy Points:
    55
    that's the thing, we have no idea which market segment it will be targeted at.
     
  20. Dox@LV2Go

    Dox@LV2Go Notebook Consultant

    Reputations:
    247
    Messages:
    188
    Likes Received:
    0
    Trophy Points:
    30
    i'd like to see someone do something different.
    If larabee works out at least is a next steup rather add more shaders, increase the clocks type thing.

    2 gpu makers in the market is not always fun, with 3 is gonna be intresting look at xbox360, wii and PS3 :D
     
  21. davepermen

    davepermen Notebook Nobel Laureate

    Reputations:
    2,972
    Messages:
    7,788
    Likes Received:
    0
    Trophy Points:
    205
    I like it to program on it, doing raytracing, and other fancy stuff. or sound-crounching (dsp style).

    i don't like the programming models gpu force upon me. and i espencially dislike nvidia in both marketing, design choises and the way they design their coding packages.
     
  22. Althernai

    Althernai Notebook Virtuoso

    Reputations:
    919
    Messages:
    2,233
    Likes Received:
    98
    Trophy Points:
    66
    I think at this point we do have some idea. For instance, it's almost certainly not aimed at the people who only need integrated graphics. The most interesting question left is whether it is intended to compete with the most powerful of Nvidia's and ATI's cards or only at the mid-range.
    Then you'll quite likely be pleased with Larrabee -- it's a bunch of x86 cores with a few extensions for the instruction set.
     
  23. lunateck

    lunateck Bananaed

    Reputations:
    527
    Messages:
    2,654
    Likes Received:
    0
    Trophy Points:
    55
    If it's capable at mid-range, it'll eat a lot of of Nvidia and ATI's pie up. Well, good as well, by that time, both of them will probably raise the benchmark of mid-range gpus.
     
  24. Beatsiz

    Beatsiz Life Enthusiast

    Reputations:
    95
    Messages:
    1,411
    Likes Received:
    0
    Trophy Points:
    55
    How about Larabee + SLI GTX 300's ?

    :D
     
  25. unknown555525

    unknown555525 rawr

    Reputations:
    451
    Messages:
    1,630
    Likes Received:
    0
    Trophy Points:
    55
    Larabee is pretty much a project to prove that the concept WORKS at this point. They've so far managed to get tons of P4 cores to run games, and know that with each added core is a near linear performance boost. But the performance of even 32 cores is terrible. But the method they are going at this with will be extremely costly and have a HUGE power consumption. For the Larabee to become reality, and compete with ATI/nVidia at this point, it would need several HUNDREDS of x86 cpu cores. They'd be far better off cost wise just using the standard method of creating video cards.

    I'll be surprised if they actually make anything out of this.
     
  26. Althernai

    Althernai Notebook Virtuoso

    Reputations:
    919
    Messages:
    2,233
    Likes Received:
    98
    Trophy Points:
    66
    Do you have a source for this? I ask because I went looking for information on Larrabee a couple of weeks ago and there wasn't a single site that had anything that even gives a ballpark scale of its performance. Pretty much every one had this old graph, but of course this isn't actually Larrabee, it's just an emulator that demonstrates the linear scaling of the architecture and even here the y-axis is completely arbitrary -- "1" could correspond to 1 FPS or 100 FPS.
     
  27. Narroo

    Narroo Notebook Enthusiast

    Reputations:
    0
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    MMMmmm. So who's Sony, and who's Nintendo? This is Deja Vu.
     
  28. MonkeyMhz

    MonkeyMhz Notebook Evangelist

    Reputations:
    68
    Messages:
    303
    Likes Received:
    0
    Trophy Points:
    30
    i don't like the programming models gpu force upon me

    What do you mean by that exactly, you mean like, what the GPU accelerates, or possible to do.

    Your not disliking API's are you, because if companies started making their own API's that would be the last thing the industry needs to put a bullet in its head.
     
  29. unknown555525

    unknown555525 rawr

    Reputations:
    451
    Messages:
    1,630
    Likes Received:
    0
    Trophy Points:
    55
    There was an entire section about larabee on intel's own website a few months ago, they had a few videos talking about it, and one showing it rendering a scene as a proof of concept. Including all of the SIGGRAPH whitepapers and graphs from their simulations.

    Reading back on a few things I think larrabee MIGHT actually work well. If they can design a 2GHz Atom CPU with a TDP of 3w, like they just did, I bet they can get a 100+ core GPU ... thing, within the watt usage of other mainstream high end GPU's.
     
  30. notyou

    notyou Notebook Deity

    Reputations:
    652
    Messages:
    1,562
    Likes Received:
    0
    Trophy Points:
    55
    Actually, with their new transistors, they could squeeze Nehalem into a 12W TDP.

    I'd bet any firm would pay out the ... bum, for that kind of power savings.
     
  31. Ayle

    Ayle Trailblazer

    Reputations:
    877
    Messages:
    3,707
    Likes Received:
    7
    Trophy Points:
    106