The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Quad core Intel GMA x3100...

    Discussion in 'Gaming (Software and Graphics Cards)' started by talin, Oct 22, 2007.

  1. talin

    talin Notebook Prophet

    Reputations:
    4,694
    Messages:
    5,343
    Likes Received:
    2
    Trophy Points:
    205
    Just thought, might be silly, but since Intel has had dual and quad core processors out for awhile, and supposedly they are working on a high performance integrated graphics solution to come in 2008, why not start introducing dual/quad core graphics? I imagine it could be very competitive, price and performance wise, and I'm sure it wouldn't be difficult to implement something like, disable all but one core when in power saving mode.
    I think it would be a great idea, or am I just talking nonsense?
    And on a seperate note, I cannot understand, why Nvidia and ATI can't do that as well, instead of coming out with multiple graphics cards, why not just multiple cores?
     
  2. Airman

    Airman Band of Gypsys NBR Reviewer

    Reputations:
    703
    Messages:
    1,675
    Likes Received:
    1
    Trophy Points:
    55
    It's not that easy first of all. Second the main source of heat and power consumption is basically the GPU, now imagine that X4 and theres your answer.

    Some notebooks do come with SLi which is dual cards.

    The only Quad-SLi is being developed for desktops, with hopefully extremely good cooling systems. One never knows they may be out for notebooks sometime in the future, but don't expect them in any ultraportable.

    http://www.pcper.com/article.php?aid=218&type=expert&pid=1
     
  3. talin

    talin Notebook Prophet

    Reputations:
    4,694
    Messages:
    5,343
    Likes Received:
    2
    Trophy Points:
    205
    Well, not easy, but possible. 3dfx did it with their voodoo5, and that was, how many years ago?
    I had a friend once that used to work for nintendo tell me, they had a 128bit system long before they even came out with the N64.
    Just seems economics stifle developement. Capitalism at it's best. :rolleyes:
    And no I'm not communist. :p
    And yes, I know sli has been around for a little while already. Just thought, instead of using multiple pcb's, why not just use multiple chips or cores on a single graphics card.
     
  4. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    Graphics cards are already extremely multithreaded oriented, and have been for a very long time. The x3100 has 8 stream processors. It would be relatively ineffecient to put multiple die's on a single card, though it's been done before (see the 7950GX2).
     
  5. talin

    talin Notebook Prophet

    Reputations:
    4,694
    Messages:
    5,343
    Likes Received:
    2
    Trophy Points:
    205
    Well ok, x3100 was a bad example. :p But you get the point. I meant in the future, with future integrated graphics offerings. :)
     
  6. Greg

    Greg Notebook Nobel Laureate

    Reputations:
    7,857
    Messages:
    16,212
    Likes Received:
    59
    Trophy Points:
    466
    Dual integrated cards...well, you'd have 1.5-2x the performance of a single card...which is crappy enough as it is.

    I seriously doubt we'll see something like that, mostly because IGPs are meant for 2d applications (not gaming!) and provide DARN good battery lives for notebooks. For their intended usage, they're perfect. Doubling up just drains the battery, steals more RAM, and provides only a little extra performance capabilities for gaming...and these cards are not meant for it.

    Anything that can be 'dual-ed' will come in the form of SLI sooner than a dual-die. Thermal limits on 15.4" machines are already being hit with the 8600GT, so no dual card there. 17" notebooks already have dual 8700GTs with the possibility of dual 8800GTs when they come out. Making a dual die would just cost more as you're prone to even more manufacturing problems when making larger devices.
     
  7. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    No the example was fine, I was just pointing out that the concept of "multi-core" configurations, as is seen in CPU's, does not apply to GPU's. GPU's are already set up to handle a ridiculous number of "threads" simultaneously. I think you're suggesting putting multiple dies on the same board, which is much less efficient than simply increasing the processing power of a single die.
     
  8. talin

    talin Notebook Prophet

    Reputations:
    4,694
    Messages:
    5,343
    Likes Received:
    2
    Trophy Points:
    205
    Hmm, you both made good points. I know integrated graphics were never meant for gaming, but I believe there will be a time when they will offer that (even if just light/moderate gaming). The recent article here about intel working on a solution that will offer up to 3x more power, points to that.
    Yes, they are great for battery life, but it wouldn't be difficult to simply have a feature to disable all but one core/chip when in battery power. :)
     
  9. ltcommander_data

    ltcommander_data Notebook Deity

    Reputations:
    408
    Messages:
    1,398
    Likes Received:
    0
    Trophy Points:
    55
    As mentioned by odin243 and Greg, multi-die GPU implementations are inherently inefficient to to problems with control/synchronization and interface bandwidth/latency. GPUs are inherently parallel via shaders and it would be much more efficient to have more integrated shaders than to add separate external chips. The limitation is of course process technology since large dies are expensive, have low yields due to higher probability of defects, and have power and heat issues. SLI and Crossfire are just stop-gaps for enthusiasts who want more than what current process technology can offer. SLI and Crossfire are also heavily driver dependent and are useless and even detrimental in games without proper driver support.

    The concept of multi-core CPUs is being applied to GPUs by Intel in their Larrabee project. This is not the same as SLI or Crossfire. The prototypes are built with 80-cores with very strong FP performance making them applicable to graphics processing. While these cores are x86 based, they aren't the same as current CPUs, since each core in Larrabee is more specialized in FP processing, yet simpler and smaller than a CPU core. The concept has potential, although more for rendering type tasks than the type of graphics that games currently use.
     
  10. talin

    talin Notebook Prophet

    Reputations:
    4,694
    Messages:
    5,343
    Likes Received:
    2
    Trophy Points:
    205
    I read about that project. :) IIRC it wasn't for graphics solutions, but rather a test run, to demonstrate MULTIPLE cores on a single chip, for future developement. The processors themselves were very rudamentary, basic. Nothing like what we have now.
     
  11. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    That's the whole point. It's incredibly inefficient to run multiple high-powered processor cores in parallel. It's much more efficient to make the processors very simple, specialized units, like what we have in modern graphics cards.
     
  12. talin

    talin Notebook Prophet

    Reputations:
    4,694
    Messages:
    5,343
    Likes Received:
    2
    Trophy Points:
    205
    True, I can't argue with that. :)
    I still think, IMHO, the day will come. But probably not for a LONG while. Not with SLi and Crossfire already on the fast track.
     
  13. Scavar

    Scavar Notebook Evangelist

    Reputations:
    50
    Messages:
    498
    Likes Received:
    0
    Trophy Points:
    30
    Eventually we'll just be powering everything with our brain. That or robots/aliens/zombies will have wiped as out. Or ourselves, but I prefer the option that has powering our computers with our brains.
     
  14. talin

    talin Notebook Prophet

    Reputations:
    4,694
    Messages:
    5,343
    Likes Received:
    2
    Trophy Points:
    205
    I always wondered, looking in device manager, what a Human Interface Device was for. Now I know! :D