The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Hybrid Memory Cube

    Discussion in 'Gaming (Software and Graphics Cards)' started by Althernai, Sep 16, 2011.

  1. Althernai

    Althernai Notebook Virtuoso

    Reputations:
    919
    Messages:
    2,233
    Likes Received:
    98
    Trophy Points:
    66
    Intel and Micron have presented a prototype of stacked DRAM technology that has 10x the bandwidth of DDR3 and 7x the energy efficiency. This puts it at roughly the same level of performance as high-end GDDR5, but for about the same energy cost as DDR2.

    It's an exciting development for a couple of reasons. First, as things stand right now, high memory bandwidth GPUs consume a huge amount of energy. The bandwidth is obtained by a combination of a wide bus (256 or 384 bits) and high clock speeds. Each of these is power hungry on its own and the combination is terrible. If this can be replaced, you will see laptop GPUs that are much closer to their desktop counterparts.

    Second, this deals with the final problem of integrated graphics. Currently, no matter how many cores you put on the CPU-GPU hybrid, it will always be bound by the RAM (see Llano). But with the stacked DRAM, there is nothing stopping iGPUs (or APUs or whatever) from matching discreet cards.

    Of course, this is just a prototype and chances are that the first version of this technology will go to servers and the like, but I'm sure it will eventually migrate down to mobile devices. Maybe in another 5 years people will be able to play high-end PC games without explicitly looking for computers with GPUs.
     
  2. redrazor11

    redrazor11 Formerly waterwizard11

    Reputations:
    771
    Messages:
    1,309
    Likes Received:
    0
    Trophy Points:
    55
    Cool stuff. The power reduction is key I feel. More and more devices are mobile, but I don't feel like battery technology has made any recent leaps and bounds...they're still big and heavy if you want more than 4 hours in a decent machine.

    Especially after reading that article on Windows 8 running "snappier" on the netbook, but cutting battery life in halfsies.

    http://www.notebookreview.com/default.asp?newsID=6267&review=windows+8+netbook+upgrade
     
  3. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    batteries are in stagnation for decades now.
    they made some refinements, but that's it.
    RAM as such is a very low power drainer to begin with... but using this new method to replace the current techniques in gpu's... well, it's bound to have a good effect.

    But, as it stands, it will be a long time before this is adopted by the industry.
    Why not milk the customers for all they are worth before you make the switch... and there's a strong possibility that this technique is nothing new... plus by the time it's put into consumer computers, it will be sorely outdated.
     
  4. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    It's not about power, although that will be a benefit, but everything else will be held back by the RAM so not sure why they wouldn't try to implement it. If they had come up with this years ago it probably would have already been implemented.