Intel and Micron have presented a prototype of stacked DRAM technology that has 10x the bandwidth of DDR3 and 7x the energy efficiency. This puts it at roughly the same level of performance as high-end GDDR5, but for about the same energy cost as DDR2.
It's an exciting development for a couple of reasons. First, as things stand right now, high memory bandwidth GPUs consume a huge amount of energy. The bandwidth is obtained by a combination of a wide bus (256 or 384 bits) and high clock speeds. Each of these is power hungry on its own and the combination is terrible. If this can be replaced, you will see laptop GPUs that are much closer to their desktop counterparts.
Second, this deals with the final problem of integrated graphics. Currently, no matter how many cores you put on the CPU-GPU hybrid, it will always be bound by the RAM (see Llano). But with the stacked DRAM, there is nothing stopping iGPUs (or APUs or whatever) from matching discreet cards.
Of course, this is just a prototype and chances are that the first version of this technology will go to servers and the like, but I'm sure it will eventually migrate down to mobile devices. Maybe in another 5 years people will be able to play high-end PC games without explicitly looking for computers with GPUs.
-
redrazor11 Formerly waterwizard11
Cool stuff. The power reduction is key I feel. More and more devices are mobile, but I don't feel like battery technology has made any recent leaps and bounds...they're still big and heavy if you want more than 4 hours in a decent machine.
Especially after reading that article on Windows 8 running "snappier" on the netbook, but cutting battery life in halfsies.
http://www.notebookreview.com/default.asp?newsID=6267&review=windows+8+netbook+upgrade -
batteries are in stagnation for decades now.
they made some refinements, but that's it.
RAM as such is a very low power drainer to begin with... but using this new method to replace the current techniques in gpu's... well, it's bound to have a good effect.
But, as it stands, it will be a long time before this is adopted by the industry.
Why not milk the customers for all they are worth before you make the switch... and there's a strong possibility that this technique is nothing new... plus by the time it's put into consumer computers, it will be sorely outdated. -
It's not about power, although that will be a benefit, but everything else will be held back by the RAM so not sure why they wouldn't try to implement it. If they had come up with this years ago it probably would have already been implemented.
Hybrid Memory Cube
Discussion in 'Gaming (Software and Graphics Cards)' started by Althernai, Sep 16, 2011.