I've been wondering about this for a while. Instead of having graphics switching which is patchy at best, why not just disable cores instead like the new intel CPU's can.
Think about it, graphics cards have tens to hundreds of cores, who says they all have to be active at any one time? The problem with simply reducing the clock speed and therefore voltage is that you can't go below about 0.6V, silicon simply doesn't work below that.
What I would propose is an extension of the current 3D/2D clocks scheme. While the 3D clock profile would use all the cores and clock up to max speeds. However when you're on the 2D profile it would disable almost all the cores to the point that it's effectively working at the same power levels as an integrated GPU. Even on low power profiles, current high end GPU's will run circles round an integrated one and therefore use more energy. This could even be extended to desktop graphics, potentially allowing even the highest power cards to run passively when not being taxed.
Having the ability to disabl cores would also get rid of the compatability problems with current graphics switching such as games not being recognised and having different frame buffers.
The question is, given that it could be so simple why hasn't it been done already?
-
-
I was under the impression that (some of) the cores switches off when not in use to save energy.
Suggestion for ATI and Nvidia
Discussion in 'Hardware Components and Aftermarket Upgrades' started by funky monk, Aug 18, 2011.