We know 2013 will bring the Haswell CPU but what about GPUs? Will nVidia and AMD bring new architectures or shrink/rebrand existing ones?
Thank you. Sorry if this has already been covered somewhere else.
-
AMD, afaik, has kept mum.
Both companies have rebranded existing GPUs for the next generation for quite a while now (I think nvidia more than AMD) so I would expect a rebrand similar to 500-600, considering how well the 680m has performed. Their die shrink is also following the trend given approximately two years between the each of the last two shrinks.
Most of the stuff I have said is speculation because that is what prediction will be for two years in the semicon industry. -
Karamazovmm Overthinking? Always!
people at anandtech think that we are transitioning to a 2 year refresh cycle for gpus.
Im not expecting anything ground breaking. I do hope that they give us something better in the mid range, at least some new cores -
I would have preferred usage of synthetic diamonds and graphene in cpu's and gpu's.
Of course, that would require getting rid of silicon all-together in order to see 40-80 (maybe 100) times increase in efficiency/power/speed/capabilities (at 10 times lower power consumption than current computers) - which neither Intel, Nvidia or AMD are crazy enough to go for because they are commercialized firms with profits in mind - and switching over to those 2 materials wold be 'costly' (in terms of money of course, not actual resources).
Then again, they might start using both in minute quantities first in hybrid form to slightly augment existing silicon chips - instead of using them from the get go (synthetic diamond chips in electronics since 1997 and graphene since 2005 - at least in somewhat hybrid form).
Perhaps by 2020 we will start seeing those 2 materials in minute quantities in electronics that boost capabilities - but they won't be crazy enough to switch over completely because they can still milk silicon until they dry out the market (even though as a material its way obsolete). -
-
Karamazovmm Overthinking? Always!
-
^That. The reason graphene isn't used is the same reason why Intel didn't use 22nm back in 2005... Because nobody knew how to implement it yet. R&D takes time. Those companies do have motivation to use graphene/diamonds/whatever new materials because it'll give them an edge over the competition. If they didn't care about speed or efficiency, then we wouldn't see smaller nm processes or updated architectures either.
2013 CPUs and GPUs
Discussion in 'Hardware Components and Aftermarket Upgrades' started by stege, Aug 6, 2012.