I fail to understand Nvidia's and Apple's decision for not scrapping the entire laptop GPU products and replace them with powerful GPU such as Tegra 3 and A5 (the one in ipad 3).
If those chips can "really" deliver console-quality gaming (in which theoretically they should be able to run games such as Crysis 2 and Skyrim) then what is the point of having regular GPUs (e.g.: 5xxm GTX, etc) which constantly overheat and require good copper heatsink and fan?
Wouldn't it be great to have on your laptop a Tegra or A5 GPU that produce near zero heat and performance-wise the same as console gaming? If the technology exist already, what's up with the hold back?
-
-
TheBluePill Notebook Nobel Laureate
Completely different animals.
Tegra 3 and the A5 are potent only for their target devices, tablets. They are not capable of running really graphically intensive 3D applications.
Even a low end GPU from AMD or NVIDIA has significantly more horesepower than the Tegra or A5. The Tegra 3 has 12 Shaders, compared to 48 in just the 520M -
Is definitely NOT the case, at least not yet. -
moviemarketing Milk Drinker
I don't think the performance of ancient console hardware is anything laptop manufacturers should be shooting for.
I game on a laptop with a metal case that gets extremely hot, but I'm quite happy to put up with it in order to have a system that can run games at 1080p. I use a mouse and keypad, so I never touch the palmrest when gaming.
The future of laptop gaming?
Discussion in 'Gaming (Software and Graphics Cards)' started by strider3871, Mar 16, 2012.