I've been wondering about this for a while: why is it that CPUs pretty much work perfectly in a way that is completely transparent to the user while GPUs require constant driver updates to stay relevant and fail at least an order of magnitude more often? I have never had to upgrade the firmware for my CPU in any incarnation of Windows or Linux I have used over the years. Likewise, I cannot recall a single story I've read of CPUs dying under normal conditions -- there have to be some that break down, but they must be pretty rare.
Is it just that CPUs are necessary for serious applications where failure is not an option (and if it happens, companies with pocketbooks large enough to sue anyone will demand massive amounts of compensation) while GPUs are more or less restricted to gaming and individual users who will at most demand a replacement?
-
RainMotorsports Formerly ClutchX2
I have never known a GPU that had a working driver to need an update. This is my most current card and my first dedicated for a laptop. All the games I play work fine under 179.41 that came on it stock, as well as DOX 182.05 and DOX 185.85.
GPU's run at a higher constant load in gaming, something a cpu is not usually subjected to for long periods of time unless compression or encrypting something or running things its just not fast enough for (you know older CPU's running new stuff). GPU's run hotter and combined with this constant thrasing and going from warm to extreme and back (expanding and contracting more and more rapidfly) it under goes more stress. -
a cpu constantly runs windows, there isnt a new windows every week like there are games........so drivers keep the gpu's updated to work better on new games that run different engines....drivers can also effect how a cpu works in a game...like if a gpu takes over physics and effects the load of the cpu
-
The real scary thing is the consoles having their share of the troubles with heating issues and patches. To be a PC gamer this day in age you really have to know what your doing, especially if you're unlucky enough to have Vista. You would think the manufacturers would be afraid of losing business to the steepening learning curve.
-
-
RainMotorsports Formerly ClutchX2
Of course 179.xx drivers are only 6 months old now so i have less experience on the Nvidia end of things. If your GPU is 2 years old its probably on the minimum list by now anyways eh? X1300-1950 are on the minimum list these days and they were common affordable but a generation behind cards 2 years ago.
I would think the latest driver that matches the latest directX being use by the game would work fine. Drivers that dont affect DX9c compatibility in a DX9c game probably need the driver updates less than the latest DX10/10.1 game. But i will see as we will see DX11 in my pc's lifetime.
@Heat, if you work in scientific computing you would work with vector processors at some time in your life. How does their heat compare to GPU's? I would think similar I know in the late 80's 300 mhz vector units where using refrigerant cooling. -
-
mobius1aic Notebook Deity NBR Reviewer
Well the whole CPU thing began with IBM contracting out x86 CPU production out to AMD so there wouldn't just be a monopolistic supply of x86 CPUs coming from Intel. From there and beyond there are 2 main hitters in the x86 CPU market with a third participant in the form of VIA. They make x86 CPUs that are readily usable for the insane amount of OSs that are made to use them, therefore it's not an issue, because the basic recognition and usability of the architecture is already similar and common enough to not require ultra specialized software drivers just to get it working. The basic x86 drivers in Windows or any other OS is enough to get the basic functionality. GPUs however, are not bound by such practice, they are inherently different, the main usability link between them is either OpenGL or DirectX API functionalities. So while CPUs are built to a basic partial hardware (x86) / partial software (the OS) standard, GPUs are built to a software standard almost completely (the graphics API). The GPU's own driver software must be able to work with Direct X in order to have the functionality it needs in Windows, and OpenGL for Linux and Mac OS.
Back in the early days of 3D acceleration, it was even worse, you had more numerous 3D GPU manufacturers, as well as more graphics APIs. Some games required a specific API to be used, and certain cards could run that API, and many games had specific renders created by the game developers themselves. It was a mess.
Why are GPUs so troublesome?
Discussion in 'Gaming (Software and Graphics Cards)' started by Althernai, Jul 3, 2009.