The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Why are GPUs so troublesome?

    Discussion in 'Gaming (Software and Graphics Cards)' started by Althernai, Jul 3, 2009.

  1. Althernai

    Althernai Notebook Virtuoso

    Reputations:
    919
    Messages:
    2,233
    Likes Received:
    98
    Trophy Points:
    66
    I've been wondering about this for a while: why is it that CPUs pretty much work perfectly in a way that is completely transparent to the user while GPUs require constant driver updates to stay relevant and fail at least an order of magnitude more often? I have never had to upgrade the firmware for my CPU in any incarnation of Windows or Linux I have used over the years. Likewise, I cannot recall a single story I've read of CPUs dying under normal conditions -- there have to be some that break down, but they must be pretty rare.

    Is it just that CPUs are necessary for serious applications where failure is not an option (and if it happens, companies with pocketbooks large enough to sue anyone will demand massive amounts of compensation) while GPUs are more or less restricted to gaming and individual users who will at most demand a replacement?
     
  2. RainMotorsports

    RainMotorsports Formerly ClutchX2

    Reputations:
    565
    Messages:
    2,382
    Likes Received:
    2
    Trophy Points:
    56
    I have never known a GPU that had a working driver to need an update. This is my most current card and my first dedicated for a laptop. All the games I play work fine under 179.41 that came on it stock, as well as DOX 182.05 and DOX 185.85.

    GPU's run at a higher constant load in gaming, something a cpu is not usually subjected to for long periods of time unless compression or encrypting something or running things its just not fast enough for (you know older CPU's running new stuff). GPU's run hotter and combined with this constant thrasing and going from warm to extreme and back (expanding and contracting more and more rapidfly) it under goes more stress.
     
  3. HaloGod2007

    HaloGod2007 Notebook Virtuoso

    Reputations:
    461
    Messages:
    2,465
    Likes Received:
    0
    Trophy Points:
    0
    a cpu constantly runs windows, there isnt a new windows every week like there are games........so drivers keep the gpu's updated to work better on new games that run different engines....drivers can also effect how a cpu works in a game...like if a gpu takes over physics and effects the load of the cpu
     
  4. bigepilot

    bigepilot Notebook Evangelist

    Reputations:
    1
    Messages:
    301
    Likes Received:
    0
    Trophy Points:
    30
    The real scary thing is the consoles having their share of the troubles with heating issues and patches. To be a PC gamer this day in age you really have to know what your doing, especially if you're unlucky enough to have Vista. You would think the manufacturers would be afraid of losing business to the steepening learning curve.
     
  5. Althernai

    Althernai Notebook Virtuoso

    Reputations:
    919
    Messages:
    2,233
    Likes Received:
    98
    Trophy Points:
    66
    Try not updating a driver for more than 2 years. This will almost guarantee that many a game will not work properly, but if you don't want to wait, go to any game's technical support pages: the very first thing anyone asking for help will be told is "Are your video card drivers up to date?" In fact, many games don't even officially support laptops because they can't guarantee that OEM drivers will work properly.

    It depends on what CPUs are used for. The ones I use for work (scientific computing) tend to run at 100% for days or even weeks at a time. They're usually Xeons or Opterons, but I've done it with my laptop a couple of times and it has never shown any glitches. If you want a much larger sample size of consumer-grade CPUs under heavy workload, look at all of the people who run Folding@Home and similar programs -- the CPUs are rock solid.
    You are right, but these are symptoms rather than the cause. Why are they designed to run hotter? This is another interesting question -- why are GPUs engineered as if energy consumption doesn't matter? They high end ones consume more power than entire PCs.
    Other way around, I think. CPUs run various flavors of Windows, Mac OS and Linux without any issues. Furthermore, there are new programs every day and CPUs run all of them (including games) with very few issues. The GPUs don't matter for the overwhelming majority of programs out there -- all they really have to do is run games, but they can't seem to get that done.
     
  6. RainMotorsports

    RainMotorsports Formerly ClutchX2

    Reputations:
    565
    Messages:
    2,382
    Likes Received:
    2
    Trophy Points:
    56
    ATi X700 Pro, which was a mid range card in its day and great for anything before crysis came out (ran fear at 1280x1024 great) runs best on the later version of the driver it shipper with. Im not sure if its the same with the 2xxx 3xxx and replacement 4xxx series but ATi usually runs better on the latest of the driver series it shipped with. I saw the latest drive murdered the performance of the X850 Pro which was the best card of its day and only recently became worthless as of DX10. ATi is known for bad drivers though.

    Of course 179.xx drivers are only 6 months old now so i have less experience on the Nvidia end of things. If your GPU is 2 years old its probably on the minimum list by now anyways eh? X1300-1950 are on the minimum list these days and they were common affordable but a generation behind cards 2 years ago.

    I would think the latest driver that matches the latest directX being use by the game would work fine. Drivers that dont affect DX9c compatibility in a DX9c game probably need the driver updates less than the latest DX10/10.1 game. But i will see as we will see DX11 in my pc's lifetime.

    @Heat, if you work in scientific computing you would work with vector processors at some time in your life. How does their heat compare to GPU's? I would think similar I know in the late 80's 300 mhz vector units where using refrigerant cooling.
     
  7. Mastershroom

    Mastershroom wat

    Reputations:
    3,833
    Messages:
    8,209
    Likes Received:
    16
    Trophy Points:
    206
    Please troll another forum. Most of us here know better.
     
  8. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    Well the whole CPU thing began with IBM contracting out x86 CPU production out to AMD so there wouldn't just be a monopolistic supply of x86 CPUs coming from Intel. From there and beyond there are 2 main hitters in the x86 CPU market with a third participant in the form of VIA. They make x86 CPUs that are readily usable for the insane amount of OSs that are made to use them, therefore it's not an issue, because the basic recognition and usability of the architecture is already similar and common enough to not require ultra specialized software drivers just to get it working. The basic x86 drivers in Windows or any other OS is enough to get the basic functionality. GPUs however, are not bound by such practice, they are inherently different, the main usability link between them is either OpenGL or DirectX API functionalities. So while CPUs are built to a basic partial hardware (x86) / partial software (the OS) standard, GPUs are built to a software standard almost completely (the graphics API). The GPU's own driver software must be able to work with Direct X in order to have the functionality it needs in Windows, and OpenGL for Linux and Mac OS.

    Back in the early days of 3D acceleration, it was even worse, you had more numerous 3D GPU manufacturers, as well as more graphics APIs. Some games required a specific API to be used, and certain cards could run that API, and many games had specific renders created by the game developers themselves. It was a mess.