The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    16/32 bit colors question

    Discussion in 'Gaming (Software and Graphics Cards)' started by Yutong, Aug 4, 2007.

  1. Yutong

    Yutong Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    I was playing CSS and I notice that on 16 bit color, max setting I get 120-140fps. But on 32 bit color, max setting I only get 40-60fps.

    Why the huge drop in performance, is this normal?

    I have 7900gs.
     
  2. techguy2k7

    techguy2k7 Notebook Evangelist

    Reputations:
    93
    Messages:
    442
    Likes Received:
    0
    Trophy Points:
    30
    The drop in performance has nothing to do with the change in color depth. All modern GPUs natively render in > 32bpp precision. You must've selected a framelock (such as vsync or a simple 60hz frame cap) when you changed to 32-bit rendering.
     
  3. Yutong

    Yutong Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    thanks for the answer
     
  4. usapatriot

    usapatriot Notebook Nobel Laureate

    Reputations:
    3,266
    Messages:
    7,360
    Likes Received:
    14
    Trophy Points:
    206
    They might do so natively but if a user manually changes the option for a game to 16-bit then it will be rendered at 16-bit.

    As to why you might get better performance is maybe because there are less colors to render.
     
  5. techguy2k7

    techguy2k7 Notebook Evangelist

    Reputations:
    93
    Messages:
    442
    Likes Received:
    0
    Trophy Points:
    30
    Regardless of what bit-depth the application calls for, the GPU is going to render it at 32bpp (or greater) internally, and then dither down to 16bpp when requested. There are no fewer calculations being performed just because the app wants lower precision. This has been true of all NV & ATI GPUs released since the very first DX9-compliant cards came out in 2003.

    It hasn't been a performance issue since the days of the original Geforce in 1999.
     
  6. knightingmagic

    knightingmagic Notebook Deity

    Reputations:
    144
    Messages:
    1,194
    Likes Received:
    0
    Trophy Points:
    55
    Actually, some older games run better in 16-bit, but it just looks awful. (You should really be able to run old games in 32-bit).
     
  7. imhungry29

    imhungry29 Notebook Evangelist

    Reputations:
    19
    Messages:
    473
    Likes Received:
    0
    Trophy Points:
    30
    techguy2k7 wasnt saying that it wont display at 16bit color. he is saying that reducing the color depth will have no performance impact, period. ur graphics card is still going to render at 32bit color but it will display at 16bit color. no performance gains at all.