The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Bioshock and gpu & OS question

    Discussion in 'Gaming (Software and Graphics Cards)' started by sasanac, Oct 8, 2007.

  1. sasanac

    sasanac Notebook Evangelist

    Reputations:
    144
    Messages:
    456
    Likes Received:
    0
    Trophy Points:
    30
    ok so this is about graphics cards and OSs on desktop machines but it is a gaming question!! so my apologies in advance if this is in the wrong place.

    I've been playing Bioshock on my Athlon 64 X2 4200+ with ATI X1800 crossfire edition with 256mb ddr3 (at least I think it's ddr3) graphics card. It works a treat on 1680x1024 (on a TFT) with all effects on bar DX10 effects (obviously due to age of the card) and I'm running on Vista HP x64

    I've also played it on my fiances fathers new beast of a gaming machine which is high end core2duo and the 768mb gddr3 Geforce 8800gtx which is a DirectX10 card but he runs XP Pro. His resolution is 1600x1200 on a trinitron CRT.

    My question is on my fiances father machine I'm seeing more effects than on mine even though he's running on XP so he would presumably be on the same DirectX level as me, so why would this be? am I missing something obvious here?! I can't find any more settings to change in the Bioshock graphics setup in game menu and to get things smother on mine I forced Antialiasing to 4x plus some other tweaks via the ATI control panel.

    The effects I seem to be missing are better transparency in steam/mist and slightly better water effects. To be honest there's not much in it (which makes me hugely happy lol) but I'm just curious as to what might make that extra bit of a difference.. like I said I have a feeling I'm missing the obvious here but it is 2318 and I've been up since 0600 and had a loooong day at work and my brain is just about giving up on me!
     
  2. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    Well, the higher resolution could be causing it to look better, and it's possible that the effects simply look better with a high end nVidia GPU instead of a last generation ATI GPU.
     
  3. wogstaa

    wogstaa Notebook Evangelist

    Reputations:
    2
    Messages:
    326
    Likes Received:
    0
    Trophy Points:
    30
    Yeh, that sounds about right
     
  4. sasanac

    sasanac Notebook Evangelist

    Reputations:
    144
    Messages:
    456
    Likes Received:
    0
    Trophy Points:
    30
    hmm makes sense I guess... I've got an identical monitor that I don't use so I might give it a try on that to see what happens.. (just to see how much is down to resolution)

    Still no matter what I'm pleased there's still life in my X1800 graphics card yet!

    thanks for the replies!
     
  5. d4mi3n

    d4mi3n Notebook Geek

    Reputations:
    0
    Messages:
    86
    Likes Received:
    0
    Trophy Points:
    15
    maybe its using SM 4.0? it is a dx10 card... will 9.0c use sm 4?
     
  6. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    No, in XP (unless he's done some significant mods) only DX9.0c is available, which only uses up to SM3.0.
     
  7. HavoK

    HavoK Registered User

    Reputations:
    706
    Messages:
    1,719
    Likes Received:
    0
    Trophy Points:
    55
    The 8800GTX is much more powerful then the X1800XT. I have a X1800XT and I can definitely not run at 1680x1050 without major stuttering in heavy areas. 1280x1024 max settings is acceptable but still not perfect at times. Bioshock is probably optimized more for Nvidia cards.
     
  8. sasanac

    sasanac Notebook Evangelist

    Reputations:
    144
    Messages:
    456
    Likes Received:
    0
    Trophy Points:
    30
    aye that's why I was very happy that my Sapphire Radeon X1800 XL 256MB PCI-E Crossfire ready card had very little difference.. it's odd that yours stutters, I thought the XTs were better than the XL?

    I started playing Bioshock on 1280x800 as I presumed it wouldn't be up to the job.. but it breezed it so I gradually started increasing the settings right up to 1680x1024 with full effects. I'm guessing here but I guess the Athlon 64bit X2 4200+ processor helps me out a bit with it running smoothly? I've not experienced any stuttering or slow down anywhere, even when the action gets a bit hectic.

    Bioshock I'm sure is optimised for Nvidia cards (it's got that "Nvidia the way it's meant to be played" logo thing at the begining).

    I just thought it weird that a DX10 card running as DX9 has more effects than a DX9 card.. I thought that the DX9 would be the limiting factor? Unless newer DX10 cards get more out of DX9?
     
  9. icer412

    icer412 Notebook Consultant

    Reputations:
    19
    Messages:
    116
    Likes Received:
    0
    Trophy Points:
    30
    Maybe he has 2 of them if hes using crossfire. I am just assuming though. But it seems that he might if hes running at that high of a resolution and level of details without all the stuttering. I have a feeling the nvidia might be optimized for bioshock. Also im sure the fact that its a newer card it would give it some extras that older cards might not be able to use, regardless of the dx10 (which neither of the senerios should be running in dx10)
     
  10. sasanac

    sasanac Notebook Evangelist

    Reputations:
    144
    Messages:
    456
    Likes Received:
    0
    Trophy Points:
    30
    ah sorry I wasn't clear on that one .. I'm just using one graphics card (I never got round to getting a second one), it's the older version of the card too.
     
  11. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    An X1800XL should perform about the same as an 8600GTS, if I'm not mistaken, so you're probably getting ~20fps on 1680x1050?
     
  12. sasanac

    sasanac Notebook Evangelist

    Reputations:
    144
    Messages:
    456
    Likes Received:
    0
    Trophy Points:
    30
    just tried testing it with Fraps only the free version tho so it only does 60 seconds and got average 27fps in a fairly busy area so not too bad really.. (I shot up a bouncer and legged it past a few grenade throwing guys so it kicked things off well!)

    *edit... just tried it at 1360x768 (next widescreen setting down) out of interest with high detail etc and got the following

    Frames, Time (ms), Min, Max, Avg
    3315, 56940, 22, 78, 58.219

    might leave it at that setting.. or just change it down if the action gets too much later on in the game for the gpu at the higher res.
     
  13. d4mi3n

    d4mi3n Notebook Geek

    Reputations:
    0
    Messages:
    86
    Likes Received:
    0
    Trophy Points:
    15
    there are mods to get dx10 on xp though right?
     
  14. MozzUK

    MozzUK Notebook Guru

    Reputations:
    8
    Messages:
    60
    Likes Received:
    0
    Trophy Points:
    15
    Nope, sorry :( , don't think so. DX10 is a new API which makes use of unified shaders on DX10 hardware to achieve its effects, things that DX9 and XP don't support. I think some DX10 effects can be achieved in DX9 but are just too costly in terms of performance to implement.
     
  15. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    Yes, there are, however they are not completely working as of now.