The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Dell XPS M1730 Owner's Lounge, Take 2

    Discussion in 'Dell XPS and Studio XPS' started by J-Bytes, Sep 27, 2007.

  1. Quicklite

    Quicklite Notebook Deity

    Reputations:
    158
    Messages:
    1,576
    Likes Received:
    16
    Trophy Points:
    56
    Eambo's right.

    In real games, there is very little difference between 8800m GTX (9800m GT) and 9800m GTX. Most difference it shows are in benchmarks.

    The clock of 9800m GT and 9800m GTX are identical. Set at 500/1250/800, the main difference is in 2x16 extra shaders.

    Otherwise it seems, that each 9800m GTX runs 10Ws hotter than an ordinary GT. In Sli config, your looking at 20W extra heat to deal with, under full load.



    http://www.lancelhoff.com/9800m-gt-vs-9800m-gtx-benchmark/

    9800m GTX sli for M1730 has pretty limited supply, thus there is very hefty price on them.

    Put it simply, its not really worth upgrading, unless you have serious amount cash to burn for rather marginal improvement.
     
  2. 72hundred

    72hundred Revolutions-Per-Millennia

    Reputations:
    392
    Messages:
    1,228
    Likes Received:
    0
    Trophy Points:
    55
    You'd be better off having a cash burning BBQ! I really won't have any interest in upgrading till >256bit is available. TBH I'm disappointed the M17x doesn't have it, it seems like the next step.
     
  3. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    Afaik, there are not mobile cards with more than 256 bit. Sadly.
     
  4. BatBoy

    BatBoy Notebook Nobel Laureate

    Reputations:
    7,395
    Messages:
    7,964
    Likes Received:
    25
    Trophy Points:
    206
  5. Slammin

    Slammin Notebook Consultant

    Reputations:
    125
    Messages:
    281
    Likes Received:
    0
    Trophy Points:
    30
    Just heard on an Ozzie forum Dell Asia Pacific has removed the XPS1730 from it's website.....
     
  6. Eambo

    Eambo Notebook Evangelist

    Reputations:
    60
    Messages:
    361
    Likes Received:
    0
    Trophy Points:
    30
    That article brought up a strange point. These new cards are coming with Directx 10.1 support.

    Windows 7 will be out Q1 next year *at the latest*. I can see an OEM copy in September, and a full release before the end of the year personally. But anywho back to the topic at hand - Directx 11.

    Windows 7 brings with it Directx 11 - and I haven't seen any sign of any cards supporting this yet. Have Nvidia said anything about mobile cards that will support this?
     
  7. Quicklite

    Quicklite Notebook Deity

    Reputations:
    158
    Messages:
    1,576
    Likes Received:
    16
    Trophy Points:
    56
    Apparently DirectX 11 is backward compatible with DirectX10 cards. Not sure though, and that may be subjected to changes.
     
  8. BatBoy

    BatBoy Notebook Nobel Laureate

    Reputations:
    7,395
    Messages:
    7,964
    Likes Received:
    25
    Trophy Points:
    206
    I just installed 190.15 and found an interesting option in the Manage 3D settings section of the NVIDIA control panel.

    So far temps are typical of the current series - nice and low ide temps. Will test out L4D and TF2 later tonight. So far so good.
     
  9. SomeFormOFhuman

    SomeFormOFhuman has the dumbest username.

    Reputations:
    1,037
    Messages:
    1,012
    Likes Received:
    0
    Trophy Points:
    55
    Yeah I saw the latest drivers there too. Thanks for the headsup. Will try this driver tonight. :)

    As for 9800M GTX upgrades - Well if you have spare cash, spend em'. If not, stick to what you got. Our 8800s/9800M GTs don't give significant performance changes vs 9800M GTXes anyway. Unless if they give GTX 280M which I doubt. But even if it's so, it ain't worth the upgrade either.
     
  10. KracsNZ

    KracsNZ Notebook Evangelist

    Reputations:
    91
    Messages:
    332
    Likes Received:
    0
    Trophy Points:
    30
    Sadly for the time being it looks like we're s%$t outta luck for fan control. Despite Franck's (from HW Monitor) best effort (he put allot of work in), we have been unable to get GPU Fan 2 functioning well. We had a big remote desktop debug session today.

    Both the CPU and GPU Fan 1 work perfectly, but Fan 2 just does not want to stay on. Franck even tried constantly setting the Fan every second to see if that would work, but it constantly turned off in less than 1/4 second.

    We're both pretty sure there is a hidden Dell PWM command to switch off the auto control, but since Dell would never disclose this info it's going to be just too much work to get this to function.

    Just a note on how well it would work, just keeping CPU and GPU fan 1 at 50% (almost not audible noise) yielded idle temps at sub 20C for CPU and sub 40C (35 to 40C) for GPU 1. All this while GPU 2 was idling at 65-75C.

    Couldn't test while gaming because as soon as GPU 2 reached a certain temperature (think around 75C) auto kicked in resetting all the fans.

    Franck hasn't given up yet, though I think it may be worth it. I sent him an image of the Dell diagnostics program and he's going to try to debug/decompile some of the code and work out how Dell does the fan control in it.

    I wish him luck, just sorry he's spent so much time on this :(
     
  11. Eambo

    Eambo Notebook Evangelist

    Reputations:
    60
    Messages:
    361
    Likes Received:
    0
    Trophy Points:
    30
    GL Franck =-( That would be amazing. Hope you can get it working and finally get us the cooling we deserve =-D
     
  12. zergslayer69

    zergslayer69 Liquid Hz

    Reputations:
    62
    Messages:
    1,551
    Likes Received:
    91
    Trophy Points:
    66
    It'd definitely be nice to be able to use the x9000 at 3.4ghz WITHOUT having the fans at full blast. With the fans on (like the usual when gpu hits 75C) and some undervolting, it should be plenty for cooling. But this max rpm noise is just too much, as much as I want to fully utilize my cpu's capability.
     
  13. BatBoy

    BatBoy Notebook Nobel Laureate

    Reputations:
    7,395
    Messages:
    7,964
    Likes Received:
    25
    Trophy Points:
    206
    Just use headphones... :p
     
  14. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    I score 11000 in 3D Mark 06 with my T7700 and SLI. That is after I have tweaked the system. Also it depends on what drivers you use.
     
  15. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    Well said. It's all nice that we get more shaders and memory, but a little research will also show that 1GB of memory is nigh useless with a 256-bit bandwidth. I'm a little ticked off at nVidia for how they've tagged their mobile GPUs this generation. I mean, the 280m is actually short of the top line desktop 9800m GTX. Whereas people reading 9800m GTX seem to think that they actually stand a chance against the desktop version - no comparison. It's a hotter, slightly higher shader-count 9800m GT with an extra 512 MB of Vram, which is gimped due to the fact that the card is still 256-bit.

    280m should've atleast come with 512-bit memory bandwidth to distinguish itself from the current generation of mobile cards.


    BatBoy, thanks for linking the article to this forum. People need to read more of this stuff to get an idea of just how borderline the upgrades are getting this generation, and I'd consider these new and 'improved' cards to be downgrades more than upgrades. 128-bit? Oh, man. That brings back nightmares of having hell runing even old, obsolete games at resolutions above 1440x900.

    As for DX 10.1. Yes, this has its uses. For example, under DX 10 API, one could run deferred shading with MS-anti-aliasing, with very little performance loss. Unfortunately, with our current dx10 configs, games require a special code-patch for multi-sampling if deferred shading is involved; only one or two games employ this via patch, and those would be Far Cry 2 (for sure) and STALKER CLEAR SKY (I think). Typically, the only kind of AA available on deferred-shading with a normal DX 10.0 card is super-sampling, which we all know is very demanding in terms of performance. Now how relevant is deferred shading (in my opinion it's very relevant as it can make lighting look amazing without eating up system resources - Killzone 2), or how important AA might be under deferred shading, which Far Cry 2 already does using a patch on standard DX 10.0 cards is up for a lot of debate. I personally think DX 10.1 had some potential, but most of it is getting done with DX 10.0 now. As for DX11, it is backwards compatible, but that doesn't rule out the possibility of an improved API for cards that sport actual DX11 hardware - another thing nVidia seem to be lacking.
     
  16. Quicklite

    Quicklite Notebook Deity

    Reputations:
    158
    Messages:
    1,576
    Likes Received:
    16
    Trophy Points:
    56
    Yeah, NV's mobile GPU offering has not drastically evolved since 8800m GTX was introduced. For about two years, they've been doing those shameless renaming, etc to marginally improve their offering. Partly because there was no threat of competition...

    Though, I'd be very happy with 256 bit GDDR 5 RAM...
     
  17. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    Hey dude, long time no see.
    What tweaks are those, you beat me by 500 points...
     
  18. pchm

    pchm Notebook Consultant

    Reputations:
    53
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    Welcome back Magnus. Long time no see! Eh, well, no see a post from you! :)
     
  19. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    Agreed.

    And you're right about the RAM. I almost overlooked the GDDR5; at least they're showing ATi something, now. Anyway, I'm not too much of an expert or scholar on the subject, but wouldn't GDDR5 V-RAM still make less of an impact than a proper 512-bit memory bandwidth? Come to think about it, wouldn't something as fast as GDDR5 RAM be best utilised by a higher bandwidth?

    More importantly, welcome back, Magnus. 11k is a pretty good score for a T7700, but it only shows me how useless my CPU is in real-time gaming. I just scored 11,920 in 3dMark06, using stock clocks with just a mild system clean-up, yet I know that Magnus' machine performs stutter free and produces better frame rates in actual games like Crysis, GTA IV, Empire Total War, etc.

    On the topic of static benchmarks, I would like to know how you got such a high vantage score, Quicklite. 11k? I'm hardly 9k, yet our 3dMark06 scores are so similar. Not to mention, so are our machines.
     

    Attached Files:

  20. BigJayTee

    BigJayTee Notebook Consultant

    Reputations:
    15
    Messages:
    134
    Likes Received:
    0
    Trophy Points:
    30
    Does anyone here know if I can image my Raid 1 drives and push it to a bigger upgrade for my m1730? I really do not want to reinstall my system again.
     
  21. Tuwa

    Tuwa Notebook Consultant

    Reputations:
    26
    Messages:
    111
    Likes Received:
    0
    Trophy Points:
    30
    I would like to know that as well, slowly running out of space :eek:
     
  22. Quicklite

    Quicklite Notebook Deity

    Reputations:
    158
    Messages:
    1,576
    Likes Received:
    16
    Trophy Points:
    56
    Are there anyway to mod the color of the keypad backlight color?
    Kinda getting bored of a single color.

    Maybe swap in M17x's keypad?
     
  23. Tuwa

    Tuwa Notebook Consultant

    Reputations:
    26
    Messages:
    111
    Likes Received:
    0
    Trophy Points:
    30
    I have one broken keyboard which i'm keeping just for spare keys. I can try to pop it open to see if the LEDs would be easy to change. But from my brief observation it seems to be made from many smaller LEDs, which can take quite a lot of time to swap individually.
     
  24. Karma_Police

    Karma_Police Notebook Consultant

    Reputations:
    35
    Messages:
    133
    Likes Received:
    0
    Trophy Points:
    30
    Hey guys, I'm trying Magnus' Crysis Warhead config ( http://forum.notebookreview.com/showthread.php?p=3954336#post3954336) with my 9800m gt sli but I think the shadows are messed up. The ground looks so dark that I can't see anything on the floor at all. I can play the original Crysis fine with Magnus' config file... any suggestions?

    Oh I'm using the official 185.65 drivers.
     
  25. hankaaron57

    hankaaron57 Go BIG or go HOME

    Reputations:
    534
    Messages:
    1,642
    Likes Received:
    2
    Trophy Points:
    56
    Well, Magnus config was made before the 185.xx drivers came to fruition if I remember correctly.
     
  26. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    Dude, his config's tweaked good in my opinion. However, your problem isn't the shadows, I believe. Try the following command:

    r_ssao_darkening

    With Magnus' setting, it's at 1.3. Personally, I like the high contrast darkness, but that's a case of taste. If the ssao is too dark for you, then you should take it down to 1.0, or even 0.9, and that'll more or less solve your problem. So open the console using the '¬' key, and then enter the following command:

    r_ssao_darkening 1.0 (or 0.85, if you desire lighter).

    If this solution works, then simply open Magnus' config, and find the r_ssao_darkening command, and alter its value to your preferred level.
     
  27. Tuwa

    Tuwa Notebook Consultant

    Reputations:
    26
    Messages:
    111
    Likes Received:
    0
    Trophy Points:
    30
    Yep, checked today. There are tiny LEDs under each single key and they are soldered/glued to thin plate of metal. Swapping them for multi-colour LEDs would take ages, but if you are patient and have time it might be possible. Just the process of removing the original LEDs seems painful. They are holding there really tightly and the plastic sockets seem to be quite fragile. Might be possible to surgically remove them with knife though, but then again, seems like a job that can take several days and in the end probably not worth the trouble.
     
  28. TimeWriter

    TimeWriter Notebook Evangelist

    Reputations:
    244
    Messages:
    419
    Likes Received:
    0
    Trophy Points:
    30
    Why don't you try to put a coloured piece of paper betwen the LEDs and keyboard?
     
  29. Tuwa

    Tuwa Notebook Consultant

    Reputations:
    26
    Messages:
    111
    Likes Received:
    0
    Trophy Points:
    30
    The surface is not flat and it's covered by another non-transparent foil, so only raised area around keys is exposed. Would take a lot of cutting and forming, not to mention that under certain angle you would be able to see the paper when you look at the keyboard, would look very cheap. However, based on your idea, it might be possible to use a thin layer of transparent paint. Just apply a bit on the LEDs, that could possibly work.
     
  30. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    With these 185.65+ drivers, I noticed a general improvement in SLi performance. So I decided to do a 3dMark06 test using 1920x1200, and actually managed to stay above 11k. I think that ain't bad considering my score with the older drivers usingt he same resolution.

    Driver: 185.85

    Resolution: 1920x12900
     

    Attached Files:

  31. KC0r8y

    KC0r8y Notebook Consultant

    Reputations:
    1
    Messages:
    279
    Likes Received:
    0
    Trophy Points:
    30
    Has anyone taken the PhysX unit out of their system? I'm curious to know if there was any temp drops or overall performance losses.

    Any info appreciated.
     
  32. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    Eleron 'retired' his Ageia PhysX card recently. He should know about the temp differences.
     
  33. Tuwa

    Tuwa Notebook Consultant

    Reputations:
    26
    Messages:
    111
    Likes Received:
    0
    Trophy Points:
    30
    Wouldn't simply disabling it work? Or is it still running when it's disabled?
     
  34. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    Well, technically, yes. Most likely, even running, this card should hardly produce much heat. It's only called into action when a 'physx' specific software demands the hardware, unlike the GPUs, which still run under lower clocks during standard 2d-pc use. I doubt that removing or disabling the card will have any significant impact on overall PC heat readings, but it would be interesting if to get feedback from Eleron.
     
  35. ernstig01

    ernstig01 Notebook Evangelist

    Reputations:
    265
    Messages:
    425
    Likes Received:
    0
    Trophy Points:
    30
    I took out the Ageia PhysX card when I was cleaning my system.
    I can't tell if it lead to a measurable temp drop. Of course I had, but it was because of cleaning my fans and using AS5 for my CPU/GPU's.
    For sure it'll give you no performance losses. I had it disabled for a long time and desided to remove it when I opened my system.

    Look at it this way: If you don't need or use it you can remove it. At least one component less in your system what could give possible issues.
    A well-performed sportscar you want to keep light as possible. :cool:
     
  36. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    Wait. Ernstig, was that you who 'retired' the card, or did Eleron also remove his card? Sorry about the confusion.

    Anyway, could you do me a favour. I don't know if it's my GPUs or software, but my physx performance, as you know, is terrible with a certain game. Would it be possible for you to download the Cryostasis demo from nzone.com, and give it a try with all settings at max, and resolution at 1280x800? It would really answer a lot of questions for me, because at the moment, I have to rely on my Ageia card to make that game play smoothly with advanced physx effects, and that only leads me to believe that the PPU card can still help with very GPU intensive games that also use intensive Physx.
     
  37. Tuwa

    Tuwa Notebook Consultant

    Reputations:
    26
    Messages:
    111
    Likes Received:
    0
    Trophy Points:
    30
    It should help. When you use your graphics card for PhysX acceleration it will take a small part of its available power to do calculations. If you use dedicated PhysX card like Ageia then all calculations are done there and gpu is free from this load, it only has to render the picture. You will still feel slight performance hit as the gpu needs to draw more particles, but it should be much less stressing then doing physics calculations on top of that.
     
  38. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    Well, one would think that in theory, wouldn't they? But the difference is almost shocking, because in Cryostasis, I am talking under 20 FPS for GeForce vs. solid 35+ with Ageia, but this is a game that -really- uses and pushes PhysX, it was afterall, billed as nVidia's primary DX10 PhysX tech demo. On the GPU side, the graphics aren't amazing, but the particles and lighting is another story seeing as the game uses geometry shaders and other unique DX10 features.

    Still, I'd like to see another person run this game smoothly, using just geforce PhysX. I think it dragged on my system because my system is badly managed at the moment.This is why I request that someone else run the demo, and recommend a lower resolution for the tests so the GPUs are not under stress from the graphics, and have the head-room to manage the PhysX tasks. If the same result comes out, then I guess the Ageia was giving me an edge.
     
  39. Eambo

    Eambo Notebook Evangelist

    Reputations:
    60
    Messages:
    361
    Likes Received:
    0
    Trophy Points:
    30
    The question is, however, has Nvidia taken this into consideration. Have they got lazy and just cut out the Physx card in favour of their own cards, or do they still utilise the physx card if it's in the system?
     
  40. ACHlLLES

    ACHlLLES Notebook Virtuoso

    Reputations:
    303
    Messages:
    2,199
    Likes Received:
    0
    Trophy Points:
    55
    From what I've read, it all depends on the game. In some games, Physx does work well, and increases fps.
     
  41. Tuwa

    Tuwa Notebook Consultant

    Reputations:
    26
    Messages:
    111
    Likes Received:
    0
    Trophy Points:
    30
    Hmm well, you can disable PhysX from nVidia Control Panel and that automatically redirects stuff to Ageia card, so i don't think they cut it out completely. If, however, you enable PhysX on the gpu then i have no clue whether Ageia is ignored or not, but it would be logical if it was, because by enabling PhysX in nVidia CP you agree to use gpu for this task. You can always turn it off.

    On the other hand, even though if PhysX puts additional stress on GPU, it might also compensate for it by being faster? After all, in case of Ageia, information has to run to PPU, be processed, and then run back to gpu (not sure if also cpu is involved in this process to make it even more complex), so there will be some losses, while GPU integrated PhysX is calculating it on the fly right in the core.
     
  42. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    Most of the so-called PhysX drivers are supposed to accommodate Ageia cards as well. Essentially, according to nVidia, any PhysX driver will work just as well with both the GeForce and the Ageia PPU.

    Now if you're comparing powers of the PPU card vs. a standard 8-series GPU, the GPU will win out - more going for it, but that would only be valid if the GPU was working as a sole and dedicated PPU. I've tried this approach as well by disabling SLi, but the performance is still on the weaker side. I don't think on our laptops, it works out the same way as both GPUs are on the same board/slot or something.

    On the subject of interface and latency between PPU and GPU, this would be very little; most of the GPUs since the 6-series have been really fast. Although the theory is interesting, but we know that desktop enthusiasts use older 8-series cards now as dedicated PPUs, and that works out much better than having single 280s handle both.
     
  43. Tuwa

    Tuwa Notebook Consultant

    Reputations:
    26
    Messages:
    111
    Likes Received:
    0
    Trophy Points:
    30
    That sounds hard to believe. When you enable PhysX, you don't only get some physical effects but there are also extra particles to be drawn so you need more power. When you disable PhysX, the physics behaviour and the extra particles are removed and you get lower, poorer effects. So logically, running without PhysX should be less demanding and thus yield higher fps. But i'm no expert so that's just my view, don't want to argue. Kinda stopped being interested in PhysX after discovering how few titles support it.
     
  44. ernstig01

    ernstig01 Notebook Evangelist

    Reputations:
    265
    Messages:
    425
    Likes Received:
    0
    Trophy Points:
    30
    I supose I can give it a try. Maybe this game needs the Ageia PhysX card. I'm also curious to know.

    BTW, I think it was me. :p I retired my card about 2 weeks ago.
     
  45. artvarck

    artvarck Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    Just picked up an M1730 from Dell Outlet:

    T8300, 9800M GT SLI, 256GB Ultra Performance SSD, 4GB RAM.
    Just under $1,700 shipped to HI.

    Does anyone have any experience with the SSD Ultra? I'm curious about its performance compared to 7200K drives.

    Thx
     
  46. Tuwa

    Tuwa Notebook Consultant

    Reputations:
    26
    Messages:
    111
    Likes Received:
    0
    Trophy Points:
    30
    Congratulations, mighty fine beast you got there, i'm sure you will be happy. In regards of SSD, they are bloody fast. You can't really compare them to 7200rpm drives, it's just so much of an upgrade. Everything starts to fly, programs launch instantly and loading times in games are matter of few seconds. Had an opportunity to fiddle a bit with alienware equipped with SSD's and told myself i never want a computer without SSD in future. Would be buying SSD myself if i wasn't selling my beast. Also Dell usually uses Samsung SSDs in its laptops, which belong to one of the best drives both in terms of reliability and performance.
     
  47. ernstig01

    ernstig01 Notebook Evangelist

    Reputations:
    265
    Messages:
    425
    Likes Received:
    0
    Trophy Points:
    30
    Congrats with your system.
    I have no experience with SSD yet. SSD is still expensive for me.
    For sure it will outperform a 7200 drive. Even 2 drives in RAID 0.
    Maybe someone has experience who changed a harddrive for SSD?
     
  48. Eambo

    Eambo Notebook Evangelist

    Reputations:
    60
    Messages:
    361
    Likes Received:
    0
    Trophy Points:
    30

    Very nice price for that o_O Apparently the SSD's are much faster booting than the 7200ks, from what I've seen on this thread anywho.
     
  49. 72hundred

    72hundred Revolutions-Per-Millennia

    Reputations:
    392
    Messages:
    1,228
    Likes Received:
    0
    Trophy Points:
    55
  50. Lord_Zath

    Lord_Zath Notebook Deity

    Reputations:
    179
    Messages:
    940
    Likes Received:
    7
    Trophy Points:
    31
    ok so either my video cards are dead or Vista service pack 2 has killed them (or both).

    I updated to SP2 (64-bit) today via windows update. Upon restarting, everything seemed fine. Then I started up Sins of a Solar Empire. I noticed some tearing and BOOM - BSOD. I restarted, and as Vista was signing in, I got another BSOD.

    I managed to boot into safe mode and uninstall the nvidia drivers (I was on 185.85). I then ran DriverSweeper and CCleaner. I booted into Vista and installed the latest nvidia drivers (downloaded from their site). Upon reboot, I had another BSOD.

    I had to leave to go to work, so when I get home I'm going to try to uninstall Service Pack Two. Can you guys think of anything else I could do?

    At this point, I'm assuming that if I can't uninstall SP2 or uninstalling SP2 does not solve the problem (after reinstalling nvidia drivers), it's time to call Dell tech support and get replacement graphics cards.
     
← Previous pageNext page →