The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.

Latitude D830 Copper Mod?

Discussion in 'Dell Latitude, Vostro, and Precision' started by dejacky, Oct 7, 2010.

Thread Status:
Not open for further replies.
  1. dejacky

    dejacky Notebook Consultant

    Reputations:
    72
    Messages:
    247
    Likes Received:
    0
    Trophy Points:
    30
    System: D830 with NVS 140M GPU

    Has anyone tried performing a GPU copper mod on their D830? Even with my heatsink and fan vent clear, the GPU with thermal pad is still causing some dropped frames at high res video and L4D2 game (after elongated usage). The CPU has arctic silver 5 and that was a noticeable improvement, but after a while the GPU drops frames, so I know the problem is heat related.
     
  2. Commander Wolf

    Commander Wolf can i haz broadwell?

    Reputations:
    2,962
    Messages:
    8,231
    Likes Received:
    59
    Trophy Points:
    216
    I copper-modded my D830 maybe a month or two before I sold it... I think it took load temps from about ~100C to ~85C. Just make sure you get the right thickness of copper... though sadly this was so long ago I don't remember how thick my copper was.

    EDIT: I think I still have some of the sheet... if I can find batteries for my calipers, I can probably measure it.
     
  3. dejacky

    dejacky Notebook Consultant

    Reputations:
    72
    Messages:
    247
    Likes Received:
    0
    Trophy Points:
    30
    That size info would be very helpful or if you have the same sized shim available, I'd gladly buy it. When you put the shim on, did you also leave the thermal pad on or remove it?

    Overall, the D830 is the perfect laptop for me in terms of performance and that balance between power and battery life (9 cell + modular). Lately, I've noticed when I hook up an external vga monitor it constantly flickers and my D830 qualifies for the Nvidia recall, but until then, I'd like to get the most out of it. Only God knows how long that RMA process will be...
     
  4. Commander Wolf

    Commander Wolf can i haz broadwell?

    Reputations:
    2,962
    Messages:
    8,231
    Likes Received:
    59
    Trophy Points:
    216
    I removed the thermal pad.

    With a thinner shim, you can leave the pad - it won't drop temps quite as much, but it'll supposedly apply more pressure to the faulty chip and prolong its life... I'm not really sure which approach is better or if either approach makes a significant difference.

    That being said, if you're under warranty and you're already getting flickering, I think you should try to get it replaced.
     
  5. Robin24k

    Robin24k Notebook Deity

    Reputations:
    274
    Messages:
    1,700
    Likes Received:
    5
    Trophy Points:
    56
    Interesting to note was that my D630 came from the factory with a piece of copper sandwiched between the thermal pad and GPU. Mine was a pretty late build, so it's possible this was a running change. Have you checked if yours has one or not?
     
  6. dejacky

    dejacky Notebook Consultant

    Reputations:
    72
    Messages:
    247
    Likes Received:
    0
    Trophy Points:
    30
    yep, my d830 has the clunky thermal pad for gpu. I saw it when I put AS5 on the cpu. :(
     
  7. WARDOZER9

    WARDOZER9 Notebook Consultant

    Reputations:
    35
    Messages:
    282
    Likes Received:
    8
    Trophy Points:
    31
    I tried 22 guage and that wasn't thick enough so aim for something thicker. I ended up using a piece of 22 guage and a thinner shim for a tight seal that I was pleased with. Maybe I'll reshim the thing before I sell it with a single thicker shim but for now my temps are already 10* C lower with shim alone and the pressure should be enough to help prevent the Nvidia chipset failure issue.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page