The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.

Dell Precision M4800 - Can GPU be upgraded?

Discussion in 'Dell Latitude, Vostro, and Precision' started by SengXun, May 30, 2016.

  1. Vaardu

    Vaardu Notebook Evangelist

    Reputations:
    140
    Messages:
    308
    Likes Received:
    283
    Trophy Points:
    76
    Would this apply to those with the M5100 heatsinks too? I had a look and it isn't much different looking to the one here. Why would the M5100 be alright with the VRMs not getting heatsink contact?
     
  2. tyrell_corp

    tyrell_corp Notebook Evangelist

    Reputations:
    84
    Messages:
    536
    Likes Received:
    36
    Trophy Points:
    41
    in my view they designed it according to temps in open tests, but when its all together and locked plus running at 100%, gets hot enough to melt flimsy soft plastick.

    besides K2100m has 4 VRMS spreading the load and WX has 2 more effective (which take the tall)

    All I found out is that Nvidia or AMD they all gonna work fine for about 3 years and throttle when too hot say above 30c ambient room temp.

    but if you want to make sure its not throttling, and maxout its potential without causing degradation, you should do what I did, or even something better!

    (k2100m was also throttling allot on stock paste as I remember)

    there should be somone with temp probe to prove me right or prove me wrong, for now what I did works, and works really well, YOU ARE WELCOME :)
     
  3. Vaardu

    Vaardu Notebook Evangelist

    Reputations:
    140
    Messages:
    308
    Likes Received:
    283
    Trophy Points:
    76
    I don't have anything that uses an IHS but I was thinking making one from a copper sheet, cutting and shaping to fit. Is yours sitting under the heatsink, o top of a thermal pad and then using the paste on the VRMs? Could there be a thin thermal pad for that bit instead? I might do it similarly to yours but a thermal pad and the VRM heatspreader from scratch.
     
    tyrell_corp likes this.
  4. jpsulisz

    jpsulisz Notebook Enthusiast

    Reputations:
    0
    Messages:
    24
    Likes Received:
    7
    Trophy Points:
    6
    So I took apart my laptop again to replace the heatsink with the AMD one and saw the tyrell_corp made. My speaker/keyboard cover was also slightly melted. I made a similar 'heatsink' from an old wifi card heatsink that measured to .5 mm and found that the heat directly above the VRM has been reduced considerably.
    [​IMG]
    Though, just for added assurance, I did add some aluminium foil tape right above it to help out. (Sorry for blurry pic)
    [​IMG]


    In terms of temperature, though it did improve to a slight degree, taking about 5 minutes to hit 90C rather than 1 minute via Furmark, I found that the battery slice hindered cooling performance slightly as it restricts the fan partially, though it still hit max temperature on both cases. I should note that in the real-world I really never see anything above 72C with games like Rocket League or GTA 5, but it should be able to hold a furmark stress test without issue. The card runs at 1V so I may need to adjust the bios with something more suitable, otherwise, I don't know how you are keeping the card so cool with this bios.

    tyrell_corp, would you be able to do a backup of your Dell WX4150 bios via DOS with AMDFlash and upload it here? I can write a guide on how to flash/backup with DOS here if it would help; the backup I made in Windows is a completely different size and doesn't seem to play well with AMDFlash, I'd like to see if flashing to a Dell bios is possible (though I can't imagine why it wouldn't be).
     
    tyrell_corp likes this.
  5. Milko Krastev

    Milko Krastev Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    3
    Trophy Points:
    6
    Some feedback from my side regarding Quadro M2200.

    I purchased one from AliExpress, it took 3 months to have it delivered and it came without the X bracket on the back although the seller ad clearly showed the card would have that X bracket on its back. Swapping the X bracket from my K2100 to M2200 was not trivial, but I managed to do it. This was in fact the hardest part of the installation process.

    Software wise, everything worked out of the box. I have a dual boot system with Windows 10 and Linux. Linux recognized the card and worked right away. Windows needed driver reinstall, but I haven't tinkered with any INF files.

    Performance wise, this upgrade is totally worth it. I use DaVinci Resolve in my spare time and the performance boost is phenomenal. I can now playback 4K timelines in real time without any caching or proxy modes. With K2100 I had to edit in 720p and then switch to 4K before exporting, and footage was still quite choppy. DaVinci exports are about 2-2.5 times faster.

    My M2200 came from another Dell Precision so it has the right firmware. Not sure whether the process would be that straightforward if it were an HP card.

    At the moment, I see no point in replacing my good old M4800 with any newer Precision or Apple's M1. I simply love its rugged industrial design and high quality 3200x1800 display. If you are in 4K editing and use DaVinci Resolve, bear in mind that faster CPU would not help. You need a really fast GPU and M2200 does the trick.
     
    tyrell_corp likes this.
  6. tyrell_corp

    tyrell_corp Notebook Evangelist

    Reputations:
    84
    Messages:
    536
    Likes Received:
    36
    Trophy Points:
    41
    @jpsulisz

    give me a guide and I will try my best to get the vbios out.

    good work, and Good thinking! I had same idea re foiling after making fist mod, but could not find the foil at 2 AM in the morning in my covid jail. excuses, I know.
    :D

    my small advise however would be not to use heatpads on top of VRMs, (I've tried that)

    reason for this is that you are ducting heat via imperfect extenssion sink, so you want to minimise the losses in "transit" for that reason its best to have paste based contact because paste hase HUGE advantage in terms of W/mk conductivity, and sits between best pad and good TIMs.

    best "flexible solution" is to use Ceramique 2, as non conductive paste, thin-ish spread of it. (you only want to have a good contact with DIY sink at both ends)

    (its also easy to clean up using iso alcohol)

    I've found this quote online regarding Ceramique 2 usage scenario, so its matching our needs here.

    ""It was specifically designed not to fail under extreme vibration conditions, but you won't be taking off and landing your computer."" (I will...) :)

    @Vaardu

    my current design* is sink over VRMS, and under main sink, over heatpads to protect VMB from diect contact. don't make peanut economy put a small heatpad* on every black square there is under the main sink, its spongy and will not harm anything 1-1.5mm pads prefered.
    TOP^
    SINK_/|""|<----VRM's
    ^^^^^^^<---Pads
    BOTTOM

    another thing I've found out by experimenting, is that all silicon based pads can be "washed" slowly gently scrubbing with gloved fingers, using warm water to remove dust debree (if any) and renew contact. (drying sticker underlay needed the one that you get stickers on)




    hope this helps, like and subscribe to my channel :D (no channel)
     
    Last edited: Mar 16, 2021
    Vaardu likes this.
  7. tyrell_corp

    tyrell_corp Notebook Evangelist

    Reputations:
    84
    Messages:
    536
    Likes Received:
    36
    Trophy Points:
    41
    how much did you splash on M2200?

    and agree with your last statment.

    reason I still like m4800, is socket CPU, I dont like BGA garbage they make now, its guranteed to have mid joint cracks after circa 2-3 years in use, so whole main board with cpu and gpu goes into bin (its like tax $400 first year of failure, and $280 the next. so no escape go buy a new one) and will require "reballing" (re-bolloxing) :D
     
    Last edited: Mar 27, 2021
  8. tyrell_corp

    tyrell_corp Notebook Evangelist

    Reputations:
    84
    Messages:
    536
    Likes Received:
    36
    Trophy Points:
    41
    @jpsulisz


    "I don't know how you are keeping the card so cool "

    my room temp is 21c
    card was hitting 72c loaded before Ceramique 2 cured (about 4-5 days typical)
    now card is able to low down (90% load) to 61- 68c under 100% stress and 100% fan (2500 rpm according to HWinf)

    best TIMs for this (non conductive non corrosive)

    MX4
    Ceramique 2 4-5c dif with above, no dif after 2 weeks

    can you use HWINF and tell me model of your LCD please, is it 4k or FHD? this should be under display. this will help me getting a replacement.
     
  9. Milko Krastev

    Milko Krastev Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    3
    Trophy Points:
    6
    A lot. 330$ for the card and 70$ more for custom duties as I live in the EU.
     
    tyrell_corp likes this.
  10. PhOeNiX_H

    PhOeNiX_H Notebook Consultant

    Reputations:
    7
    Messages:
    129
    Likes Received:
    21
    Trophy Points:
    31
    I was supposed to narrate my adventure trying to install WX 4150, but... there was none. It was actually plug and play. After that I flashed RX 560 OC bios from aceoyame and worked like a charm. Maximum temperature with manual fan never surpass 76°C, even on intensive games.

    I'm just having an issue: I'm not being able to turn off Switchable Graphics and I can't connect to any external monitor. It says it is connected, but it doesn't show anything. I could verify on Windows that they are using RX 560 to output. Tried with DP and HDMI, same behavior on both.

    Is there something that I need to do? I have an eDP screen, so I'm thinking that something is missing. Is there, at least, any way to use iGPU for external screen?

    Thanks.
     
Loading...

Share This Page