The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Alienware M17X R2 GPU Best Upgrade Options (GTX 680M or 7970M) and Questions

    Discussion in 'Alienware 17 and M17x' started by senzazn12, Apr 16, 2014.

  1. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41
    Hello folks. I have an excellent M17X R2 equipped with an i7 940XM and the fantastic 1900 X 1200P RGB screen. It has been working great but I have a selfish desire to upgrade my 1GB AMD Mobility 5870M to a GTX 680M or 7970M GPU card. I plan to just get the 2GB version of the GTX 680M with the Dell branded bios. I just have a couple of questions before doing so.

    Would I need to do a flash vbios mod after I install card or will it work fine plug and play since it is a Dell branded bios card? Also the GTX 680M I plan on getting has the x-bracket installed but can I still use the 5870M heatsink to attach to the 680M? Will the screws from the 5870M heatsink attach to the x-bracket of the GTX 680M fine as I'm only getting the GTX 680M card and not the upgrade kit to save money since I already have thermal paste, x-bracket and pad? I'm asking because I have heard of users who are using the GTX 680M Clevo 4GB version say that the screws on the 5870M heatsink are too large for the GTX 680M 4GB Clevo version x-bracket. I wonder if the GTX 680M 2GB is different. All I want to know is if I can attach the heatsink from my 5870M to the GTX 680M and if the screws will fit the GTX 680M x-bracket.

    Any other things I should know before upgrading? Should I just save my money and stick with the still pretty decent 5870M card? I usually play from mid to high settings but more is always better and future proof I guess.

    This is the card I plan on getting. Will it work fine and be a simple installation? Thanks for the help in advance you all.


    Dell Alienware M17x R4 NVIDIA GeForce GTX680M 2GB GDDR5 Video Card 20HTK | eBay
     
  2. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    Hi there :)

    From others that I have worked with: It will not be plug-n-play as only GPU's supplied with a model work this way. You will have to set the bios to boot from the dgpu (PEG) to disable optimus which messes up the card switching. I believe the R2 has this option so you should not need an unlocked bios.

    The dell vbios does have a drawback in that it limits overclocking to a max of +135 but this is probably good enough for most games ;).

    The screws are different between AMD and NV as well as a different clearance set by the x-bracket. I did help one guy that swapped out the captive screws and AFAIK it all went OK, but more care is needed in monitoring temps on the first run through, just in case you don't get it right.

    One other thing is that the fan profile can get messed up and you will need to use HWInfo to create a custom fan profile.

    Finally, you will need a modded .inf file for the driver. Get that from here: LaptopVideo2Go: Drivers.


    You are going to get a considerable performance improvement, it's only about if doing all the above is worth it to you?

    Good luck :)
     
    DDDenniZZZ likes this.
  3. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41
    Hello thanks for the tips. I just got a couple questions.

    Why is booting to "PEG" important? I have heard of people getting black screens and not posting when booted this way?

    Could you recommend me a good modded inf driver as I don't want to pick a bad one?

    Also what kind of screws would I need to attach the heatsink to the x-bracket?

    Thanks again for chiming in.
     
  4. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    No problem, you're welcome :)

    The PEG boot option forces the machine to start on the D-GPU (the new card) and not the built-in GPU. The problem is as far as I understand it is that optimus no longer works (probably because the card ID is not in the bios?) so you need to bypass it, or get stuck on the on-board. You would think disabling the igpu in device manager should work but it does not :(.

    If there is some problem with the new card POST will fail but the fix is to remove the coin-cell battery and hold down the power button (no battery or PSU connected). This should reset the bios to defaults.

    Afraid I don't know the screws needed, they looked to me the same size (not length - unknown) as all the other screws I removed. The AMD ones were considerably thicker.

    Use the link I provided earlier (laptopvideo2go) and you can drill down to a matched inf for any specific driver version.

    ;)
     
  5. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41
    Very helpful and fast you are! :D I really appreciate it. So when I download the inf for that driver, lets say I want to upgrade to the new latest driver, would I have to redo the inf process?
     
  6. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    Haha, retired (due to ill health) IT professional, keeps my brain active!

    You will need a new inf for each driver and I have noticed that the name sometimes changes. There are over 30 inf files in the original install and I believe there is more than one that can add in the machine ID/cardID. Just make sure you get the 'overwrite?' question when you copy it over.

    If this becomes a problem I do have a contact that modifies his own inf files, so provided he is around, he is very helpful!
     
  7. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41
    Oh I see. Well you seem to be a great part of the NBR community. You will be sure to get a lot of REP points from me. :D

    I did end up ordering the very card from the first link I posted above and should be here Saturday. Did you actually upgrade anyones M17X R2 personally with the GTX 680M?

    I just recently upgraded the i7 4700MQ on AW 14 to an i7 4930MX CPU. It works great. It took a while to get to the CU of the AW 14 but I hear it is quite easy to get to the M17X R2 GPU cards. Is this true?
     
  8. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    :D :D

    I have only actual hands-on with the R3 and R4. You might want to have a stab at a new thread in the R1/R2 Owners Lounge for a more targeted audience once it arrives for any 'real' experiences.

    From pics I have seen the R1/2 are as easy as the later models (2 screws) to get to the GPU(s). All that stressful prising and levering is reserved for a full stripdown ;)
     
  9. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41
    I just want to let you know that the i7 940XM does not have an intergrated graphics card like the later i7 generations have therefore I don't have optimus meaning won't need to do the "PEG" boot up option. Can anyone else chime in?

    Looks like the hard part will be getting out the captiver screws but I got pliers and tweezers so should be okay.

    I thought you had worked on R2's before. I might start a new thread.
     
  10. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    You are correct that the CPU does not have an integrated graphics but you do have an on-board GPU (an nvidia I believe) so you have graphics switching along with a special feature called Hybrid SLi. This was introduced by nvidia to push sales of their lower GPU's for use by OEM's. It allowed a sub-set of SLi to offload to the on-board (as long as it's an nvidia). As soon as intel decided to incorporate a GPU on the CPU die they effectively killed the majority of the mobile GPU market. From driver release 3xx onwards they (NV) dropped hybrid support. You should see an option in the bios for setting hybrid SLi and this is where an R2 owner needs to chime in :).

    The captive screws are held in with e-clips that you can prise off with a very small screwdriver - but they 'ping' so use a cloth!
     
  11. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41
    I just checked the bios of my M17X R2 and there is no option to change booting to "PEG". Don't even see anything on there remotely to "PEG". I'm running the lastest A10 unlocked bios from "the Wiz"as well.
     
  12. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    I'm just guessing that it had a different name for setting the boot GPU, maybe PCi-e graphics?

    Really need an owner here (or search the owners lounge?)...
     
  13. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41
    I have confirmed with another R2 owner who upgraded to the GTX 680M that he did not have to do a "PEG" boot. Also on youtube, a guy that switched out his 5870M with a GTX 580M used the same screws from the AMD heatsink and it worked fine. He also did not do the PEG boot option. Here is the video.

    Alienware M17X R2 laptop GTX580 video card upgrade - YouTube
     
  14. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    That's good news re the PEG boot.

    I'm only going on another user that went from AMD to NV, the screws were visually larger in the pics and he did swap them out to use it. Watching that vid I see where I was mistaken - if you're intending to reuse the x-bracket then it will fit, they didn't change the mounting position.

    My concern with this would be that the x-bracket is what sets the card-to-heatsink clearence and is dependant on the GPU core height. Newer cards are typically thinner so you won't overstress it but I'd be inclined to just put a small blob of paste on the core and assemble without any pads. Open it up and make sure it's all squeezed out with barely a thin layer left.

    I also noticed that the guy had a lot less pads than my R3 580m had, it included a putty like line across the voltage regulators. Guess you'll already have got a pad layout pic?

    He used the same site for the modified drivers I use when coming up against the 'cannot find hardware' problem :D.

    This is my R3 580m as from dell: heatsink 003-small.jpg

    Good to hear that his fans appear to be working - although at around 70c I would expect full speed and it was not that loud. Just monitor under load (heaven 4 benchmark is a good one: Heaven Benchmark | Unigine: real-time 3D engine (game, simulation, visualization and VR)) If it hits 75c without full tilt fan then you will need hwinfo to get the best out of it when gaming.

    I think that this being a machine that Dell did not design, it was put together by enthusiasts (original Alienware company) and is therefore less finicky than the later models about hardware?

    Oh, almost forgot, if there is any tape on the heatsink around the core area remove it :).
     
  15. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41

    So it is best to just put a small blob and not try to smear the past to cover the the GPU core die? Is isopropyl 91% alcohol good to use to clean the the part where the new GPU will make contact with that copper material on the heatsink part? Any tips on cleaning and preparation?

    I will keep you posted after I install it. I'm hoping for a sucess story. :D
     
  16. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    I was talking about a test to make sure the contact is good. A small pea size in the middle should spread all over very thinly. If you post a pic of this I can see if it looks good :D.

    Then clean it up and put on the pads and new paste. Personally I have always used the 'business card' method. Apply a line along one edge and use some card to smear a thin layer across all the core area.

    When I have been unsure of pad thickness's I would use blobs of putty in a test assembly (with a touch of talc to stop sticking) , opening it up and measuring the thickness, then adding 1mm for compression - probably overkill but I get very OCD with these things :eek:

    90% or higher IPA is recommended for cleaning but I found that medical injection site swabs, although not be as high % as that, are very convenient as a lint free cloth comes into the bargain :). You can even get them for free if you have a local needle exchange - just messing, you'll find them under diabetes supplies.

    As for paste quality it is generally accepted that the difference between the best and average is only a few degrees at most. Have a look at this old article for some fun - Mayo? Really??

    Thermal Compound Roundup - October 2011 | Hardware Secrets

    Fingers crossed for you and please do post results :D
     
  17. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41
    I was wondering for the modded inf part, couldn't I just download the latest working driver and modded inf from this thread?

    http://forum.notebookreview.com/ali...-m17x-r2-680m-driver-assistance-needed-7.html

    Then once I do that, I'll install the driver and get to the part where it says "incompatible hardware error" and then I'll close out and go to the folder of the inf file and replace it with the modded inf. Couldn't I just do that to make it easier as I'm not sure if laptop2go has the inf file for the latest beta driver.

    BTW, added some rep points for you! :D
     
  18. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    :D.

    The inf is specific to a driver version so you have to use the matched pair. You have the process sorted to extract all the files :) and LV2go are usually very quick in new releases. Just follow the link for the driver you want and you will get offered to just get the inf. If it opens in a page of code just manually 'save as' using the name as offered (and NOT as a text file). That's all there is to it :).

    Edit: Here is j95's latest mod post : http://forum.notebookreview.com/ali...vidia-long-awaited-337-5-a-3.html#post9627719

    He posts them same day usually so I recommend 'following' him :D
     
  19. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41
    Thanks so much for the link! :D So now all I would have to do is download that modded driver from j95 and I'm good to go? I wouldn't need to do anything else since the modded inf file is already in the driver version j95 posted?
     
  20. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    No, all he posted is the NV driver link and his modded inf. It's just one of those things, you don't modify a package set from an OEM, they don't like it. You supply additional files ;). That's why the driver files on LV2Go are not modified and they say so in their disclaimer (sarcastically aimed at NV lawyers).
     
  21. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41
    I'm sorry but I just got a little confused. So after I download NV driver link and his modded inf, what would I need to do to get the installation running successfully? Do I just run the installation first and once I get to the "incompatible hardware error" part, I close out and replace the original inf file (from folders that were created during installation process) and replace it with the j95 modded inf file? Sorry again.
     
  22. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Yes, that's what you do. You'll need to find out which folder to paste the modified inf file into though. I think it's the Display Driver folder - you'll know it's the right one because it will ask you if you want to replace the existing file when you copy to the correct location (proof that the file already exists in that location!). Then you run the setup.exe in the extracted NVidia folder and then the driver will recognise your hardware & install.
     
    MickyD1234 likes this.
  23. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    Thanks brother Robbo. Exactly as he says :D.

    If you download the driver as well from LV2go it IS already extracted then zipped up so it will expand all the files and folders from wherever you run it, and not use the default location, meaning that running it to create the files (and quitting at the error) is not needed. The inf file is still a separate DL. Personally I use the NV published version, run and quit, and just get the inf - less confusion for the location of the driver install files ;)
     
  24. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41
    Hey bros! I got the GTX 680M this afternoon, installed it and is currently working great on my M17X R2! The process went relatively smooth except when taking out the captive screw clips off the 5870M heatsink. Since I couldn't get those clips out, I had to remove the GTX 680M x-bracket and use my 5870M x-bracket to put on the GTX 680M so that I wouldn't have to take out the captive screws from the 5870M heatsink and also to make the installation easier by using the old screws from the 5870M heatsink since I didn't have any GTX 680M x-bracket original screws. It was a perfect fit except the GTX 680M had a thing attached where it covered four of the ram slots on back side. Since my 5870M x-bracket did not have a cover thingy attached (since there were no ram slots in the back part of the 5870M), all I did was put some thermal pads on the four ram slots. Would that be okay? I posted some pics. Hope I can get some feedback. My temps on idle are between 35-40 degrees Celsius and under gaming load about 70-80 degrees Celsius.

    With the GTX 680M I can now play Tomb Raider on ultra with excellent fps. When I tried to do ultra on the 5870M card it would stutter pretty bad. Glad I did this upgrade! Well worth it and not very hard to do!

    I'll keep you all posted. Add some rep power to my account if you liked my upgrade. :D

    Old 1GB AMD Radeon 5870M card. Surprised this didn't have any thermal pads on. Just had a paste.



    New 2GB GTX 680M all prepped with thermal pads and Artic Silver 5 thermal paste



    Pics of the ram slots on back part of the GTX 680M. The original 680M x-bracket had a cover thingy for these RAM slots but since I had to reuse the 5870M x-bracket for the GTX 680M(since I couldn't get those clips out and didn't have the right screws for attaching my 5870M heasink to the original GTX 680M x-bracket), I used thermal pads instead to cover these ram slots. Hoping this would be okay.



    This was the original GTX 680M x-bracket that I took out. Notice the difference between the GTX 680M x-bracket shown below and the 5870M x-bracket that was used on the GTX 680M above. You can see that the original GTX 680M x-bracket had another cover thingy attached that covered the four RAM slots. There was also a white thermal pad beneath that cover thingy. If you look at the video here you can see that when this guy upgraded his 5870M to a GTX 580M that he reused his 5870M x-bracket and he did not put any sort of thermal pad to the four ram slots behind the GTX 580M. I put thermal pads on the four RAM slots just to add a bit more protection.

    http://www.youtube.com/watch?v=LNGnOU_Ky-8



    Just have some questions when updating to a future NVIDIA driver. Would the process be the same? Would I need to remove the older NVIDIA driver and install new driver with the modded inf? It be great if I someone could let me know about this.
     
    reborn2003, MickyD1234 and Robbo99999 like this.
  25. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    Great to hear it worked out man :D. I've seen others use just pads on the underside without a problem and I noticed on mine that there was a large pad, almost full card size, on the MB under the card.

    The temps seem a little high to me but within working tolerances. If I see 75c or more then I'm looking to 'fix' it. Couple of things that may account for the extra heat: The x-bracket being from a thicker card, and the pads. You have them in the right places but it looks like some are different heights due to different chip thickness's? My card has a mix of thickness's, The VRM's have at least 3mm height before compression. If you want I would open it up again and look for pressure dents, if you don't see them then it's not making good contact - and yes, you can double them up if needed :). You just want then all to be close to the same level before assembly - but the 680m appears to be a very reliable card so may not be like mine and die through getting too hot all the time :(. Spending a little time on my pads gave me around -5 to -10c and it games on a mild OC at around 68C :D. Different card of course so YMMV ;).

    As for the driver, yes you need a new inf for each release. No need to remove the old one (unless a problem occurs :(). I always just go for the upgrade option to keep all my settings and so far it's all worked just fine.

    Did you check out the fan operation? Use HWInfo to monitor max/min/average speeds - it should be very loud at around 75c. I did hear from some R2 owners that full speed does not happen?

    You can also check out a product called DDU ( Display Driver Uninstaller) if needed. This strips out any traces of graphic driver installs if you need to track down a problem. You might have been better to run this before the NV driver install to remove all the AMD junk but I forgot about that, and you seem to have escaped any issues 'old' installations can cause :D. Since it ain't broke then ....

    Enjoy the new lease of life you have injected into your rig :D
     
    Robbo99999 likes this.
  26. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Well done, looks like it's working well. I think you'll be fine with the thermal pads underneath the VRAM on the bottom side of the card, as long as the pad is thick enough and in contact with the VRAM and also the chassis of the notebook - allowing it to transfer the heat away to the chassis/base of the notebook. My notebook is also configured in the same way, with 1 bank of VRAM covered by an X-bracket VRAM spreader, and the other bank of VRAM covered by a thermal pad just in contact with the chassis (like you've done) - so this should be fine for you.

    (How come the paste on your VRAM of you 5870M is still in blobs and not spread out over the chips? It should be spread out when it was squashed by the heatsink).

    Your paste job looks a bit rough & ready on the GPU core of your 680M. Did you use the spread method for the Arctic Silver 5? If so, then the idea is to get a paper flat perfect spread to the paste before you apply the heatsink - this minimises air bubble formation as the heatsink comes down to meet the GPU core. If you used the spread method your pic shows lots of swirls & uneven levels to the paste, which may have created quite a few air bubbles within the interface, thereby slowing heat transfer & raising core temperatures at load. If I was you I'd redo that paste job.

    EDIT: And as MickyD says it looks like you have just placed a pad on each chip without much thinking to chip height and their corresponding contact with the heatsink, which means you may have reduced contact on some chips. Uneven pad placement can result in your heatsink not sitting flat on your GPU core (raised GPU core temperatures), and also some chips not being covered properly, and possibly putting uneven physical pressure stress on some parts of the card causing it bend & fail - this was the hardest part of the install for me, making sure the pad heights were correct - took ages! You did it differently to me, I stuck the pads on the heatsink first, not the other way around. Also, the pads you've placed on the VRAM chips don't fully cover each chip which isn't ideal - raised VRAM temperatures, possibly shorter life & lower overclockability. (Sorry to point out all these points, especially when I initially said 'well done, it's working well', you might well be fine with it as it is, but doesn't look ideal)
     
    MickyD1234 likes this.
  27. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    Just to add to Robbo's comments: I also noticed the paste, if this is how it came apart after a trail assembly then I'd say that the contact area is not close enough (x-bracket issue) - too much left behind.
     
  28. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41
    @ Robbo99999 and MickD1234 I did notice these small things I did not do which I'm sure accounted for extra heat at loud. I will definitely redo it. I'm getting a thicker thermal pad in the mail and will make sure it is full covered. The thermal pads I got were too small only 1mm thick. The new thermal pads I'm getting is 2mm thick and is custom cut. Is 2mm thick good enough and should I double the thermal pads? What should I do to get the thermal paste off? Is a microfibre cloth and isopropyl alcohol 90% good enough?

    Thanks for being frank guys. I appreciate the honestly. I'll do a better job on the thermal pad and thermal paste application this time and keep you both posted.
    :D

    Also I tried to manually spread the paste. Am I better off just doing the dot method?

    Edit: Just to add.The heatsink seems to be making contact with the die because when I took it out to to check, I can see that the paste spread fully across the die. When I applied paste the first time, I didn't cover the whole area so I just manually spread it in a small square within the area. I also put a little bit too much paste I admit. Got overzealous. So it seems to me the biggest account for the extra heat on load is the paste job and uneven and too thin of thermal pads on the parts. In addition, the thermal pads I used did not cover the whole area.

    I'll also get some longer screws just too add more pressure so there is more contact between the heatsink and GPU die.
     
  29. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41
    Funny you mentioned about DDU. Someone PM'd me recommending that I used DDU in order to remove all the AMD drivers and junk which is what I did.

    Guys what is the best thermal pad thickness I should get? The first batch I got were 1mm thick. Is that too small? Would 2mm or 3mm be good such as this one? Let me know what you guys recommend.

    100x100x2mm GPU PS3 PS2 Xbox 360 Heatsink Compound Thermal Conductive Pad Blue √ | eBay
     
  30. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    With 1mm and 2 mm pads you can double up or combo to get the correct thickness. If you look at the pic I posted earlier you can see three thicknesses. I think only the memory had 1mm but I can't recall. Since it's not an exact science try putting on the memory first, then the tallest components. If there is a visible (more than .5 mm) height difference when looking at the horizontal cross-section (a straight edge helps here) then adjust. Once they visibly are at the same height move onto the thinner components which will probably need 2x2mm. You would do this on the card as you did before although the normal method (when you know what goes where) is to apply to the heatsink, again look at my pic.

    Like I said about figuring out the thicknesses, the only sure way is to use a small blob of modelling putty on each of the surfaces, assemble, then open up and look at the thickness after compression. You want 0.5mm to 1mm bigger. You might also consider getting 1.5 mm as well for a full combination without going too big.

    If you can get the correct screws for the 680m x-bracket I think this will make quite a difference. Even if you used too much paste it should have been all squeezed out. When you open it up again you will see by the dents in the pads which ones have good contact. No dent - no contact :(.

    And your cool with IPA as the cleaning agent, that's what is normally recommended :)

    Good luck ;)
     
    Robbo99999 likes this.
  31. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41
    Do you guys think I should put thermal pads on the four RAM slots on the right side as well shown below? Do you think that might have contributed to excess heat too? I will get my 2mm and 1.5 thick thermal pads soon so I can redo it slowly and correctly. I'll make sure it is stacked so that is even in height and has the correct amount of pressure on heatsink.

    Thanks guys.

     
  32. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    I've seen other people that do the underside ram that matches up with the shield (?) same as you, but simply one strip across them all. The other row do not have anything on the original install but I don't know if that is a 2 GB card or a 4gb card? Dell only use a 2gb so maybe theirs does not have the second row? Either way it works fine as-is for others so I would not be concerned.

    The two hottest parts are the VRM's, which align with the thick strip you can see in my pic, and of course the core where I think most of the extra heat you are seeing is being generated. IMO you only want to see those temps when you overclock it :). You should see a few C drop getting the pads right but if you want to try out an OC get those screws :D

    Let us know how it goes ;)
     
  33. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41
    Yeah, I'm going to see how it is like if and when I get screws. There just hard to find now. The ones on ebay are sold out. However, I think most of the heat is from my weak paste job and the thermal pads not hitting the heatsink. But yeah will keep you posted.
     
  34. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    senzazn12, here is how to do the spread method with Arctic Silver 5, and is the exact method I used for my GPU too (pasted up now for nearly a full year with 66 degC max temp, room temperature 21 degC):
    http://www.arcticsilver.com/pdf/appmeth/int/ss/intel_app_method_surface_spread_v1.1.pdf

    Yep, make sure all those pads will be level when they meet the heatsink and/or card depending which way round you're doing it. Cover those VRAM chips a little more completely too (in terms of surface area). You'll get it perfect after a bit of work!

    (P.S. The only temperature readings are from the GPU core itself, but incorrect padding on the other chips can interfer with a proper heatsink to GPU core contact, thereby increasing temps. You never know how hot your other chips are, which is why it's important to be confident that they're padded up all nice & level to ensure good temperatures on them - takes some sweet time!) :)
     
  35. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    Just for info I discovered that the latest version of HWInfo also reports the VRM temps :D. I noticed mine were running in the 90's C and sure enough when dell replaced it a couple of weeks ago the strip of putty like compound was missing from the factory install! (while I have warranty I'm not fixing these things myself, as soon at it runs out I'll pop in a 680 myself :D)
     
  36. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Yep, I remember reading one of your posts saying about HWinfo and the VRM temperatures. Unfortunately, the latest version of HWInfo doesn't read VRM temperatures on Kepler cards (at least not on mine - I tried it after I read your post the other day). Thanks for pointing it out though.
     
  37. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    That sucks, I'm seeing a LOT more info that is useful:

    vrm.JPG
     
  38. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Cool, good that you're getting that info on the VRM's. Just double checked right now with the latest version of HWInfo - no mention of VRM info anywhere, only temperatures are core temperatures.
     
  39. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41
    I just noticed you guys are running GTX 670M and GTX 675M cards. Is it possible that the GTX 680M just runs hotter than the GTX 670M and GTX 675M? I'm using HWINFO and my max fan speeds are about 4,000 RPM for the GPU 1 fan, GPU 2 fan and CPU fan. My max CPU temperature is 75 degrees Celsius when gaming.
     
  40. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41
    @Robbo99999 thanks for the link to the Arctic Silver method. I'll read it thoroughly before I get my thermal pads and work on the machine again. :D
     
  41. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    Thought this might be clearer as I run heaven for 10 mins to see the variations :)
    vrm2.JPG

    current/min/max/average :)

    Oh, and I just discovered that HWInfo stats get added into my Afterburner on-screen when it is running. Now I just have to figure out just what it is showing me since there are no titles...

    Edit: @Robbo, did you get a post 'lost'? HWInfo64 version 4.36 beta
     
  42. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    Yes it's quite possible, I'm going on temps I have seen posted (and why I said YMMV earlier ;)). The 780's are running hotter but I think 70c or lower when at full tilt no overclock should be achievable on a 680m with a little time spent. Cooler is always better :D
     
  43. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Post lost?

    Ah, I've not tried the beta, I might try that.
     
  44. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    I got this notification of a post from you but didn't see any post?? :

    "Cool, good that you're getting that info on the VRM's. Just double checked right now with the latest version of HWInfo - no mention of VRM info anywhere, only temperatures are core temperatures."

    maybe I'm just not looking hard enough...

    Let us know how it goes for reference :D
     
  45. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Yep, latest beta is not showing any additional temperatures for the GPU. Looks like it might just be on Fermi perhaps for VRM temperature monitoring.
     
  46. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41
    Hey bros, I'm wondering what are the TDP's of your GTX 670M and GTX 675M?

    Also, when I checked card again, I can see that most of the thermal pads were hitting the heatsink. Only one or two of them were?

    Do you think this is where most of my extra heat problems are from?
     
  47. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I have the 670MX, not the 670M, there's a big difference: the 670M is a Fermi card, and the 670MX was the next generation up - a Kepler card produced in 28nm vs the 40nm of Fermi. My 670MX is supposedly a 75W card, but it uses way less than that at stock settings, at overvolted & overclocked settings it's using about 95W though now. 675M is a 100W card, 680M is also supposedly a 100W card, but uses just a little less at stock. So, for comparison, your 680M should be getting temperatures the same or better than a 675M in comparable systems.

    In terms of the effect of incorrect pad placement & thickness I explained the implications in my posts in this thread: post #34 and post #26.
     
    MickyD1234 likes this.
  48. Killiandros

    Killiandros Notebook Consultant

    Reputations:
    142
    Messages:
    246
    Likes Received:
    11
    Trophy Points:
    31
    Senzazn12 - replying in your thread instead of response to your PM -

    Check this out for recommended thermal pad placement: Index of /support/drivers/zip/Alienware/6xxm

    I only padded the side of the card that faces the heatsink, not the underside (although I don't think that's relevant for you if you're only using the 2GB version - i.e. potentially only has RAM on one side).

    My temps under heavy load very rarely top 70C so the temps you're seeing are quite high.
    I'd re-paste, re-pad and make sure that all vents are clear of dust etc.

    It did take me a couple of attempts to find the ideal thermal paste spread / pad placement though.

    The drivers can also make a difference - I've noticed some can run the card hotter than others so whenever I install a new driver I'll take note of the temps / fan noise and FPS in-game. The current 337.50's are pretty good for me so I'll stick with them for a bit.
     
    MickyD1234 likes this.
  49. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    I'd just like to add that I think it's crucial that all VRAM chips are covered with a pad that is either in contact with the heatsink, or the chassis depending on which side of the card you're talking. From touching my heatsink above the VRAM chips I can tell it gets quite hot in that area, and that the VRAM chips do produce a fair amount of heat that needs to be removed - don't just leave them all naked!
     
  50. senzazn12

    senzazn12 Notebook Consultant

    Reputations:
    122
    Messages:
    257
    Likes Received:
    48
    Trophy Points:
    41
    Did you use a NVIDIA x-bracket on your GTX 680M as well?
     
 Next page →