Hello folks. I have an excellent M17X R2 equipped with an i7 940XM and the fantastic 1900 X 1200P RGB screen. It has been working great but I have a selfish desire to upgrade my 1GB AMD Mobility 5870M to a GTX 680M or 7970M GPU card. I plan to just get the 2GB version of the GTX 680M with the Dell branded bios. I just have a couple of questions before doing so.
Would I need to do a flash vbios mod after I install card or will it work fine plug and play since it is a Dell branded bios card? Also the GTX 680M I plan on getting has the x-bracket installed but can I still use the 5870M heatsink to attach to the 680M? Will the screws from the 5870M heatsink attach to the x-bracket of the GTX 680M fine as I'm only getting the GTX 680M card and not the upgrade kit to save money since I already have thermal paste, x-bracket and pad? I'm asking because I have heard of users who are using the GTX 680M Clevo 4GB version say that the screws on the 5870M heatsink are too large for the GTX 680M 4GB Clevo version x-bracket. I wonder if the GTX 680M 2GB is different. All I want to know is if I can attach the heatsink from my 5870M to the GTX 680M and if the screws will fit the GTX 680M x-bracket.
Any other things I should know before upgrading? Should I just save my money and stick with the still pretty decent 5870M card? I usually play from mid to high settings but more is always better and future proof I guess.
This is the card I plan on getting. Will it work fine and be a simple installation? Thanks for the help in advance you all.
Dell Alienware M17x R4 NVIDIA GeForce GTX680M 2GB GDDR5 Video Card 20HTK | eBay
-
MickyD1234 Notebook Prophet
Hi there
From others that I have worked with: It will not be plug-n-play as only GPU's supplied with a model work this way. You will have to set the bios to boot from the dgpu (PEG) to disable optimus which messes up the card switching. I believe the R2 has this option so you should not need an unlocked bios.
The dell vbios does have a drawback in that it limits overclocking to a max of +135 but this is probably good enough for most games.
The screws are different between AMD and NV as well as a different clearance set by the x-bracket. I did help one guy that swapped out the captive screws and AFAIK it all went OK, but more care is needed in monitoring temps on the first run through, just in case you don't get it right.
One other thing is that the fan profile can get messed up and you will need to use HWInfo to create a custom fan profile.
Finally, you will need a modded .inf file for the driver. Get that from here: LaptopVideo2Go: Drivers.
You are going to get a considerable performance improvement, it's only about if doing all the above is worth it to you?
Good luckDDDenniZZZ likes this. -
Why is booting to "PEG" important? I have heard of people getting black screens and not posting when booted this way?
Could you recommend me a good modded inf driver as I don't want to pick a bad one?
Also what kind of screws would I need to attach the heatsink to the x-bracket?
Thanks again for chiming in. -
MickyD1234 Notebook Prophet
The PEG boot option forces the machine to start on the D-GPU (the new card) and not the built-in GPU. The problem is as far as I understand it is that optimus no longer works (probably because the card ID is not in the bios?) so you need to bypass it, or get stuck on the on-board. You would think disabling the igpu in device manager should work but it does not.
If there is some problem with the new card POST will fail but the fix is to remove the coin-cell battery and hold down the power button (no battery or PSU connected). This should reset the bios to defaults.
Afraid I don't know the screws needed, they looked to me the same size (not length - unknown) as all the other screws I removed. The AMD ones were considerably thicker.
Use the link I provided earlier (laptopvideo2go) and you can drill down to a matched inf for any specific driver version.
-
I really appreciate it. So when I download the inf for that driver, lets say I want to upgrade to the new latest driver, would I have to redo the inf process?
-
MickyD1234 Notebook Prophet
You will need a new inf for each driver and I have noticed that the name sometimes changes. There are over 30 inf files in the original install and I believe there is more than one that can add in the machine ID/cardID. Just make sure you get the 'overwrite?' question when you copy it over.
If this becomes a problem I do have a contact that modifies his own inf files, so provided he is around, he is very helpful! -
I did end up ordering the very card from the first link I posted above and should be here Saturday. Did you actually upgrade anyones M17X R2 personally with the GTX 680M?
I just recently upgraded the i7 4700MQ on AW 14 to an i7 4930MX CPU. It works great. It took a while to get to the CU of the AW 14 but I hear it is quite easy to get to the M17X R2 GPU cards. Is this true? -
MickyD1234 Notebook Prophet
I have only actual hands-on with the R3 and R4. You might want to have a stab at a new thread in the R1/R2 Owners Lounge for a more targeted audience once it arrives for any 'real' experiences.
From pics I have seen the R1/2 are as easy as the later models (2 screws) to get to the GPU(s). All that stressful prising and levering is reserved for a full stripdown -
Looks like the hard part will be getting out the captiver screws but I got pliers and tweezers so should be okay.
I thought you had worked on R2's before. I might start a new thread. -
MickyD1234 Notebook Prophet
You are correct that the CPU does not have an integrated graphics but you do have an on-board GPU (an nvidia I believe) so you have graphics switching along with a special feature called Hybrid SLi. This was introduced by nvidia to push sales of their lower GPU's for use by OEM's. It allowed a sub-set of SLi to offload to the on-board (as long as it's an nvidia). As soon as intel decided to incorporate a GPU on the CPU die they effectively killed the majority of the mobile GPU market. From driver release 3xx onwards they (NV) dropped hybrid support. You should see an option in the bios for setting hybrid SLi and this is where an R2 owner needs to chime in
.
The captive screws are held in with e-clips that you can prise off with a very small screwdriver - but they 'ping' so use a cloth! -
-
MickyD1234 Notebook Prophet
Really need an owner here (or search the owners lounge?)... -
Alienware M17X R2 laptop GTX580 video card upgrade - YouTube -
MickyD1234 Notebook Prophet
That's good news re the PEG boot.
I'm only going on another user that went from AMD to NV, the screws were visually larger in the pics and he did swap them out to use it. Watching that vid I see where I was mistaken - if you're intending to reuse the x-bracket then it will fit, they didn't change the mounting position.
My concern with this would be that the x-bracket is what sets the card-to-heatsink clearence and is dependant on the GPU core height. Newer cards are typically thinner so you won't overstress it but I'd be inclined to just put a small blob of paste on the core and assemble without any pads. Open it up and make sure it's all squeezed out with barely a thin layer left.
I also noticed that the guy had a lot less pads than my R3 580m had, it included a putty like line across the voltage regulators. Guess you'll already have got a pad layout pic?
He used the same site for the modified drivers I use when coming up against the 'cannot find hardware' problem.
This is my R3 580m as from dell:
Good to hear that his fans appear to be working - although at around 70c I would expect full speed and it was not that loud. Just monitor under load (heaven 4 benchmark is a good one: Heaven Benchmark | Unigine: real-time 3D engine (game, simulation, visualization and VR)) If it hits 75c without full tilt fan then you will need hwinfo to get the best out of it when gaming.
I think that this being a machine that Dell did not design, it was put together by enthusiasts (original Alienware company) and is therefore less finicky than the later models about hardware?
Oh, almost forgot, if there is any tape on the heatsink around the core area remove it.
-
So it is best to just put a small blob and not try to smear the past to cover the the GPU core die? Is isopropyl 91% alcohol good to use to clean the the part where the new GPU will make contact with that copper material on the heatsink part? Any tips on cleaning and preparation?
I will keep you posted after I install it. I'm hoping for a sucess story. -
MickyD1234 Notebook Prophet
.
Then clean it up and put on the pads and new paste. Personally I have always used the 'business card' method. Apply a line along one edge and use some card to smear a thin layer across all the core area.
When I have been unsure of pad thickness's I would use blobs of putty in a test assembly (with a touch of talc to stop sticking) , opening it up and measuring the thickness, then adding 1mm for compression - probably overkill but I get very OCD with these things
90% or higher IPA is recommended for cleaning but I found that medical injection site swabs, although not be as high % as that, are very convenient as a lint free cloth comes into the bargain. You can even get them for free if you have a local needle exchange - just messing, you'll find them under diabetes supplies.
As for paste quality it is generally accepted that the difference between the best and average is only a few degrees at most. Have a look at this old article for some fun - Mayo? Really??
Thermal Compound Roundup - October 2011 | Hardware Secrets
Fingers crossed for you and please do post results -
http://forum.notebookreview.com/ali...-m17x-r2-680m-driver-assistance-needed-7.html
Then once I do that, I'll install the driver and get to the part where it says "incompatible hardware error" and then I'll close out and go to the folder of the inf file and replace it with the modded inf. Couldn't I just do that to make it easier as I'm not sure if laptop2go has the inf file for the latest beta driver.
BTW, added some rep points for you! -
MickyD1234 Notebook Prophet
.
The inf is specific to a driver version so you have to use the matched pair. You have the process sorted to extract all the filesand LV2go are usually very quick in new releases. Just follow the link for the driver you want and you will get offered to just get the inf. If it opens in a page of code just manually 'save as' using the name as offered (and NOT as a text file). That's all there is to it
.
Edit: Here is j95's latest mod post : http://forum.notebookreview.com/ali...vidia-long-awaited-337-5-a-3.html#post9627719
He posts them same day usually so I recommend 'following' him -
So now all I would have to do is download that modded driver from j95 and I'm good to go? I wouldn't need to do anything else since the modded inf file is already in the driver version j95 posted?
-
MickyD1234 Notebook Prophet
. That's why the driver files on LV2Go are not modified and they say so in their disclaimer (sarcastically aimed at NV lawyers).
-
-
Robbo99999 Notebook Prophet
MickyD1234 likes this. -
MickyD1234 Notebook Prophet
Thanks brother Robbo. Exactly as he says
.
If you download the driver as well from LV2go it IS already extracted then zipped up so it will expand all the files and folders from wherever you run it, and not use the default location, meaning that running it to create the files (and quitting at the error) is not needed. The inf file is still a separate DL. Personally I use the NV published version, run and quit, and just get the inf - less confusion for the location of the driver install files -
With the GTX 680M I can now play Tomb Raider on ultra with excellent fps. When I tried to do ultra on the 5870M card it would stutter pretty bad. Glad I did this upgrade! Well worth it and not very hard to do!
I'll keep you all posted. Add some rep power to my account if you liked my upgrade.
Old 1GB AMD Radeon 5870M card. Surprised this didn't have any thermal pads on. Just had a paste.
New 2GB GTX 680M all prepped with thermal pads and Artic Silver 5 thermal paste
Pics of the ram slots on back part of the GTX 680M. The original 680M x-bracket had a cover thingy for these RAM slots but since I had to reuse the 5870M x-bracket for the GTX 680M(since I couldn't get those clips out and didn't have the right screws for attaching my 5870M heasink to the original GTX 680M x-bracket), I used thermal pads instead to cover these ram slots. Hoping this would be okay.
This was the original GTX 680M x-bracket that I took out. Notice the difference between the GTX 680M x-bracket shown below and the 5870M x-bracket that was used on the GTX 680M above. You can see that the original GTX 680M x-bracket had another cover thingy attached that covered the four RAM slots. There was also a white thermal pad beneath that cover thingy. If you look at the video here you can see that when this guy upgraded his 5870M to a GTX 580M that he reused his 5870M x-bracket and he did not put any sort of thermal pad to the four ram slots behind the GTX 580M. I put thermal pads on the four RAM slots just to add a bit more protection.
http://www.youtube.com/watch?v=LNGnOU_Ky-8
Just have some questions when updating to a future NVIDIA driver. Would the process be the same? Would I need to remove the older NVIDIA driver and install new driver with the modded inf? It be great if I someone could let me know about this.reborn2003, MickyD1234 and Robbo99999 like this. -
MickyD1234 Notebook Prophet
Great to hear it worked out man
. I've seen others use just pads on the underside without a problem and I noticed on mine that there was a large pad, almost full card size, on the MB under the card.
The temps seem a little high to me but within working tolerances. If I see 75c or more then I'm looking to 'fix' it. Couple of things that may account for the extra heat: The x-bracket being from a thicker card, and the pads. You have them in the right places but it looks like some are different heights due to different chip thickness's? My card has a mix of thickness's, The VRM's have at least 3mm height before compression. If you want I would open it up again and look for pressure dents, if you don't see them then it's not making good contact - and yes, you can double them up if needed. You just want then all to be close to the same level before assembly - but the 680m appears to be a very reliable card so may not be like mine and die through getting too hot all the time
. Spending a little time on my pads gave me around -5 to -10c and it games on a mild OC at around 68C
. Different card of course so YMMV
.
As for the driver, yes you need a new inf for each release. No need to remove the old one (unless a problem occurs). I always just go for the upgrade option to keep all my settings and so far it's all worked just fine.
Did you check out the fan operation? Use HWInfo to monitor max/min/average speeds - it should be very loud at around 75c. I did hear from some R2 owners that full speed does not happen?
You can also check out a product called DDU ( Display Driver Uninstaller) if needed. This strips out any traces of graphic driver installs if you need to track down a problem. You might have been better to run this before the NV driver install to remove all the AMD junk but I forgot about that, and you seem to have escaped any issues 'old' installations can cause. Since it ain't broke then ....
Enjoy the new lease of life you have injected into your rigRobbo99999 likes this. -
Robbo99999 Notebook Prophet
Well done, looks like it's working well. I think you'll be fine with the thermal pads underneath the VRAM on the bottom side of the card, as long as the pad is thick enough and in contact with the VRAM and also the chassis of the notebook - allowing it to transfer the heat away to the chassis/base of the notebook. My notebook is also configured in the same way, with 1 bank of VRAM covered by an X-bracket VRAM spreader, and the other bank of VRAM covered by a thermal pad just in contact with the chassis (like you've done) - so this should be fine for you.
(How come the paste on your VRAM of you 5870M is still in blobs and not spread out over the chips? It should be spread out when it was squashed by the heatsink).
Your paste job looks a bit rough & ready on the GPU core of your 680M. Did you use the spread method for the Arctic Silver 5? If so, then the idea is to get a paper flat perfect spread to the paste before you apply the heatsink - this minimises air bubble formation as the heatsink comes down to meet the GPU core. If you used the spread method your pic shows lots of swirls & uneven levels to the paste, which may have created quite a few air bubbles within the interface, thereby slowing heat transfer & raising core temperatures at load. If I was you I'd redo that paste job.
EDIT: And as MickyD says it looks like you have just placed a pad on each chip without much thinking to chip height and their corresponding contact with the heatsink, which means you may have reduced contact on some chips. Uneven pad placement can result in your heatsink not sitting flat on your GPU core (raised GPU core temperatures), and also some chips not being covered properly, and possibly putting uneven physical pressure stress on some parts of the card causing it bend & fail - this was the hardest part of the install for me, making sure the pad heights were correct - took ages! You did it differently to me, I stuck the pads on the heatsink first, not the other way around. Also, the pads you've placed on the VRAM chips don't fully cover each chip which isn't ideal - raised VRAM temperatures, possibly shorter life & lower overclockability. (Sorry to point out all these points, especially when I initially said 'well done, it's working well', you might well be fine with it as it is, but doesn't look ideal)MickyD1234 likes this. -
MickyD1234 Notebook Prophet
Just to add to Robbo's comments: I also noticed the paste, if this is how it came apart after a trail assembly then I'd say that the contact area is not close enough (x-bracket issue) - too much left behind.
-
Thanks for being frank guys. I appreciate the honestly. I'll do a better job on the thermal pad and thermal paste application this time and keep you both posted.
Also I tried to manually spread the paste. Am I better off just doing the dot method?
Edit: Just to add.The heatsink seems to be making contact with the die because when I took it out to to check, I can see that the paste spread fully across the die. When I applied paste the first time, I didn't cover the whole area so I just manually spread it in a small square within the area. I also put a little bit too much paste I admit. Got overzealous. So it seems to me the biggest account for the extra heat on load is the paste job and uneven and too thin of thermal pads on the parts. In addition, the thermal pads I used did not cover the whole area.
I'll also get some longer screws just too add more pressure so there is more contact between the heatsink and GPU die. -
Guys what is the best thermal pad thickness I should get? The first batch I got were 1mm thick. Is that too small? Would 2mm or 3mm be good such as this one? Let me know what you guys recommend.
100x100x2mm GPU PS3 PS2 Xbox 360 Heatsink Compound Thermal Conductive Pad Blue √ | eBay -
MickyD1234 Notebook Prophet
With 1mm and 2 mm pads you can double up or combo to get the correct thickness. If you look at the pic I posted earlier you can see three thicknesses. I think only the memory had 1mm but I can't recall. Since it's not an exact science try putting on the memory first, then the tallest components. If there is a visible (more than .5 mm) height difference when looking at the horizontal cross-section (a straight edge helps here) then adjust. Once they visibly are at the same height move onto the thinner components which will probably need 2x2mm. You would do this on the card as you did before although the normal method (when you know what goes where) is to apply to the heatsink, again look at my pic.
Like I said about figuring out the thicknesses, the only sure way is to use a small blob of modelling putty on each of the surfaces, assemble, then open up and look at the thickness after compression. You want 0.5mm to 1mm bigger. You might also consider getting 1.5 mm as well for a full combination without going too big.
If you can get the correct screws for the 680m x-bracket I think this will make quite a difference. Even if you used too much paste it should have been all squeezed out. When you open it up again you will see by the dents in the pads which ones have good contact. No dent - no contact.
And your cool with IPA as the cleaning agent, that's what is normally recommended
Good luckRobbo99999 likes this. -
Thanks guys.
-
MickyD1234 Notebook Prophet
I've seen other people that do the underside ram that matches up with the shield (?) same as you, but simply one strip across them all. The other row do not have anything on the original install but I don't know if that is a 2 GB card or a 4gb card? Dell only use a 2gb so maybe theirs does not have the second row? Either way it works fine as-is for others so I would not be concerned.
The two hottest parts are the VRM's, which align with the thick strip you can see in my pic, and of course the core where I think most of the extra heat you are seeing is being generated. IMO you only want to see those temps when you overclock it. You should see a few C drop getting the pads right but if you want to try out an OC get those screws
Let us know how it goes -
-
Robbo99999 Notebook Prophet
senzazn12, here is how to do the spread method with Arctic Silver 5, and is the exact method I used for my GPU too (pasted up now for nearly a full year with 66 degC max temp, room temperature 21 degC):
http://www.arcticsilver.com/pdf/appmeth/int/ss/intel_app_method_surface_spread_v1.1.pdf
Yep, make sure all those pads will be level when they meet the heatsink and/or card depending which way round you're doing it. Cover those VRAM chips a little more completely too (in terms of surface area). You'll get it perfect after a bit of work!
(P.S. The only temperature readings are from the GPU core itself, but incorrect padding on the other chips can interfer with a proper heatsink to GPU core contact, thereby increasing temps. You never know how hot your other chips are, which is why it's important to be confident that they're padded up all nice & level to ensure good temperatures on them - takes some sweet time!) -
MickyD1234 Notebook Prophet
Just for info I discovered that the latest version of HWInfo also reports the VRM temps
. I noticed mine were running in the 90's C and sure enough when dell replaced it a couple of weeks ago the strip of putty like compound was missing from the factory install! (while I have warranty I'm not fixing these things myself, as soon at it runs out I'll pop in a 680 myself
)
-
Robbo99999 Notebook Prophet
-
MickyD1234 Notebook Prophet
-
Robbo99999 Notebook Prophet
Cool, good that you're getting that info on the VRM's. Just double checked right now with the latest version of HWInfo - no mention of VRM info anywhere, only temperatures are core temperatures.
-
-
-
MickyD1234 Notebook Prophet
Thought this might be clearer as I run heaven for 10 mins to see the variations
current/min/max/average
Oh, and I just discovered that HWInfo stats get added into my Afterburner on-screen when it is running. Now I just have to figure out just what it is showing me since there are no titles...
Edit: @Robbo, did you get a post 'lost'? HWInfo64 version 4.36 beta -
MickyD1234 Notebook Prophet
). The 780's are running hotter but I think 70c or lower when at full tilt no overclock should be achievable on a 680m with a little time spent. Cooler is always better
-
Robbo99999 Notebook Prophet
Ah, I've not tried the beta, I might try that. -
MickyD1234 Notebook Prophet
"Cool, good that you're getting that info on the VRM's. Just double checked right now with the latest version of HWInfo - no mention of VRM info anywhere, only temperatures are core temperatures."
maybe I'm just not looking hard enough...
Let us know how it goes for reference -
Robbo99999 Notebook Prophet
-
Hey bros, I'm wondering what are the TDP's of your GTX 670M and GTX 675M?
Also, when I checked card again, I can see that most of the thermal pads were hitting the heatsink. Only one or two of them were?
Do you think this is where most of my extra heat problems are from? -
Robbo99999 Notebook Prophet
In terms of the effect of incorrect pad placement & thickness I explained the implications in my posts in this thread: post #34 and post #26.MickyD1234 likes this. -
Senzazn12 - replying in your thread instead of response to your PM -
Check this out for recommended thermal pad placement: Index of /support/drivers/zip/Alienware/6xxm
I only padded the side of the card that faces the heatsink, not the underside (although I don't think that's relevant for you if you're only using the 2GB version - i.e. potentially only has RAM on one side).
My temps under heavy load very rarely top 70C so the temps you're seeing are quite high.
I'd re-paste, re-pad and make sure that all vents are clear of dust etc.
It did take me a couple of attempts to find the ideal thermal paste spread / pad placement though.
The drivers can also make a difference - I've noticed some can run the card hotter than others so whenever I install a new driver I'll take note of the temps / fan noise and FPS in-game. The current 337.50's are pretty good for me so I'll stick with them for a bit.MickyD1234 likes this. -
Robbo99999 Notebook Prophet
-
Alienware M17X R2 GPU Best Upgrade Options (GTX 680M or 7970M) and Questions
Discussion in 'Alienware 17 and M17x' started by senzazn12, Apr 16, 2014.