The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    AW m17x r3 GPU swap from GTX 560M to GTX 675M

    Discussion in 'Alienware 17 and M17x' started by souleh, Aug 20, 2013.

  1. souleh

    souleh Notebook Enthusiast

    Reputations:
    0
    Messages:
    21
    Likes Received:
    2
    Trophy Points:
    6
    Hello,

    I'm sure this has been asked lots of time already, but I didn't really was able to find anything similar to what I was looking for, so he're the situation:
    I currently own an Alienware m17x r3 with the following specs:

    i7-2760QM @ 2.40GHz
    1.5GB GeForce GTX 560M
    16GB Corsair Vengeance @ 1600 MHz
    Boot: Crucial M4 256GB
    Storage: 750GB 7200RPM HDD
    Killer Wireless-N 1103
    1600 x 900 Display
    Win7 x64
    Alienware stock BIOS A08.

    and I was thinking about upgrading my GPU to a 675m.

    I'm pretty sure the 675M will fit the laptop, but I got a couple of questions, being this the first time I try to swap a laptop GPU:
    1. Will the heatsink from the GTX 560M work with the 675M or do I need to buy a new one alongside the card? What about the termal pads/paste?
    2. One the card is seated, do I need to make any kinky BIOS/VBIOS/Drivers/whatever modifications to get it to work or it is not necessary?

    I'm sorry if this thread is redundant, but I thought it would be better to ask before I screw something up.
    Thanks for understanding.
     
  2. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    No problem, there is a LOT of information on this to sort through!

    The card is physically identical so the 580m heatsink (taking care with the pads and paste) will be just fine.

    No BIOS stuff or anything like that (except maybe something to do with Optimus where you have to turn off the iGPU, Anyone??) but you will always have to use modified drivers in the future. That machine/card combination will not be in any dell or NV drivers. Get them from the link below. The INF is the most important bit and you have to manually save it to the correct location.

    LaptopVideo2Go: Drivers

    Good Luck ;)
     
  3. souleh

    souleh Notebook Enthusiast

    Reputations:
    0
    Messages:
    21
    Likes Received:
    2
    Trophy Points:
    6
    Thanks!
    So just to clarify, the 5 60M heatsink will NOT do?
     
  4. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    Doh, sorry thought I was going from a 580! The heatsink will be fine but you will need to get the x-bracket from a 580 or 675 and remove any black tape around the core area..

    Having a bad day. sorry again....
     
  5. bigtonyman

    bigtonyman Desktop Powa!!!

    Reputations:
    2,377
    Messages:
    5,040
    Likes Received:
    277
    Trophy Points:
    251
    if you want optimus to work you're going to have the card back to a 580m. Is it a clevo or Dell card that you are looking at? If you get a dell one, a quick vbios flash and everything should work perfectly. ;)
     
  6. souleh

    souleh Notebook Enthusiast

    Reputations:
    0
    Messages:
    21
    Likes Received:
    2
    Trophy Points:
    6
    I'm not sure. I'm going to buy it from a guy I know, I suppose it's a DELL card. Will need to verify that. What's the difference between the two? And what about the Optimus?
     
  7. bigtonyman

    bigtonyman Desktop Powa!!!

    Reputations:
    2,377
    Messages:
    5,040
    Likes Received:
    277
    Trophy Points:
    251
    optimus enables the igpu to run so you can have much much better battery life. as to the difference between the cards, I believe the clevo versions of graphics cards use a different set of memory chips than the dell one as well as using a different vbios. Not sure exactly how to check and see which card it is easily though. :(
     
  8. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Dell cards are typically a darker shade of blue while clevo are quite a light shade.
     
  9. souleh

    souleh Notebook Enthusiast

    Reputations:
    0
    Messages:
    21
    Likes Received:
    2
    Trophy Points:
    6
    Yeah I know that Optimus is, I meant to ask what's the issue with it when I swap cards? Why should I go back to the 580M?
     
  10. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    I believe it's to do with the card not being in the R3 BIOS so optimus will not work. It needs to be on the dGPU always. If you were to flash a 675 back to a 580, which should work, since I've seen 580 users flash (for no reason) to 675, then it should work just fine. A bit of theory though... ;)
     
  11. souleh

    souleh Notebook Enthusiast

    Reputations:
    0
    Messages:
    21
    Likes Received:
    2
    Trophy Points:
    6
    Ok now we are talking about quantum physics as far as concerns me :D
    Anyhow, if I just install the 675M (which I asked, it's coming out from a M18X so it should be a DELL card) the dGPU will work just fine, but I won't be able to use the Intel GPU. No biggie.

    So, and sorry if I'm a bit of an hassle, to sum it up, I just need to plug it in (with the X bracket and stuff) and use some custom drivers from the website in the first reply. Correct?

    Any suggestion on what termal paste to use?

    Ninjaedit: anyone got any link to a guide and/or video where it shows the correct process to install the card? And I was reading again the first post, what about the black tape around the core area? Is there supposed to be any?
     
  12. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    Yup, you you got it! You will need to find out how to turn off the iGPU in the BIOS but I'm sure someone will give you that snippet of info ;) >removes propeller cap!< For paste you will find a lot of reconsiderations around, Artic Silver is one I see often. I just use whatever I can find, in my own tests I saw around 3~5C lower with the more expensive ones.
     
  13. souleh

    souleh Notebook Enthusiast

    Reputations:
    0
    Messages:
    21
    Likes Received:
    2
    Trophy Points:
    6
    I'm assuming that will require some kind of unlocked BIOS, or at least that's what I found out by looking around.
     
  14. souleh

    souleh Notebook Enthusiast

    Reputations:
    0
    Messages:
    21
    Likes Received:
    2
    Trophy Points:
    6
    So yeah I know I'm thinking too much, but being this the first time I do something like this I'd really like not to screw up.

    Anyone got any advices on what thermal paste would be better to use for the GPU?
    What about the pads? can I use the old ones or will it be better to get new ones? What thickness?

    Thanks D:
     
  15. bigtonyman

    bigtonyman Desktop Powa!!!

    Reputations:
    2,377
    Messages:
    5,040
    Likes Received:
    277
    Trophy Points:
    251
    yup you will need an unlocked bios. I use Arctic Silver because that is what I have readily available and it works just fine. Yes you can use the same thermal pads as well, but you may need to adjust them so they fit the new card properly. :)
     
  16. souleh

    souleh Notebook Enthusiast

    Reputations:
    0
    Messages:
    21
    Likes Received:
    2
    Trophy Points:
    6
    Sorry if I bring this up again, I got a silly question.

    I found myself having a spare x-bracket for the GPU, but somehow the glue on the side the goes attached to thee GPU came off. Can I still use it or it is no good? Would cause any problems to use it without glue?
     
    reborn2003 likes this.
  17. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    Hi, the glue is only to hold it in place and stay with the card so I don't see any issues there, BUT, the x-bracket is specific to each card. The post heights can be different causing a loss of pressure (or too much) if the core is a different thickness. You should try to get one for the card, but if you can't then assembling with paste and then opening up again should show if almost all the paste is being squeezed out (should be barely any there) or a thicker layer is being left? If it gets hot (+70c) on stock clocks then it may not be mated well.
     
  18. souleh

    souleh Notebook Enthusiast

    Reputations:
    0
    Messages:
    21
    Likes Received:
    2
    Trophy Points:
    6
    Yeah I'm pretty sure it's the bracket of the 675M, but then again, that's the one with no glue on. I'm going to clean it anyway and check also if the one of my 560M could fit. Gonna measure it and stuff. Well thanks for the extra infos anyway, I just wantend to be sure that the lack of glue would not be an huge hassle.
     
  19. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    Cool, let us know how it goes ;)
     
  20. souleh

    souleh Notebook Enthusiast

    Reputations:
    0
    Messages:
    21
    Likes Received:
    2
    Trophy Points:
    6
    I got my thermal pads today, so I will be installing my GTX 675M soon. Hopefully this weekend.
    I got just one last question, how do I find the correct drivers on laptopvideo2go?
     
  21. MickyD1234

    MickyD1234 Notebook Prophet

    Reputations:
    3,159
    Messages:
    6,473
    Likes Received:
    1,165
    Trophy Points:
    331
    Great. I'd play safe with a fresh card and use the 314.22 WHQL driver. I'm still using that one having a bunch of problems (including the card failing!)

    It's not as easy as it could be at LV2go but once you have done it once it gets easier ;)

    Browse to the driver - there will be a lot of later ones - another is mooted for today or tomorrow. LaptopVideo2Go: Drivers

    You will see a link for the driver and the Inf. (note: win 8 is 7 also) To expand the driver download and run the install. It will fail with a 'hardware not detected error' but the files will have been expanded to the c:nvidia\... folder. Or you can manually expand them using a zip utility.

    You then go to the inf link on LV2Go. A page will open with all the code in it. You have to manually save the file using the browser 'save' feature. This file is now going to replace the original which will be in a sub folder of the expanded driver. The folder will be titled 'Display.Driver'. You should be prompted to overwrite the original. Run the setup.exe from one folder up, and this time it should go through.

    Check out posts from j95 as well, he's given instructions for this a few times :)
     
  22. souleh

    souleh Notebook Enthusiast

    Reputations:
    0
    Messages:
    21
    Likes Received:
    2
    Trophy Points:
    6
    So I installed the card yesterday. Everything went smooth. The only thing I'm not sure about is if I made a decent job with the thermal paste, I think I might have used too much, but so far the temps seem good, 50-54° on idle (which was the same as before).
    The whole process was pretty straightforward, and there were alot of videos on how to reach the GPU unmount it and whatnot. The only extra thing I had to do was a little modification to the heatsink, since mine had some kind of black tape on the copper surface where the GPU die should go, and since the GTX 675M die was wider than the GTX 560M I had to remove the tape so that the whole die could come in contact with the copper surface.
    Sadly, I was so focused on not screwing anything up that I forgot to take pics :/
    If it might be of any interest, I could copy/paste a post I made on another forum explaining the whole process.

    Anyway thanks for all the infos, they really helped alot.