The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Aw m18x R2 Dual 980m SLI upgrade!!

    Discussion in 'Alienware 18 and M18x' started by Peter, Nov 12, 2014.

  1. wuruochong

    wuruochong Notebook Enthusiast

    Reputations:
    0
    Messages:
    16
    Likes Received:
    1
    Trophy Points:
    6
    Thanks for the response, I'll try that out today. I bought my original card from HIDevolution, not so sure about them now. If the GPU does turn out to be faulty, any recommended sellers for mobile GPUs?
     
  2. supervendor

    supervendor Notebook Geek

    Reputations:
    5
    Messages:
    95
    Likes Received:
    7
    Trophy Points:
    16
    Uhm, really if I wont go for the 980m because directx12..
    For another card I dont do the upgrade.
    Your reply mean upgrade to 980m is not possible for the m18x r2???
     
  3. wuruochong

    wuruochong Notebook Enthusiast

    Reputations:
    0
    Messages:
    16
    Likes Received:
    1
    Trophy Points:
    6
    Its possible, but quite a chore. If your going to go dual cards they are pretty expensive and you'll need to go full UEFI mode to make it work. Thus you'll have to completely reformat your hard drive to GPT and reinstall Windows 8 (note windows 7 won't work anymore). Honestly I would only recommend 980ms if you are going balls to the walls and getting 2 of them. Otherwise 2 780ms will just beat the crap out of a single 980m while having similar price. 780ms are also plug n'play on the 18x r2. No fiddling around required.
     
    reborn2003 likes this.
  4. Scanner

    Scanner Notebook Deity

    Reputations:
    180
    Messages:
    877
    Likes Received:
    270
    Trophy Points:
    76

    I also got my 980s from HIDevolution. One of them was faulty as you describe, artifact's as you describe, and black screens. If you do have to send it back to Hidevolution, MAKE SURE, I REPEAT MAKE SURE you speak to the "manager" TED. Everyone else will give you the run around (if not outright lie to you). Call on the phone SPEAK to a live person, messages and emails go unanswered-that happened to me. When I finally spoke to TED, my replacement was sent out same day. Others places that sell cards, Eurocom and RJTech, I went with HIDevolution because they had 1 year warranty. If I had to do it again-NOOO.
     
    reborn2003 likes this.
  5. wuruochong

    wuruochong Notebook Enthusiast

    Reputations:
    0
    Messages:
    16
    Likes Received:
    1
    Trophy Points:
    6
    Thanks... I'll keep that in mind. Pretty much given up on hopes of a still working card at this point. But I'll still try UltraGSM's suggestion just to be sure. Also, any thoughts on how it just broke on me? It worked perfectly until last night, crushing benchmarks and games. Temps never went above 65. And then all hell just breaks loose after a freeze and force restart.
     
  6. Scanner

    Scanner Notebook Deity

    Reputations:
    180
    Messages:
    877
    Likes Received:
    270
    Trophy Points:
    76
    I honestly could not say, BUT, I was also doing overclocking and then BAM! Card died. Was it the overclocking? who can say, but that is what I was doing. Card did not get hot, it just passed away. Good Luck
     
  7. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,202
    Likes Received:
    17,918
    Trophy Points:
    931
    DX12 will work with kepler too btw, a lot of the software improvements are being made available to fermi and kepler too, it's just the hardware shader features that wont be available.
     
  8. wuruochong

    wuruochong Notebook Enthusiast

    Reputations:
    0
    Messages:
    16
    Likes Received:
    1
    Trophy Points:
    6
    I just opened up my system and checked the card. And I found this:
    http://i.imgur.com/uGYmQI9.jpg
    http://i.imgur.com/c0Zrdpo.jpg
    Is this the cause? This is the back side of the GPU (I didn't put any thermal pads there. Do you need them?) I have people saying that this is probably the sign of a dead GPU due to overheating and the excess heat caused the plastic to melt a bit.
     
  9. supervendor

    supervendor Notebook Geek

    Reputations:
    5
    Messages:
    95
    Likes Received:
    7
    Trophy Points:
    16
    Im very interessed for directx 12, this is my priority.
    This is why I wont the 980m, because is full dx12 compatible, right? The kepler and fermi is not full compatible, right?
    So, I can format, use gpt and windows 8 without problems, and I dont need windows 7. In this case dual 980m is just plug and play in the m18x r2 or I need to modding bios,driver,firmware, etc.. I dont find any full success uograde information for m18x r2 with dual 980m. And I care about if the bios switch to discrete/integrate graphic work. In my laptop enduro is disable, I just need ti choice integrate or discrete (7970m) and restart, and I like this so much, I dont know if with nvidia card is same..
    And sli mode work without issue?
    If I need to go for the 780 I just remain with 7970 because the uograde is not so big.
    Thankyou
     
  10. wuruochong

    wuruochong Notebook Enthusiast

    Reputations:
    0
    Messages:
    16
    Likes Received:
    1
    Trophy Points:
    6
    Im pretty sure Kepler is just as compatible with DX 12 as Maxwell. If I was you I would hold off on the 980m. They are really expensive and have shoddy compatiblity with the r2 (UEFI required, throttling). Furthermore success on the r2 with 980ms are pretty limited. And many people (including me) are reporting faulty 980ms, definitely too much for comfort. Right now I have most likely a dead 980m in my hands with what looks like overheated and fried vram modules, even though GPU temps were fine (<65) when it was still working. 780ms, on the other hand, are really stable and has perfect compatibility with the r2. Its also a tried and tested setup by many people. And the performance difference between 7970ms and 780ms are pretty significant as AMD is really behind when it comes to mobile GPUs.
     
    Last edited: Feb 14, 2015
  11. supervendor

    supervendor Notebook Geek

    Reputations:
    5
    Messages:
    95
    Likes Received:
    7
    Trophy Points:
    16
    I hate trhottling, not sense to have a so expensive card if I have this problem. So finally m18x r2 is not fully compatible with 980m, at this point I go just for double 680 or 780 if Is really full compatible or I buy a new pc with a 980. I buy Alienware because I thinked is possible ti do a easy uograde GPU after a e or 4 generation, but now I understand Alienware is just have same fealing with others brand about GPU upgrade. Maybe my next pc is a full spec asus g and I keep it untuched for 5 years, I need to thing. A lot of years passed but at today not much is change about laptop upgrade, with clevo too. I hope one day you can change a GPU laptop like a desktop without strees.
     
  12. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    Im pretty sure Kepler is just as compatible with DX 12 with Maxwell. This is true but there is talk of an upgrade of dx12 to eg 12.1 or further. Then will not previous versions of DX11 be as compatible ..
     
  13. wuruochong

    wuruochong Notebook Enthusiast

    Reputations:
    0
    Messages:
    16
    Likes Received:
    1
    Trophy Points:
    6
    That might be true, but before DX 12 comes out people really shouldn't bring DX 12+ compatibility into their buying decisions. Nothing as of right now has been confirmed to have compatibility with later versions of DX 12 and probably there will never be confirmation on that until DX 12 actually comes out. Second of all, DX 12 won't really make much of a difference in games. Its designed to own in incredibly draw call intensive situations (star swarm and new 3dmark) but in actual games it will make very little difference (as with mantle). There certainly will be exceptions like if a dev gives absolutely no ****s about optimizing their game. *cough* Ubisoft and ACU *cough*
     
  14. wuruochong

    wuruochong Notebook Enthusiast

    Reputations:
    0
    Messages:
    16
    Likes Received:
    1
    Trophy Points:
    6
    Well thats the ultimate goal of all laptop designers. Unfortunately the fact that all laptops are custom designed and have basically no standards and interchangeability in parts (other than RAM and storage) makes it very difficult to achieve no hassle upgrades. MXM GPUs are kinda weird. They sorta follow a standard and technically can fit in all MXM laptops but its really a luck thing once laptop makers stop providing BIOS updates for a model.
     
  15. wuruochong

    wuruochong Notebook Enthusiast

    Reputations:
    0
    Messages:
    16
    Likes Received:
    1
    Trophy Points:
    6
    I really need an answer so I am reposting this:
    http://i.imgur.com/uGYmQI9.jpg
    http://i.imgur.com/c0Zrdpo.jpg
    here is the back side of a what looks like is a faulty GTX 980m. The text on the VRAM module is somewhat smeared out and the plastic have unnatural marks on them. Note that the surface of the module is still perfectly flat. Do you guys think this is the cause of the defectiveness? My friends say that its probably that the modules were not properly cooled causing the plastic on top to melt a bit. All while the GPU temps were fine.
     
  16. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    DX12 will be coming to all NVIDIA GPU's from the Ferni ones...
     
    Kade Storm likes this.
  17. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,202
    Likes Received:
    17,918
    Trophy Points:
    931
    Thermal pads cause the marks on the ram chips to fade, this is normal and not a sign of a malfunction.
     
    Mr. Fox likes this.
  18. supervendor

    supervendor Notebook Geek

    Reputations:
    5
    Messages:
    95
    Likes Received:
    7
    Trophy Points:
    16
    But I know one gpu is really full dx12 only if the hardware support it, and I know only tge 9 series full support the dx9 via hardware.
    So maybe the others nvidia is conpatible via software but is not real 100% compatible so theyre cant use at 100% the dx10.

    What I dont know is about ATI regards tge dx12, maybe us virtual retro-compatible too?
     
  19. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    Okay, a lot going on here.

    DX12 efficiency improvements are going to be available to all GPUs from Fermi onwards. Some exclusive DX12 features will require DX12 compatible hardware. Keep in mind that your current generation of consoles will not have this hardware advantage and so most games will be limited in framework in this regard. So, it's really not a decision making issue on my part. I made a DX10 GPU last all the way through last generation beating out the console quality and performance on every count up until 2013. It's not a matter of real or 100% compatibility, it's two different facets of DX12 and the decision rests on what one wants and how much it'll prevail over the next coming years.
     
  20. wuruochong

    wuruochong Notebook Enthusiast

    Reputations:
    0
    Messages:
    16
    Likes Received:
    1
    Trophy Points:
    6
    Alright I took my card to a friend who's quite knowledgeable about PCB's in general. He told me that some modules of the card are visibly not original and are soldered on after. Basically that card is most likely refurbished. I bought my card from HIDevolution and their listing says its a brand new card.
    Judging from the reported amount of failures of cards bought from HIDevolution and the fact that my friend is quite experienced in the field (He fixes PCB's for a living. I think he can tell whether or not something is fixed or original) It might be very true that HIDevolution refurbishes some of their cards and sell them as new. Keep this in mind the next time you buy.
     
  21. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,312
    Trophy Points:
    431
  22. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,202
    Likes Received:
    17,918
    Trophy Points:
    931
    And they get access to a lot of the overhead reduction too so it's not just that you can install DX12.
     
  23. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    Are you sure?

    That sounds a bit sad :(
     
  24. Nycro

    Nycro Notebook Consultant

    Reputations:
    78
    Messages:
    100
    Likes Received:
    18
    Trophy Points:
    31
    Thanks for the write up now I just need to sell some of these m17 parts to help pay for my new cards lol
     
    reborn2003 likes this.
  25. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    For gaming and running the API, all DX11 cards support it. For using DX12 features, usually not used in games (just like DX11.1 and DX11.2 and DX11.3 features which aren't used in games), Maxwell and beyond (and possibly AMD's next series and beyond) are required.

    DX12 (like mantle) aimed to remove CPU bottlenecks. This means that if your game didn't use a whole lotta CPU to begin with OR you had an overly powerful CPU, you would notice diminishing returns.
    Let's give some examples:
    Let's say you played Tomb Raider 2013. Anyone who tried that game knows it uses about as much CPU as a 10 year old girl eats from an all you can eat buffet. Nothing. Adding Mantle or DX12 to this would give fairly little gains.
    Let's take BF4 and give you a single 780M and a 4.3GHz 4930MX. Adding DX12 or Mantle to this would give fairly little gains once more.
    Let's take BF4 and give you three GTX 980s with a 20% OC on each and a single i5 4450S. DX12 would just about double your FPS, if not more.
    Let's take BF4 and give you a 4930K at 4.5GHz and a GTX 760. I'd be bloody amazed if you got a single fps increase with DX12.

    If your game likes CPU (anything built on Cryengine 3, anything built in Frostbite 3, Dying Light, etc) then DX12/Mantle could very well give +50% fps bonuses. Most of these games don't use CPUs/GPUs very well, and sometimes have an actual "limit" they can use on a CPU core if using a quadcore or better CPU (Cryengine 3 it's 95% on 1 core; Frostbite 3 appears to be 60-70%. It used to be 80% in BF4's beta, but I haven't seen it so high since; Dying Light seems to not cross 75% on my machine (seems like they fixed that 100% on 1 core load issue), etc etc) so they end up performing exceedingly inefficiently even though the hardware you're using is powerful. There's honestly no excuse for anyone on this planet with two 980s and a 4.3GHz 4930K to get under 120fps at 1080p unless using SSAA of some kind... but go watch Totalbiscuit's reviews and he'll definitely do it. Yes he will. He'll do it in games time and time again.

    Eventually we get to a point where we sort of brute-force through unoptimizations, and games "adjust" to the stronger hardware with more unoptimizations.

    Want to know what unoptimization is? Let me tell you:
    Toxxik is a game in DX9 on UE3. A friend just gave it to me the other day. In Toxxik, I get fairly nice visuals (mainly lighting) and ~60-80fps or so (only uses one card).
    Unreal Tournament 4 is an ALPHA GAME in DX11 on UE4. I tried it last year; haven't updated it since. A couple of the maps were more fleshed out in graphics. My lowest FPS was 100. In explosions. At max graphics. Also only uses one card.

    That's unoptimization, folks. UE4 might be a powerful engine, but whoever the hell is coding UT4 apparently knows what the word "optimization" means and what to do with it.
     
    Ashtrix and Kade Storm like this.
  26. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,747
    Messages:
    29,856
    Likes Received:
    59,723
    Trophy Points:
    931
    Stardock CEO Brad Wardell, who is now current with the strategy game Galactic Civilizations III, screws up expectations in a post on Twitter. He tells that DirectX 12 provides 820 percent better performance than today's DirectX 11 in an otherwise unspecified test of light and lens effects.
    [​IMG]Brad Wardell @draginol
    Följ

    Did a test of DirectX 11 vs. DirectX 12 on an unreleased GPU with an 8core CPU. DX11: 13fps, DX12: 120fps. Lighting and lens effects.

    22:01 - 16 feb 2015
     
  27. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,202
    Likes Received:
    17,918
    Trophy Points:
    931
    Yes there will be specific instances of where you can get a test that abuses a weakness in DX11 so a DX12 benchmark performs much better.
     
    Kade Storm likes this.
  28. E3E

    E3E Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    7
    Trophy Points:
    6
    Hey guys. Is it recommended to use HWInfo to control fans while using the 980Ms SLI in an M18x R2, or is that mainly necessary for the AW 18?
     
    reborn2003 likes this.
  29. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    R2 works fine without needing to use HWinfo. AW18 needs it, but I'm not sure if it's the case for all of them.
     
    E3E and Mr. Fox like this.
  30. reborn2003

    reborn2003 THE CHIEF!

    Reputations:
    7,764
    Messages:
    2,988
    Likes Received:
    349
    Trophy Points:
    101
    Yup only the Alienware 18 requires hwinfo to control the fans because without using it the right secondary GPU fan will not run at all causing the GPU to overheat.
     
    TBoneSan and E3E like this.
  31. E3E

    E3E Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    7
    Trophy Points:
    6
    TL;DR-- I'm gushing about finally finalizing this upgrade! Thank you for the responses.

    Thanks a lot, you two. :D I've been using HWInfo extensively to monitor my temps and so far so good. The first run showed the primary graphics card getting way too hot, so I ensured there was enough thermal padding (Fujipoly Extreme), redid the paste job (I used Gelid GC Extreme for each instance), using the X pattern (read that it is one of the best to use for square-shaped dies--I used a smooth spread on the CPU since that was second best and since X isn't really applicable with the rectangular profile). Compared to the temps I was getting with the line I used last time, I'd say as long as the application is nice and smooth, it will be effective!). Put the GPU back in, and bam, cut my temps by 10-12 degrees. Everything is running smooth right now, I'm running Mr. Fox's modded drivers and they work like a charm. It's amazing that one can turn SLI on or off on the fly with the desktop version. I was so used to having to restart with my 680Ms every time I switched.

    For now, I'm using HWInfo to control the fans, since I really like the ability to fine tune the control there.

    Other mentionables (I saved up a ton and put sooo much effort into this upgrade, it's been exhausting!): I just couldn't help but purchase the L-shaped spreader bracket (revised) for the backside of the GPU for both of these cards. The Fujipoly SARCON extreme pads are not very sticky and so getting the plates to adhere to the back of the cards was virtually impossible. I took Mr. Fox's advice and kept the adhesive backing on the X bracket on. It will be nice to keep those around for when the next upgrade comes, whether I'll be hanging onto AW or not. I wanted to use the stock pads on the brackets, but 1) they were not optimally placed for the 980m's different VRAM chip layout, which is about 1-2mm further out and there's that one offset chip which the 580ms (which I believe the L style-plate is designed for) or the 680ms, for that matter,did not have, so thus I had to use the fuji and 2) the pads were kind of chewed up thanks to probably brushing against some things, so no way I could keep them.

    I also looked into upgrading the fans on my Cooler Master NotePal U3. I bought an extra set for my Notepal when I first purchased it way back in late 2012/early 2013 for my M18x R2. Six fans was a bit ridiculous, I'll admit, but they definitely lowered temps. So I looked around and found various threads. This time, I went for some Cooler Master JetFlo 120MM fans, a set of three, and they are just lovely. Got the blue LEDs to match the usual color I like my AW 's LEDs to be. Perfect hue too. Bought that kinda weird Silenx external fan controller that was listed on the modding thread. It doesn't seem to have an off switch, so I have to unplug it whenever I want turn off my laptop. It's not a big deal, but it's just kind of odd. Voltage control is amazing, and I love these fans a ton. Not sure if I'll mangle my U3 to open up the grate to allow full unrestricted airflow. I'm happy with the temps so far, so most likely not--for now.

    So yeah! I'm really digging it! Most games run flawless and beautifully. Sleeping Dogs is a weird exception; it starts stuttering real bad after a while when I play it. No idea why. It seems with the FPS limiter on, it works okay, but with it off, that's when I see that odd effect. I am playing on a 120HZ monitor, but I have no idea what could be causing it.

    -- some afterthoughts --

    On the X pattern application of TIM: This style uses so MUCH paste. It's only good for around 5 applications or so, maybe 6 with one 3.5g tube. The temps might be worth it in the end, but I find it a little borderline silly. Doesn't stop me from doing it though!

    Also, I don't have the guts to use any liquid metal TIMs. I'm in no way hardcore enough of an OCer (I'm just casual if that!) to even justify that.
     
    Last edited: Feb 25, 2015
    TBoneSan, Kade Storm, ped and 2 others like this.
  32. chrusti

    chrusti Notebook Evangelist

    Reputations:
    36
    Messages:
    607
    Likes Received:
    44
    Trophy Points:
    41
    Hi there, I have following question: Is the PSU mod required to run the two 980m in the Alienware M18X R2 ?
    Also, would the 7970m heatsink work with the 980m ?

    Cheers
     
    Last edited: Mar 13, 2015
  33. pathfindercod

    pathfindercod Notebook Virtuoso

    Reputations:
    1,940
    Messages:
    2,344
    Likes Received:
    2,352
    Trophy Points:
    181
    The 7970 heatsink would but you'll have to trim the black tape off the gpu area because the GPu's are oriented in different directions.
     
  34. kenny27

    kenny27 Notebook Deity

    Reputations:
    294
    Messages:
    919
    Likes Received:
    167
    Trophy Points:
    56
    At this stage I understand that you do not need the dual psu, as the system throttles before you reach the limit of one psu.
     
    Rotary Heart and TBoneSan like this.
  35. chrusti

    chrusti Notebook Evangelist

    Reputations:
    36
    Messages:
    607
    Likes Received:
    44
    Trophy Points:
    41
    Thanks for the quick answer. I did however just found another thread by mr. Fox that kind of worries me: http://forum.techinferno.com/alienw...-980m-alienware-m17x-r4-m18x-r2-aw17-116.html

    It seems like the 980m is not working in a stable fashion in his alienware.
    People there are talking about the gpu throttling and not achieving full power.

    Is it at this point actually recommended to do the upgrade? Or should I might as well planning to sell my beloved alienware and to get a sager or msi? :(

    Cheers
     
  36. Ashtrix

    Ashtrix ψυχή υπεροχή

    Reputations:
    2,386
    Messages:
    2,082
    Likes Received:
    3,292
    Trophy Points:
    281
    It's really unfortunate that Alienwares can't run 900M GPUs properly...
    Thtrottling (power related..), UEFI requirement.
    IIRC the throttling occurs with a 4+ GHz mark OC on XM cpus on M18x R2 (also can't use double psu mod, can't run hard OC on GPUs aswell ) , PEG Mode throttles in Single GPU machines is 100% (optimus only is a fix for 60Hz machines), AW18 is a dead horse with the 900M upgrade.

    Tbh.. the GTX 780M SLI in an M18x R2 is a real beast, If u want latest and greatest switch to a Clevo machine with SLI, 120Hz (rare), No BIOS BS (Prema mod ftw) but you will definitely miss the quality part, rest assured :vbthumbsup:.

    Don't opt for MSI, ASUS machines they all are BGA filth.
     
    Last edited: Mar 14, 2015
    Kade Storm likes this.
  37. chrusti

    chrusti Notebook Evangelist

    Reputations:
    36
    Messages:
    607
    Likes Received:
    44
    Trophy Points:
    41
    But how much of a performance boost would two 780m grand me? I figured the 980ms should be at least 60% faster in 3d mark 11 than my current
    7970m cf i wouldnt go through all this hassle for merely a 30% improvement :/
    then again its not very financially to sell my alienware either.

    And yes - like you said the build quality of the r2 is a very nice thing to have. And to be quite frank I find the clevo ugly and just plain looking.

    Meh :(
    just compared them on notebookcheck.com and the results are very disappointing. Around 15% faster than my crossfire set up? Great! So instead of 40 fps I will get 45fps in crysis 3. :( seems like the only reasonable thing to do is to wait for Windows 10 and direct x 13 and then to jump ships...
     
    Last edited: Mar 14, 2015
    Mr. Fox likes this.
  38. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,202
    Likes Received:
    17,918
    Trophy Points:
    931
    Considering the investment it's hard to justify getting the 780M cards over what you have.
     
    Ashtrix likes this.
  39. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,358
    Likes Received:
    70,790
    Trophy Points:
    931
    I disagree @Meaker . Having owned both, I consider it no exaggeration to say that the GTX 780M SLI absolutely DESTROYS 7970M CrossFire. In fact, even GTX 680M SLI performance is superior to 7970M CrossFire. I've had the luxury of comparing all three GPUs back to back and NVIDIA wins decisively... no contest. AMD is better only for GPUGPU compute performance.

    Just don't waste money on a 980M upgrade for an Alienware. Dell/Alienware and NVIDIA remain silent and apparently don't care. Be careful selecting a Clevo as an alternative, because there is a model or two that behaves badly with 980M throttling just like Alienware.

    Severe GTX 980M Throttling } Geforce Community
    garbage.JPG
     
    @tomX likes this.
  40. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,358
    Likes Received:
    70,790
    Trophy Points:
    931
    Notebookcheck is full of crap. Their results reflect 780M performance using a stock vBIOS, which is almost always a disappointment with NVIDIA GPUs. 7970M overclocks poorly, is famous for having an abbreviated lifespan and being prone to failure.

    You would be shocked at how badly 780M SLI performance will embarrass your 7970M CrossFire performance. If you're into overclocking, using a modded vBIOS for the best performance and benching, you can expect a gigantic performance increase. If you run them stock, and think 60 FPS @ 1080p is the holy grail for gaming, don't bother upgrading to anything.

    Numbers don't lie. Here is the best each GPU has to offer running in a laptop with the same CPU.

    3DMark 11 - 7970M CrossFire (P13440) versus GTX 780M SLI (P17197)
    Red_vs_Green.JPG
     
  41. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,202
    Likes Received:
    17,918
    Trophy Points:
    931
    A single 980M will do better though and cost of investment vs buying two 780M cards and a single 980M machine means it's hard to justify considering the better performance and single vs dual card comparison.
     
    Mr. Fox likes this.
  42. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,358
    Likes Received:
    70,790
    Trophy Points:
    931
    Yes, 980M is awesome and reasonably priced considering the huge performance increase. Unfortunately, it doesn't work well in all machines. It doesn't appear to work well as an upgrade on most Alienware systems. If it doesn't work well, it is a waste of money. That's my opinion on it, for what it's worth. If you chip in a quarter, that might be enough to get you a cup of coffee at the local Starbucks drive-thru.
     
  43. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,202
    Likes Received:
    17,918
    Trophy Points:
    931
    I'm talking about a new system... Not being able to take the latest card would be a killer for myself.
     
  44. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,358
    Likes Received:
    70,790
    Trophy Points:
    931
    Oh, my bad. I thought you were responding to the question about upgrading from 7970M CF. Sorry for the confusion. I agree with you 100% regarding a new system in which 980M works correctly... very nice performance.
     
  45. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,202
    Likes Received:
    17,918
    Trophy Points:
    931
    If it was a solid upgrade I would say a 980M would be an excellent choice, but given the issues I find that hard to recommend. I only see mobile jumping ahead with the release of a full (or more full at least) GM204 after the 980M.
     
  46. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,358
    Likes Received:
    70,790
    Trophy Points:
    931
    Yup. Should be interesting to see what happens. I'm more concerned at this point with the pathetic BGA garbage they are calling CPUs. The mere fact that NVIDIA arbitrarily decided it would be OK to disable GPU overclocking is a scourge and a crime against us all, and I think the future of high performance laptops is extremely grim. The retards building the components and the designers of the receiving machines have no concept of what quality and high performance mean. To top it all off, there are too many uninformed sheeple willing to waste money on cookie-cutter disposable trash for us to reasonably expect that things will improve. I hope I turn out to be full of baloney, but I am not encouraged by all of the signs of troubled times looming on the horizon for us.
     
    @tomX likes this.
  47. chrusti

    chrusti Notebook Evangelist

    Reputations:
    36
    Messages:
    607
    Likes Received:
    44
    Trophy Points:
    41
    Those are some very impressive numbers you posted there. I can't even seem to get anywhere near the 13000 of the 7970m cf.
    Mind telling which volts / overclock you were using?

    Also the 17000 of the 780m is insanely high! Isn't that close to what the 980m receives?
    Well I am kinda torn apart at the moment. Should I try the 780 or just "upgrade" later on to a new brand.

    Its really too bad the 980s don't work.
     
  48. kenny27

    kenny27 Notebook Deity

    Reputations:
    294
    Messages:
    919
    Likes Received:
    167
    Trophy Points:
    56
    That 7970m benchmark is actually mine, I cant prove it as I didn't bother to take a screen shot, It was about 4:30am in the morning and I was disappointed with my 3920xm overclocking efforts.
    The cards were overvolted to 1075mV and clocked at about 990/1475 (from memory, no pun intended)

    I think it is good to note there are people who are happy with the 980M upgrade. Just find our how far you can push the system before it throttles. As far as overclocking goes I will probably get more satisfaction from my 7970M then what I will ever get from the 980M (in the M18x R2, my 980Ms should arrive sometime this week) Simply because I know I squeezed every last drop out of them from them! Where as the 980M will be awesome but at the same time "gimped". There is something extremely unsatisfying about throttling...
     
    @tomX, chrusti, Mr. Fox and 1 other person like this.
  49. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,358
    Likes Received:
    70,790
    Trophy Points:
    931
    GTX 780M is the best upgrade available for the M18xR1/R2/18. Awesome overclocking, no NVIDIA clock-blocking shenanigans, throttle-free and no drama with pure UEFI and Windows 8 crapola. Unless Alienware does the right thing and fixes the Ivy and Haswell machine BIOSes so they are Maxwell compatible--and I doubt they will--980M will never function as well in them as they should. From my own perspective, installing 970M/980M is an Ivy or Haswell Alienware machine is largely an exercise in futility. I recommend buying an SLI Clevo machine or going back to desktops for something worth owning. Almost everything available in a new laptop now, among all brands, is pure trash. There are a few Clevo models that are an exception, but who knows how much longer they will continue to offer them in light of the fact that mainstream consumers gobble up the garbage like candy, and many novice gamers are OK with eating from a dumpster and settling for BGA garbage.
     
    @tomX and chrusti like this.
  50. CorePax

    CorePax Notebook Guru

    Reputations:
    106
    Messages:
    70
    Likes Received:
    17
    Trophy Points:
    16
    Makes me wish I was an engineer, I'd buy a clevo SLI motherboard with a desktop cpu socket. make it fit in the M18x R2 chassis, work with the screen. And throw in the watercooling concept they made for the M18x R1.

    A man can dream.....
     
    reborn2003 and kenny27 like this.
← Previous pageNext page →