The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    new GTX 980M coming

    Discussion in 'Alienware 18 and M18x' started by dandan112988, Oct 8, 2014.

  1. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    I'm not going to lie, part of the allure of buying 2 GM200 behemoths is the opportunity to do a custom loop, but of course that means more $$$. Though if I did go down that route I'd probably just install a GPU loop and be done with it. My H110 with Noctua fans copes just fine with the 4930K running 4.5GHz @ 1.39V (75C max when running Prime95 small FFT), so no point spending money that doesn't have to be spent.
     
  2. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,626
    Trophy Points:
    931
    reborn2003 and TBoneSan like this.
  3. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    That's somewhat more reasonable than these eBay prices. It's close to what I'm comfortable paying but I think ill try and hold out a couple more months.
    These cards really should be no more than $600.
     
  4. dandan112988

    dandan112988 Notebook Deity

    Reputations:
    550
    Messages:
    1,437
    Likes Received:
    251
    Trophy Points:
    101
    how reliable is that source? do you think they have SLI connectors to be used in sli configs? I know that some cards are sold without the SLI ports on them. that is a much better price to pay then 1000. I wonder if they will work in our beloved m18x r2s
     
  5. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,626
    Trophy Points:
    931
    RJ Tech is reliable. They are a Clevo reseller based in California and have a good reputation for MXM upgrade parts. If I am not mistaken, Brian from T|I (5150Joker here) has purchased GPU upgrades from them before and I suspect other members of our forum have as well. Problem is, we don't know if this GPU is compatible with the R2. It probably is, but there is a chance it may not be... and they have a no-return policy. You could possibly sell it on eBay for as much as you pay for it and break even if it is not compatible.
     
    bumbo2 likes this.
  6. dandan112988

    dandan112988 Notebook Deity

    Reputations:
    550
    Messages:
    1,437
    Likes Received:
    251
    Trophy Points:
    101
    No returns.. even for doa?
     
  7. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    Yes, I meant to say that there is zero input lag with G-Sync enabled. When I was running G-Sync, before discovering 3D Vision, I kept my refresh rate at 120Hz with "OD" on "normal". Many Swift users were reporting issues with 144Hz, if you have any this is a potential source.

    And yeah, your desktop is pretty Alpha Beast, a 4960 pushing 3x 780 Ti K|NGP|N SLI should have absolutely no problem with 3D Vision if I am getting by with 2x 780 Ti SC ACX.

    @ 4.5 GHz, considering I do not have an exceptional Ivy E sample, with its performance indicative of the average, you might have AT LEAST 4.6GHz in your sample. Consider the following changes (from my post on OCN):

    I am by no means an expert but if both Intel and Raja@Asus state that 1.4V is the maximum safe limit for 49XX then I am running right up to that limit (1.398V measured on the board). I do understand what youre saying though, if 1.4V is the maximum limit then it is a bit discomforting to know youre sitting right on the threshold 24/7. But again, I am running Offset voltage so it really only sees this voltage under adequate load, most of the time while gaming at 4.6Ghz Hwinfo64 is only reporting 1.376 (1.392V as reported by Hwinfo64 equates to 1.398V as measured on the motherboard).


    Anyhow, I have everything completely maxed out:

    CPU Current Capability: 180%
    DRAM Current: 140% (optimized)
    CPU VTT and VCCSA: 1.2V
    DRAM Voltage: 1.675V
    CPU PLL: 1.85V
    CPU LLC: High
    CPU Power Phase: Extreme
    CPU Voltage Frequency: 500
    Rampage Tweak: 3, "optimized for Ivy E"...............

    .............With 2 years and 8 months remaining on Intel's Tuning Plan I'm not exactly worried about prematurely frying my sample, after all it isn't exactly an exceptional performer.


    Oh and if you don't already have Intel's Tuning Plan, get it now:

    Home Page



    WOW. It must be the recession as those prices are lower than what 780M was going for just a year ago (upgradeyourlaptop). Yeah if they come down to $600 I might actually pick up a pair.
     
  8. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,180
    Likes Received:
    17,889
    Trophy Points:
    931
    My notebook handles a 4.4ghz 4930k but going much further really does make the power start sky rocketing. Keeping it as cool as possible will mean you need less voltage though.
     
  9. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    I think its so cool that Clevo actually built a laptop around the desktop 4930, I wish Alienware were doing things like that.
     
    reborn2003, Mr. Fox, bumbo2 and 2 others like this.
  10. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    Keep dreaming :p....
     
  11. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,180
    Likes Received:
    17,889
    Trophy Points:
    931
    If you do the power/mobile chip properly that makes sense for gaming. The P570WM is quite ridiculous in the weight/size department.
     
  12. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    I must have one of those bottom 10% 4930K samples. IMC was kinda wonky at 2133 MHz from day 1, and now just throws errors whenever there is a load, which is why I'm forced to run ram at 1866 and tweaking timings to get the most out of it.

    4.5GHz seems to be as far as my chip is willing to go, even at 1.45V Prime95 was instant BSOD at 4.6GHz, and I have no interest in running 1.5V for 24/7 use, not to eek out another measly 100MHz anyway. Also using offset voltage, and under load it fluctuates between 1.376V to 1.392V, with the occasional spike to 1.408V. I'm not using R4BE so can't push the board as hard, although 120% CPU current is more than enough for 4.5GHz. I tend not to mess with LLC since Vdroop is there for a reason, and in any case going from regular to medium made not 1 iota of difference except for more volts and heat. DRAM is dialed back at a cool 1.5V since 1866 is cakewalk for 2400 rated sticks.

    Got the overclocking warranty already and it's past the 30 day waiting period so good to go. Decided against trading in prematurely since I could end up with a bottom 1% 4930K given my luck, plus I kind of want to see how much and how fast degradation will occur given the settings I have right now. If after a pre-determined amount of time the chip still soldiers on, I may consider pushing the chip to unreasonable limits and then doing some suicide benches and enjoy the short-lived glory. :D
     
    D2 Ultima likes this.
  13. kiwikewl

    kiwikewl Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    I have a question been researching but haven't found much of a solid answer. If I put a pair of 980m im my m18x r2 would it change the hdmi from 1.4 to 2.0 the hdmi cables are apparently the same between the 2
     
  14. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,180
    Likes Received:
    17,889
    Trophy Points:
    931
    From what I can tell no, it would not.
     
  15. kiwikewl

    kiwikewl Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    that sux no 4k for us
     
  16. EviLCorsaiR

    EviLCorsaiR Asura

    Reputations:
    970
    Messages:
    2,674
    Likes Received:
    144
    Trophy Points:
    81
    Wouldn't DisplayPort be able to drive a 4k60 display no problem? At least, I'm assuming the R2 uses DP 1.2.
     
    TBoneSan likes this.
  17. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Yes both R1/R2 M18x have displayport 1.2, so with the right card 4k should be possible.
     
  18. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Wow, those have displayport 1.2 and this clevo only has displayport 1.1 thunderbolt port T-T.

    Way to go clevo
     
  19. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,180
    Likes Received:
    17,889
    Trophy Points:
    931
    Yes because when it was brought out thunderbolt only supported 1.1.

    The alienware will go up to 4k with no issues.
     
  20. EviLCorsaiR

    EviLCorsaiR Asura

    Reputations:
    970
    Messages:
    2,674
    Likes Received:
    144
    Trophy Points:
    81
    I don't get why HDMI is even still a thing? It just seems inferior in every way to DP. Larger connector and can't handle the same resolutions/framerates as DP. The only advantage I can think of for HDMI is that it can carry Ethernet as well, but...that seems like a very niche thing to me. It seriously makes me wonder why TVs and consoles don't feature DP connectors? Are they more expensive? (Not that the 'next gen' consoles can even drive 1080p60, let alone anything above that...pathetic, honestly.)
     
  21. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    It's all about the money. You have to license hdmi, dp is free to use. We're going to see hdmi for a long time.
     
  22. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,626
    Trophy Points:
    931
    The nice thing about HDMI is the availability of inexpensive but acceptable quality monitors. To get DP monitors it seems the selection is much smaller and you have to pay more even when the product quality isn't better... or use an adapter to DVI or HDMI from DP. Adapters are nice to have when you need to use them, but it's nicer without them.
     
  23. EviLCorsaiR

    EviLCorsaiR Asura

    Reputations:
    970
    Messages:
    2,674
    Likes Received:
    144
    Trophy Points:
    81
    ...So wouldn't that be a great reason for companies to use DP instead of HDMI? Or alongside HDMI?

    I suppose HDMI will persist for much the same reason as the 'next gen' consoles: ignorance and/or lack of tech knowledge amongst the general public. I still find it absolutely disgusting when I see £40 HDMI cables in tech-oriented stores, never mind more general ones, when the £1.50 cable I bought off eBay is actually better than any I've seen in stores (it has ridiculously thick braiding). There is absolutely no reason to spend more than pennies on a HDMI cable unless you need a very long cable, a powered cable, or one that carries Ethernet. And yet people do, because they don't know better. So why would they use DP when they're already familiar with HDMI, I suppose.

    And again, I ask why DP isn't available on more displays? If you're buying a high quality monitor, it will almost always have DP on it, but why not on the cheaper ones?
     
  24. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    HDMI was developed by tv manufacturers such as panasonic, sony, philips, toshiba, etc. Because it was made by tv manufacturers, we see it on TVs. The charge for HDMI is 4 cents per device I believe. TV manufacturers do not want to implement the free displayport standard. They wouldn't make money. Because TVs all have HDMI, this forces most consumer devices to have hdmi as well. No one wants to buy a displayport to hdmi adapter for every device they own.
     
  25. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    A better question is, why is DVI (and especially) VGA still around? Backward/legacy computability I assume?
     
    D2 Ultima likes this.
  26. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    That's exactly why. Many projectors and monitors still have vga/dvi ports. As long as high end equipment gets rid of these ports, low end/mid range products can keep it. I was surprised to see vga on a 750Ti and dvi on a gtx 980 since these are new products. Maybe in 2015 or 2016 we won't see them anymore.
     
  27. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,626
    Trophy Points:
    931
    It's going to be really valuable to keep that legacy hardware, at least on business machines, for a long time. Eliminating VGA completely would cause financial harm to many consumers as well. Probably a high percentage that are not enthusiasts can see no point in buying new computer hardware or software. From a business perspective, it would totally suck to have to toss out perfectly good projectors and still working CRT displays that were paid for years ago and still serving their business purpose flawlessly. Most cost-conscientious companies don't waste money on things that don't earn money until they are backed into a corner and left with no alternative except to upgrade. Software is the same. Windows 7 will likely retain a huge market share until Microsoft ends support as they recently did for Windows XP. My employer waited until the 11th hour to deploy around 30,000 new machines with Windows 7, and that was more or less only because they were forced to spend money on something that essentially provides a zero ROI. With BGA CPUs being rammed down our throats, I may have to follow the same model.
     
    D2 Ultima likes this.
  28. EviLCorsaiR

    EviLCorsaiR Asura

    Reputations:
    970
    Messages:
    2,674
    Likes Received:
    144
    Trophy Points:
    81
    Okay, I can kind of understand why they wouldn't include DP on most TVs. However, as far as I'm concerned, it should be the standard for displays designed for computers. It's the best connector currently available, and as it's free, it presumably costs pennies to implement.

    There are a LOT of companies, organisations, and other tech installations out there that are still reliant on older standards.

    A good example of this would be projectors in educational establishments. If they aren't relatively new projectors, the odds are that they use an older connector. Those projectors will likely see use until they fail completely, as there's little point in undergoing the cost of replacing them sooner. Things like that means that VGA may still be relevant for at least another decade.

    Attitudes like that make me sick. Managers and the such can only think in terms of ROI, which is completely the wrong way to approach things when you look at computer software and hardware, particularly where security is concerned, in any application that depends on it.

    All of the recent store breaches in America that have led to millions of credit card details being stolen? How many would be averted if the American banks and stores were solely using Chip and PIN by now, which has been standard in Europe for a decade? I had my first bank card 8 years ago and it used Chip and PIN. Meanwhile the credit cards my parents recently got for traveling in America didn't? If those stores had spent a couple thousand for a Chip and PIN reader at every PoS terminal, would the details have been stolen? Probably not. But the managers couldn't see the point in implementing something like that.

    Same thing with the update to Windows 7. It's disgusting that so many companies still haven't swapped over when it's a major security risk. But why should they care about their customer's data? Until a breach happens, of course.


    On a side note, Mr. Fox, I've just noticed your signature...my lord, that ALX-18 would be an awesome machine, but I also think it would be a little excessive for a laptop. It would have to be HUGE to facilitate all of that. In my ideal world, AW would work with Intel to produce a few new mobile CPUs with a higher TDP (perhaps around 65W) and fully unlocked multipliers, which when combined with proper cooling, should allow for 4.5GHz quads - the same that most gamers and even most enthusiasts aim for on a desktop. Plus I don't believe a third 980M would be worth the extra space and money seeming as SLi scales badly past the second GPU. Still, I'd much rather see that ALX-18 than the BGA crap they're now implementing...the sad part is that I would want the new Alienware 13 if it actually had a proper, socketed, 47W TDP quad core that was properly cooled. It would be the ideal companion laptop for the desktop I plan on building at some point in the next year or two.
     
    Mr. Fox likes this.
  29. Rhubarb

    Rhubarb Notebook Geek

    Reputations:
    2
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    How would you fit 3 mobile GPUs and a desktop CPU in a non-custom chassis?
     
    Mr. Fox likes this.
  30. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Yeah SLI scaling past the 2nd card is usually not the greatest, and 4 way SLI is still pretty wonky these days. All that being said, does the MXM standard even support 3-way SLI?
     
    Mr. Fox likes this.
  31. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,626
    Trophy Points:
    931
    @EviLCorsaiR - I agree that protection of sensitive information, such as customers' private information is critical. For very similar reasons, it is equally important to safeguard proprietary trade secrets and business intelligence that helps a company maintain a competitive advantage. Identity theft and corporate espionage are both despicable. Windows 7 is a secure OS and should be what all companies use for reasons stated. I don't see a problem with companies not wanting to be forced into wasting tons on money replacing hardware and peripherals. I think keeping legacy hardware support available for things like VGA output is not placing anyone at risk and makes good sense. I still think it would suck for consumers and business to be forced into spending money replacing hardware that is still working perfectly for the purpose intended.

    ALX-18 would be awesome, indeed. I never view anything that makes a computer more powerful or run faster as being excessive no matter how extreme or power hungry it is because that's something I value above everything else. I'd love to have Quad SLI in a laptop, but I thought it would be more reasonable to expect 3-way and I think that is doable. I personally will not purchase (or build) any gaming or benching beast with only GPU any more. They just bore me to death, and the multi-GPU performance increase is clear and unmistakable, even when it's not scaling as well as one might hope. When it doesn't work as well as it should the fault lies with incompetent game developers and driver development. Even when the benefits are diminished, in almost all cases the multi-GPU system is more powerful..

    If not, it's a shame they haven't put forth the effort to make it so.
     
  32. EviLCorsaiR

    EviLCorsaiR Asura

    Reputations:
    970
    Messages:
    2,674
    Likes Received:
    144
    Trophy Points:
    81
    Oh, I absolutely don't see a problem with companies not throwing money at new hardware/software without good reason, and legacy support is definitely a good thing. I just disagree with the culture of saving as much money as possible on hardware and software that's critical to security of any kind. I'm pretty much forced into using a password manager these days because there are very few online services that I actually trust to properly encrypt my password and keep it secure, so it pretty much necessitates using different (and randomly generated) passwords for everything with two-factor authentication to access the vault...

    When it comes to a desktop, I absolutely agree with you. But throwing a desktop CPU into a laptop - never mind one of the Haswell-E monsters - and three graphics cards would necessitate such ludicrously large and heavy cooling that at that point, I really would say just get a desktop.

    I think that with intelligent design, it's possible to get two 100+W GPUs plus a CPU in between current mobile and desktop CPUs in a chassis the size of the 18 and cool it all effectively. If they needed extra room, they could always get rid of the optical bay, it uses up a LOT of space and so few people use it these days that I'd rather have an external USB optical drive and use that space for more stuff in my laptop. It'd make room for a larger battery, more storage, and larger heatsinks and fans.

    The benefits of going multi-GPU are definitely there, a second GPU scales really well in most games, but I've seen very few cases where a third adds on more than a few percent extra, and even fewer where the fourth helps. It's partly down to lazy programming, but it's a very hard thing to program for, and there doesn't seem to be a great deal of purpose in targeting it when the size of the market with more than two GPUs is tiny.

    In fact, I'd still much rather have a single, very powerful GPU than two GPUs that can theoretically beat the more powerful single GPU, provided the gap isn't too large. SLi is great in the games that support it, but there's too many out there that see less-than-great scaling, and while that IS down to lazy programming, I'd still like to be able to play those games at higher settings and framerates than I could with a single 980M.

    I'm still surprised no manufacturer out there has tried to put a full desktop graphics card in a laptop. Shrink the PCB as much as possible without compromising performance, fit it with an MXM interface and additional pins for the extra power, and a very large heatsink (larger than two of the sinks traditionally used to cool 100W mobile GPUs). I see no reason why it's not possible to stick a 980 in there, or even the inevitable 980 Ti or Titan 2. I'd prefer that to 980M SLi.
     
  33. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I have no idea how that ALX-18 will ever fit in any kind of mobile form factor XD. I could imagine 660W PSU to even turn it on, and it'd be like a 21"er XD
     
  34. GTO_PAO11

    GTO_PAO11 Notebook Deity

    Reputations:
    173
    Messages:
    1,309
    Likes Received:
    205
    Trophy Points:
    81
    How powerful is the 980GTX compared to dual 980M SLI?
     
  35. EviLCorsaiR

    EviLCorsaiR Asura

    Reputations:
    970
    Messages:
    2,674
    Likes Received:
    144
    Trophy Points:
    81
    Two 980M's will be significantly faster in situations where SLi gives good scaling. The 980M uses the same GPU core as the desktop 970 - i.e. it has 75% of the shader count that the desktop 980 has - as well as lower clock speeds. At a rough guess, 980M SLi is probably around 30% faster than a 980 (assuming perfect SLi scaling), depending on how well the 980M boosts (i.e. how well it's cooled).

    But I do expect a faster nVidia desktop GPU to come out at some point. The 980 is only 165W, I don't see nVidia keeping it as their flagship GPU when they could almost double the size of the die for an absolute monster of a GPU within the limits of a single GPU desktop card. nVidia are probably looking to maximise their profits (they ARE a business, after all) by selling as many 980's to the enthusiasts as they can, I would then expect them to release a Titan 2 based on a larger GPU core (higher profit margins than a standard gaming graphics card, obviously) and then probably a 980 Ti using the same GPU core without the Titan's DP performance several months after that.
     
  36. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,626
    Trophy Points:
    931
    Maybe they will release something like a "Goliath" or "Leviathan" to replace the "Titan-Z" with trip proc 980 with 12,000 CUDA/Shader cores and 16GB of vRAM. Say, 1200 base / 1500 boost / 1800 max clock. I think I could put up with just one of those. :D Just don't go and do something stupid, like pair it up with a 2.5GHz Core i3 CPU or Celeron welded to the stinking mobo.
     
  37. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    The funny thing is, Mr. Fox wants all this power to benchmark... I want all that power to actually use it in everyday usage XD.
     
  38. GTO_PAO11

    GTO_PAO11 Notebook Deity

    Reputations:
    173
    Messages:
    1,309
    Likes Received:
    205
    Trophy Points:
    81
    What's the equivalent of 980 Desktop to mobile sli GPU? 780m sli?
     
  39. vulcan78

    vulcan78 Notebook Deity

    Reputations:
    590
    Messages:
    1,523
    Likes Received:
    352
    Trophy Points:
    101
    First off, very impressive scores to Pathfinder, 21k GPU is creeping up on non-reference GTX 780 Ti SLI territory ( here is my 22.5k GPU run on default clocks, optimal OC clocks in signature). But what kind of wattage are we talking about here overclocked? 175-200W per card? This question has relevance for anyone here in the M18X R2 forum as you will absolutely need two 330W PSU's if you intend to go the 980M SLI route and enjoy maximal OC performance.

    18k GPU with default clocks is definitely the 80% of GTX 980 SLI that Nvidia claimed 980M SLI is good for.

    As far as reference GTX 980 performance, I believe a single 980 is good for 13k GPU Firestrike, about the performance of a pair of 780M with an aggressive OC:

    Nvidia GeForce GTX 970 and 980 reference review - DX11: Futuremark 3DMark 2013
     
  40. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    They probably won't ever bother seeing as how no laptop manufacturer has put forward a concept design for a tri-SLI laptop. And given the direction the market is taking these days that is unlikely to ever come to fruition sadly.

    Actually nVidia tried that with the 480M, and the results were so disastrous they never took the "take big die and cut down for laptop" approach ever again. I think the main issue is there simply isn't enough room on an MXM PCB to fit all the power circuitry needed to feed the power hungry big dies without significantly cutting them down, and as 480M has shown that is simply not a viable way to go about things.
     
    D2 Ultima likes this.
  41. EviLCorsaiR

    EviLCorsaiR Asura

    Reputations:
    970
    Messages:
    2,674
    Likes Received:
    144
    Trophy Points:
    81
    I probably should have specified that it would be two to three times larger than a standard MXM PCB. I absolutely don't think it would be possible to fit all of that power delivery onto something the size of a 980M without serious problems. But if you take something two to three times the size, it's not too far off the size of the full desktop PCB.

    My logic being that if they can fit the power delivery for a 980M onto a standard MXM PCB, they should be able to fit twice the power delivery on a board twice the size.
     
  42. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    780M SLI is nowhere close to desktop GTX 980.


    GTX 970M SLI is more or less 10% faster than desktop GTX 980. You can see that in this review if you take away all the CPU benchmarks and useless synthetic GPU benchmarks and just look at the games tested.
    Thats why Im thinking about going 970M SLI because anything more will be waaaay too much for 1080p. I rather have cool running GPUs and overclock them if it ever becomes nescessary.
    GTX 980 is for 1600p gaming abd above.
     
  43. EviLCorsaiR

    EviLCorsaiR Asura

    Reputations:
    970
    Messages:
    2,674
    Likes Received:
    144
    Trophy Points:
    81
    Not for 1080p 144Hz, or for making sure you can max out every release coming in the next few years. Not to mention the consumer version of the Oculus Rift will likely be running 1600p or 3k resolution and demand framerates well above 60fps.

    If you're just going for 1080p60 on a notebook with no intentions to ever use a higher refresh rate display or the Oculus Rift or maxing upcoming games like Star Citizen, then 970M SLi is probably sufficient. But then if that's what you're going for, I'd just go for a single 980M and never deal with the incompatibilities and other issues that SLi can create. Once again, I still favour a single high end GPU over two lower end GPUs, even when the two lower end GPUs can exceed the performance of the single GPU given good SLi scaling. As long as the difference between the single and dual GPUs isn't absolutely huge, that is. It just means less hassle dealing with the problems of SLi as well as more heat and power consumption, although the latter two probably don't matter much on a dual-GPU laptop system which has individual intakes and heatsinks for the GPUs. (So, for example, if I were building a desktop, I'd rather have a single 980 than two 970s, particularly as the former allows for much smaller mini-ITX based builds.)
     
  44. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,180
    Likes Received:
    17,889
    Trophy Points:
    931
    980M SLI is perfect for higher refresh rates (where a higher CPU is important) or higher resolutions. 970M SLI wont struggle at 1080p 60hz for some time.
     
  45. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,626
    Trophy Points:
    931
    LOL, there's no such thing as "too much" performance... ever, even if 970M is "adequate" for 1080p. The comments about "overkill" always make me laugh. The only thing that is ever overkill is the prices.
     
    D2 Ultima, TBoneSan and Rotary Heart like this.
  46. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Same thing I said to someone. As far as I'm concerned, if you have "too much graphics power", then it's time to make your game look better XD.
     
    Mr. Fox, TBoneSan and n=1 like this.
  47. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,626
    Trophy Points:
    931
    Exactly. Great minds think alike.
     
    D2 Ultima likes this.
  48. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    And that's why I run all my games with DSR enabled. :D

    I'll say this as well: if you really think you have too many frames to spare, crank up that MSAA and SSAA and I promise you'll find that you may even start to struggle with 60 FPS at 1080p with 2 desktop 980s (although one might argue whether using SSAA and the like really constitutes gaming at 1080p, but I digress)
     
    Mr. Fox, D2 Ultima and TBoneSan like this.
  49. EviLCorsaiR

    EviLCorsaiR Asura

    Reputations:
    970
    Messages:
    2,674
    Likes Received:
    144
    Trophy Points:
    81
    I wish they'd enable DSR on the damn 880M already. Don't suppose there's any way to mod it in?
     
  50. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,626
    Trophy Points:
    931
    There may be a way to do it. I think j95 has been trying to force it through driver tweaks. He did some tweaks that force CUDA back into full force and I'm liking that... It made a gigantic improvement in OCL performance and now I can use CUDA again with my video transcoding. I don't understand why NVIDIA disabled that. They made add DSR functionality later for 780M and 880M on, after they have tricked enough people into buying new systems just to get 900M series GPUs.
     
← Previous pageNext page →