I'm not going to lie, part of the allure of buying 2 GM200 behemoths is the opportunity to do a custom loop, but of course that means more $$$. Though if I did go down that route I'd probably just install a GPU loop and be done with it. My H110 with Noctua fans copes just fine with the 4930K running 4.5GHz @ 1.39V (75C max when running Prime95 small FFT), so no point spending money that doesn't have to be spent.
-
Well, the GTX 980M is available for $710 at RJTECH.com.
reborn2003 and TBoneSan like this. -
That's somewhat more reasonable than these eBay prices. It's close to what I'm comfortable paying but I think ill try and hold out a couple more months.
These cards really should be no more than $600. -
-
RJ Tech is reliable. They are a Clevo reseller based in California and have a good reputation for MXM upgrade parts. If I am not mistaken, Brian from T|I (5150Joker here) has purchased GPU upgrades from them before and I suspect other members of our forum have as well. Problem is, we don't know if this GPU is compatible with the R2. It probably is, but there is a chance it may not be... and they have a no-return policy. You could possibly sell it on eBay for as much as you pay for it and break even if it is not compatible.
bumbo2 likes this. -
-
And yeah, your desktop is pretty Alpha Beast, a 4960 pushing 3x 780 Ti K|NGP|N SLI should have absolutely no problem with 3D Vision if I am getting by with 2x 780 Ti SC ACX.
I am by no means an expert but if both Intel and Raja@Asus state that 1.4V is the maximum safe limit for 49XX then I am running right up to that limit (1.398V measured on the board). I do understand what youre saying though, if 1.4V is the maximum limit then it is a bit discomforting to know youre sitting right on the threshold 24/7. But again, I am running Offset voltage so it really only sees this voltage under adequate load, most of the time while gaming at 4.6Ghz Hwinfo64 is only reporting 1.376 (1.392V as reported by Hwinfo64 equates to 1.398V as measured on the motherboard).
Anyhow, I have everything completely maxed out:
CPU Current Capability: 180%
DRAM Current: 140% (optimized)
CPU VTT and VCCSA: 1.2V
DRAM Voltage: 1.675V
CPU PLL: 1.85V
CPU LLC: High
CPU Power Phase: Extreme
CPU Voltage Frequency: 500
Rampage Tweak: 3, "optimized for Ivy E"...............
.............With 2 years and 8 months remaining on Intel's Tuning Plan I'm not exactly worried about prematurely frying my sample, after all it isn't exactly an exceptional performer.
Oh and if you don't already have Intel's Tuning Plan, get it now:
Home Page
-
Meaker@Sager Company Representative
My notebook handles a 4.4ghz 4930k but going much further really does make the power start sky rocketing. Keeping it as cool as possible will mean you need less voltage though.
-
reborn2003, Mr. Fox, bumbo2 and 2 others like this.
-
....
-
Meaker@Sager Company Representative
If you do the power/mobile chip properly that makes sense for gaming. The P570WM is quite ridiculous in the weight/size department.
-
4.5GHz seems to be as far as my chip is willing to go, even at 1.45V Prime95 was instant BSOD at 4.6GHz, and I have no interest in running 1.5V for 24/7 use, not to eek out another measly 100MHz anyway. Also using offset voltage, and under load it fluctuates between 1.376V to 1.392V, with the occasional spike to 1.408V. I'm not using R4BE so can't push the board as hard, although 120% CPU current is more than enough for 4.5GHz. I tend not to mess with LLC since Vdroop is there for a reason, and in any case going from regular to medium made not 1 iota of difference except for more volts and heat. DRAM is dialed back at a cool 1.5V since 1866 is cakewalk for 2400 rated sticks.
Got the overclocking warranty already and it's past the 30 day waiting period so good to go. Decided against trading in prematurely since I could end up with a bottom 1% 4930K given my luck, plus I kind of want to see how much and how fast degradation will occur given the settings I have right now. If after a pre-determined amount of time the chip still soldiers on, I may consider pushing the chip to unreasonable limits and then doing some suicide benches and enjoy the short-lived glory.D2 Ultima likes this. -
-
Meaker@Sager Company Representative
From what I can tell no, it would not.
-
-
TBoneSan likes this.
-
Yes both R1/R2 M18x have displayport 1.2, so with the right card 4k should be possible.
-
Wow, those have displayport 1.2 and this clevo only has displayport 1.1 thunderbolt port T-T.
Way to go clevo -
Meaker@Sager Company Representative
Yes because when it was brought out thunderbolt only supported 1.1.
The alienware will go up to 4k with no issues. -
I don't get why HDMI is even still a thing? It just seems inferior in every way to DP. Larger connector and can't handle the same resolutions/framerates as DP. The only advantage I can think of for HDMI is that it can carry Ethernet as well, but...that seems like a very niche thing to me. It seriously makes me wonder why TVs and consoles don't feature DP connectors? Are they more expensive? (Not that the 'next gen' consoles can even drive 1080p60, let alone anything above that...pathetic, honestly.)
-
-
The nice thing about HDMI is the availability of inexpensive but acceptable quality monitors. To get DP monitors it seems the selection is much smaller and you have to pay more even when the product quality isn't better... or use an adapter to DVI or HDMI from DP. Adapters are nice to have when you need to use them, but it's nicer without them.
-
I suppose HDMI will persist for much the same reason as the 'next gen' consoles: ignorance and/or lack of tech knowledge amongst the general public. I still find it absolutely disgusting when I see £40 HDMI cables in tech-oriented stores, never mind more general ones, when the £1.50 cable I bought off eBay is actually better than any I've seen in stores (it has ridiculously thick braiding). There is absolutely no reason to spend more than pennies on a HDMI cable unless you need a very long cable, a powered cable, or one that carries Ethernet. And yet people do, because they don't know better. So why would they use DP when they're already familiar with HDMI, I suppose.
-
-
A better question is, why is DVI (and especially) VGA still around? Backward/legacy computability I assume?
D2 Ultima likes this. -
-
It's going to be really valuable to keep that legacy hardware, at least on business machines, for a long time. Eliminating VGA completely would cause financial harm to many consumers as well. Probably a high percentage that are not enthusiasts can see no point in buying new computer hardware or software. From a business perspective, it would totally suck to have to toss out perfectly good projectors and still working CRT displays that were paid for years ago and still serving their business purpose flawlessly. Most cost-conscientious companies don't waste money on things that don't earn money until they are backed into a corner and left with no alternative except to upgrade. Software is the same. Windows 7 will likely retain a huge market share until Microsoft ends support as they recently did for Windows XP. My employer waited until the 11th hour to deploy around 30,000 new machines with Windows 7, and that was more or less only because they were forced to spend money on something that essentially provides a zero ROI. With BGA CPUs being rammed down our throats, I may have to follow the same model.
D2 Ultima likes this. -
A good example of this would be projectors in educational establishments. If they aren't relatively new projectors, the odds are that they use an older connector. Those projectors will likely see use until they fail completely, as there's little point in undergoing the cost of replacing them sooner. Things like that means that VGA may still be relevant for at least another decade.
All of the recent store breaches in America that have led to millions of credit card details being stolen? How many would be averted if the American banks and stores were solely using Chip and PIN by now, which has been standard in Europe for a decade? I had my first bank card 8 years ago and it used Chip and PIN. Meanwhile the credit cards my parents recently got for traveling in America didn't? If those stores had spent a couple thousand for a Chip and PIN reader at every PoS terminal, would the details have been stolen? Probably not. But the managers couldn't see the point in implementing something like that.
Same thing with the update to Windows 7. It's disgusting that so many companies still haven't swapped over when it's a major security risk. But why should they care about their customer's data? Until a breach happens, of course.
On a side note, Mr. Fox, I've just noticed your signature...my lord, that ALX-18 would be an awesome machine, but I also think it would be a little excessive for a laptop. It would have to be HUGE to facilitate all of that. In my ideal world, AW would work with Intel to produce a few new mobile CPUs with a higher TDP (perhaps around 65W) and fully unlocked multipliers, which when combined with proper cooling, should allow for 4.5GHz quads - the same that most gamers and even most enthusiasts aim for on a desktop. Plus I don't believe a third 980M would be worth the extra space and money seeming as SLi scales badly past the second GPU. Still, I'd much rather see that ALX-18 than the BGA crap they're now implementing...the sad part is that I would want the new Alienware 13 if it actually had a proper, socketed, 47W TDP quad core that was properly cooled. It would be the ideal companion laptop for the desktop I plan on building at some point in the next year or two.Mr. Fox likes this. -
How would you fit 3 mobile GPUs and a desktop CPU in a non-custom chassis?
Mr. Fox likes this. -
Yeah SLI scaling past the 2nd card is usually not the greatest, and 4 way SLI is still pretty wonky these days. All that being said, does the MXM standard even support 3-way SLI?
Mr. Fox likes this. -
@EviLCorsaiR - I agree that protection of sensitive information, such as customers' private information is critical. For very similar reasons, it is equally important to safeguard proprietary trade secrets and business intelligence that helps a company maintain a competitive advantage. Identity theft and corporate espionage are both despicable. Windows 7 is a secure OS and should be what all companies use for reasons stated. I don't see a problem with companies not wanting to be forced into wasting tons on money replacing hardware and peripherals. I think keeping legacy hardware support available for things like VGA output is not placing anyone at risk and makes good sense. I still think it would suck for consumers and business to be forced into spending money replacing hardware that is still working perfectly for the purpose intended.
ALX-18 would be awesome, indeed. I never view anything that makes a computer more powerful or run faster as being excessive no matter how extreme or power hungry it is because that's something I value above everything else. I'd love to have Quad SLI in a laptop, but I thought it would be more reasonable to expect 3-way and I think that is doable. I personally will not purchase (or build) any gaming or benching beast with only GPU any more. They just bore me to death, and the multi-GPU performance increase is clear and unmistakable, even when it's not scaling as well as one might hope. When it doesn't work as well as it should the fault lies with incompetent game developers and driver development. Even when the benefits are diminished, in almost all cases the multi-GPU system is more powerful..
-
I think that with intelligent design, it's possible to get two 100+W GPUs plus a CPU in between current mobile and desktop CPUs in a chassis the size of the 18 and cool it all effectively. If they needed extra room, they could always get rid of the optical bay, it uses up a LOT of space and so few people use it these days that I'd rather have an external USB optical drive and use that space for more stuff in my laptop. It'd make room for a larger battery, more storage, and larger heatsinks and fans.
The benefits of going multi-GPU are definitely there, a second GPU scales really well in most games, but I've seen very few cases where a third adds on more than a few percent extra, and even fewer where the fourth helps. It's partly down to lazy programming, but it's a very hard thing to program for, and there doesn't seem to be a great deal of purpose in targeting it when the size of the market with more than two GPUs is tiny.
In fact, I'd still much rather have a single, very powerful GPU than two GPUs that can theoretically beat the more powerful single GPU, provided the gap isn't too large. SLi is great in the games that support it, but there's too many out there that see less-than-great scaling, and while that IS down to lazy programming, I'd still like to be able to play those games at higher settings and framerates than I could with a single 980M.
I'm still surprised no manufacturer out there has tried to put a full desktop graphics card in a laptop. Shrink the PCB as much as possible without compromising performance, fit it with an MXM interface and additional pins for the extra power, and a very large heatsink (larger than two of the sinks traditionally used to cool 100W mobile GPUs). I see no reason why it's not possible to stick a 980 in there, or even the inevitable 980 Ti or Titan 2. I'd prefer that to 980M SLi. -
I have no idea how that ALX-18 will ever fit in any kind of mobile form factor XD. I could imagine 660W PSU to even turn it on, and it'd be like a 21"er XD
-
How powerful is the 980GTX compared to dual 980M SLI?
-
But I do expect a faster nVidia desktop GPU to come out at some point. The 980 is only 165W, I don't see nVidia keeping it as their flagship GPU when they could almost double the size of the die for an absolute monster of a GPU within the limits of a single GPU desktop card. nVidia are probably looking to maximise their profits (they ARE a business, after all) by selling as many 980's to the enthusiasts as they can, I would then expect them to release a Titan 2 based on a larger GPU core (higher profit margins than a standard gaming graphics card, obviously) and then probably a 980 Ti using the same GPU core without the Titan's DP performance several months after that. -
Maybe they will release something like a "Goliath" or "Leviathan" to replace the "Titan-Z" with trip proc 980 with 12,000 CUDA/Shader cores and 16GB of vRAM. Say, 1200 base / 1500 boost / 1800 max clock. I think I could put up with just one of those.
Just don't go and do something stupid, like pair it up with a 2.5GHz Core i3 CPU or Celeron welded to the stinking mobo.
-
-
-
18k GPU with default clocks is definitely the 80% of GTX 980 SLI that Nvidia claimed 980M SLI is good for.
As far as reference GTX 980 performance, I believe a single 980 is good for 13k GPU Firestrike, about the performance of a pair of 780M with an aggressive OC:
Nvidia GeForce GTX 970 and 980 reference review - DX11: Futuremark 3DMark 2013 -
D2 Ultima likes this. -
My logic being that if they can fit the power delivery for a 980M onto a standard MXM PCB, they should be able to fit twice the power delivery on a board twice the size. -
GTX 970M SLI is more or less 10% faster than desktop GTX 980. You can see that in this review if you take away all the CPU benchmarks and useless synthetic GPU benchmarks and just look at the games tested.
Thats why Im thinking about going 970M SLI because anything more will be waaaay too much for 1080p. I rather have cool running GPUs and overclock them if it ever becomes nescessary.
GTX 980 is for 1600p gaming abd above. -
If you're just going for 1080p60 on a notebook with no intentions to ever use a higher refresh rate display or the Oculus Rift or maxing upcoming games like Star Citizen, then 970M SLi is probably sufficient. But then if that's what you're going for, I'd just go for a single 980M and never deal with the incompatibilities and other issues that SLi can create. Once again, I still favour a single high end GPU over two lower end GPUs, even when the two lower end GPUs can exceed the performance of the single GPU given good SLi scaling. As long as the difference between the single and dual GPUs isn't absolutely huge, that is. It just means less hassle dealing with the problems of SLi as well as more heat and power consumption, although the latter two probably don't matter much on a dual-GPU laptop system which has individual intakes and heatsinks for the GPUs. (So, for example, if I were building a desktop, I'd rather have a single 980 than two 970s, particularly as the former allows for much smaller mini-ITX based builds.) -
Meaker@Sager Company Representative
980M SLI is perfect for higher refresh rates (where a higher CPU is important) or higher resolutions. 970M SLI wont struggle at 1080p 60hz for some time.
-
LOL, there's no such thing as "too much" performance... ever, even if 970M is "adequate" for 1080p. The comments about "overkill" always make me laugh. The only thing that is ever overkill is the prices.
D2 Ultima, TBoneSan and Rotary Heart like this. -
-
-
I'll say this as well: if you really think you have too many frames to spare, crank up that MSAA and SSAA and I promise you'll find that you may even start to struggle with 60 FPS at 1080p with 2 desktop 980s (although one might argue whether using SSAA and the like really constitutes gaming at 1080p, but I digress) -
-
There may be a way to do it. I think j95 has been trying to force it through driver tweaks. He did some tweaks that force CUDA back into full force and I'm liking that... It made a gigantic improvement in OCL performance and now I can use CUDA again with my video transcoding. I don't understand why NVIDIA disabled that. They made add DSR functionality later for 780M and 880M on, after they have tricked enough people into buying new systems just to get 900M series GPUs.
new GTX 980M coming
Discussion in 'Alienware 18 and M18x' started by dandan112988, Oct 8, 2014.