Besides the next Maxwell refresh. With 2016 Pascal are you all positive it will be MXM compatible?
Or to be more specific will have Pascal the arquitecture to be compatible with 2014/2015 AW/Clevo/MSI notebooks?
-
I'm figuring it will be. PCIe 4.0 will not be finalized until late 2016 according to the most recent information from PCI-SIG. Pascal will be out by then which means it will be PCIe 3.0. I think this is a safe assumption to make. Since it will be PCIe 3.0 there is no benefit to changing the socket. MXM 3.0b is already as fast as PCIe 3.0 can go so it will be enough for Pascal cards. The only thing that changing the socket would do is cut off the upgrade path for existing MXM owners and I don't see Nvidia doing that when they can still milk us on upgrades.
-
-
NVLink is for ganging more than 4 GPUs together in a single computer. The tech is useless in notebooks with one or at most two GPUs. Yes, I am aware of Eurocom's "mobile supercomputer" with four MXM slots but that's a unique case and even that wouldn't benefit from NVLink.
-
Four MXM?
-
Sir. You have been trolled. There is no such thing as a quad SLI notebook. There is no such thing as an i7 with 8 cores and 16 threads on x79 chipset. That is a X7200 or a P570WM shell they took a picture of and photoshopped.
BELIEVE ME. IF THERE WAS A QUAD-SLI NOTEBOOK. MR. FOX WOULD HAVE HAD IT.
The notebook with the most GPUs in it was Clevo's 2006 tri-SLI notebook. It was the only one of its kind and I believe it was a 19" beast.
Also, that article is dated April 1st 2012. -
Lol,
4 x MXM GG
-
Look at the date
-
-
Well we aren't in 2016 yet. I'll look this one and start a 2016 rumors thread January 1st. Lol
-
NVLink is a proprietary mezzanine card that replaces the PCIe slots on a specially designed motherboard. It's strictly for GPU compute stuff.
-
D'oh. I have, indeed, been trolled.jaybee83, Mr Najsman, TomJGX and 3 others like this.
-
I guess then we can still hope for an MXM Pascal chip...
-
lol guys chill! First of all lets always keep the respect up.
-
Just don't report TechPowerUP about "trolling article" please
-
Yes. Nvidia gains nothing by changing the slot for Pascal, and loses potential revenues from existing Kepler and Maxwell owners who are expecting to upgrade but won't if that avenue is cut off.
Volta may not require a new slot. PCIe 4.0 will be 100% forwards and backwards compatible with PCIe 3.0. I fully expect that Nvidia will come up with a PCIe 4.0-based MXM slot but I'm not aware of any reasons why it could not be similarly forwards and backwards compatible with MXM 3.0b. -
Believe me, the number of people who expect to upgrade is so low it's not worth mentioning.
MSI *might* offer it... unless nVidia is only having two refreshes for maxwell. Then their "guarantee" for the GT72/GT80 doesn't apply for Pascal.
ASUS, Alienware, Gigabyte, Razer, Lenovo, etc... everyone's making soldered chips now. MXM is basically a Clevo and MSI endeavour, and nVidia knows it. Laptop gaming is going down, and Clevo's insistence on selling us good machines doesn't make up for all the other problems like BIOS OC blocks and nVidia gimping their GPUs and intel gimping their soldered CPUs and everybody thinking 1" thick laptops are the future of performance/gaming laptops...
Right now it's an uphill battle to get good gear again. We were SO CLOSE with Haswell/kepler refresh #1... but then downhill again. GG. -
True enough. My speculation is based on the premise that Pascal will have MXM at all. If Nvidia decide not to bother then the Maxwell refreshes are the end of it.
-
As someone who only buys a new laptop and has never upgraded the GPU, I don't fall into the category where this is a concern to me. To those who do upgrade their GPU's in a laptop I can completely understand the validity of the concern here. However, I would bet that it is in nVidia/AMD at this time to limit MXM or even phase it out in favor of the soldered in chips because the majority of the market of mobile users buy new and don't upgrade.
-
Have to agree with Prema that this should be the 2016 thread.....Cloudfire's got his rear so up his pants that the stuff he's been sprouting is nothing short of nonsencial... I'd expect nothing short of a slightly improved GM204 chip to replace the 980M.. GM200 won't come to notebook... NVIDIA has no incentive to do so anyways... AMD's been nothing short of disappointing for the last 2 years..
-
I remember when we thought Nvidia would have to make a new MXM after GTX 280M. Here we are 5+ years later, still on MXM 3.0, with GPUs 5x as fast.
They just haven't had a hardware reason to spend R&D money on something more advanced. -
Wondering when that beast cone to sell, I still planing to buy MSI GT80 TITAN but now I think I will wait till 990m In sli
-
List one game that's bottlenecked because of any haswell i7 quad. Seriously? What game can't an i7 of ANY caliber handle? And if you honestly think a 4720HQ can't keep up with 980M SLI, then you'll be surprised to know that Notebookcheck.net tested out both chips with a desktop i7 and achieved the same scores. I.E. You're wrong.
-
Crysis 3 and Star Citizen off the top of my head. Minimum frame rates are often affected by a CPU not pulling it's weight too. Also, there are people who have 120/144hz monitors who need a strong CPU to get them there.TomJGX, D2 Ultima and Mr Najsman like this.
-
wait for Clevo's skylake SLI model and that might very well be the best machine ever.Mr Najsman and TomJGX like this.
-
That's what I'm doing
And also for new higher res 17.3" screens
-
Problem is: I don't like so much clevo and I can't wait till middle of 2016)
Also I want 18 inch laptop and all clevo which I try use before not satisfied me with barebon quality and a lot of plastic
Also seems more ugly them alienware or msi -
Clevo might be ugly on outside but it's beautiful on inside.
Alienware might be beautiful on outside but they are getting uglier on inside at time goes on .
This is only my opinion .
And by inside i mean hardware wise.jaybee83 likes this. -
I received compliments about my P750ZM's finish today. Some people like the matte black. I rather do aside from the way that oils from my hands tend to show up so readily. And it doesn't look like an expensive gaming notebook.jaybee83 likes this.
-
If you want 18 inch then you're stuck with MSI. Clevo really are good quality, especially as of late. Looks are personal preference. I'm not a fan of how the AW or MSI look personally, especially the ugly MSI logo on the lid. But to each their own. I'd rather have choice than not, but we seem to have less and less choice these days.D2 Ultima likes this.
-
The AW 15 and 17R2 is a total failure... There is no alternative other than Clevo now. If Dell continues its destruction of Aw laptops in 2015/16 there are only Clevo which is the option or a self-built desktop.jaybee83, TomJGX, TBoneSan and 1 other person like this.
-
I totally agree with you about alienware
-
It used to be that Alienware was beautiful on the outside as well as having inner beauty. Now it's just a gold digger with the looks but no substance or personality whatsoever.
Clevo is like ugly Betty; might not have the best looks, but has a heart of gold that's bound to win you over in the long run. But hey don't take my word for it, just ask Mr. Fox what he thinks of his P570WM. -
Star Citizen is poorly optimized then and there's probably something wrong with your computer if you're having issues running Crysis 3, which is Crysis 2 all over again.
Minimum FPS tends to be WORSE with SLI than it is with single cards, it's been proven multiple times in the past and present. It's gotten better than before, but it's still there in games that aren't using the best engines. -
Cute response, but I never said I was having issues running crysis 3. I said it benefits from high CPU clocks, including more cores/threads. I think you forgot the point you were trying to make.
http://www.tomshardware.com/reviews/crysis-3-performance-benchmark-gaming,3451-8.htmlMr Najsman likes this. -
They are all getting worse. Cost down all the way. Let's get ride of those diodes. Nobody can blame us if static kills their computer. And those caps? Come on, power stability isn't even a first world problem (rolls eyes <_<). Errrr, didn't I tell you our customers can't recognize the mushrooms on the board and your IC selections are eating our precious budget? HannStar (both Clevo and Compal/Dell used high quality HannStar PCBs at a time) charging us too much and we can't give our distributors more margin now? Oh, I know just the right guy to solve this problem!
With that said, recently AW designs have been getting closer to the nope realm. So stick to Clevo if you want DTR to survive a bit longer. Or maybe lets see what the Precision department will do next if you don't need the fastest and budget isn't tight, but I'm not sure I see good signs. -
FWIW my Clevo P370SM still used HannStar for the mobo.
-
No, the point I was making was that any i7, from Sandy onward, is PLENTY fast and presents no bottleneck to any GPU config as far as mobile machines are concerned. YOU pointed that out as incorrect and believe that a 4720HQ isn't fast enough to handle 980M SLI, which, again, is false.
Now if you're done trying to derail my original comment, by all means, provide proof showing how such an i7 isn't adequate for 980M SLI and how a faster CPU equates to substantial gains in performance?
Also, as far as a cute response goes, that particular link shows a Sandy-E chip with a single 680 vs various other CPUs with the same GPU. The minimum FPS comparing the i5 to the Hexa i7 is so hilariously low, it's likely to not even be worth using that as part of an argument. So sure, it benefits from higher clock speeds when you want a higher MAX framerate, but the minimum FPS matters faaaar more than the maximum. Again, this just brings us back to the beginning of my original statement. There's no game that's going to cause any lapse in performance, even with SLI, with the current i7 line. Crysis 3 is ONE game and it's kinda silly to just use that ONE game as reference. Could you possibly show another game that doesn't suck eggs that people actually care about? -
Take a deep breath. It's OK.
-
Dear me.. I've bolded my responses in the quotes. Try addressing your responses to the right person before you start spitting venom around.
Quality advice. There's no reason to... -
It's not false. YOU think it's false. That's your (wrong) opinion.
Proof that an i7 isn't enough for 980M SLI? Okay. How about 780M SLI? You know, that decidedly weaker combination? On my 120Hz screen? LOOK CPU BOTTLENECK
Oh, and here's a video that I captured with Shadowplay in BF4 where I can't hold my 125fps constant because CPU is getting maxed
Please watch in 1080/60 so you can CLEARLY see my usage statistics in real-time thank you~
Neither of these games are Crysis 3, mind you. Oh, and another game where my CPU limits my performance be Dying Light. =D. -
Poor mobile CPUs trying to run SLI graphic cards with 120 frames per second. If thats not enough they also have to capture gameplay at the same time.
I feel sorry for those that end up in D2 Ultima machines. They try their best.
Lets have a moment of silence for them.
jaybee83, killkenny1, Papusan and 5 others like this. -
Don't worry, Shadowplay doesn't abuse my CPU to capture.
But I do that with streaming =D. "If your CPU ain't hitting 80%+ while streaming you're not using enough compression" - Ultima school rule of streaming #1
Edit: OOO CAKELast edited: May 25, 2015Cloudfire likes this. -
I dunno. I see the CPU running primarily at 60-70% with occasional spikes to 80% something. Not saying it isn't being taxed but far from "limiting", there's still some breathing room. If 60-70% performance is limiting then I'm limiting my 970m too. I'm certain it will affect FPS to an extent, but hard to really quantify unless you compare with same config with different CPU's.
-
With Hyper-Threading enabled, you could be CPU-bound at 50%. It's not common that you'll see 100% per thread on a Hyper-Threaded CPU while playing a modern multi-core aware game, and the OS scheduler plays a role in load distribution too as @D2 Ultima can attest to.Last edited: May 25, 2015Mr.Koala likes this.
-
So turn off HT and run the tests again.
-
Thankfully we will probably not care as much when DX12 is in wide use this time next year
-
Or add DSR and see if GPU usage goes up. No need to reboot.
-
Yeah, there seems to be a limit with these things. I think GTA V seems to top out near 66% of my real-time CPU usage (which translates to the 30% boost I get with HT). I already know that GTA V can use 100% of i5s, I've asked people to look for me. Windows 8 CAN (doesn't have to) limit your CPU usage in games too, just by being Windows 8. If the game or program is dependent on Windows for CPU usage ( like Sony Vegas 13 rendering) you'll find that your CPU will show 100% in windows, but not be at 100% actually. Other programs like Handbrake rendering, or Throttlestop's TSBench can use actual 100% as seen here; bypassing Windows' limit.residualvoltage and HTWingNut like this.
nVidia 2015 mobile speculation thread
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, May 9, 2015.
![[IMG]](images/storyImages/LB7b48W.jpg)