so what is the latest status of multi GPU support? It's incredible how much DX12 has killed off sli and crossfire by giving devs the authority to include support for multiple GPUS. What has been everyone's experience with the latest games? Is there any hope ? I remember gears of war 4 was promised to have explicit multi GPU (they showed it off!) but its many many months without a sign.
-
HaloGod2012 Notebook Virtuoso
-
Hopefully as studios get more comfortable with DX12 they add it more.
I want that Asus PG27UQ, but it's not gonna work at all without sli. Gonna need either two 1080 Tis or two 1180s. -
I find it pathetic, sometimes I have to go into the Nvidia settings to trick SLI to work in a game. Even then it's not true SLI, since I see no real difference between single and dual cards in game. It's frustrating to look at my overlay and one GPU is sitting there for all intents and purposes DEAD.
TBoneSan, Awhispersecho and TomJGX like this. -
Awhispersecho Notebook Evangelist
My personal feeling is that as long as OEM's are going to continue advertising and selling SLI machines, and boasting of the advantages that provides, developers should be required to support it. No SLI support, no release of game. Pure and simple.
hmscott and Support.2@XOTIC PC like this. -
HaloGod2012 Notebook Virtuoso
-
Support.2@XOTIC PC Company Representative
-
HaloGod2012 Notebook Virtuoso
They need to follow what the Rise of the Tomb Raider devs implemented. The DX12 multiGPU in that game works perfect !
hmscott likes this. -
Awhispersecho Notebook Evangelist
The issue is much bigger than this though. It's goes to the increasing laziness of developers in general and it's the consumers fault. As of now, I would guess 50-60% of games are released buggy as hell, poorly optomized and basically incomplete. They are released that way because the developers know the games will still be bought and people will deal with it until they get around to releasing a patch. And then another patch, and then another patch. It goes on and on. They release crap and people throw money at them anyway regardless of the launch issues, the lack of features, poor optomization and performance, and every other issue you can think of. I figure for alot of games, it takes about 6 months now to get the game you were supposed to get when you bought it. Obviously not all games are this way but it is getting increasingly frequent.
It's not just games, Look at Nvidia and their drivers, look at MS and and Windows 10. Quality has gone out the window across the board. So if developers aren't being held to a high enough standard to even release a game that runs properly and is complete, there's no way in hell they are going to go the extra mile and put SLI support in it. No reason to. People will but it anyway. If developers were doing what they were supposed to be doing with games, drivers, Windows, DX12 and everything else, I would have a compelling reason to upgrade to Win 10. As of now, I don't. The performance gains I might get, if any at all, aren't worth the other crap involved with WIn 10.
But yes, SLI should be mandatory for a game release. It actually surprises me that Nvidia and AMD don't push this harder, it would only help them sell more video cards. -
hmscott and HaloGod2012 like this.
-
HaloGod2012 Notebook Virtuoso
hmscott likes this. -
-
HaloGod2012 Notebook Virtuoso
hmscott likes this. -
I personally don't see much future for multi GPU's honestly. SLI or Crossfire have not matured a whole lot over the years and support is still spotty at best. I think the niche market will always be there, just because it's a status thing, but I wouldn't expect a whole lot to change. Until consoles decide to go with multi-GPU configurations, you won't see much improvement on the PC front either.
D2 Ultima likes this. -
-
HaloGod2012 Notebook Virtuoso
-
This is interesting. The last part of the benchmark video actually shows a decrease, but the others a small bump.
It would be pretty cool to test with the new Kaby Lake systems as their GPU's are a little more powerful than Skylakes (which for iGPU's weren't terrible - especially if you had Iris)
I wonder what it does for lower end machines, like my GT 540m + Ivy Bridge combo? -
King of Interns Simply a laptop enthusiast
A similar bump I can imagine considering the difference between a 1080 GTX and any igpu should also be pretty huge despite improvements.
-
Support.2@XOTIC PC Company Representative
-
D2 Ultima and HaloGod2012 like this.
-
I mean, it works. Sure. I won't deny that. It also uses less CPU, which is great considering the fact that CPU usage in AAA games from 2015 onward has been massive and even highly clocked quadcores with great RAM isn't often enough to hold 120fps sometimes (far less what lower speed ones and more basic RAM would do)... but I can't really say it's perfect, sadly. A step in the right direction, certainly, but then their DX11 scaling (especially 1440p onward) was so bad that I think it's more of a technical showcase for DX12 than much else.
And the most popular games (especially among the streamer crowd) are the most broken. H1Z1, Dead by Daylight, CS:GO, etc. It's terrible. It's like people are proving with their wallets that it's ok to release the most unstable, unoptimized things in the world and they'll buy and play them to no end. And these kinds of games never get fixed. Planetside 2, DayZ, etc are great examples of it.
I also agree with nVidia's drivers and W10 being garbage quality. Constantly broken stuff, as far as I can tell. But people keep making excuses for them.
------------------------------------------------------------------------------------------------------------------------
I honestly believe that SLI is going to get worse (or die) before it gets better. Devs DO NOT USE DX12 OR VULKAN. They don't bake in optimizations (except that I've seen from ROTTR, and that was only for mGPU functionality which I'm still not convinced is purely DX12's power) and just use the drivers to handle optimizations. DX12 as-is is a detriment to the gaming community. Devs lazily slap it in to mitigate their overboard CPU usage in most high profile new games.
I'm still a lover of mGPU (I don't think single GPU will ever leave me truly satisfied, if I were buying a system and had sufficient budget) but I really can't recommend it except to the most advanced users, and even then, I STILL push for a single strong card before mGPU. I can't even recommend AMD at all for mGPU because they lack a counterpart to nVidia Profile Inspector to force/edit GPU profiles. Their basic list of supported games too is much smaller than nVidia's, and they STILL don't work in windowed modes.
As crap as it is, the band-aid on the mGPU bandwidth problem (High Bandwidth SLI cable) is still something I'd say was necessary. So at least they... might... be working on something to further mGPU. Maybe by Volta or the architecture after Volta we might have something proper... but it looks like they're just waiting for PCI/e 4.0 so x8/x8 will act like current-gen x16/x16 which solves much of the bandwidth issue they have. I have a very lengthy explanation of the bandwidth problem in my SLI guide you can find in my signature, if you're interested in learning more about it (I'm too tired to type it all out from scratch). -
-
Trust me when I say, it is all on the developers this time around. Sometimes, games have bad profiles and that's nVidia's fault (like CoD: Black Ops 3 which needs another profile entirely for great scaling with strong cards) but when a game just doesn't work well even if it has a profile? It's devs only.
Dark Souls 3 is a good example. The patch that came with the DLC launch for Dark Souls 3 completely fixed SLI scaling. Before, SLI was introducing negative scaling. Single card was better. After, it worked perfectly in SLI; basically held 60fps constantly. I tried to make it drop and it wouldn't. I didn't even change the drivers, it simply just worked significantly better. This is the state of most demanding games since about mid 2014.bradleyjb likes this. -
Unfortunately there is no incentive for developers to make games with SLI in mind. It would be good if Nvidia / AMD could do something about that since they are the beneficiaries at the end of the day.
SLI / CF users are ridiculously rare I don't see that happening.
It's pretty sucky since 4k 120hz is just about a thing now and hardcores could really use it. If I were building a desktop in the near future I'd kinda want to be shooting for that.Last edited: Jan 23, 2017 -
Support.2@XOTIC PC Company Representative
-
They removed using SLI to get higher AA.
They added features to cards that didn't work in SLI, and some of them still don't (like MFAA).
They broke SLI equivalency with Maxwell; custom vBIOSes are needed to keep cards' clocks and voltages equal, and crashing due to voltage variance not clocking back up fast enough was present, and I don't even know if it ever got fixed.
They keep adding more features and then saying it doesn't work with mGPU. They use the High Bandwidth bridge to get more bandwidth between cards (killing 3-way mGPU and above) and then in later drivers they legitimately took all the bandwidth benefits (which would have benefitted games that have bandwidth limitations like Rainbow Six Siege and Doom 2016) and shoved it into smoothing out the frame pacing. So no XDMA-style design except for NVLink at the tesla end of things because they can't sell it for more, and then they waste the bandwidth they had.
Why bother trying to aim for a technology that the makers don't seem to care about?
It doesn't excuse developers of game engines and games who use tech that relies a lot on previous-frame frame buffer data for no real reason, though... the tech doesn't generally provide better visuals or optimization that makes using single cards preferable, except for SMAA T2x and TAA; the latter being capable of usage in mGPU after some tweaking similar to TXAA, albeit at a larger performance hit than on single GPU. This is also the reason mGPU is so terrible. Couple it with ridiculous CPU load in later games (where the framerate won't increase much anyway unless at super high resolutions, where you AGAIN become bandwidth-starved) and you've got a recipe for disaster.Last edited: Jan 23, 2017 -
Support.2@XOTIC PC Company Representative
-
It's just that devs aren't making their games AFR-friendly. Why? No idea. -
Support.2@XOTIC PC Company Representative
Unless it is something like 2 X 1080s, then....
TomJGX likes this. -
-
Support.2@XOTIC PC Company Representative
-
It was at this point I learned RAM was a huge factor commonly mistakenly overlooked, but one of my 780Ms died just 2 months after I got a RAM upgrade so I could no longer check even if I wanted to.
I would take one 980M over two 780Ms though, considering the last couple of years.
Sent from my OnePlus 1 using the forceTomJGX and Spartan@HIDevolution like this. -
Support.2@XOTIC PC Company Representative
-
TomJGX likes this.
-
Support.2@XOTIC PC Company Representative
-
The 1080N models are the only ones I've seen in most any notebook actually do what you expect. -
I also just in general really want SLI to comeback. DX12 has been killing it... I can't realistically get a 4K 144Hz monitor this year without sli like 1080 Tis or 1180s. But there is no point if nothing works. When I bought my cards sli was working pretty much better than it ever had, and now it's going down again. -
When you bought your 980Ti cards SLI was in a pretty big decline. 980Tis just were capable of pretty much anything, and two of them, even with terrible scaling (let's say 20%) would still have hit the upper limit of many users' CPUs and whatnot.
Remember that scaling and utilization are not the same. 99% util on each card can only grant you say... 120fps over 100fps. That'd be 20% scaling. SLI is capable of 95% scaling on average. People are now "happy" if they get 70% scaling, and expect 50% or less. I'm not one of those people, and it's why I very much dislike suggesting SLI to people these days. -
Yeah I would say it was in a decline, but coming out of like a peak. Most of the games currently out worked, new ones were starting to not work though. So I just change it to battleforge? Nothing else needed? I know sometimes the custom or altered profiles need other tweaks. I'll check that out.
My cpu is a 5820k at 4.5GHz. Usually had plenty of ultilization free on the 8 threads getting used, and the occasional Dice game using all 12. -
It doesn't help someone who buys multiple GPUs expecting the power to work to be told they gotta sit on one GPU for a couple weeks or a couple months (or in the case of Dark Souls 3, 7 months) until it decides it's going to work properly with mGPU. Might as well buy one card. SLI is only for the true die hards now, and that will kill it faster.
But with all the rampant unoptimization going on and everybody marketing 4K like it's actually worth a damn to game on (until I see 4K 144Hz with GPUs and CPUs to push it, and response times low enough too, I won't say it's worth it!), we might see a fallback to mGPU just because there's no way to get more power. But mark my words, PCI/e 4.0 will bring some life back. There *IS* a bandwidth issue problem. You can see I left a whole section about it in my SLI guide. But PCI/e 4.0 x8/x8 will be like PCI/e 3.0 x16/x16, so things might churn forward some more. I have no expectations, but for this alone, I'm looking forward to Skylake-E and Volta.DreDre likes this. -
Asus PG27UQ monitor should be good enough for all that. Cpus are fine, gpus are the problem.D2 Ultima likes this. -
Support.2@XOTIC PC Company Representative
-
Raising resolution *SLIGHTLY* raises CPU usage (negligibly; maybe 2-5% extra utilization going from 1080p to 4K); we need more (though not much more) CPU power by a long shot. Skylake-E might fix this though. HWE and BWE can't overclock very well, you need good chips to get 4.5GHz+, and the heat is ridiculous. Skylake-E should fix that easily, and support higher RAM speeds too, as well as more PCI/e CPU lanes (48 instead of 40 I think) and thus tri-SLI without a bandwidth problem would be great, as well as DMI 3.0 for 4GB/s transfer from NVMe drives instead of the 2GB/s limit (excepting burst transfers) might actually make high end computing something to salivate over. I didn't like Broadwell and HWE/BWE were just disappointments entirely. -
It's funny. One the one hand you are correct in everything you've been saying. But I feel like it's a bit pessimistic especially on the older stuff. 980 Ti sli made 4K breach 60fps on tons of games that a single one alone couldn't go. Sure, it has a ton of issues, but my experience gaming at 4K for the last couple years has been nothing short of awesome. I couldn't have done it without sli or GSync.
That being said I am nervous for many of the reasons you've been saying about 4K 144Hz. Not liking the thought of needing to redo my entire build... When is Skylake-E expected? The monitor is launching somewhere between September and December this year. Also why do you expect it to OC far ahead of current 6+ core cpu? Above 4.8GHz seems unlikely to me.Last edited: Jan 24, 2017 -
I know, what I say can often sound a bit pessimistic. But really, I just look at the facts. I think we're coming out of a depressing trend in tech, but PCI/e 4.0 is up and coming, Skylake-E is probably around May this year if I remember correctly. I'd like to see the improvements Vega has (and if it affects current-gen games) and see what they force nVidia to do. Ryzen is poised to offer cheap, dual-channel-RAM hexacore/hyperthreading chips for what we currently pay for i7s when it launches. I think we're actually about to enter a good era. But it all rides on AMD, and unfortunately they can't take the enthusiast market due to only supporting Dual Channel memory on Ryzen. We'll see how things work out for their GPUs as well.
SLI is better at higher resolutions because it forces higher scaling in a lot of games. Though some (like Fallout 4) get pitiful scaling like I think 20% at one point. Running PCI/e 3.0 x16/x16 will significantly improve SLI in those scenarios. The HB bridge and LED bridges should no longer help unless old drivers are used, though. -
-
Awhispersecho Notebook Evangelist
If they can get SLI working right for VR, SLI will suddenly be very important. That will immediately increase the amount of people that would be capable of using the current gen headsets while making it much faster to push new 4k headsets which is where VR needs to be. That my friends is what they call a win win.
-
I said that AMD wouldn't take the "Enthusiast" market, not the "gamer" market. Enthusiasts (like myself) or Prosumers who do productivity work etc may understand the difference, and realize that even though AMD is cheaper, they cannot offer the best performance. But the mainstream high end market? Ryzen can obliterate intel. If 6c/12t, similar IPC to Skylake, is offered for $300-$350, then Intel simply loses. Especially if it OCs well.
The trick to RAM is that timings and speed need to match. If you're using 1600MHz 8-8-8-24 DDR3 and you change to 2133MHz 10-12-12-31 DDR3, you're actually getting a net reduction in aggregate bandwidth. But basic DDR4 RAM (2133MHz 15-15-15-35) against decent 3000MHz-3200MHz RAM (15-16-16-36 and 14-14-14-34 respectively) is a straight upgrade. People don't test the timings or even list them. Quad channel grants a lot more read/copy bandwidth, and helps for games that like it (Frostbite 3 engine, Unreal Engine 4, Cryengine 3, all LOVE good RAM). For the DDR3 RAM to make sense, you'd need some say... 2400MHz 9-11-11-27 RAM (which exists for quad channel in DDR3). Then you'd see straight benefits. Alternately, if you had bad 1600MHz RAM (like my old 1600MHz 11-11-11-27) and upgrade (like to my current 2133MHz 11-11-11-31) you can see massive benefits. -
My ram is admittedly trash right now but I was planning on upgrading to way higher clocked ram for the 4K 144Hz display. -
I simply know the importance of good RAM right now. I'll never settle for crap RAM again =D.
But I'll admit, quad channel memory over dual channel memory for a lot of gamers might not be worth the cash outright, however if you're getting a hexacore and quad channel memory capable system, I'd say you should get it. It can only improve, and there's 3200MHz 14-14-14-34 quad channel kits from G.Skill now. Compared to the cl18+ we had when DDR4 was released, it's good times for RAM.
This is why I say Ryzen can take the mainstream/gamer-focused market. They can't take the enthusiast market because someone who is aiming for the best will simply HAVE to choose intel. It isn't a choice, like it currently isn't a choice between the FX series and intel's chips. If AMD had allowed for quad channel support with Ryzen and launched the hexacore at ~$325 and the quadcore at ~$250 and the octocore at ~$500, intel would be in serious trouble. If Ryzen OCs well there'd be no reason to use Intel (except I guess for non-W10 support? Both Intel and AMD aren't bothering to provide non-W10 drivers for chipsets beyond Z170). -
Support.2@XOTIC PC Company Representative
The Future of Multi GPU
Discussion in 'Gaming (Software and Graphics Cards)' started by HaloGod2012, Jan 14, 2017.