Recently I've noticed that a lot of major AAA releases are lacking multi-GPU support. Three recent games that come to mind are Titanfall, The Elder Scrolls Online (ESO) and Dark Souls II. All of them released in 2014, during the last couple of months, in that order. I find it ridiculous that large game companies and especially AMD/Nvidia are allowing these major released to go without multi-GPU support for long periods of time, if not indefinitely. Some people spend a few thousand dollars on their GPUs alone, splurging on dual Titans or 4 290Xs. Many others simply want to augment their aging systems with 650 TIs and 5870s. Still others purchase dual GPU laptops, sacificing a lot of the benefits laptops are known for to get more performance of their systems.
Still we aren't taken seriously. Why is it okay for Titanfall to not have XFire support 2 months after its release? Why did Nvidia feel the need to advertise ESO as SLI ready for months, only to not have a profile available at launch, nor a month after release? How can From Software release Dark Souls II without something as basic as multi-GPU support, after the horror of the original Dark Souls PC port?
Will this thread make any real changes? No, it won't. I suppose I'm just annoyed that the PC doesn't get the support or the respect it deserves. How do you feel about the lack of multi-GPU support in 2014? How about the overall lack of respect toward the PC? Discuss.
-
-
dumitrumitu24 Notebook Evangelist
those games work fine on old systems and arent demanding at all..I play with my 760M(overclocked a bit) dark souls 2 at constant 60fps
-
I agree, I mean, not having SLI support is pretty bad. None of the games that are released without it are particularly demanding though. I think the statistics for steam are like 3 percent of users have an SLI setup or something so it is a minority as well. But after seeing the incredible benefits of using SLI there is no way I am ever going back to single GPU systems without feeling like I am using a lesser product.
-
It sucks, but where there's a will, there's a way.
3DCenter Forum - SLI - Kompatibilitätsbits - Sammelthread
SLI works just fine in Dark Souls II with the 0x02400405 bits.TBoneSan likes this. -
Eh From Software are not known for making good PC stuff. With how I've heard Dark Souls 2 has released on PC, I'd say most people are quite happy with it. It's also still a minority use, despite the way it was hyped by many streamers and stuff. I expected as much from it.
Titanfall... needs to be patched by Respawn and it is NO fault of nVidia's. They have even admitted that they were working on it (albeit without an ETA) when I asked them over and over again on twitter. I agree with the fact that it's inexcusable, but I can't blame nVidia for this one.
ESO... Elder Scrolls games aren't known for good multi-gpu support either. I didn't expect ESO to have great support, but I thought it'd at least work. It is a bit of an annoyance that it isn't there. I also think the blame lies with the developers and not nVidia here, though it was really bad marketing on their part to suggest it with SLI if it has no profile.
On a whole-hearted note, my friend who worked on MW3 with sledgehammer games, Transformers Fall of Cybertron and was recently working with microsoft on a game that he couldn't really tell me about said to me that developers don't ever code with multiple GPUs, and usually it's an afterthought to add them in unless a game's being made properly for the PC and consoles are getting the port (read: Star Citizen, possibly Mighty No. 9). It's just how the game industry is run these days. It REALLY sucks, but hopefully we'll soon slink back into a time where developers realize that PC is where they can make their dreams come true and consoles should deal with the poopy ports (not like the users would complain much). But yeah, don't expect them to happen very often. I'm actually happy nVidia updates SLI profiles for most of the very high-profile games IF the games don't glitch out whenever it's applied (like Titanfall does). Sometimes it's just on the devs, other times nVidia and AMD try to figure out a way to make their cards work around the current issues in the game. It could be a LOT worse. -
I thought the Titanfall SLI profile was recently pushed through GeForce Experience? If not, I'm sure custom bits can be found in Nvidia Inspector, perhaps experimenting with the bit value editor if necessary. Unfortunately, I do not have Titanfall, therefore I cannot test it myself.
Is there any reason you can't use the Skyrim compatibility bits to enable SLI in ESO? They're on the same engine and everything so I imagine it would work just fine. -
-
Cursory search says ESO uses HeroEngine, same engine as used in SWTOR. So try the SWTOR bits. -
If I may clarify on some points you all have mentioned:
1 - If the Titanfall SLI profile WAS pushed, I don't know about it. I haven't had my laptop for over a month, however no driver updates have carried a Titanfall SLI profile and as I said, Respawn confirmed through the official twitter that they need to work on it more, albeit without any ETA as to when the fixes will arrive. People HAVE tried forcing different bits on Titanfall, but they don't ever work just right from what I remember.
2 - Games that NEED multiple GPUs to run efficiently are FLAT OUT UNOPTIMIZED. If you cannot toss a 780Ti at a game and get 60+fps at even 1440p right now, that game has problems. And the problem is, games DO have optimization problems. In fact, this is why all the new Xbox 1 and PS4 titles are making excuses (after past promises at E3 etc) of 1080p 60fps gameplay. They say "Oh well we're not accustomed to the hardware yet." (which is a lie, since it IS PC architecture hardware, excepting the PS4's GDDR5 system RAM not being DDR3 system RAM, which isn't that big of an issue), or "Well 30fps is Open World Gold Standard (Ubisoft, Assassin's Creed 4)" (which is no real excuse for the fact that you couldn't optimize a game moving from low graphics to medium/high graphics and from 720p to 1080p with a GPU near TEN times stronger than your previous system which was 720p 60fps albeit with some framedrops), or even a flat out "well you can't really notice the difference between 900p and 1080p, so resolution isn't so important" (Nope, I can tell. Try again.). And then they try to resort to tricks like with Killzone Shadowfall where the game renders multiplayer at 960 x 1080 (yes you read that right) and simply alternates each pixel column being filled every other frame, causing multiplayer to look very blurry compared to single player. If you don't understand, it does this: |_|_|_|_ on frame 1, then _|_|_|_| on frame 2, where | is a filled pixel and _ is a blank one. Essentially, developers CANNOT code for the PC, and have simply been mostly working with ports and getting things to a simply working state, assuming if players wanted better performance they'd just toss a stronger GPU at it. So blame the developers (except maybe for a game like Star Citizen which DOES look breathtakingly real and immersive) when a game needs more than one GPU to give satisfactory performance.
3 - Bursting a bubble here, 4K gaming is NOT popular and neither is multi-monitor gaming. It is and likely for a long time WILL remain to be a gimmick. I am not saying it will always remain a gimmick, similar to how I hope 3D gaming (as I have and like stereoscopic 3D gaming) will get much more popular soon, but right now it isn't nearly as popular as it's hyped to be. There are cheaper 4K screens, but often they feel stuttery or are 30Hz only at 4K res (according to quite a few videos I've seen about them). Also, with the way game optimization currently is (which is quite likely to change in the next year or so when devs start to optimize for PC architecture to make the most out of the current gen consoles which I refuse to call "next gen") the only way to truly max games out at 4K and such is to throw a couple R9 290X cards or a couple Titans and GTX 780Tis at the game, and even then good luck getting 60fps on some of the more demanding titles like Crysis 3 or Far Cry 3 or BF4 etc. We SHOULD be well beyond 1080p as far as GPU power is concerned, but until we see some swole Maxwell flagship cards and whatever AMD has up their sleeve (which I hope is something really good), good 4K gaming ain't gonna happen very well. It's the same for multimonitor gaming; it COULD be very good, but sometimes you need to mod or hack the game to make it work, and it actually lowers your ability to perform in some instances like with FPS games where you can't see everywhere properly at once and people can sneak up on you on your sides while being fully in view.
4 - Visual artifacts and light bleeding aren't the only problems with SLI. For example, before Deus Ex Human Revolution Director's Cut got the SLI profile (which admittedly took a couple months) if I forced SLI I would simply get worse performance. I would go from 99% constant, 80-100 fps on a single 780M to 60-70% usage on BOTH cards, but only 55-70 fps. Sometimes games need a bit of work to make multi-cards useful, even if there are no detrimental visual effects. Sometimes games support SLI (and it helps) but not in all instances. Like, for example we have Arma 2 and DayZ. DayZ the Arma 2 mod supports SLI. Does it help in single player? HELL YES. Do my cards get 99% constant scaling and let me max the game and bask in its mediocre graphic quality? DEFINITELY! Do I get over 25-40 fps on a server that some dude is paying for that's likely running on an AMD dual core with a 5770? NOPE! In this case, I disable SLI for stability and stream-ability because there's no point keeping it on. It's not my fault though, and not the fault of nVidia that the game's FPS is limited by the server. But sometimes, just sometimes, it doesn't really benefit, and it's not always about visual artifacts. I'll always say it, if possible, get the STRONGEST single card you can find, and if that turns out to not be enough, grab a second one. If you grab two weaker cards to SLI to save money, you also have to deal with the fact that sometimes SLI may not make a difference and it may just not be anyone's real fault.
5 - As for SLI/CrossfireX being a minority in PC gamers, you have to remember time and cost vs returns. If a publisher is rushing a game *cough titanfall cough* either because the publisher screwed up *cough EA cough* or the devs screwed up and can't get more time *cough Dark Souls 1 cough*, do you think it's in their best interest to appease such a small minority? Now granted with the recent F2P full release of DoTA 2 (which runs on EVERYTHING and FPS isn't very important for it) steam users have skyrocketed and thus you may be seeing a far larger number of users now that DON'T have SLI and have an older machine and just play DoTA 2 and/or TF2 or whatever other F2P game they can find which explains why only 3% of steam users have multiple GPUs (as I expect the percentage that play more games like CoD/battlefield/name-of-some-hype-game-everybody-plays-for-3-months-then-forgets-about/etc is a far more generous ratio), but honestly, think about it. How long would it take to truly optimize a game for multiple cards, especially if they don't code with multiple cards? They'd likely have to buy two sets of cards to run in multi-gpu mode (one for nVidia and one for AMD), as well as some other different hardware for testing, then test while doing everything else. Which takes time, money, manpower. Things they don't necessarily have. This is why I listed above that it's just the way games are today, where rushing games out the door half-finished and overhyped is the norm, that these things just aren't addressed. Making a TINY portion of your profits happy for time, money and manpower that you don't have just isn't cost effective enough, especially if a game being released or not is on the line. And remember this: Console games need to be finished 6 weeks prior to release. If a game is multiplat, it's gotta be finished WAY earlier than the PC version needs, which is basically a couple days before your release. They'll finish the console version then try to finish the PC version. And sometimes they can never really do it fast enough, so some games' PC versions get heavily delayed. Like all of Ubisoft's AC series, because they push it out year after year on a supremely time-starved dev cycle and essentially finish the console game then start porting it to PC or something.
--------------------------------------------------------------------------------------------------------------------------------------------------------
After all this, you may think I'm defending the right for such things to happen. I'm not. Not even close. I *AM* however, trying to shed light on things that you probably don't know/realize. I hate when PC gets the backburner for everything. I hate when games get rushed and their potential is wasted in the graphics or broken mechanics/imbalances. I hate when games come out late for PC and offer almost no benefits to their console counterparts, aside from better graphics (like GTA 4, which had a WHOLE YEAR and still released as one of the most broken titles optimization-wise TO DATE on PC). All of this stuff needs to change, but honestly, there's nothing I can do about it. If I was famous and huge in the community of gaming, I would probably tell people these things and make them vote with their wallets, so to speak. Someone like pewdiepie could probably do a lot of good if he'd open his mouth and tell people about it, and give publishers a HUGE slap to the face. But all I can do is explain what things are really like to people, and hell, I don't even know everything. Not even close. I do know enough though to explain some things to others where they may not know enough and make misguided statements. You gotta know both sides of the story, yeah?
That being said, ESO can go jump off a bridge for not being that ready by now. Seriously. Game's been in beta for over a year for crying out loud. Still releases with tons of issues I heard. Anyway, I just wanted to try and explain away some of the things you all are thinking. I probably sounded really harsh, but AT NO TIME DID I MEAN TO BELITTLE ANYBODY. -
Wow man, I didn't even come close to finish reading your dissertation, but yeah the game and driver developers really need to step up their game in terms of multi-GPU support. I mean it's mind-boggling how utterly broken the default SLI profile for PlanetSide 2, a marquee Nvidia TWIMTBP ("The Way It's Meant To Be Played") title, is 1.5 years after its release. I'm forced to use compatibility bits from another game to get better performance/scaling and prevent severe artifacting.
Official bits:
Unofficial bits:Last edited by a moderator: May 12, 2015 -
Yeah. I actually wrote it all out then decided nobody'd actually read it all. But hey, sometimes I like to write books.
Anyway, as for the graphical issues you get in PS2, I've NEVER had that. Granted I have 780Ms and you have 650Ms, but the last time I played the game (I think sometime near january? It was a while ago, but not too far in the past) I had perfect performance with pretty much 150+ fps and maximum graphic settings. But yes, as I did point out, developers need to (and hopefully soon will) eventually fix themselves and stop giving PC games the afterthought, but until then... sometimes we just gotta grit and bear it. And trust me, I know the feeling. -
I too can get over 100 FPS standing around at the Warp Gate but anybody who says they can pull that off in the heat of some 48+ vs. 48+ vs. 48+ three-way inside a Bio Lab is lying through his teeth. There are no laptops, let alone desktops, on this planet which can pull that off, not with the extreme demands of this game and not with its current state of optimization. Doesn't matter how beast of a rig you have, anybody would be lucky to never drop below 60 FPS in LagSide 2. -
Can't say I ran into any 48 v 48 v 48 three ways, I never played the game that much. My friends were all "HEY LET'S PLAY IT" then POOF GOES PUFF THE MAGIC DISAPPEARING DRAGON and they vanished from ever touching it. I did get into some 20 v 20 before, trying to take this base while my team jumped off this mountainside. No idea where the flying moo it was. But I never got below 100 fps when I DID play. But good to know not to reinstall it when I get back my laptop.
New Games That Lack SLI and XFire Support
Discussion in 'Gaming (Software and Graphics Cards)' started by LanceAvion, May 4, 2014.