Approximately a year ago I jumped on the bandwagon and built myself a PC utilizing SLI 780 ti's and a 4k monitor. Now I realize this is a notebook forum but I imagine you guys are having the same issues as me.
My problem with SLI:
1) Still not enough to run most games at 4k with full graphics "40+ FPS"
2) ALMOST every game has some issue with SLI until many patches, such as texture flickering
3) Some games actually perform worse on SLI setups.
I have now decided to purchase a MSI GS30 laptop, pull out one of my 780 Ti's and utilize my other PC for my girlfriend.
I will than wait for a a year or so until they release a beefy new single card which can handle 4k+ gaming.
Anyone else in the same boat?
-
My next laptop and desktop will both sport a single AMD GPU because multi-GPU has too many issues on both sides and Nvidia has disabled overclocking for its mobile cards.
-
After owning a Sager 9150 with a 7970m, I will never touch an AMD product again. Anyone who had experience with that card likely knows what I am talking about....
Even if they are better now, I am very brand loyal when it comes to **** like that...Zymphad likes this. -
However, I don't use a desktop. So I need all the power I can get. For users like me SLI is the only way to get desktop like performancereborn2003, Spartan@HIDevolution and Mr. Fox like this. -
However if your a guy who likes to play games on release date, or *cough* *cough* PIRATE games... it's seriously a pain in the ass. At best 50% success rate.
I have come to the conclusion personally it's best to go back to 1080p gaming / 1440p and use a single graphics card... I hope for a giant leap forward in performance soon for single desktop cards.
Remember mobile video cards are not that far behind Desktop cards anymore, especially if you don't overclock. -
Meaker@Sager Company Representative
I have no other option if i want more than 1440p 60FPS. I really want my games at 100fps or more.
D2 Ultima likes this. -
Cakefish, TomJGX, TBoneSan and 1 other person like this.
-
-
Qing Dao likes this.
-
Quagmire LXIX Have Laptop, Will Travel!
I do understand how that can sour on a person though. -
-
We can agree to disagree.
Lol @ thinking 780 series would be good for 4k. C'mon man. Really? -
Getawayfrommelucas Notebook Evangelist
Current tech + 2 years in my rule of thumb. 4K gaming has become more common since...last year? idk - I'm behind the times. But for me, 4k has been around for a year. So 2016 will be the year for me to upgrade!
-
killkenny1 Too weird to live, too rare to die.
-
-
2) Yup. SLI is a pain in the butt, even when it works well. That's why every system builder will always recommend a single powerful graphics card whenever possible.
BTW, the GeForce GTX 980 is a single card that does decently well at 4K resolutions. But the real solution won't be GPU power... the real solution will be variable refresh rate (VRR) sync technologies, like Adaptive Sync or G-Sync. The "sweet spot" for those VRR technologies are between 40fps - 60fps, which is right in the range of the framerates that current single-GPUs can handle. -
Well I haven't had issues yet, but I can understand the frustration. I always went with single GPU but the GT80 was too tempting for me.
I knew 4k was a challenge when I decided to try and run Dark Souls 2 at 4k, and found that challenging enough for my single 780m, hovering around 24fps. And it is not exactly a heavy game graphically.
As others have said, I expect the next gen to address the 4K issue, since such res is barely becoming the new thing. Current GPUs can handle 4k but with reduced settings just fine. Maxing is out of the question. -
-
Handling 40FPS using Vsync is a problem because it either forces you to choose between Vsync on (smooth visual image, but heavy input lag) or Vsync off (minimal input lag, but visual tearing). The way around this was for people to try and get games to run at minimum 60fps whenever possible.
Variable refresh rates give you the best of both worlds. It gives you minimal input lag, and also eliminates screen tearing. So it really doesn't matter if your framerate is 40fps, 50fps, 60fps, 70fps, etc. -
For the absolutely last time (not), CURRENT. TECHNOLOGY. CANNOT. SUSTAIN. NEW. AND. UNOPTIMIZED. GAMES. AT. 4K. RESOLUTION. AT. ACCEPTABLE. FRAMERATES. WITHOUT. MAKING. USE. OF. AT. LEAST. THREE-WAY. SLI.
It will not happen now. It will not happen in two years. It will probably happen in three or four years. When GPUs get so powerful that a single top of the line GPU can play every single game on the market at 1080p 120fps or higher, we will be ready for 4K.
It doesn't matter how strong GPUs get. It doesn't matter if games barely improve graphically over the course of three or four years. The fact of the matter is, your game is optimized for the lowest common denominator. With EXTREMELY weak last-gen hardware we ran into bottlenecks where we couldn't produce the kinds of worlds/games/etc people wanted to; because of memory/CPU/GPU/etc limitations. Now the limitations are free, and people are using every tiny inch of it, and not bothering with compression. If a game's primary design is a current-gen console, its PC version is likely going to use much more memory and video RAM than is necessary... because devs went from 512MB limits to ~6GB limits, and increasing things like texture resolution (not shadowmap resolution OR texture quality; those are different) are relatively free in terms of performance hits. Once you got the memory for it.
Then when the "minimum" PC cards being designed for become console-level or higher (in general, assume GTX 660) due to Maxwell and Pascal being one and two generations newer than kepler? You're gonna see games that look similar to stuff coming out now/been out for a couple years/etc that are just going to run worse, because there's no need for them to design for 10 year old hardware anymore. Easy examples are how games looking worse than stuff like Crysis 2 (and that in itself was a 2011 title) and running a lot worse than it being out right now. It's just because it was designed with bigger hardware as its minimum denominator. Good example: I'd say Crysis 2 and Dying Light look fairly similar in a lot of cases. I'd still give Crysis 2 the edge. But I could play that in DX9 at 1200p on ~medium on my 280M and never dip below 30fps... if I tried Dying Light on my 280M it'd roll over, burn up, curse me for making it try, then die. And the same goes for a whole lotta games that've been released recently; there's lots of reasons why games like Watch Dogs, CoD: AW (only improvement is its lighting system over Black Ops 2 in many aspects), Titanfall, etc... They LOOK on par with many older titles.... ones that run 5x better than them (not exaggerating). Hell, I used to play sleeping dogs with "extreme" AA on (huge SSAA included)... I dropped below 60 maybe 5% of the time on my two stock 780Ms. You think I'd be able to run Watch Dogs at max at 1080p and get such good FPS? LOL. No. Ain't happening. Far less if I tried using SSAA to force it to 3k or 4K res. You must be crazy. But that's just the way the times roll.
In short: if you expected to play the latest AAA games at 4K, ESPECIALLY with how little attention is given to PC gamers (which explains lack of good SLI/CrossfireX support) by those devs... you were pretty sorely mistaken, and you honestly should have known better.
Does this mean you can't play stuff like Sleeping Dogs/CS:GO/LoL/SC2/AC2/AC: Brotherhood/Black Ops 2/BF:BC2/Dark Souls 2/Killing Floor/Mass Effect 1, 2 and 3/Skyrim/etc at 4K? No, it doesn't mean that. In fact those games'd probably run amazingly well at 4K. Amazingly. But is that what 99% of vocal 4K supporters/enthusiasts wish to play? Nope. Nobody's fault but their own for being disappointed. -
AMD better deliver on the 390X and someone really needs to make a 1440p Freesync panel. Or at least a 1080p one.
-
[ ] SLI not worth it
[x] SLI worth itnightingale and D2 Ultima like this. -
I rarely have issues with SLI... I won't have a laptop without it.
Sent from my Nexus 6 using TapatalkD2 Ultima likes this. -
-
Maybe OP needs to start thinking about tri- and quad-SLI and pray to the Nvidia gods that drivers and app support improve.
Also, AMD Hawaii cards (290/290X and 295X2) are better at 4K than Nvidia GM204 and GK110.Last edited: Feb 13, 2015D2 Ultima likes this. -
I'm sold on SLI and won't do any machine with a single GPU. They don't offer enough performance. If, in the rare event, a game has serious issues with SLI and performs better with one GPU, I simply disable SLI to play that game... not a big deal if it's a game I cannot live without playing. The rest of the time I enjoy significantly greater performance. I hardly ever have any kind of serious issue with SLI. Issues for me are few and far between, and are almost always extremely minor issues.
D2 Ultima likes this. -
I have just personally decided to go back to a single 780TI until the next "big break" in graphics processing.
Also I want to be fair here. Dark Souls 2 is one of the games that does scale well with SLI. I play that game maxed out at 4k 60+ fps and it looks awesome.
Also, the fact that AA is not needed with 4k is a myth. When it comes to graphics quality of course you don't "Need" it. However it does make things look noticeably better even at 4k. I usually do at least 2x. -
So I don't disagree with what your saying, although your coming off a bit arrogant in this post.
To be clear my issue isn't that with SLI I still can't play games in 4k, so much as so few games come out optomized for multi-GPU setups on release. Most modern games I usually try to play at 1440P, and I am happy with that. As you said I think I am just going to wait this out for a few years and see what "next gen" provides.
I also agree that something needs to change before PC gamers are ever a priority for game developers. It will happen though, the "next gen" consoles released have had a slow start to say the least, and they are just too antiquated to keep up with the way technology is going. I sold my PS4 the other day, and when the guy came to pick it up he questioned me why I was selling so cheap.
I decided to show him my gaming setup with 3 4k Samsung monitors and a occulus rift. "Yes I know I can't game at 4k +3 with only 780 TI SLI". Going form a solid PC gaming setup to a console, is like going from a PS4 to a legacy gameboy lol. -
As long as 3D computer graphics in games are pixels and raster graphics, AA is needed. Even just FXAA if you want to be cheap about it. Doesn't matter if you have a 2K or 4K screen or whatever. Spatial aliasing may be reduced on a high PPI screen at native res but temporal aliasing is still noticeable.
Mr.Koala likes this. -
1. 40FPS means about 25ms of rendering time per frame on average. That's 25ms of latency between game's physics code and rendering output that you can't avoid with better output control.
2. After one frame is fished you would have 25ms of rendering time where there's no new data for the screen to feed on. It's doing nothing during this time and you see nothing new.
#2 can be fixed by SLI/CF assuming good multi-GPU scaling. #1 is always there unless the GPU is fast enough or splite frame techniques (basically extinct these days) work well.Last edited: Feb 14, 2015 -
I guarantee you that if you planned on playing older titles at 4K and had your SLI 780Ti, not only would you be perfectly happy for ages, but you wouldn't even run out of vRAM (especially if only gaming at 4K with one active screen and fullscreened) so... yeah. In general, I'm not sympathetic to you, or to any user with the same woes, as all the information was there and knowing how current games keep releasing is also there. 99.9% of PC games run fine at 4K on even a single 4GB 770. The 0.1% of PC games that don't are the ones that people keep wanting to play at 4K. Hint: It's not the games' fault.
Also, Devs are the ones who need to speak to nVidia and get the ability to use SLI working. When a game takes 4 months and then it comes out broken... that's the devs' fault. When a game launches and it already uses SLI without so much as a driver update? That's also the devs' fault. Has nothing to do with SLI in itself nor nVidia (or AMD for that matter). -
-
-
thanks for the reply
-
4K pops in out of nowhere only in the last year for laptops. Now people expect to play at 4k when the hardware has only improved incrementally. We're only now to the point where 1080p/60 as a minimum is playable with top end hardware. 4K can go away as far as I'm concerned, when it comes to gaming laptops. Give us high quality 1920x1200 or 2560x1600 16:10 displays first. We'll worry about 4k later.
-
As I always say: I'd be using most likely a 1440p 120Hz screen if I got a desktop, and I'd need a minimum of THREE 980s or Titan Blacks to satisfy that for gaming. Hell, I wouldn't even settle for two 980s at 1080p the rate things are going.
I'm turning into one of those guys who REALLY likes graphics turned all the way up. -
It doesn't actually work. -
I wish 1440p 60hz was plausible though. Perhaps a Titan Black will handle it. I always wondered; Titan Black is a beast but 980 is the latest generation. Which one is better in terms of gaming?
-
-
-
Can a single GTX 980 comfortably handle 1440p gaming? Absolutely yes.
Can a single GTX 980 run 1440p with all settings maxed out, and run at 120fps? No.
But when you're talking 1440p with all settings maxed out at 120fps, you're not talking "practical." People who want this type of performance want it simply because they want "the best", and not because they want to know if a GTX 980 @ 1440p is a practical solution.
So if you have realistic expectations on gaming performance, then yes, a GTX 980 @ 1440p is enough. If you have unrealistic expectations and simply want to max out your game for the personal satisfaction of knowing you can run things at max, then no, a GTX 980 @ 1440p is not enough. -
Wrong thread.
-
This is what I use to play games... will a 720m handle it?
Imagine if you could SLI multiple PC's together that each had 4x SLI...
reborn2003 and TomJGX like this. -
@HTWingNut I think you're looking for something like this:
reborn2003, TomJGX, HTWingNut and 1 other person like this. -
I think it's worth it. I think it's worth it for any enthusiast. Enthusiasts are early adopters so on the desktop front, it's absolutely worth it to have whatever it takes to "almost" get any cutting edge game to run. Remember Tri-SLI 8800GTX for Crysis? That ran at maybe 30fps average. Is it 'worth it' for one game? Not to 99% of people.
Is it 'worth it' to have a cutting edge 18.4 inch super heavy lappy with two GPUs to get "close to desktop performance...on a notebook that sits on a...desk"....not for 99% of people. I am one of those crazy people. -
And the CPU.
No idea how I'm getting to norway.
Don't worry Cloudfire I love you <3
Watch your back when you sell
(I'm just sleepy and going far with jokes, don't slay me mods) -
-
SLI is NOT worth it
Discussion in 'Gaming (Software and Graphics Cards)' started by Nick11, Feb 12, 2015.