Hi maties.
Please pardon my ignorance, but is it true in an SLI configuration the second card will never offer 100% performance except the first card?
I thought these "bandwidth" adapters was supposed to remedy this weakness?
-
This is true, because the support for SLI is pretty bad. Only benchmark applications seem to have optimized SLI support. You will often see an SLI setup beat a single card setup in benchmamrks such as firestrike but in actual games the single card can perform almost twice as good.
Dennismungai likes this. -
-
some games run worse on SLI than single card. -
-
-
saturnotaku Notebook Nobel Laureate
Micro stuttering and other anomalies have been a problem ever since NVIDIA re-branded SLI after their acquisition of 3dfx's assets, which was more than a decade ago. It's long past time for the technology to be put out to pasture.
-
SLI used to work well during 400/500/600/700 series cards. Then slowly support started to go down and single GPU was pushed more.
Each generation after 700 series seemed to have less SLI support.
I've had GTX 295, 480 2/3-way SLI, 470 SLI, 690, 580M SLI, 680M SLI, 780M SLI, 980 SLI, 7970M crossfire (I know it's AMD), and I must say almost every game at the time that supported SLI or added support later, seemed to work fine for me.
Doesn't mean there weren't issues but compared to today, SLI used to be very good. I would get really good scaling in some games 90+%. -
yrekabakery Notebook Virtuoso
I agree, SLI was in a much better state before the current-gen consoles came out. The weaksauce hardware in those machines necessitated the development of mGPU-unfriendly rendering techniques to squeeze the most performance out of them, which had a direct negative effect on mGPU in the PC space since consoles drive multiplatform AAA development.
ssj92 likes this. -
yrekabakery Notebook Virtuoso
Another problem is that the SLI tech itself has not kept up with the advances in display tech like G-Sync, 4K/5K, 1440p144Hz+, ultrawide/surround, HDR, etc. We needed NvLink 5 years ago.
saturnotaku and ssj92 like this. -
Thanks for the information guys.
One more question - what are theses SLI adaptors for? What do they do or help with? -
*Sigh* The only computer I've had with SLI is my trusty old y410p laptop. Just using this over the years I have to concur with what others have been saying. Older games from ~2014 and prior scale very well with SLI. Newer ones either don't scale at all, scale poorly, or have some other issue/glitch hindering performance.
It's a shame really, but it's time I updated my laptop anyways so this is where I'll say goodbye and good riddance to SLI for now. -
Does NVLink make any difference or will it in the future?
-
saturnotaku Notebook Nobel Laureate
Starlight5 likes this. -
I've seen YouTube videos with NVLink with the 2080Ti has made quit a bit of a difference. -
-
yrekabakery Notebook Virtuoso
NvLink fixes the bandwidth problem, but is still contingent on games having SLI support in the first place.
-
saturnotaku Notebook Nobel Laureate
It's your money. Feel free to waste it as you wish. -
Awhispersecho Notebook Evangelist
It amazes me how many people have a negative opinion of SLI. I have had 3 systems with SLI and I was thankful to have SLI in every 1 of them. My current 980m SLI works fantastic and makes a huge difference. Going from 55-60 FPS to 90-100 FPS in Battlefield 1 is huge. Same with COD titles and many, many others. Hell even using forced SLI through inspector in games like Anno 2205 to go from 40-55 FPS to 60-80 FPS makes a world of difference.
I love SLI, is it 100% increase in performance, no. But it makes a huge difference and can be the difference in whether a game is playable or not. Most triple A games still support it and the narrative that it is dead is simply not true. Going forward, support will probably die down. But unless you haven't bought any games in the last 5 years, chances are many of the games you own support SLI and would benefit greatly from it. -
They should have made it modular from the start and could still do it now with PCIe 5.0; there is nothing preventing a solution whereby two cards (or more) are controlled as one unit. In this way you could build a system with one card and later on pop in a second when you've saved up enough ching. Think RAID but with graphics cards.
Starlight5 and Awhispersecho like this. -
I am going to wait another generation and see if NVLink is further supported.
Who knows, perhaps we'll see a bigger performance on the second card in some of these games? -
The RTX Titan seems to have the better version of NVLink inherited from the Quadro line, it adds memory as well thanks to its 50GB/s transfers.
It would be very interesting to see when reviewers get their cards, if vram does add up. I mean it already has 24GB but it's just to see. -
Any released date yet for it? -
-
Forgive my ignorance, but it seems like most SLI configurations use an alternate frame rendering technique. How come on high resolution displays they don't each take half the frame? That would seem to help 4k gaming quite a bit.
-
yrekabakery Notebook Virtuoso
-
Basing something off of potential in the far off future doesn't seem wise when you are looking for results now. Can you elaborate on why it's better?
-
yrekabakery Notebook Virtuoso
Despite its much higher scaling, AFR can however suffer from microstutter due to frame time variance between alternate frames. In turn, SFR can have a visible tear line or seam right across the middle of the screen where the two halves are "stitched" together.
AFR vs. SFR is a moot point these days though. SFR has been extinct, and AFR has been the de facto SLI rendering method, for well over a decade now. My understanding is that this is not only due to AFR having much better scaling, but also better compatibility with more modern graphics engines and rendering pipelines. All the games I've tested which utilize SFR are very old games, most of them OpenGL-based and don't even support programmable shaders, to give you an idea of how ancient they are.Last edited: Dec 26, 2018Awhispersecho likes this. -
Wouldn't GSync (or equivalent technology) eliminate the tear line, as it would wait for both graphics cards to be done before displaying the frame?
It seems to me that AFR's main limitation (besides microstuttering) would be having to wait for input. Otherwise, if you knew each frame in advance (say for a movie), I'm not sure why it wouldn't scale all the way up to 1 graphics card per frame, rather than just 4 total. -
yrekabakery Notebook Virtuoso
Even AFR has been officially limited to 2-way since Pascal (2016). 3- and 4-way were always more problematic and made microstutter worse. SLI support in games has been been pretty spotty in the last 5 years as well with DX12/Vulkan putting the onus of SLI support on devs rather than on Nvidia, the increasing amount of games/engines using non AFR-friendly rendering techniques, and the stagnation of the PCIe spec resulting in inter-GPU bandwidth not keeping up with increased bandwidth demand due to said AFR-unfriendly engines and due to advances in display tech like higher resolutions and refresh rates, G-Sync, and HDR, although NvLink finally fixes the bandwidth issue.
Another thing is that AFR increases input lag by 1000(n-1)/FPS milliseconds, where n is the number of GPUs in SLI, compared to single GPU at the same FPS. This is inherently due to AFR needing a longer pre-render queue in order to scale efficiently, and the queue increases with additional GPUs. The input lag penalty can be offset by the higher frame rate in SLI, but this means that at best, SLI does not make input lag worse, instead of reducing it.Last edited: Dec 26, 2018 -
JRE84 likes this. -
yrekabakery Notebook Virtuoso
-
JRE84 likes this.
-
yrekabakery Notebook Virtuoso
SLI - second card
Discussion in 'Gaming (Software and Graphics Cards)' started by Penchaud, Dec 2, 2018.