The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    SLI - second card

    Discussion in 'Gaming (Software and Graphics Cards)' started by Penchaud, Dec 2, 2018.

  1. Penchaud

    Penchaud Notebook Consultant

    Reputations:
    154
    Messages:
    279
    Likes Received:
    32
    Trophy Points:
    41
    Hi maties.

    Please pardon my ignorance, but is it true in an SLI configuration the second card will never offer 100% performance except the first card?

    I thought these "bandwidth" adapters was supposed to remedy this weakness?
     
  2. Danishblunt

    Danishblunt Guest

    Reputations:
    0
    This is true, because the support for SLI is pretty bad. Only benchmark applications seem to have optimized SLI support. You will often see an SLI setup beat a single card setup in benchmamrks such as firestrike but in actual games the single card can perform almost twice as good.
     
    Dennismungai likes this.
  3. Penchaud

    Penchaud Notebook Consultant

    Reputations:
    154
    Messages:
    279
    Likes Received:
    32
    Trophy Points:
    41
    I see, so the limitation is with the software side not so much the hardware?
     
  4. Danishblunt

    Danishblunt Guest

    Reputations:
    0
    pretty much, and before you try to tinker, it's basicially not fixable.As you can see on firestrike scores the performance is pretty much double, when playing games you never get 100% more performance.

    some games run worse on SLI than single card.
     
  5. Penchaud

    Penchaud Notebook Consultant

    Reputations:
    154
    Messages:
    279
    Likes Received:
    32
    Trophy Points:
    41
    I wonder if that developers will ever fully support it.
     
  6. Danishblunt

    Danishblunt Guest

    Reputations:
    0
    nope, seeing how AMD and NVIDIA are giving less and less multi GPU support it's pretty clear that it won't ever be a thing.
     
  7. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    Micro stuttering and other anomalies have been a problem ever since NVIDIA re-branded SLI after their acquisition of 3dfx's assets, which was more than a decade ago. It's long past time for the technology to be put out to pasture.
     
  8. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    SLI used to work well during 400/500/600/700 series cards. Then slowly support started to go down and single GPU was pushed more.

    Each generation after 700 series seemed to have less SLI support.

    I've had GTX 295, 480 2/3-way SLI, 470 SLI, 690, 580M SLI, 680M SLI, 780M SLI, 980 SLI, 7970M crossfire (I know it's AMD), and I must say almost every game at the time that supported SLI or added support later, seemed to work fine for me.

    Doesn't mean there weren't issues but compared to today, SLI used to be very good. I would get really good scaling in some games 90+%.
     
  9. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    I agree, SLI was in a much better state before the current-gen consoles came out. The weaksauce hardware in those machines necessitated the development of mGPU-unfriendly rendering techniques to squeeze the most performance out of them, which had a direct negative effect on mGPU in the PC space since consoles drive multiplatform AAA development.
     
    ssj92 likes this.
  10. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Another problem is that the SLI tech itself has not kept up with the advances in display tech like G-Sync, 4K/5K, 1440p144Hz+, ultrawide/surround, HDR, etc. We needed NvLink 5 years ago.
     
    saturnotaku and ssj92 like this.
  11. Penchaud

    Penchaud Notebook Consultant

    Reputations:
    154
    Messages:
    279
    Likes Received:
    32
    Trophy Points:
    41
    Thanks for the information guys.

    One more question - what are theses SLI adaptors for? What do they do or help with?
     
  12. LanceAvion

    LanceAvion Notebook Deity

    Reputations:
    131
    Messages:
    736
    Likes Received:
    169
    Trophy Points:
    56
    *Sigh* The only computer I've had with SLI is my trusty old y410p laptop. Just using this over the years I have to concur with what others have been saying. Older games from ~2014 and prior scale very well with SLI. Newer ones either don't scale at all, scale poorly, or have some other issue/glitch hindering performance.

    It's a shame really, but it's time I updated my laptop anyways so this is where I'll say goodbye and good riddance to SLI for now.
     
  13. Penchaud

    Penchaud Notebook Consultant

    Reputations:
    154
    Messages:
    279
    Likes Received:
    32
    Trophy Points:
    41
    Does NVLink make any difference or will it in the future?
     
  14. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    Highly unlikely. This video is about Crossfire, but much of the analysis can apply to NVIDIA as well. Basically, multi-GPU is only kind of, sort of worth it if you're doing two 2080 Ti cards, and you have more money than sense.

     
    Starlight5 likes this.
  15. Penchaud

    Penchaud Notebook Consultant

    Reputations:
    154
    Messages:
    279
    Likes Received:
    32
    Trophy Points:
    41
    That's me.

    I've seen YouTube videos with NVLink with the 2080Ti has made quit a bit of a difference.
     
  16. Penchaud

    Penchaud Notebook Consultant

    Reputations:
    154
    Messages:
    279
    Likes Received:
    32
    Trophy Points:
    41
    Seems to make quite a bit of difference watching this video.

     
  17. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    NvLink fixes the bandwidth problem, but is still contingent on games having SLI support in the first place.
     
  18. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,707
    Trophy Points:
    431
    In all but one game you're seeing, at most, a 50% performance improvement while you're spending another 100% for the second graphics card.

    It's your money. Feel free to waste it as you wish.
     
  19. Awhispersecho

    Awhispersecho Notebook Evangelist

    Reputations:
    66
    Messages:
    355
    Likes Received:
    306
    Trophy Points:
    76
    It amazes me how many people have a negative opinion of SLI. I have had 3 systems with SLI and I was thankful to have SLI in every 1 of them. My current 980m SLI works fantastic and makes a huge difference. Going from 55-60 FPS to 90-100 FPS in Battlefield 1 is huge. Same with COD titles and many, many others. Hell even using forced SLI through inspector in games like Anno 2205 to go from 40-55 FPS to 60-80 FPS makes a world of difference.

    I love SLI, is it 100% increase in performance, no. But it makes a huge difference and can be the difference in whether a game is playable or not. Most triple A games still support it and the narrative that it is dead is simply not true. Going forward, support will probably die down. But unless you haven't bought any games in the last 5 years, chances are many of the games you own support SLI and would benefit greatly from it.
     
  20. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,216
    Likes Received:
    741
    Trophy Points:
    131
    They should have made it modular from the start and could still do it now with PCIe 5.0; there is nothing preventing a solution whereby two cards (or more) are controlled as one unit. In this way you could build a system with one card and later on pop in a second when you've saved up enough ching. Think RAID but with graphics cards.
     
    Starlight5 and Awhispersecho like this.
  21. Penchaud

    Penchaud Notebook Consultant

    Reputations:
    154
    Messages:
    279
    Likes Received:
    32
    Trophy Points:
    41
    I am going to wait another generation and see if NVLink is further supported.

    Who knows, perhaps we'll see a bigger performance on the second card in some of these games?
     
  22. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    The RTX Titan seems to have the better version of NVLink inherited from the Quadro line, it adds memory as well thanks to its 50GB/s transfers.

    It would be very interesting to see when reviewers get their cards, if vram does add up. I mean it already has 24GB but it's just to see. :p
     
  23. Penchaud

    Penchaud Notebook Consultant

    Reputations:
    154
    Messages:
    279
    Likes Received:
    32
    Trophy Points:
    41
    I'm interested to know how it performs against the 2080Ti. :)

    Any released date yet for it?
     
  24. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Not yet, expect around 10-15% performance increase on average.
     
  25. Penchaud

    Penchaud Notebook Consultant

    Reputations:
    154
    Messages:
    279
    Likes Received:
    32
    Trophy Points:
    41
    Looks like you were right.
     
    ssj92 likes this.
  26. Ryan_S

    Ryan_S Notebook Enthusiast

    Reputations:
    7
    Messages:
    17
    Likes Received:
    14
    Trophy Points:
    6
    Forgive my ignorance, but it seems like most SLI configurations use an alternate frame rendering technique. How come on high resolution displays they don't each take half the frame? That would seem to help 4k gaming quite a bit.
     
  27. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Because AFR has the potential for much higher scaling than SFR.
     
  28. Ryan_S

    Ryan_S Notebook Enthusiast

    Reputations:
    7
    Messages:
    17
    Likes Received:
    14
    Trophy Points:
    6
    Basing something off of potential in the far off future doesn't seem wise when you are looking for results now. Can you elaborate on why it's better?
     
  29. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    It's simple really. Having each GPU render alternate complete frames, AFR, results in more efficient workload distribution, and in turn better parallel utilization of the GPUs. With SFR, each frame is split into a top half and a bottom half, and each half is assigned to one GPU, but you can't guarantee that each half has the same complexity within a scene. For example, the top half might be mostly empty sky, or the bottom half might be mostly bare ground. The GPU with the less strenuous portion will be underutilized and limited by the GPU with the more strenuous portion. This means that scaling in SFR is highly variable moment-to-moment since the workload is never perfectly balanced between the two halves at all times, and is generally much lower than AFR as well as being limited to 2-way SLI, while AFR can have nearly perfect scaling up to 4-way SLI.

    Despite its much higher scaling, AFR can however suffer from microstutter due to frame time variance between alternate frames. In turn, SFR can have a visible tear line or seam right across the middle of the screen where the two halves are "stitched" together.

    AFR vs. SFR is a moot point these days though. SFR has been extinct, and AFR has been the de facto SLI rendering method, for well over a decade now. My understanding is that this is not only due to AFR having much better scaling, but also better compatibility with more modern graphics engines and rendering pipelines. All the games I've tested which utilize SFR are very old games, most of them OpenGL-based and don't even support programmable shaders, to give you an idea of how ancient they are.
     
    Last edited: Dec 26, 2018
    Awhispersecho likes this.
  30. Ryan_S

    Ryan_S Notebook Enthusiast

    Reputations:
    7
    Messages:
    17
    Likes Received:
    14
    Trophy Points:
    6
    Thanks, that's a much better explanation. I think I've read somewhere that SFR can be made better by figuring out approximately where half the complexity of the frame is, and assigning half the complexity rather than half the frame to each card (though I get this would never be perfect).
    Wouldn't GSync (or equivalent technology) eliminate the tear line, as it would wait for both graphics cards to be done before displaying the frame?

    It seems to me that AFR's main limitation (besides microstuttering) would be having to wait for input. Otherwise, if you knew each frame in advance (say for a movie), I'm not sure why it wouldn't scale all the way up to 1 graphics card per frame, rather than just 4 total.
     
  31. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    I think I’ve tested just about every single game that uses SFR according to Nvidia Profile Inspector (tbh it’s not a massive amount, they’re all old games like I said, so I either already had them for a long time or bought them cheaply on sale), and not a single one of them had anything but a static 50/50 split across the middle. I also tested manually forcing SFR in some old games that didn’t support it by default, and again it was always a perfect 50/50. It was easy to tell that it was 50/50 because taking screenshots would almost always result in half the image being blacked out, plus I could observe the shifting in GPU utilization when looking up and down. The seam wasn’t in every game or always visible, but VSync did not fix it (although it did eliminate tearing in other parts of the screen). My test system didn’t have G-Sync, but I’d imagine if VSync didn’t work then G-Sync probably wouldn’t either.

    Even AFR has been officially limited to 2-way since Pascal (2016). 3- and 4-way were always more problematic and made microstutter worse. SLI support in games has been been pretty spotty in the last 5 years as well with DX12/Vulkan putting the onus of SLI support on devs rather than on Nvidia, the increasing amount of games/engines using non AFR-friendly rendering techniques, and the stagnation of the PCIe spec resulting in inter-GPU bandwidth not keeping up with increased bandwidth demand due to said AFR-unfriendly engines and due to advances in display tech like higher resolutions and refresh rates, G-Sync, and HDR, although NvLink finally fixes the bandwidth issue.

    Another thing is that AFR increases input lag by 1000(n-1)/FPS milliseconds, where n is the number of GPUs in SLI, compared to single GPU at the same FPS. This is inherently due to AFR needing a longer pre-render queue in order to scale efficiently, and the queue increases with additional GPUs. The input lag penalty can be offset by the higher frame rate in SLI, but this means that at best, SLI does not make input lag worse, instead of reducing it.
     
    Last edited: Dec 26, 2018
  32. Ryan_S

    Ryan_S Notebook Enthusiast

    Reputations:
    7
    Messages:
    17
    Likes Received:
    14
    Trophy Points:
    6
    I read it as part of a Wikipedia article.

    Of course, just because it's theoretically possible doesn't mean it's ever been implemented.

    Other than the bandwidth problem you mentioned, why would they choose to use non-friendly techniques for AFR? Is it faster performance, or just easier to program?

    This reminds me of pipelining in CPUs. I wonder how similar the techniques are.
     
    JRE84 likes this.
  33. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Because re-using data from previous frames is a smart way to increase visual quality while reducing computational cost. It’s an easy win for consoles and single GPU, but collapses scaling in AFR and greatly increases bandwidth requirement between GPUs.
     
    JRE84 and Ryan_S like this.
  34. Ryan_S

    Ryan_S Notebook Enthusiast

    Reputations:
    7
    Messages:
    17
    Likes Received:
    14
    Trophy Points:
    6
    That makes sense. It would seem to not be as big an issue in SFR, as each graphics card would be working on each frame. You could even use the previous frame to "guess" where the dividing line should be, and move it each frame based on the previous frame's actual complexity.
     
    JRE84 likes this.
  35. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    It should be possible with low-level APIs like DX12 and Vulkan, but so far nobody has bothered.
     
    Ryan_S and JRE84 like this.