The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    No more SLI laptops?

    Discussion in 'Gaming (Software and Graphics Cards)' started by paulofeg, Jan 7, 2019.

  1. paulofeg

    paulofeg Notebook Geek

    Reputations:
    10
    Messages:
    91
    Likes Received:
    14
    Trophy Points:
    16
    I didn't see any sli laptops :(
    It makes me sad because i only ever buy sli laptops and only build sli desktops.
    I realize mgpu seems to be the big craze these days but i don't even see a option for a second card.
     
  2. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,629
    Messages:
    23,562
    Likes Received:
    36,880
    Trophy Points:
    931
    Other than in synthetic benchmarks SLI has very minimal performance benefits these days as games don't scale well with SLI. SLI is not as good as it used to be or even beneficial if at all as game developers lack interest for optimizing their games for SLI setups. Give me a single powerful GPU and I'm a happy bunny.

    Next...
     
  3. paulofeg

    paulofeg Notebook Geek

    Reputations:
    10
    Messages:
    91
    Likes Received:
    14
    Trophy Points:
    16
    I of course respect your opinion Ultra Male.
    However for my need my amd 7970 crossfire x to 980 ti sli then to 1080 ti sli and finally to 2080 ti sli have always worked for my needs.
    All of the major triple AAA games i play support sli and my favorite mmo does as well. I need sli to get maximum use out of my pg27uq
    My laptop uses 1080 sli and i will be more than a bit disappointed if a next gen laptop with 2080 sli doesn't come out for it
    I like building over the top pc's because its fun and one you turn that pc on and it purrs like a kitten oh man its like a dream....
     
    Awhispersecho likes this.
  4. aarpcard

    aarpcard Notebook Deity

    Reputations:
    606
    Messages:
    1,129
    Likes Received:
    284
    Trophy Points:
    101
    I see this statement all accross the internet whenever someone mentions SLI. IMO that's not a strong argument. Many games do support SLI and many scale very well with SLI. There are also non gaming applications that benefit from having multiple GPUs.

    I prefer to game in 4k with max settings at 60+fps. No single card can do that for the majority of games. SLI is a must.

    It almost seems to me that NVIDIA is trying to kill off SLI to prevent people from simply adding another card or two when their performance starts lacking which would instead force them to buy newer cards and upgrade more frequently. First by eliminating SLI beyond 2 cards and then with this seemingly astroturfed campaign toting the irrelevance of SLI.

    Or it could be people who can't afford an SLI setup trying to convince themselves SLI is pointless so they can coddle their ego . . .

    If the RTX 2080 was 225% more powerful than the GTX 1080, then I would concede that SLI would be less relevent. But someone whose currently using SLI GTX 1080 will have 0 incentive to upgrade to a singular RTX 2080 - they'd be loosing ~35% of their performance.
     
    TheDantee, bennyg, Ashtrix and 3 others like this.
  5. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,629
    Messages:
    23,562
    Likes Received:
    36,880
    Trophy Points:
    931
    you're talking to a person who has owned many laptops with SLI starting from an Alienware 18 with 770 GTX in SLI, 780 GTX in SLI, 980M in SLI, 1080 GTX in SLI (in my previous Clevo P870DM3), whenever there is SLI, I'm the first one to buy it. I just didn't find it beneficial anymore.

    Buy what you please, no one is holding you.
     
  6. Mobius 1

    Mobius 1 Notebook Nobel Laureate

    Reputations:
    3,447
    Messages:
    9,069
    Likes Received:
    6,376
    Trophy Points:
    681
    Nvlink is actually bruteforce solving the bandwitdh problem encountered with SLi, whether that would physically fit in a laptop is another story.
     
  7. Terreos

    Terreos Royal Guard

    Reputations:
    1,170
    Messages:
    1,847
    Likes Received:
    2,264
    Trophy Points:
    181
    With how poor the scaling has been with Sli lately it really wasn't worth it. You would get one game with nearly perfect scaling and then the next game you'd get maybe 20% improvement over a single card. Though there are plenty of games that give you somewhere in the middle. I used to swear by sli myself, but honestly going one powerful GPU and calling it a day has been a very pleasant experience for me.

    Also high end Desktop Replacement laptops are using dual psu's as it is. So maybe it's a limit on how much power they can deliver also?
     
    saturnotaku likes this.
  8. paulofeg

    paulofeg Notebook Geek

    Reputations:
    10
    Messages:
    91
    Likes Received:
    14
    Trophy Points:
    16
    They told me on the clevo forums that there is no nvlink for 2080 on laptops so i guess sli is finished for laptops. What a shame. At least they still have it on desktops but I can't begin to describe how sad I am learning its dead on laptop. Life has victories and disappointments i guess.
     
    hmscott likes this.
  9. NB_Neenja

    NB_Neenja Notebook Consultant

    Reputations:
    41
    Messages:
    106
    Likes Received:
    70
    Trophy Points:
    41
    SLI was never really that great in the first place.
     
  10. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    every dog has it's day
     
  11. CGornet

    CGornet Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    4
    Trophy Points:
    6
    Most don't want to spring for SLI and it's kind of dying tech. For me, it was never worth the expense.
     
  12. cj_miranda23

    cj_miranda23 Notebook Evangelist

    Reputations:
    334
    Messages:
    567
    Likes Received:
    537
    Trophy Points:
    106
    Let's pray and hope for the NVLINK technology to improve overtime and be implemented to laptops.
    https://www.nvidia.com/en-us/data-center/nvlink/

    https://www.pcgamesn.com/nvidia-rtx-2080-ti-nvlink-performance

    "Essentially, the GPUs are much closer together now, and that will allow game developers to actually see the NVLink connection and use it to render their games, hopefully in a more elegant, repeatable way than with SLI.

    “That bridge is visible to games, that means that maybe there’s going to be an app that looks at one GPU and looks at another GPU and does something else.” explains Petersen. “The problem with multi-GPU computing is that the latency from one GPU to another is far away. It’s got to go across PCIe, it’s got to go through memory, it’s a real long distance from a transaction perspective.

    “NVLink fixes all that. So NVLink brings the latency of a GPU-to-GPU transfer way down. So it’s not just a bandwidth thing, it’s how close is that GPU’s memory to me. That GPU that’s across the link… I can kind of think about that as my brother rather than a distant relative.”
     
  13. Lumlx

    Lumlx Notebook Consultant

    Reputations:
    38
    Messages:
    205
    Likes Received:
    159
    Trophy Points:
    56
    The problem is not with sli or nvlink bridges. Problem is with game devs seeing no point to waste extra money for properly implementing Sli/Nvlink. There's really small percentage of people who use it so they just dont care.
     
  14. paulofeg

    paulofeg Notebook Geek

    Reputations:
    10
    Messages:
    91
    Likes Received:
    14
    Trophy Points:
    16
    I understand people's opinion on this but whenever and wherever i discussed my support for sli and nvlink i was almost always attacked *not on these forums but other ones*
    Some people seem to have a viewpoint that if sli didn't exist that games would function better with less bugs. I was almost always made to feel like a fifth wheel who shouldn't exist. You guys on this forum are respectful but the vast majority of the community is hateful toward sli. I guess all the sli builds that made pc gaming famous and showed the true horsepower of pc gaming don't matter and should just rollover and die. You pretty much can't have enthusiast builds without sli or nvlink. Please don't take my opinion as a personal attack against anyone its just massive frustration.
     
    TheDantee likes this.
  15. Lumlx

    Lumlx Notebook Consultant

    Reputations:
    38
    Messages:
    205
    Likes Received:
    159
    Trophy Points:
    56
    Poor ports from consoles make games run like ****. Definitely not sli or nvlink
     
  16. x-pac

    x-pac Notebook Enthusiast

    Reputations:
    17
    Messages:
    40
    Likes Received:
    18
    Trophy Points:
    16
    I used a laptop with GTX 1070 SLI for a while, before swapping it for one with a GTX 1080. Would I go the SLI way again? No! Not if I can help it. Was the SLI setup faster than the single 1080? Yes, it was. Objectively. But a 20% performance increase is not justifiable with the consistent microstuttering that I had to deal with the SLI setups. I don't play games... may be once in a blue moon, but not regularly... so I won't speak about that much. But for all my lab work the SLI setup did not justify the headache w.r.t. the performance gains!

    May be nvlink will improve... may SLI is dead... but I highly doubt it really matters (at least from the manufacturer's viewpoint)! My lab orders the highest specs laptops available at any given time, and I have not heard any complaints about dual GPUs since the Turing cards came out. Part of that relates to the stuttering issues, I am sure. But part of that also has to do with the fact that most of us are already lugging around two power bricks with a 2080 and and i9 K series CPU!
     
  17. aarpcard

    aarpcard Notebook Deity

    Reputations:
    606
    Messages:
    1,129
    Likes Received:
    284
    Trophy Points:
    101
    In games that support SLI, I consistently get a 60-80%+ performance increase. I also found that microstuttering is effectively eliminated when using the modified stripped down drivers.

    Also a single RTX 2080 won't cut it for 60fps 4k gaming. If I could do that on a single card, then I would, but for that SLI GTX 1080's are the only option
     
    TheDantee and bennyg like this.
  18. x-pac

    x-pac Notebook Enthusiast

    Reputations:
    17
    Messages:
    40
    Likes Received:
    18
    Trophy Points:
    16
    See, it's not that I disagree with you per se. But I was talking about a more delicate cost ~ gain ratio w.r.t. what is being gained. I think that reflects in your post as well. First major nuance is "in games that support SLI", meaning that in games that don't actively support it (which would be quite a lot of them) the gains would be much less (if any at all). Also, I am sure your description of your experiences are accurate, but I have hardly ever heard of people getting an 80% boost. Second, the best I can recall there is very little evidence from vision neuroscience that gains above 60 fps are meaningfully perceptible. On the other hand, there is a bunch load of evidence for induced placebo effects of illusory advantage from being exposed to fps-count addiction.

    The next major nuance is using "modified stripped down drivers". It's extra work, and for 'may-be' advantage in an area where the realism of the advantage attained is suspect. In many ways this is analogous to the deep-learning hooplah that gets passed around as "A.I." is popular literature. A deep look into any DL algorithm reveals them to be variations on a single learning algorithm that requires terabytes of data to even function properly, while organic intelligence can attain better performance with mere bytes of data. The point is not that DL does not work... it is that it does not function intelligently, and given Occam's Razor it is actually a step-backward in terms of attaining true "intelligent" operations. So, to return to your statement, what do I gain by hunting down modified stripped down drivers, and countering all the hassles that will inevitably accompany the modifications? And also in what ways is that "gain" truly beneficial? It is a lot of work to satisfy an hunger for fps, and not in any (objectively) demonstrably advantageous ways. Without this, there is absolutely no motivation for an engineering department to devote additional resources to this problem.

    Finally, "60fps at 4K" itself is an extension of the previous point. I fail to see what the point of 4K gaming is. It will not add to performance (either human or machine) in any meaningful sense, and any gains would be purely cosmetic. Furthermore, even from a cosmetic standpoint, with the usual distance between your eyes and the monitor for gaming it is rather dubious how much visual advantage is attained by bumping from 1080p to 4K. 4K can be useful for some purposes, for instance when pixel-clarity is a concern for editors trying to attain a certain end-goal for a visual depiction of details. Data embedding in images through pixel manipulation comes to mind as an example. But for gaming, when you won't be focussing on any particular scene-fraction for more than a few seconds (at most), I fail to see why anyone would want 4K. In fact, if anything I would avoid it actively. I will give you an example -- I am currently preparing to order an Eurocom SkyX7C with a RTX 2080 and an i9 K series through my lab. I have spec'd out pretty much everything to the max, including 2X 2 TB NVME drives. But one thing I actively ensured was to order the 1080P display @ 120 HZ. The refresh rate is the most ambient, and bumping that up from 60HZ makes a lot more sense than making it 4K, IMO, because (a) I fail to see any objective visual degradation, and (b) the GPU can dedicate itself to maximum data crunching without being bogged down by pixel counts. For a personal experience, I remember playing the first Max Payne on a Core Duo desktop with integrated graphics. I didn't mess with the settings, the game ran fine enough for me to follow the story and immerse myself in the world... and frankly that's the most fun I have had playing games. Like I mentioned, I have not touched any games in the last ten years or so (unless you count the odd few minutes on someone else's machine)... and it could be that I enjoy it less these days because I am not as young as I used to be... but it seems to me that people these days focus a lot more spec-sheets than on actually experiencing the story. The best and most powerful GPU is in your head, and that's where most of the details of the story come to life.

    Now, all of this is not to say that you are wrong, and should do what I do. Rather, these are concerns that would keep me from devoting time to optimizing SLI/nvlink for modern GPUs! I would rather have a (still somewhat) portable powerhouse over a SLI maching with dubious aesthetic benefits.
     
    Prototime likes this.
  19. aarpcard

    aarpcard Notebook Deity

    Reputations:
    606
    Messages:
    1,129
    Likes Received:
    284
    Trophy Points:
    101
    My only response is that for my use case combined with my personal preferences SLI is my only option.

    I use a 42" 4k TV as my primary monitor and sit no more than 30" from the screen. At that distance, the advantage 4k has in producing a more detailed image (and no need for AA) is very obvious.

    I also largely play single player games with immersive, realistic graphics. I don't have much interest in competition shooters or similar games which benefit from greater than 60fps.

    It also happens to be the case that the majority of the games I play do support SLI and support it well. Tomb Raider, GTAV, the Witcher all have 80%+ scaling. In all of these games the only way to get greater than 60fps at 4k with max settings in a laptop is with SLI GTX1080s. A single RTX 2080 won't cut it.

    Cost doesn't matter as much to me as it used to. I've also recycled my sunk costs into my machine consistently for the past decade. After selling the parts to my old laptop, the laptop in my sig cost about $2k.

    My use case is more specific than many people, but theres obviously a market for SLI if nvidia is supporting it in the RTX 2080, 2070 Super, 2080TI. I'm concerned that in the future SLI will be completely killed alienating those of us who do use it and do benefit from it.
     
    TheDantee and Prototime like this.
  20. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    SLI is not a well supported feature from game devs, and it seems to be going the way of the dinosaur.
    What's likely going to replace it is AMD's approach with Infinity Fabric and chiplets from Zen CPU's applied to their GPU's to move away from the monolithic die approach and start interconnecting GPU chips like that to create more powerful ones that register as a single GPU.
     
    x-pac likes this.
  21. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Thanks to SLI, my Alienware M18xR2 is keeping up even today in many games that support it.

    I do agree that it is a dying tech though. mGPU with DX12 is supposed to be the new thing but almost nothing uses it.

    400 series to 600 series were the good days for SLI. After that it just kept going downhill.
     
  22. x-pac

    x-pac Notebook Enthusiast

    Reputations:
    17
    Messages:
    40
    Likes Received:
    18
    Trophy Points:
    16
    I agree. Personally I can't see any reason to use 4K for gaming, nor to lug around an environmentally unfriendly and arthritis inducing hunk of a machine that requires multiple powerbricks. With the performance available from the RTX 20 series, it makes very little sense to work for SLI! The primary concern for engineering is optimization, and that is much better attained by limiting the pixel count to sensible numbers. I usually don't argue with people on these issues because they take it personally... but from my own experience I have never honestly been able to tell what the big whoop with 4k is all about? I don't see any noticeable difference in everyday use... and aesthetics is much better attained by tuning the color calibration, contrast ratios etc. I can see 1440p... but I will never bog down my GPU with anything with a higher pixel count. Just makes no sense... and I do not give in to marketing gimmicks about "retina" displays or "infinity" edges. Trying to reinvent the wheel is a pointless effort... just make sure the display is pleasing, crisp and there is no pixelation. And as I keep saying, I can't tell the difference with 4k even by pressing my nose against the monitor.

    AMD's approach has huge advantages simply by drastically reducing the power consumption. On that concern alone, SLI should be extinct (and the faster the better).
     
  23. x-pac

    x-pac Notebook Enthusiast

    Reputations:
    17
    Messages:
    40
    Likes Received:
    18
    Trophy Points:
    16
    30" from a 42" 4K TV? I doubt that's very good for your eyes! I am using a 60" Sony Bravia smart LED TV with 1440p currently (but not all the time, as I do use a regular 25" 1080p monitor most of the time), and I can barely stand to stare at it unless I am sitting right up against the opposite wall. But if it works for you...
     
  24. aarpcard

    aarpcard Notebook Deity

    Reputations:
    606
    Messages:
    1,129
    Likes Received:
    284
    Trophy Points:
    101
    Damaging your eyes by sitting too close to a TV is an old wive's tale. Sitting too close may result in eyestrain (which goes away after you focus on something farther away), but that's not something all people are susceptible to.

    At farther viewing distances there is no real advantage with 4k over 1080p on the same sized screen, but close up it's night and day.

    That's not a great argument. The laptop won't draw any more power than a similarly specced desktop, and if someone is willing to carry around the weight, then more power to them. If saving the environment is a high priority, then maybe the answer is to not game at all. Realistically the few extra kwh's per month gaming adds to your energy consumption is background noise in the scheme of things. Keeping your house 10degrees cooler in the winter will have a logarithmically greater impact.

    But regardless, when it comes to personal preferences, neither side will win the argument, because there really is no argument to win.
     
    Last edited: Jul 12, 2019
    Prototime likes this.
  25. x-pac

    x-pac Notebook Enthusiast

    Reputations:
    17
    Messages:
    40
    Likes Received:
    18
    Trophy Points:
    16
    You misunderstood me! I was trying to put myself in the OEM's shoes. They would only be concerned with ticking checkboxes, and it seems like they find it easier to justify going away from SLI. True, personally I wouldn't care for anything more than 1440p. But I wasn't making a personal statement here...
     
  26. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331

    I never saw a point in AMD Crossfire of NV SLI laptops. The thermal constraints for one with existing cooling solutions are too large to ignore and dual GPU configurations jack up the prices by A LOT. Not to mention that both Crossfire and SLI were never well supported and had really bad scaling (Crossfire was actually slightly better in this regard than SLI).
    Still, if one had a mid/high end range GPU inside a laptop to begin with, it was likely that by the time people reach their chip limits, they would just move on to getting a new system anyway, so it never made sense to spend so much money on a single laptop.
     
  27. x-pac

    x-pac Notebook Enthusiast

    Reputations:
    17
    Messages:
    40
    Likes Received:
    18
    Trophy Points:
    16
    I agree! It all comes down to cost~effort~benefit balance. High price, questionable performance gains, thermal issues, power issues and the pointlessness of pixel-chasing has sent SLI/Crossfire the way of the dodo-bird. I am not shedding any tears for it personally.
     
    Deks likes this.
  28. bennyg

    bennyg Notebook Virtuoso

    Reputations:
    1,567
    Messages:
    2,370
    Likes Received:
    2,375
    Trophy Points:
    181
    SLI wouldn't be needed if Nvidia gave permission and OEMs had the desire to put 102-class GPUs in the big laptops. They don't, and the throttled mobile 2080 isn't enough for Ultra 1080p 120Hz in some modern titles... let alone RTX on.

    It is a shame. Imagine the hate that would flow if car makers all decided that 300hp was the most any future engine in a passenger car would ever have, that any more than that was unnecessary (well until next year's model had 320hp anyway).

    That's fine if you don't see the point, you're free to not buy into it. To advocate it shouldn't exist for anybody, is approaching selfishness.

    I used to rag on idiots who bought huge giant laptops and peasants who bought old BMWs because they couldn't afford a new one now I own and love both for my own reasons and dilligaf to what the world thinks.
     
    Last edited: Jul 13, 2019
    aarpcard likes this.
  29. x-pac

    x-pac Notebook Enthusiast

    Reputations:
    17
    Messages:
    40
    Likes Received:
    18
    Trophy Points:
    16
    I agree with Nvidia limiting stuff being bad. That's unjustifiable! Personally, I am yet to come across anything that wouldn't do 60 FPS+ on a mobile RTX 2080... of course, ray-tracing is whole another issue. How many games (as of now) actually take advantage of it? My reason for using the RTX is not ray-tracing, but to take advantage of the tensors!

    That's not quite a proper analogy, though. I will use motorcycles for my example since I don't use cars. Used to be you could buy a Hayabusa (my previous bike was an example) that was unrestricted and could be charged upto 480 hps. But there are strict limitations on how fast motorcycles can be these days. My 2000 Busa could touch 350 kmph with a little coaxing of the gearbox, and without any super/turbo-charging. No modern Busa would come unrestricted. You could disable the rev limiter, but it would be illegal to do so (mostly) and immediately void all warranty. Suzuki resolutely refuses to service any super/hyper-bike that has the rev-limiter disabled. Even the factory-supercharged Kawasaki H2 is limited to under 330 hp from factory. The H2R, still under 400 hp, is not street legal at all. So the idea of imposing limits is very common, and very necessary. The issue is whether it is being done for the right reasons or to introduce planned obsolence.

    I am not at all recommending that SLI should be made "taboo"! I am just saying that from my perspective the decision to move away makes perfect sense. It doesn't have to be your view at all!
     
    Last edited: Jul 13, 2019
  30. x-pac

    x-pac Notebook Enthusiast

    Reputations:
    17
    Messages:
    40
    Likes Received:
    18
    Trophy Points:
    16
    Just FYI: Eurocom now offers you the ability to configure the SkyX9C with either a GTX 1080 SLI or two RTX 2080s for PhysX processing.
     
  31. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    joluke, ExMM, Papusan and 2 others like this.
  32. JRE84

    JRE84 Notebook Virtuoso

    Reputations:
    856
    Messages:
    2,505
    Likes Received:
    1,513
    Trophy Points:
    181
    is physx even used anymore?
     
  33. x-pac

    x-pac Notebook Enthusiast

    Reputations:
    17
    Messages:
    40
    Likes Received:
    18
    Trophy Points:
    16
    Agreed. But if someone still wants a dual-GPU setup in a laptop, whether SLI or PhysX, it is available after all. Since some people are worried about dual GPU going extinct, I was just updating the fact.

    Personally, I wouldn't shell out for even SLI. I am going with a single RTX 2080 for my purchase.
     
    JRE84 likes this.
  34. bennyg

    bennyg Notebook Virtuoso

    Reputations:
    1,567
    Messages:
    2,370
    Likes Received:
    2,375
    Trophy Points:
    181
    It's a solution looking for a problem... "Lack of dedicated PhysX gpu" was not ever a real problem, not even in 2005 when somebody last bothered doing it.

    Yes two RTX 2080s can be retrofitted into the (no longer in production = extinct) P870 chassis, which was at its heart a gaming notebook.

    However, they're not linked with the high bandwidth/low latency SLI/NVLink interface, and while the drivers can be hacked to use the PCIe bus for SLI-like usage (I don't know whether Eurocon enable this) leads to horrible performance and atrocious frame pacing, and is effectively pointless except for some non-game/mining/CUDA workload that uses the GPUs independently.
     
  35. x-pac

    x-pac Notebook Enthusiast

    Reputations:
    17
    Messages:
    40
    Likes Received:
    18
    Trophy Points:
    16
    I spoke to Eurocom, and apparently they have absolutely no intention of supporting SLI/NVLink with RTX series anytime. Regarding hacking the drivers, their official position is that they will NOT cover it under their terms and conditions. Any problem you run into, you are on your own.

    Personally, I agree with what you say about this being a rather useless hack anyway.
     
  36. yosv211

    yosv211 Notebook Consultant

    Reputations:
    19
    Messages:
    174
    Likes Received:
    65
    Trophy Points:
    41
    SLI RIP.
    My last laptop with sli was the p370sm3 and that one had major problems with many games.
     
  37. Viares Strake

    Viares Strake Notebook Guru

    Reputations:
    15
    Messages:
    74
    Likes Received:
    43
    Trophy Points:
    26
    SLI is a software issue. It has ceased to get attention in development of even the biggest titles. Likely because it is such a corner market
     
  38. joluke

    joluke Notebook Deity

    Reputations:
    1,040
    Messages:
    1,798
    Likes Received:
    1,217
    Trophy Points:
    181
    Yes it is for physics effects but buying a GPU just for that is stupid. a CPU nowadays can handle it pretty well
     
  39. joluke

    joluke Notebook Deity

    Reputations:
    1,040
    Messages:
    1,798
    Likes Received:
    1,217
    Trophy Points:
    181
  40. Prototime

    Prototime Notebook Evangelist

    Reputations:
    206
    Messages:
    639
    Likes Received:
    888
    Trophy Points:
    106
    This is a good breakdown of why SLI is disappearing, and not just in laptops.
     
    Viares Strake and joluke like this.
  41. Viares Strake

    Viares Strake Notebook Guru

    Reputations:
    15
    Messages:
    74
    Likes Received:
    43
    Trophy Points:
    26
    Spot on

    Sent from my XT1575 using Tapatalk
     
  42. bennyg

    bennyg Notebook Virtuoso

    Reputations:
    1,567
    Messages:
    2,370
    Likes Received:
    2,375
    Trophy Points:
    181
    What he failed to mention is that right now, AMD's 5700s abandon multi GPU altogether, and only new Nvidia GPUs capable are the 2070 'Super' and above.

    So if market share is behind the decreasing support, restricting it to dual stupid-expensive-for-one-card-already RTX setups is just quickening the death spiral.
     
  43. Dominick_7

    Dominick_7 Notebook Deity

    Reputations:
    113
    Messages:
    1,969
    Likes Received:
    29
    Trophy Points:
    66
    As someone who has zero experience with sli, if you are looking to use a Eurocom Sky x9c with a 1440p 120hz screen, i9, and 2080 in sli, can you just shut off the other gpu until you want to use it to save on power/electricity?
     
  44. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    You should just take out the second 2080 and sell it, because that “SLI” does not work in games.
     
  45. bennyg

    bennyg Notebook Virtuoso

    Reputations:
    1,567
    Messages:
    2,370
    Likes Received:
    2,375
    Trophy Points:
    181
    What he said about dual rtx ^^^

    And no, it's powered constantly, but is pretty minor considering the general power consumption of the unit which is high no matter what, my 2nd 1080 reports a 5-6W draw at idle