I didn't see any sli laptops![]()
It makes me sad because i only ever buy sli laptops and only build sli desktops.
I realize mgpu seems to be the big craze these days but i don't even see a option for a second card.
-
Spartan@HIDevolution Company Representative
Next...sasuke256, steberg, saturnotaku and 1 other person like this. -
I of course respect your opinion Ultra Male.
However for my need my amd 7970 crossfire x to 980 ti sli then to 1080 ti sli and finally to 2080 ti sli have always worked for my needs.
All of the major triple AAA games i play support sli and my favorite mmo does as well. I need sli to get maximum use out of my pg27uq
My laptop uses 1080 sli and i will be more than a bit disappointed if a next gen laptop with 2080 sli doesn't come out for it
I like building over the top pc's because its fun and one you turn that pc on and it purrs like a kitten oh man its like a dream....Awhispersecho likes this. -
I prefer to game in 4k with max settings at 60+fps. No single card can do that for the majority of games. SLI is a must.
It almost seems to me that NVIDIA is trying to kill off SLI to prevent people from simply adding another card or two when their performance starts lacking which would instead force them to buy newer cards and upgrade more frequently. First by eliminating SLI beyond 2 cards and then with this seemingly astroturfed campaign toting the irrelevance of SLI.
Or it could be people who can't afford an SLI setup trying to convince themselves SLI is pointless so they can coddle their ego . . .
If the RTX 2080 was 225% more powerful than the GTX 1080, then I would concede that SLI would be less relevent. But someone whose currently using SLI GTX 1080 will have 0 incentive to upgrade to a singular RTX 2080 - they'd be loosing ~35% of their performance. -
Spartan@HIDevolution Company Representative
Buy what you please, no one is holding you. -
Nvlink is actually bruteforce solving the bandwitdh problem encountered with SLi, whether that would physically fit in a laptop is another story.
-
Also high end Desktop Replacement laptops are using dual psu's as it is. So maybe it's a limit on how much power they can deliver also?saturnotaku likes this. -
They told me on the clevo forums that there is no nvlink for 2080 on laptops so i guess sli is finished for laptops. What a shame. At least they still have it on desktops but I can't begin to describe how sad I am learning its dead on laptop. Life has victories and disappointments i guess.
hmscott likes this. -
SLI was never really that great in the first place.
-
every dog has it's day
-
Most don't want to spring for SLI and it's kind of dying tech. For me, it was never worth the expense.
-
cj_miranda23 Notebook Evangelist
Let's pray and hope for the NVLINK technology to improve overtime and be implemented to laptops.
https://www.nvidia.com/en-us/data-center/nvlink/
https://www.pcgamesn.com/nvidia-rtx-2080-ti-nvlink-performance
"Essentially, the GPUs are much closer together now, and that will allow game developers to actually see the NVLink connection and use it to render their games, hopefully in a more elegant, repeatable way than with SLI.
“That bridge is visible to games, that means that maybe there’s going to be an app that looks at one GPU and looks at another GPU and does something else.” explains Petersen. “The problem with multi-GPU computing is that the latency from one GPU to another is far away. It’s got to go across PCIe, it’s got to go through memory, it’s a real long distance from a transaction perspective.
“NVLink fixes all that. So NVLink brings the latency of a GPU-to-GPU transfer way down. So it’s not just a bandwidth thing, it’s how close is that GPU’s memory to me. That GPU that’s across the link… I can kind of think about that as my brother rather than a distant relative.” -
The problem is not with sli or nvlink bridges. Problem is with game devs seeing no point to waste extra money for properly implementing Sli/Nvlink. There's really small percentage of people who use it so they just dont care.
-
I understand people's opinion on this but whenever and wherever i discussed my support for sli and nvlink i was almost always attacked *not on these forums but other ones*
Some people seem to have a viewpoint that if sli didn't exist that games would function better with less bugs. I was almost always made to feel like a fifth wheel who shouldn't exist. You guys on this forum are respectful but the vast majority of the community is hateful toward sli. I guess all the sli builds that made pc gaming famous and showed the true horsepower of pc gaming don't matter and should just rollover and die. You pretty much can't have enthusiast builds without sli or nvlink. Please don't take my opinion as a personal attack against anyone its just massive frustration.TheDantee likes this. -
Poor ports from consoles make games run like ****. Definitely not sli or nvlink
-
I used a laptop with GTX 1070 SLI for a while, before swapping it for one with a GTX 1080. Would I go the SLI way again? No! Not if I can help it. Was the SLI setup faster than the single 1080? Yes, it was. Objectively. But a 20% performance increase is not justifiable with the consistent microstuttering that I had to deal with the SLI setups. I don't play games... may be once in a blue moon, but not regularly... so I won't speak about that much. But for all my lab work the SLI setup did not justify the headache w.r.t. the performance gains!
May be nvlink will improve... may SLI is dead... but I highly doubt it really matters (at least from the manufacturer's viewpoint)! My lab orders the highest specs laptops available at any given time, and I have not heard any complaints about dual GPUs since the Turing cards came out. Part of that relates to the stuttering issues, I am sure. But part of that also has to do with the fact that most of us are already lugging around two power bricks with a 2080 and and i9 K series CPU! -
Also a single RTX 2080 won't cut it for 60fps 4k gaming. If I could do that on a single card, then I would, but for that SLI GTX 1080's are the only option -
The next major nuance is using "modified stripped down drivers". It's extra work, and for 'may-be' advantage in an area where the realism of the advantage attained is suspect. In many ways this is analogous to the deep-learning hooplah that gets passed around as "A.I." is popular literature. A deep look into any DL algorithm reveals them to be variations on a single learning algorithm that requires terabytes of data to even function properly, while organic intelligence can attain better performance with mere bytes of data. The point is not that DL does not work... it is that it does not function intelligently, and given Occam's Razor it is actually a step-backward in terms of attaining true "intelligent" operations. So, to return to your statement, what do I gain by hunting down modified stripped down drivers, and countering all the hassles that will inevitably accompany the modifications? And also in what ways is that "gain" truly beneficial? It is a lot of work to satisfy an hunger for fps, and not in any (objectively) demonstrably advantageous ways. Without this, there is absolutely no motivation for an engineering department to devote additional resources to this problem.
Finally, "60fps at 4K" itself is an extension of the previous point. I fail to see what the point of 4K gaming is. It will not add to performance (either human or machine) in any meaningful sense, and any gains would be purely cosmetic. Furthermore, even from a cosmetic standpoint, with the usual distance between your eyes and the monitor for gaming it is rather dubious how much visual advantage is attained by bumping from 1080p to 4K. 4K can be useful for some purposes, for instance when pixel-clarity is a concern for editors trying to attain a certain end-goal for a visual depiction of details. Data embedding in images through pixel manipulation comes to mind as an example. But for gaming, when you won't be focussing on any particular scene-fraction for more than a few seconds (at most), I fail to see why anyone would want 4K. In fact, if anything I would avoid it actively. I will give you an example -- I am currently preparing to order an Eurocom SkyX7C with a RTX 2080 and an i9 K series through my lab. I have spec'd out pretty much everything to the max, including 2X 2 TB NVME drives. But one thing I actively ensured was to order the 1080P display @ 120 HZ. The refresh rate is the most ambient, and bumping that up from 60HZ makes a lot more sense than making it 4K, IMO, because (a) I fail to see any objective visual degradation, and (b) the GPU can dedicate itself to maximum data crunching without being bogged down by pixel counts. For a personal experience, I remember playing the first Max Payne on a Core Duo desktop with integrated graphics. I didn't mess with the settings, the game ran fine enough for me to follow the story and immerse myself in the world... and frankly that's the most fun I have had playing games. Like I mentioned, I have not touched any games in the last ten years or so (unless you count the odd few minutes on someone else's machine)... and it could be that I enjoy it less these days because I am not as young as I used to be... but it seems to me that people these days focus a lot more spec-sheets than on actually experiencing the story. The best and most powerful GPU is in your head, and that's where most of the details of the story come to life.
Now, all of this is not to say that you are wrong, and should do what I do. Rather, these are concerns that would keep me from devoting time to optimizing SLI/nvlink for modern GPUs! I would rather have a (still somewhat) portable powerhouse over a SLI maching with dubious aesthetic benefits.Prototime likes this. -
I use a 42" 4k TV as my primary monitor and sit no more than 30" from the screen. At that distance, the advantage 4k has in producing a more detailed image (and no need for AA) is very obvious.
I also largely play single player games with immersive, realistic graphics. I don't have much interest in competition shooters or similar games which benefit from greater than 60fps.
It also happens to be the case that the majority of the games I play do support SLI and support it well. Tomb Raider, GTAV, the Witcher all have 80%+ scaling. In all of these games the only way to get greater than 60fps at 4k with max settings in a laptop is with SLI GTX1080s. A single RTX 2080 won't cut it.
Cost doesn't matter as much to me as it used to. I've also recycled my sunk costs into my machine consistently for the past decade. After selling the parts to my old laptop, the laptop in my sig cost about $2k.
My use case is more specific than many people, but theres obviously a market for SLI if nvidia is supporting it in the RTX 2080, 2070 Super, 2080TI. I'm concerned that in the future SLI will be completely killed alienating those of us who do use it and do benefit from it. -
SLI is not a well supported feature from game devs, and it seems to be going the way of the dinosaur.
What's likely going to replace it is AMD's approach with Infinity Fabric and chiplets from Zen CPU's applied to their GPU's to move away from the monolithic die approach and start interconnecting GPU chips like that to create more powerful ones that register as a single GPU.x-pac likes this. -
Thanks to SLI, my Alienware M18xR2 is keeping up even today in many games that support it.
I do agree that it is a dying tech though. mGPU with DX12 is supposed to be the new thing but almost nothing uses it.
400 series to 600 series were the good days for SLI. After that it just kept going downhill. -
AMD's approach has huge advantages simply by drastically reducing the power consumption. On that concern alone, SLI should be extinct (and the faster the better). -
-
At farther viewing distances there is no real advantage with 4k over 1080p on the same sized screen, but close up it's night and day.
But regardless, when it comes to personal preferences, neither side will win the argument, because there really is no argument to win.Last edited: Jul 12, 2019Prototime likes this. -
-
I never saw a point in AMD Crossfire of NV SLI laptops. The thermal constraints for one with existing cooling solutions are too large to ignore and dual GPU configurations jack up the prices by A LOT. Not to mention that both Crossfire and SLI were never well supported and had really bad scaling (Crossfire was actually slightly better in this regard than SLI).
Still, if one had a mid/high end range GPU inside a laptop to begin with, it was likely that by the time people reach their chip limits, they would just move on to getting a new system anyway, so it never made sense to spend so much money on a single laptop. -
Deks likes this.
-
SLI wouldn't be needed if Nvidia gave permission and OEMs had the desire to put 102-class GPUs in the big laptops. They don't, and the throttled mobile 2080 isn't enough for Ultra 1080p 120Hz in some modern titles... let alone RTX on.
It is a shame. Imagine the hate that would flow if car makers all decided that 300hp was the most any future engine in a passenger car would ever have, that any more than that was unnecessary (well until next year's model had 320hp anyway).
That's fine if you don't see the point, you're free to not buy into it. To advocate it shouldn't exist for anybody, is approaching selfishness.
I used to rag on idiots who bought huge giant laptops and peasants who bought old BMWs because they couldn't afford a new one now I own and love both for my own reasons and dilligaf to what the world thinks.Last edited: Jul 13, 2019aarpcard likes this. -
I am not at all recommending that SLI should be made "taboo"! I am just saying that from my perspective the decision to move away makes perfect sense. It doesn't have to be your view at all!Last edited: Jul 13, 2019 -
-
yrekabakery Notebook Virtuoso
-
is physx even used anymore?
-
Personally, I wouldn't shell out for even SLI. I am going with a single RTX 2080 for my purchase.JRE84 likes this. -
Yes two RTX 2080s can be retrofitted into the (no longer in production = extinct) P870 chassis, which was at its heart a gaming notebook.
However, they're not linked with the high bandwidth/low latency SLI/NVLink interface, and while the drivers can be hacked to use the PCIe bus for SLI-like usage (I don't know whether Eurocon enable this) leads to horrible performance and atrocious frame pacing, and is effectively pointless except for some non-game/mining/CUDA workload that uses the GPUs independently. -
Personally, I agree with what you say about this being a rather useless hack anyway. -
SLI RIP.
My last laptop with sli was the p370sm3 and that one had major problems with many games. -
SLI is a software issue. It has ceased to get attention in development of even the biggest titles. Likely because it is such a corner market
-
-
Using it with @dsanke BIOS with a 8700k in my P775DM3-G and g-sync is working too flawlessly -
This is a good breakdown of why SLI is disappearing, and not just in laptops.
Viares Strake and joluke like this. -
Sent from my XT1575 using Tapatalk -
So if market share is behind the decreasing support, restricting it to dual stupid-expensive-for-one-card-already RTX setups is just quickening the death spiral. -
As someone who has zero experience with sli, if you are looking to use a Eurocom Sky x9c with a 1440p 120hz screen, i9, and 2080 in sli, can you just shut off the other gpu until you want to use it to save on power/electricity?
-
yrekabakery Notebook Virtuoso
-
What he said about dual rtx ^^^
And no, it's powered constantly, but is pretty minor considering the general power consumption of the unit which is high no matter what, my 2nd 1080 reports a 5-6W draw at idle
No more SLI laptops?
Discussion in 'Gaming (Software and Graphics Cards)' started by paulofeg, Jan 7, 2019.