The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    SLI is NOT worth it

    Discussion in 'Gaming (Software and Graphics Cards)' started by Nick11, Feb 12, 2015.

  1. Nick11

    Nick11 Notebook Consultant

    Reputations:
    72
    Messages:
    224
    Likes Received:
    8
    Trophy Points:
    31
    Approximately a year ago I jumped on the bandwagon and built myself a PC utilizing SLI 780 ti's and a 4k monitor. Now I realize this is a notebook forum but I imagine you guys are having the same issues as me.

    My problem with SLI:

    1) Still not enough to run most games at 4k with full graphics "40+ FPS"

    2) ALMOST every game has some issue with SLI until many patches, such as texture flickering

    3) Some games actually perform worse on SLI setups.

    I have now decided to purchase a MSI GS30 laptop, pull out one of my 780 Ti's and utilize my other PC for my girlfriend.

    I will than wait for a a year or so until they release a beefy new single card which can handle 4k+ gaming.

    Anyone else in the same boat?
     
  2. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    My next laptop and desktop will both sport a single AMD GPU because multi-GPU has too many issues on both sides and Nvidia has disabled overclocking for its mobile cards.
     
    Ramzay and TomJGX like this.
  3. Nick11

    Nick11 Notebook Consultant

    Reputations:
    72
    Messages:
    224
    Likes Received:
    8
    Trophy Points:
    31
    After owning a Sager 9150 with a 7970m, I will never touch an AMD product again. Anyone who had experience with that card likely knows what I am talking about....

    Even if they are better now, I am very brand loyal when it comes to **** like that...
     
    Zymphad likes this.
  4. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    I'm definitely not in the same boat. Especially on a mobile platform. I love SLI and seem to always be able to make things work. Granted, things don't always play nice straight away.
    However, I don't use a desktop. So I need all the power I can get. For users like me SLI is the only way to get desktop like performance :)
     
  5. Nick11

    Nick11 Notebook Consultant

    Reputations:
    72
    Messages:
    224
    Likes Received:
    8
    Trophy Points:
    31
    I have both a gaming laptop and a gaming desktop. Using a desktop to game at 4k is very similar to gaming on a laptop at 1080p.... I need all the performance I can get but it's very frustrating when SLI adds no benefit and in some cases causes issues. When I play older games with multiple patches it's great. Like Max Payne 3, playing that game at 4k with ultra settings is ****ing awesome.

    However if your a guy who likes to play games on release date, or *cough* *cough* PIRATE games... it's seriously a pain in the ass. At best 50% success rate.

    I have come to the conclusion personally it's best to go back to 1080p gaming / 1440p and use a single graphics card... I hope for a giant leap forward in performance soon for single desktop cards.

    Remember mobile video cards are not that far behind Desktop cards anymore, especially if you don't overclock.
     
  6. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    I have no other option if i want more than 1440p 60FPS. I really want my games at 100fps or more.
     
    D2 Ultima likes this.
  7. Qing Dao

    Qing Dao Notebook Deity

    Reputations:
    1,600
    Messages:
    1,771
    Likes Received:
    304
    Trophy Points:
    101
    I'm not sure how pulling out one of her graphics cards is going to make your computer a better a girlfriend, although I do prefer my girlfriends to not have any more processing power than is absolutely necessary.
     
  8. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    ^FTFY
     
    TomJGX likes this.
  9. Qing Dao

    Qing Dao Notebook Deity

    Reputations:
    1,600
    Messages:
    1,771
    Likes Received:
    304
    Trophy Points:
    101
    I wouldn't have written that if it wasn't already ambiguous.
     
  10. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    It kinda reeked of nerd/geek stereotypes and wasn't very funny to begin with.
     
    Qing Dao likes this.
  11. Quagmire LXIX

    Quagmire LXIX Have Laptop, Will Travel!

    Reputations:
    1,368
    Messages:
    1,085
    Likes Received:
    45
    Trophy Points:
    66
    While I'm aware of the insanity you went through, it wasn't the 7970m, it was the implementation of the iGPU switch ability. Alienware seen the issue for us and set up a hard option Fn+F7 where we could dedicate the 7970m as only gpu.

    I do understand how that can sour on a person though.
     
  12. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    Indeed this is what I was going to say.. I'm regretting getting rid off my 7970M with this Maxwell mobile lock crap..
     
  13. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    We can agree to disagree.

    Lol @ thinking 780 series would be good for 4k. C'mon man. Really?
     
  14. Getawayfrommelucas

    Getawayfrommelucas Notebook Evangelist

    Reputations:
    84
    Messages:
    636
    Likes Received:
    51
    Trophy Points:
    41
    Current tech + 2 years in my rule of thumb. 4K gaming has become more common since...last year? idk - I'm behind the times. But for me, 4k has been around for a year. So 2016 will be the year for me to upgrade!
     
  15. killkenny1

    killkenny1 Too weird to live, too rare to die.

    Reputations:
    8,268
    Messages:
    5,258
    Likes Received:
    11,615
    Trophy Points:
    681
    But but, Linus said they were OK for 4K (desktop 780ti's).
     
  16. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    Oh Linus... does he even play games?
     
  17. kent1146

    kent1146 Notebook Prophet

    Reputations:
    2,354
    Messages:
    4,449
    Likes Received:
    476
    Trophy Points:
    151
    1) Well, it depends on what game(s) you're talking about. It also depends on what graphics settings you're talking about. For example, at 4K, you don't need Anti-aliasing. In other examples, some games just don't scale well to 4K. Realistically, if you want to game at 60+ fps, you should be turning down graphical settings.

    2) Yup. SLI is a pain in the butt, even when it works well. That's why every system builder will always recommend a single powerful graphics card whenever possible.

    BTW, the GeForce GTX 980 is a single card that does decently well at 4K resolutions. But the real solution won't be GPU power... the real solution will be variable refresh rate (VRR) sync technologies, like Adaptive Sync or G-Sync. The "sweet spot" for those VRR technologies are between 40fps - 60fps, which is right in the range of the framerates that current single-GPUs can handle.
     
  18. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Well I haven't had issues yet, but I can understand the frustration. I always went with single GPU but the GT80 was too tempting for me.

    I knew 4k was a challenge when I decided to try and run Dark Souls 2 at 4k, and found that challenging enough for my single 780m, hovering around 24fps. And it is not exactly a heavy game graphically.

    As others have said, I expect the next gen to address the 4K issue, since such res is barely becoming the new thing. Current GPUs can handle 4k but with reduced settings just fine. Maxing is out of the question.
     
  19. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    And the fact that most SLI/CF implementations are alternate frame which doesn't help with latency, only throughput.

    Variable timing or not, slow frame rate is slow frame rate.
     
  20. kent1146

    kent1146 Notebook Prophet

    Reputations:
    2,354
    Messages:
    4,449
    Likes Received:
    476
    Trophy Points:
    151
    Well, no. That's the point of variable refresh rates.

    Handling 40FPS using Vsync is a problem because it either forces you to choose between Vsync on (smooth visual image, but heavy input lag) or Vsync off (minimal input lag, but visual tearing). The way around this was for people to try and get games to run at minimum 60fps whenever possible.

    Variable refresh rates give you the best of both worlds. It gives you minimal input lag, and also eliminates screen tearing. So it really doesn't matter if your framerate is 40fps, 50fps, 60fps, 70fps, etc.
     
  21. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    For the absolutely last time (not), CURRENT. TECHNOLOGY. CANNOT. SUSTAIN. NEW. AND. UNOPTIMIZED. GAMES. AT. 4K. RESOLUTION. AT. ACCEPTABLE. FRAMERATES. WITHOUT. MAKING. USE. OF. AT. LEAST. THREE-WAY. SLI.
    It will not happen now. It will not happen in two years. It will probably happen in three or four years. When GPUs get so powerful that a single top of the line GPU can play every single game on the market at 1080p 120fps or higher, we will be ready for 4K.

    It doesn't matter how strong GPUs get. It doesn't matter if games barely improve graphically over the course of three or four years. The fact of the matter is, your game is optimized for the lowest common denominator. With EXTREMELY weak last-gen hardware we ran into bottlenecks where we couldn't produce the kinds of worlds/games/etc people wanted to; because of memory/CPU/GPU/etc limitations. Now the limitations are free, and people are using every tiny inch of it, and not bothering with compression. If a game's primary design is a current-gen console, its PC version is likely going to use much more memory and video RAM than is necessary... because devs went from 512MB limits to ~6GB limits, and increasing things like texture resolution (not shadowmap resolution OR texture quality; those are different) are relatively free in terms of performance hits. Once you got the memory for it.

    Then when the "minimum" PC cards being designed for become console-level or higher (in general, assume GTX 660) due to Maxwell and Pascal being one and two generations newer than kepler? You're gonna see games that look similar to stuff coming out now/been out for a couple years/etc that are just going to run worse, because there's no need for them to design for 10 year old hardware anymore. Easy examples are how games looking worse than stuff like Crysis 2 (and that in itself was a 2011 title) and running a lot worse than it being out right now. It's just because it was designed with bigger hardware as its minimum denominator. Good example: I'd say Crysis 2 and Dying Light look fairly similar in a lot of cases. I'd still give Crysis 2 the edge. But I could play that in DX9 at 1200p on ~medium on my 280M and never dip below 30fps... if I tried Dying Light on my 280M it'd roll over, burn up, curse me for making it try, then die. And the same goes for a whole lotta games that've been released recently; there's lots of reasons why games like Watch Dogs, CoD: AW (only improvement is its lighting system over Black Ops 2 in many aspects), Titanfall, etc... They LOOK on par with many older titles.... ones that run 5x better than them (not exaggerating). Hell, I used to play sleeping dogs with "extreme" AA on (huge SSAA included)... I dropped below 60 maybe 5% of the time on my two stock 780Ms. You think I'd be able to run Watch Dogs at max at 1080p and get such good FPS? LOL. No. Ain't happening. Far less if I tried using SSAA to force it to 3k or 4K res. You must be crazy. But that's just the way the times roll.

    In short: if you expected to play the latest AAA games at 4K, ESPECIALLY with how little attention is given to PC gamers (which explains lack of good SLI/CrossfireX support) by those devs... you were pretty sorely mistaken, and you honestly should have known better.

    Does this mean you can't play stuff like Sleeping Dogs/CS:GO/LoL/SC2/AC2/AC: Brotherhood/Black Ops 2/BF:BC2/Dark Souls 2/Killing Floor/Mass Effect 1, 2 and 3/Skyrim/etc at 4K? No, it doesn't mean that. In fact those games'd probably run amazingly well at 4K. Amazingly. But is that what 99% of vocal 4K supporters/enthusiasts wish to play? Nope. Nobody's fault but their own for being disappointed.
     
  22. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    AMD better deliver on the 390X and someone really needs to make a 1440p Freesync panel. Or at least a 1080p one.
     
  23. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    [ ] SLI not worth it
    [x] SLI worth it
     
    nightingale and D2 Ultima like this.
  24. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    I rarely have issues with SLI... I won't have a laptop without it.

    Sent from my Nexus 6 using Tapatalk
     
    D2 Ultima likes this.
  25. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Umm, you need SLI 970 or 980 or Titan GPU's to run games at 4k at 40+ FPS, so not sure what you were expecting?
     
  26. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Well besides less VRAM, 780 Ti is at least as fast as those cards, if not faster in the case of 970.

    Maybe OP needs to start thinking about tri- and quad-SLI and pray to the Nvidia gods that drivers and app support improve.

    Also, AMD Hawaii cards (290/290X and 295X2) are better at 4K than Nvidia GM204 and GK110.
     
    Last edited: Feb 13, 2015
    D2 Ultima likes this.
  27. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,341
    Likes Received:
    70,660
    Trophy Points:
    931
    I'm sold on SLI and won't do any machine with a single GPU. They don't offer enough performance. If, in the rare event, a game has serious issues with SLI and performs better with one GPU, I simply disable SLI to play that game... not a big deal if it's a game I cannot live without playing. The rest of the time I enjoy significantly greater performance. I hardly ever have any kind of serious issue with SLI. Issues for me are few and far between, and are almost always extremely minor issues.
     
    D2 Ultima likes this.
  28. Nick11

    Nick11 Notebook Consultant

    Reputations:
    72
    Messages:
    224
    Likes Received:
    8
    Trophy Points:
    31
    Agreed, I hate people who come on forums like this to spread such stereo types.... Being a "geek" has made a wealthy man... and for the record I do have a very attractive wife which is not my computer... lol


    I have just personally decided to go back to a single 780TI until the next "big break" in graphics processing.


    Also I want to be fair here. Dark Souls 2 is one of the games that does scale well with SLI. I play that game maxed out at 4k 60+ fps and it looks awesome.

    Also, the fact that AA is not needed with 4k is a myth. When it comes to graphics quality of course you don't "Need" it. However it does make things look noticeably better even at 4k. I usually do at least 2x.
     
  29. Nick11

    Nick11 Notebook Consultant

    Reputations:
    72
    Messages:
    224
    Likes Received:
    8
    Trophy Points:
    31

    So I don't disagree with what your saying, although your coming off a bit arrogant in this post.

    To be clear my issue isn't that with SLI I still can't play games in 4k, so much as so few games come out optomized for multi-GPU setups on release. Most modern games I usually try to play at 1440P, and I am happy with that. As you said I think I am just going to wait this out for a few years and see what "next gen" provides.

    I also agree that something needs to change before PC gamers are ever a priority for game developers. It will happen though, the "next gen" consoles released have had a slow start to say the least, and they are just too antiquated to keep up with the way technology is going. I sold my PS4 the other day, and when the guy came to pick it up he questioned me why I was selling so cheap.

    I decided to show him my gaming setup with 3 4k Samsung monitors and a occulus rift. "Yes I know I can't game at 4k +3 with only 780 TI SLI". Going form a solid PC gaming setup to a console, is like going from a PS4 to a legacy gameboy lol.
     
  30. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    As long as 3D computer graphics in games are pixels and raster graphics, AA is needed. Even just FXAA if you want to be cheap about it. Doesn't matter if you have a 2K or 4K screen or whatever. Spatial aliasing may be reduced on a high PPI screen at native res but temporal aliasing is still noticeable.
     
    Mr.Koala likes this.
  31. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    I agree that adaptive sync is (much) better than fixed fresh rate at low FPS. But that only tells half of the story. With adaptive sync you get good latency between GPU frame buffer and screen. However even if you have perfect latency performance here, if your average FPS drops down to 40 you're in trouble because:
    1. 40FPS means about 25ms of rendering time per frame on average. That's 25ms of latency between game's physics code and rendering output that you can't avoid with better output control.
    2. After one frame is fished you would have 25ms of rendering time where there's no new data for the screen to feed on. It's doing nothing during this time and you see nothing new.

    #2 can be fixed by SLI/CF assuming good multi-GPU scaling. #1 is always there unless the GPU is fast enough or splite frame techniques (basically extinct these days) work well.
     
    Last edited: Feb 14, 2015
  32. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    The reason I came across as such was because I keep seeing people make the same arguement I listed FOR 4K, then the same complaints when they get it. Either they run lower res and deal with interpolation or they just lower settings in the game and bite it. Anybody who knows what the current market is like for SLI users AND who banked on SLI being the way to power through new, unoptimized AND SLI-unfriendly games... that's their fault. End of story. Honestly.

    I guarantee you that if you planned on playing older titles at 4K and had your SLI 780Ti, not only would you be perfectly happy for ages, but you wouldn't even run out of vRAM (especially if only gaming at 4K with one active screen and fullscreened) so... yeah. In general, I'm not sympathetic to you, or to any user with the same woes, as all the information was there and knowing how current games keep releasing is also there. 99.9% of PC games run fine at 4K on even a single 4GB 770. The 0.1% of PC games that don't are the ones that people keep wanting to play at 4K. Hint: It's not the games' fault.

    Also, Devs are the ones who need to speak to nVidia and get the ability to use SLI working. When a game takes 4 months and then it comes out broken... that's the devs' fault. When a game launches and it already uses SLI without so much as a driver update? That's also the devs' fault. Has nothing to do with SLI in itself nor nVidia (or AMD for that matter).
     
  33. bluefox94

    bluefox94 Notebook Consultant

    Reputations:
    1
    Messages:
    105
    Likes Received:
    37
    Trophy Points:
    41
    Meaker do you think a desktop 980 will be sufficient for 1440p 120hz monitor?
     
  34. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    I'm not Meaker, but I own this card with the highest end desktop CPU. Nope is the answer.
     
    Mr. Fox and D2 Ultima like this.
  35. bluefox94

    bluefox94 Notebook Consultant

    Reputations:
    1
    Messages:
    105
    Likes Received:
    37
    Trophy Points:
    41
    That's what I thought.. It would be too good to be true :D thanks for the reply
     
  36. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    4K pops in out of nowhere only in the last year for laptops. Now people expect to play at 4k when the hardware has only improved incrementally. We're only now to the point where 1080p/60 as a minimum is playable with top end hardware. 4K can go away as far as I'm concerned, when it comes to gaming laptops. Give us high quality 1920x1200 or 2560x1600 16:10 displays first. We'll worry about 4k later.
     
    Zymphad, octiceps, Mr. Fox and 3 others like this.
  37. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Believe me, 1080/60 minimum for all games with single top end GPU is going to go far away the more unoptimized crap comes out. You can't do it in AC:U, you can't do it in the Arma games, you can't do it in Dragon Age: Inquisition, you'll probably not be able to do it in Star Citizen, etc etc.

    As I always say: I'd be using most likely a 1440p 120Hz screen if I got a desktop, and I'd need a minimum of THREE 980s or Titan Blacks to satisfy that for gaming. Hell, I wouldn't even settle for two 980s at 1080p the rate things are going.

    I'm turning into one of those guys who REALLY likes graphics turned all the way up.
     
    TomJGX and bluefox94 like this.
  38. bluefox94

    bluefox94 Notebook Consultant

    Reputations:
    1
    Messages:
    105
    Likes Received:
    37
    Trophy Points:
    41
    #pcmasterrace #7waysli #praiselordgaben
     
    TomJGX and D2 Ultima like this.
  39. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    >_> <_< I have thought about 8 way SLI.

    It doesn't actually work.
     
  40. bluefox94

    bluefox94 Notebook Consultant

    Reputations:
    1
    Messages:
    105
    Likes Received:
    37
    Trophy Points:
    41
    I think I am happy with 1080p 120hz :D I wish 1440p 60hz was plausible though. Perhaps a Titan Black will handle it. I always wondered; Titan Black is a beast but 980 is the latest generation. Which one is better in terms of gaming?
     
  41. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    In gaming and overclockability, the 980 wins.
     
  42. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    What even 3 of them isn't enough? I'm quite sure that if the games support SLI etc, 3 cards will easily do 4K @60fps on high/ultra details.. It's too much to expect 1440p @120Hz with 1 card but 2 should do it...

    #consolegamersarepeasants #pcmasterraceforever :)
     
  43. kent1146

    kent1146 Notebook Prophet

    Reputations:
    2,354
    Messages:
    4,449
    Likes Received:
    476
    Trophy Points:
    151
    It depends on what you mean by "enough".

    Can a single GTX 980 comfortably handle 1440p gaming? Absolutely yes.

    Can a single GTX 980 run 1440p with all settings maxed out, and run at 120fps? No.

    But when you're talking 1440p with all settings maxed out at 120fps, you're not talking "practical." People who want this type of performance want it simply because they want "the best", and not because they want to know if a GTX 980 @ 1440p is a practical solution.

    So if you have realistic expectations on gaming performance, then yes, a GTX 980 @ 1440p is enough. If you have unrealistic expectations and simply want to max out your game for the personal satisfaction of knowing you can run things at max, then no, a GTX 980 @ 1440p is not enough.
     
  44. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    Wrong thread.
     
  45. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    This is what I use to play games... will a 720m handle it?

    [​IMG]


    Imagine if you could SLI multiple PC's together that each had 4x SLI...
    [​IMG]
     
    reborn2003 and TomJGX like this.
  46. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    @HTWingNut I think you're looking for something like this:

    [​IMG]
     
  47. daveh98

    daveh98 P4P King

    Reputations:
    1,075
    Messages:
    1,500
    Likes Received:
    145
    Trophy Points:
    81
    I think it's worth it. I think it's worth it for any enthusiast. Enthusiasts are early adopters so on the desktop front, it's absolutely worth it to have whatever it takes to "almost" get any cutting edge game to run. Remember Tri-SLI 8800GTX for Crysis? That ran at maybe 30fps average. Is it 'worth it' for one game? Not to 99% of people.

    Is it 'worth it' to have a cutting edge 18.4 inch super heavy lappy with two GPUs to get "close to desktop performance...on a notebook that sits on a...desk"....not for 99% of people. I am one of those crazy people.
     
    bluefox94 and D2 Ultima like this.
  48. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I dunno if I'm crazy enough to sell three titans for three 980s or something... but I am crazy enough to attempt to acquire two 980Ms or two 970Ms over my 780Ms. I'm still contemplating mugging @Cloudfire when he sells his AW18 for the cards.

    And the CPU.

    No idea how I'm getting to norway.

    Don't worry Cloudfire I love you <3

    Watch your back when you sell

    (I'm just sleepy and going far with jokes, don't slay me mods)
     
    TomJGX and TBoneSan like this.
  49. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    I did almost exactly the same thing with my 17R2, dismantled my SLI 780ti desktop and I'm using one of the cards in the graphics amplifier.
     
  50. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Exactly what I'm looking for... to hook up to my 800x480 LCD. :D
     
    TomJGX and D2 Ultima like this.
 Next page →