The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    The Future of Multi GPU

    Discussion in 'Gaming (Software and Graphics Cards)' started by HaloGod2012, Jan 14, 2017.

  1. HaloGod2012

    HaloGod2012 Notebook Virtuoso

    Reputations:
    766
    Messages:
    2,066
    Likes Received:
    1,725
    Trophy Points:
    181
    so what is the latest status of multi GPU support? It's incredible how much DX12 has killed off sli and crossfire by giving devs the authority to include support for multiple GPUS. What has been everyone's experience with the latest games? Is there any hope ? I remember gears of war 4 was promised to have explicit multi GPU (they showed it off!) but its many many months without a sign.
     
    hmscott likes this.
  2. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    Some games have it? Most don't though.

    Hopefully as studios get more comfortable with DX12 they add it more.

    I want that Asus PG27UQ, but it's not gonna work at all without sli. Gonna need either two 1080 Tis or two 1180s.
     
  3. Scanner

    Scanner Notebook Deity

    Reputations:
    180
    Messages:
    877
    Likes Received:
    270
    Trophy Points:
    76
    I find it pathetic, sometimes I have to go into the Nvidia settings to trick SLI to work in a game. Even then it's not true SLI, since I see no real difference between single and dual cards in game. It's frustrating to look at my overlay and one GPU is sitting there for all intents and purposes DEAD.
     
    TBoneSan, Awhispersecho and TomJGX like this.
  4. Awhispersecho

    Awhispersecho Notebook Evangelist

    Reputations:
    66
    Messages:
    355
    Likes Received:
    306
    Trophy Points:
    76
    My personal feeling is that as long as OEM's are going to continue advertising and selling SLI machines, and boasting of the advantages that provides, developers should be required to support it. No SLI support, no release of game. Pure and simple.
     
    hmscott and Support.2@XOTIC PC like this.
  5. HaloGod2012

    HaloGod2012 Notebook Virtuoso

    Reputations:
    766
    Messages:
    2,066
    Likes Received:
    1,725
    Trophy Points:
    181
    I agree. What screwed it up is suddenly throwing it on the devs with dx12. Dx12 has been cancer for gaming. DX11 plus driver optimization for sli and crossfire was much better. If MS and devs use the tools for multi GPU in games, it will scale amazingly. Nvidia and MS need to take charge here and lend more helping hands and tools. They need to basically make it so that both GPUS appear as one to the developer.
     
    D2 Ultima and hmscott like this.
  6. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    I agree, my biggest deal with SLI is that most application and software do not utilize it fully.
     
  7. HaloGod2012

    HaloGod2012 Notebook Virtuoso

    Reputations:
    766
    Messages:
    2,066
    Likes Received:
    1,725
    Trophy Points:
    181
    They need to follow what the Rise of the Tomb Raider devs implemented. The DX12 multiGPU in that game works perfect !
     
    hmscott likes this.
  8. Awhispersecho

    Awhispersecho Notebook Evangelist

    Reputations:
    66
    Messages:
    355
    Likes Received:
    306
    Trophy Points:
    76
    The issue is much bigger than this though. It's goes to the increasing laziness of developers in general and it's the consumers fault. As of now, I would guess 50-60% of games are released buggy as hell, poorly optomized and basically incomplete. They are released that way because the developers know the games will still be bought and people will deal with it until they get around to releasing a patch. And then another patch, and then another patch. It goes on and on. They release crap and people throw money at them anyway regardless of the launch issues, the lack of features, poor optomization and performance, and every other issue you can think of. I figure for alot of games, it takes about 6 months now to get the game you were supposed to get when you bought it. Obviously not all games are this way but it is getting increasingly frequent.

    It's not just games, Look at Nvidia and their drivers, look at MS and and Windows 10. Quality has gone out the window across the board. So if developers aren't being held to a high enough standard to even release a game that runs properly and is complete, there's no way in hell they are going to go the extra mile and put SLI support in it. No reason to. People will but it anyway. If developers were doing what they were supposed to be doing with games, drivers, Windows, DX12 and everything else, I would have a compelling reason to upgrade to Win 10. As of now, I don't. The performance gains I might get, if any at all, aren't worth the other crap involved with WIn 10.

    But yes, SLI should be mandatory for a game release. It actually surprises me that Nvidia and AMD don't push this harder, it would only help them sell more video cards.
     
  9. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    They added multi GPU support? Back when I played it with my old GT80 it had only single GPU performance in DX12 and SLI performance in DX11. I didn't know they patched it!
     
    hmscott and HaloGod2012 like this.
  10. HaloGod2012

    HaloGod2012 Notebook Virtuoso

    Reputations:
    766
    Messages:
    2,066
    Likes Received:
    1,725
    Trophy Points:
    181
    Yep they added full dx12 multi gpu. It works with integrated gpus also. So an IGPU and gtx 1080 can work together to boost performance. Its awesome.
     
    hmscott likes this.
  11. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    When did this happen? Last I checked, RotTR used DX12 Explicit mGPU (Linked Mode), which requires matching dGPUs, just like Nixxes' other title DX:MD.
     
  12. HaloGod2012

    HaloGod2012 Notebook Virtuoso

    Reputations:
    766
    Messages:
    2,066
    Likes Received:
    1,725
    Trophy Points:
    181
    Can't confirm, I just see YouTube videos of people using the IGPUs with their gtx 1080s and seeing an improvement in the game
     
    hmscott likes this.
  13. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    I personally don't see much future for multi GPU's honestly. SLI or Crossfire have not matured a whole lot over the years and support is still spotty at best. I think the niche market will always be there, just because it's a status thing, but I wouldn't expect a whole lot to change. Until consoles decide to go with multi-GPU configurations, you won't see much improvement on the PC front either.
     
    D2 Ultima likes this.
  14. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    Source? AotS is the only game of which I know that allows this sort of mGPU mode.
     
  15. HaloGod2012

    HaloGod2012 Notebook Virtuoso

    Reputations:
    766
    Messages:
    2,066
    Likes Received:
    1,725
    Trophy Points:
    181
     
  16. Templesa

    Templesa Notebook Deity

    Reputations:
    172
    Messages:
    808
    Likes Received:
    247
    Trophy Points:
    56
    This is interesting. The last part of the benchmark video actually shows a decrease, but the others a small bump.
    It would be pretty cool to test with the new Kaby Lake systems as their GPU's are a little more powerful than Skylakes (which for iGPU's weren't terrible - especially if you had Iris)
    I wonder what it does for lower end machines, like my GT 540m + Ivy Bridge combo?
     
  17. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    A similar bump I can imagine considering the difference between a 1080 GTX and any igpu should also be pretty huge despite improvements.
     
  18. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    I agree! They utilized it very well, Most do not. It's a shame to waste, when available, great hardware.
     
  19. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    That is a fake/staged video (notice the ratio of upvotes to downvotes). I have RotTR and can confirm that only Linked Explicit mGPU (essentially DX12 SLI/CrossFire) is supported.
     
    D2 Ultima and HaloGod2012 like this.
  20. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    It doesn't work perfectly. In fact, scaling is pretty bad compared to a few years ago. And scaling at 4K in DX12 was actually worse than it was at 1080p/1440p, which can even indicate a CPU bottleneck as to why DX12 worked better. Here's their blog post for reference; look at the pitiful minimum scaling of DX11 at 1080/1440: https://blogs.msdn.microsoft.com/di...rectx-12-multigpu-and-a-peek-into-the-future/

    I mean, it works. Sure. I won't deny that. It also uses less CPU, which is great considering the fact that CPU usage in AAA games from 2015 onward has been massive and even highly clocked quadcores with great RAM isn't often enough to hold 120fps sometimes (far less what lower speed ones and more basic RAM would do)... but I can't really say it's perfect, sadly. A step in the right direction, certainly, but then their DX11 scaling (especially 1440p onward) was so bad that I think it's more of a technical showcase for DX12 than much else.
    Actually, it's both better and worse than you're considering. Most AAA games are extremely buggy and broken out the gate; it takes at least a couple weeks before they're more stable. But indie games don't really have that problem... they just don't get bought or lauded like the AAA ones do.

    And the most popular games (especially among the streamer crowd) are the most broken. H1Z1, Dead by Daylight, CS:GO, etc. It's terrible. It's like people are proving with their wallets that it's ok to release the most unstable, unoptimized things in the world and they'll buy and play them to no end. And these kinds of games never get fixed. Planetside 2, DayZ, etc are great examples of it.

    I also agree with nVidia's drivers and W10 being garbage quality. Constantly broken stuff, as far as I can tell. But people keep making excuses for them.

    ------------------------------------------------------------------------------------------------------------------------
    I honestly believe that SLI is going to get worse (or die) before it gets better. Devs DO NOT USE DX12 OR VULKAN. They don't bake in optimizations (except that I've seen from ROTTR, and that was only for mGPU functionality which I'm still not convinced is purely DX12's power) and just use the drivers to handle optimizations. DX12 as-is is a detriment to the gaming community. Devs lazily slap it in to mitigate their overboard CPU usage in most high profile new games.

    I'm still a lover of mGPU (I don't think single GPU will ever leave me truly satisfied, if I were buying a system and had sufficient budget) but I really can't recommend it except to the most advanced users, and even then, I STILL push for a single strong card before mGPU. I can't even recommend AMD at all for mGPU because they lack a counterpart to nVidia Profile Inspector to force/edit GPU profiles. Their basic list of supported games too is much smaller than nVidia's, and they STILL don't work in windowed modes.

    As crap as it is, the band-aid on the mGPU bandwidth problem (High Bandwidth SLI cable) is still something I'd say was necessary. So at least they... might... be working on something to further mGPU. Maybe by Volta or the architecture after Volta we might have something proper... but it looks like they're just waiting for PCI/e 4.0 so x8/x8 will act like current-gen x16/x16 which solves much of the bandwidth issue they have. I have a very lengthy explanation of the bandwidth problem in my SLI guide you can find in my signature, if you're interested in learning more about it (I'm too tired to type it all out from scratch).
     
  21. bradleyjb

    bradleyjb Notebook Consultant

    Reputations:
    37
    Messages:
    215
    Likes Received:
    129
    Trophy Points:
    56
    Not to disagree about buggy game releases, but complexity of software has increased a ridiculous amount from the good old days. There's no way they could test for every use case and hardware config (and again, i dont disagree that some devs and games are released prematurely). I also think that there's been a huge cultural shift with the ability to just release patches to be downloaded. 15/20 years ago you were forced to have it mostly right and with my earlier point, things were much simpler to implement for. All the graphical features, multithreading, etc. unfortunately come with a cost, especially for us PC gamers.

    I hope I never see the day when Nvidia or AMD can mandate what developers should do. Instead, they should strive to make it easier to implement, and/or bake it into their drivers.
     
  22. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    It needs dev work and engine support more than anything. Devs tend to use engines that don't support mGPU by default (like Unreal Engine 4) and then not optimize things. Street Fighter V runs worse than Unreal Tournament 4, and looks SIGNIFICANTLY worse/simpler, for one example. Ark: Survival Unoptimized, Dead by Daylight and previously The Culling were the prime examples of terrible optimization to visual prowess ratio. But then you can't mGPU to get better performance.

    Trust me when I say, it is all on the developers this time around. Sometimes, games have bad profiles and that's nVidia's fault (like CoD: Black Ops 3 which needs another profile entirely for great scaling with strong cards) but when a game just doesn't work well even if it has a profile? It's devs only.

    Dark Souls 3 is a good example. The patch that came with the DLC launch for Dark Souls 3 completely fixed SLI scaling. Before, SLI was introducing negative scaling. Single card was better. After, it worked perfectly in SLI; basically held 60fps constantly. I tried to make it drop and it wouldn't. I didn't even change the drivers, it simply just worked significantly better. This is the state of most demanding games since about mid 2014.
     
    bradleyjb likes this.
  23. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Unfortunately there is no incentive for developers to make games with SLI in mind. It would be good if Nvidia / AMD could do something about that since they are the beneficiaries at the end of the day.
    SLI / CF users are ridiculously rare I don't see that happening.
    It's pretty sucky since 4k 120hz is just about a thing now and hardcores could really use it. If I were building a desktop in the near future I'd kinda want to be shooting for that.
     
    Last edited: Jan 23, 2017
    TomJGX and D2 Ultima like this.
  24. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    I agree. It is a small market (relatively speaking) But it devs did truly take advantage, we could see some awesome performance and graphics in the future.
     
  25. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    It doesn't help that nVidia started neutering SLI since Maxwell.

    They removed using SLI to get higher AA.
    They added features to cards that didn't work in SLI, and some of them still don't (like MFAA).
    They broke SLI equivalency with Maxwell; custom vBIOSes are needed to keep cards' clocks and voltages equal, and crashing due to voltage variance not clocking back up fast enough was present, and I don't even know if it ever got fixed.
    They keep adding more features and then saying it doesn't work with mGPU. They use the High Bandwidth bridge to get more bandwidth between cards (killing 3-way mGPU and above) and then in later drivers they legitimately took all the bandwidth benefits (which would have benefitted games that have bandwidth limitations like Rainbow Six Siege and Doom 2016) and shoved it into smoothing out the frame pacing. So no XDMA-style design except for NVLink at the tesla end of things because they can't sell it for more, and then they waste the bandwidth they had.

    Why bother trying to aim for a technology that the makers don't seem to care about?

    It doesn't excuse developers of game engines and games who use tech that relies a lot on previous-frame frame buffer data for no real reason, though... the tech doesn't generally provide better visuals or optimization that makes using single cards preferable, except for SMAA T2x and TAA; the latter being capable of usage in mGPU after some tweaking similar to TXAA, albeit at a larger performance hit than on single GPU. This is also the reason mGPU is so terrible. Couple it with ridiculous CPU load in later games (where the framerate won't increase much anyway unless at super high resolutions, where you AGAIN become bandwidth-starved) and you've got a recipe for disaster.
     
    Last edited: Jan 23, 2017
  26. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    Yes, seems to be the blind leading the blind. To play devil's advocate although, I cannot imagine trying to dev for an unlimited amount of hardware combinations and optimize for each, but SLI does always seem to be throughout to the wayside.
     
  27. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    SLI isn't something that you need to make such broad optimizations I think. Most of it is handled in the driver; the most you do is essentially make your game "AFR-friendly". And cut down on CPU usage, of course. The extra power of mGPU is pointless if you're using 100% CPU for like 70fps or something. SLI, even for the same FPS, does use slightly more CPU power.

    It's just that devs aren't making their games AFR-friendly. Why? No idea.
     
  28. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    My personal cons is again SLI is just its another thing to potential go wrong or worry about. I would much rather have one great GPU then two OK ones.

    Unless it is something like 2 X 1080s, then.... :D
    [​IMG]
     
    TomJGX likes this.
  29. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    hahaha. I tell people single strongest GPU before SLI now. Two midranged cards are broken now; nVidia knows it. They don't let people run 1060s in SLI, even.
     
  30. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    How do you like running the 780Ms?
     
  31. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Most new games never used them well. Post mid-2014 was a crapshoot, though I did not get many of the more broken titles. When things did work, scaling was often really bad too. Cards that could not touch 780M SLI in Unigine or 3DMark would wreck me in stuff like Black Ops 3 or match me in Evolve etc.

    It was at this point I learned RAM was a huge factor commonly mistakenly overlooked, but one of my 780Ms died just 2 months after I got a RAM upgrade so I could no longer check even if I wanted to.

    I would take one 980M over two 780Ms though, considering the last couple of years.

    Sent from my OnePlus 1 using the force
     
    TomJGX and Spartan@HIDevolution like this.
  32. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    9 series cards, since the 10 series have been debuted, I feel are very overlooked. For price per performance, there are some great options.
     
  33. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    indeed. Prices went down with maxwell, but truthfully for the quality of products sold, they weren't very high. Pascal is a crapshoot all over I find. Most cards don't even hold their default listed "boost", except the 1080 models.
     
    TomJGX likes this.
  34. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    Do you mean the 10 series cards specially are of bad quality in your opinion?
     
  35. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I don't think the 10-series cards are really of bad quality, but rather they're too power limited. I see people with 1070N cards holding a total 1506MHz boost or less with unlocked FPS gaming, etc. I wish that they would have given them proper allowance and simply told notebook owners "hey, deal with it, you can't cool it don't offer it, we want our "within-10%-of-desktop-power" statement to be true at all times in all notebooks".

    The 1080N models are the only ones I've seen in most any notebook actually do what you expect.
     
  36. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    Can you elaborate on the Black Ops 3 thing? Been using the stock one and getting quite good scaling on my 980 Tis for the most part. What profile are you talking about? The big downside is there is like fairly large framedrops every once in a while. What is this other profile? Funny it's one of the few games I hadn't looked into because it worked well enough I thought.

    I also just in general really want SLI to comeback. DX12 has been killing it... I can't realistically get a 4K 144Hz monitor this year without sli like 1080 Tis or 1180s. But there is no point if nothing works. When I bought my cards sli was working pretty much better than it ever had, and now it's going down again.
     
  37. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    [​IMG]

    [​IMG]

    When you bought your 980Ti cards SLI was in a pretty big decline. 980Tis just were capable of pretty much anything, and two of them, even with terrible scaling (let's say 20%) would still have hit the upper limit of many users' CPUs and whatnot.

    Remember that scaling and utilization are not the same. 99% util on each card can only grant you say... 120fps over 100fps. That'd be 20% scaling. SLI is capable of 95% scaling on average. People are now "happy" if they get 70% scaling, and expect 50% or less. I'm not one of those people, and it's why I very much dislike suggesting SLI to people these days.
     
  38. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    Yeah I was getting good scaling, utilization was near perfect, scaling was like 80% though. It did fluctuate depending on the update.

    Yeah I would say it was in a decline, but coming out of like a peak. Most of the games currently out worked, new ones were starting to not work though. So I just change it to battleforge? Nothing else needed? I know sometimes the custom or altered profiles need other tweaks. I'll check that out.

    My cpu is a 5820k at 4.5GHz. Usually had plenty of ultilization free on the 8 threads getting used, and the occasional Dice game using all 12.
     
  39. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Games that've been out for a few months usually work well with SLI. Much better than launch. But unlike when I got my system in 2013 where I didn't even need to update drivers for stuff like BF4's pre-release demo to use mGPU? Even early 2015 was an awful time.

    It doesn't help someone who buys multiple GPUs expecting the power to work to be told they gotta sit on one GPU for a couple weeks or a couple months (or in the case of Dark Souls 3, 7 months) until it decides it's going to work properly with mGPU. Might as well buy one card. SLI is only for the true die hards now, and that will kill it faster.

    But with all the rampant unoptimization going on and everybody marketing 4K like it's actually worth a damn to game on (until I see 4K 144Hz with GPUs and CPUs to push it, and response times low enough too, I won't say it's worth it!), we might see a fallback to mGPU just because there's no way to get more power. But mark my words, PCI/e 4.0 will bring some life back. There *IS* a bandwidth issue problem. You can see I left a whole section about it in my SLI guide. But PCI/e 4.0 x8/x8 will be like PCI/e 3.0 x16/x16, so things might churn forward some more. I have no expectations, but for this alone, I'm looking forward to Skylake-E and Volta.
     
    DreDre likes this.
  40. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    Ok I took a ss with the old sli profile I'm gonna do the new later I gtg. I'll report back.

    Asus PG27UQ monitor should be good enough for all that. Cpus are fine, gpus are the problem.
     
    D2 Ultima likes this.
  41. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    [​IMG]
    Could not agree more !
     
    D2 Ultima likes this.
  42. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Nah. CPUs aren't even fine anymore. Lots of games need way way too much CPU usage. Rise of the Tomb Raider can't even seem to hold 120fps+ at 1080p with a Titan X Pascal using a 4.8GHz 7700K with 3000MHz RAM. And it's CPU-limited, too. Because reducing the RAM or using an i5 will provide lower performance.

    Raising resolution *SLIGHTLY* raises CPU usage (negligibly; maybe 2-5% extra utilization going from 1080p to 4K); we need more (though not much more) CPU power by a long shot. Skylake-E might fix this though. HWE and BWE can't overclock very well, you need good chips to get 4.5GHz+, and the heat is ridiculous. Skylake-E should fix that easily, and support higher RAM speeds too, as well as more PCI/e CPU lanes (48 instead of 40 I think) and thus tri-SLI without a bandwidth problem would be great, as well as DMI 3.0 for 4GB/s transfer from NVMe drives instead of the 2GB/s limit (excepting burst transfers) might actually make high end computing something to salivate over. I didn't like Broadwell and HWE/BWE were just disappointments entirely.
     
  43. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    I was more thinking higher core count cpus. I never have tried with rise of the tomb raider though, so I'm sure that's accurate.

    It's funny. One the one hand you are correct in everything you've been saying. But I feel like it's a bit pessimistic especially on the older stuff. 980 Ti sli made 4K breach 60fps on tons of games that a single one alone couldn't go. Sure, it has a ton of issues, but my experience gaming at 4K for the last couple years has been nothing short of awesome. I couldn't have done it without sli or GSync.

    That being said I am nervous for many of the reasons you've been saying about 4K 144Hz. Not liking the thought of needing to redo my entire build... When is Skylake-E expected? The monitor is launching somewhere between September and December this year. Also why do you expect it to OC far ahead of current 6+ core cpu? Above 4.8GHz seems unlikely to me.
     
    Last edited: Jan 24, 2017
  44. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Well, the hire core CPUs do help greatly. They're just not supported by every game. I can't say whether ROTTR supports them or not. But I mean if 4.8GHz high speed RAM highest-IPC-quadcore with hyperthreading (on a game that makes use of said hyperthreading, which is about a +30% boost over non-hyperthreading CPUs) can't hold 120fps at 1080p, SOMEBODY has coded it badly. There is no real way it should need so much CPU, and a lot of newer AAA games are requiring that much. It's why DX12 and Vulkan are being tossed at the titles; because simply being on the DX12/Vulkan APIs significantly reduce CPU load and balance it better across multiple threads. It's why I'd like for Vulkan to be more adapted (does not need W10), if they can get some SLI drivers working for it. The nVidia driver API has DX12/Vulkan support for SLI profiles... games just don't have any yet, really.

    I know, what I say can often sound a bit pessimistic. But really, I just look at the facts. I think we're coming out of a depressing trend in tech, but PCI/e 4.0 is up and coming, Skylake-E is probably around May this year if I remember correctly. I'd like to see the improvements Vega has (and if it affects current-gen games) and see what they force nVidia to do. Ryzen is poised to offer cheap, dual-channel-RAM hexacore/hyperthreading chips for what we currently pay for i7s when it launches. I think we're actually about to enter a good era. But it all rides on AMD, and unfortunately they can't take the enthusiast market due to only supporting Dual Channel memory on Ryzen. We'll see how things work out for their GPUs as well.

    SLI is better at higher resolutions because it forces higher scaling in a lot of games. Though some (like Fallout 4) get pitiful scaling like I think 20% at one point. Running PCI/e 3.0 x16/x16 will significantly improve SLI in those scenarios. The HB bridge and LED bridges should no longer help unless old drivers are used, though.
     
  45. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    Why is dual channel memory a deal breaker for gamers? My own testing did not show much difference compared to ram clockspeed with sli. I'm expecting Ryzen to be decent (at least close to intel) and Vega to be around where the 1080 with Nvidia immediately trumping it either with 1080 Ti or Volta.
     
  46. Awhispersecho

    Awhispersecho Notebook Evangelist

    Reputations:
    66
    Messages:
    355
    Likes Received:
    306
    Trophy Points:
    76
    If they can get SLI working right for VR, SLI will suddenly be very important. That will immediately increase the amount of people that would be capable of using the current gen headsets while making it much faster to push new 4k headsets which is where VR needs to be. That my friends is what they call a win win.
     
  47. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    It's all about temps and speeds. Better RAM reduces CPU usage and can improve GPU usage, depending on the game. If you're hitting gsync limits or whatnot you may not notice too much difference. But for example, here's higher speed RAM doing better than overclocking.

    I said that AMD wouldn't take the "Enthusiast" market, not the "gamer" market. Enthusiasts (like myself) or Prosumers who do productivity work etc may understand the difference, and realize that even though AMD is cheaper, they cannot offer the best performance. But the mainstream high end market? Ryzen can obliterate intel. If 6c/12t, similar IPC to Skylake, is offered for $300-$350, then Intel simply loses. Especially if it OCs well.

    The trick to RAM is that timings and speed need to match. If you're using 1600MHz 8-8-8-24 DDR3 and you change to 2133MHz 10-12-12-31 DDR3, you're actually getting a net reduction in aggregate bandwidth. But basic DDR4 RAM (2133MHz 15-15-15-35) against decent 3000MHz-3200MHz RAM (15-16-16-36 and 14-14-14-34 respectively) is a straight upgrade. People don't test the timings or even list them. Quad channel grants a lot more read/copy bandwidth, and helps for games that like it (Frostbite 3 engine, Unreal Engine 4, Cryengine 3, all LOVE good RAM). For the DDR3 RAM to make sense, you'd need some say... 2400MHz 9-11-11-27 RAM (which exists for quad channel in DDR3). Then you'd see straight benefits. Alternately, if you had bad 1600MHz RAM (like my old 1600MHz 11-11-11-27) and upgrade (like to my current 2133MHz 11-11-11-31) you can see massive benefits.
     
  48. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    Interesting. I was aware of the importance of ram timings for ram intensive tasks, but I guess what do you do as an enthusiast that yields huge gains from it?

    My ram is admittedly trash right now but I was planning on upgrading to way higher clocked ram for the 4K 144Hz display.
     
  49. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    It's not really that RAM timings benefit RAM intensive tasks... it's the combination. RAM benchmarks x data per cycle, but if your cycles take forever to show up then what's the point? You should aim for a net increase in calculated data. Some programs prefer speed over everything, but for the most part, you need timings + speed for the benefits. In plain old gaming it helps. The most obvious benefits are higher minimum framerates, and reduced CPU load in many titles/operations. Swapping my RAM netted me 20% utilization reduction per CPU core driving a GPU (this is 20% out of 100%, mind. I went from about 82%/95%/10%/10% to 62%/75%/10%/10% and simply sat at 120fps almost the entire time) in Killing Floor 2. Evolve Stage 2 gave me a 30fps boost to my average frames, going from 80fps to 110fps. I didn't change anything; swapped RAM and launched the game the next day and poof, performance bump. The people I was playing with didn't have the performance bump either so it wasn't the game's update.

    I simply know the importance of good RAM right now. I'll never settle for crap RAM again =D.

    But I'll admit, quad channel memory over dual channel memory for a lot of gamers might not be worth the cash outright, however if you're getting a hexacore and quad channel memory capable system, I'd say you should get it. It can only improve, and there's 3200MHz 14-14-14-34 quad channel kits from G.Skill now. Compared to the cl18+ we had when DDR4 was released, it's good times for RAM.

    This is why I say Ryzen can take the mainstream/gamer-focused market. They can't take the enthusiast market because someone who is aiming for the best will simply HAVE to choose intel. It isn't a choice, like it currently isn't a choice between the FX series and intel's chips. If AMD had allowed for quad channel support with Ryzen and launched the hexacore at ~$325 and the quadcore at ~$250 and the octocore at ~$500, intel would be in serious trouble. If Ryzen OCs well there'd be no reason to use Intel (except I guess for non-W10 support? Both Intel and AMD aren't bothering to provide non-W10 drivers for chipsets beyond Z170).
     
  50. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    Very interesting VR has not been addressed yet. I think that could potentially be what "saves" SLI. Creating a problem SLI is already in place to solve.
     
 Next page →