I think this post overestimated the existing sli users on non VR capable systems. SLI users are usually cutting edge, I doubt too many 780 users or lower are stilling running sli on those cards. I know I wanna upgrade this year monitor willing.
-
-
Support.2@XOTIC PC Company Representative
I think it could go either way. Either you SLI'ed some time ago and are looking to upgrade since the SLI cards are being outdone by a single GPU or people are building modern systems and going all out with SLI. -
Yes but on modern systems the only cards even capable of sli are all VR ready. 1070 and up can run VR right now and are the only cards that can sli.
-
Support.2@XOTIC PC Company Representative
True. I was more referring to VR taking full advantage of the hardware in the future or even better GPUs to SLI. -
Oh yeah, definitely want it to happen. I just meant I don't think it's a huge untapped market of older cards that are sli'd.
-
Well... It's slightly better. Only a few percent though. -
I can't actually see much xD. The images are really small.
-
Top right, fps is 114 vs 120. The usages aren't relevant because they were flying up and down...
Ironically that was 4K, but I can only upload 2MB images so that's like 360p. -
What was your CPU load at?
I have no idea how people get 200fps in that thing. -
Well I was at 4K. I was at a paltry 26% cpu load at only 4.1GHz lol. I get way over 200 at lower resolutions.
-
interesting. *shrugs*.
-
Ok so somethings weird. I dropped to like 640p and 50% render even and cpu stayed the same in like the low 30s and fps at like 230. Something else is bottlenecking. No settings really increased the cpu load much though occassional brief hikes happened for like a couple frames.
-
That's really weird. You have a 4.5GHz 5820K right? On my system if I dropped graphics to the lowest (except render scale) I'd get under 120fps with 100% CPU load being a huge bottleneck. Translating this into your chip, you should be at 52% for 120fps (give or take) on your chip...
This probably means that the game loves quad channel memory and high cache on a CPU, because the quadcore users can't really hold 120fps easily, but people on the -E chipsets seem to get 200fps like it's going out of style.
That game is just buckets of unoptimization. -
Er, currently using dual channel DDR4 lol. Otherwise probably right. My ram is 100% a bottleneck right now in many games, but it's still well above 60, so I've been waiting to upgrade to closer when I upgrade the rest of my stuff. Though the rising ram prices are making me think it may be a better idea to buy asap. I haven't thought about that too much actually.
-
Are you only using two sticks? If you're using four it might be doing quad channel anyway. Dual channel isn't guaranteed to quad, but it can.
Also, here you go: https://www.newegg.com/Product/Product.aspx?Item=N82E16820232349
Though you could probably get the 32GB version or something to that effect. I don't believe there is a 16GB version unfortunately, but you can look on G.Skill's website. Corsair is closing in in terms of timings but they don't have the best so they don't get recommended~
It's so weird... I cannot understand why that game runs at such high FPS for some. I cleanly hit a CPU limit, and you're using 2 cards as well so it's not like SLI is the problem. I distinctly notice a CPU spike in utilization above 78fps in particular. My RAM upgrade helped quite a bit, but it still ran like aids and had that fps dropping bug you claimed you saw. That game really wasn't worth money to any PC gamer, honestly. -
Two sticks right now... Also christ.... Not planning on dropping half a grand on ram....
Also odd, I've have no issues with it pretty much at all. My P650RS can perform well on a 144Hz monitor... I also love black ops 3, I'm ridiculously good at it lol. My kd raises dramatically though on a high refresh rate monitor.
Case in point this was Sunday on 165Hz:
-
32GB is $300 https://www.newegg.com/Product/Product.aspx?item=N82E16820232348
Interesting. Really interesting. Weird game I tell you that.
I actually couldn't get the game working properly. Shooting people was always a pain. I know when I physically miss, but most of the time I would be dead on and just get no hitmarkers. Question, do you play with a controller by chance? I am wondering if the game just never liked my internet. Black Ops 1 and Black Ops 2 and MW2 never had problems. Ghosts was great. AW, BO3 and MW3 I just never could really get people. I know MW3 was internet-based and if I got a decent lobby it worked, and I just would've needed to use a different playstyle than my run and gun, but AW and BO3 were two *different* kinds of games. -
No, controller is pretty good on BO3 because of the ridiculous auto aim on PC, but no I'm way better with a mouse and keyboard (like playing on Xbox vs PC I'm a way better shot on PC).
AW was awful, and MW3 I liked, but hated the maps. I loved BO3, I think Treyarch toned town the stupidity of AW but increased the skill gap. If you know what your doing there are a ton of tricks you can do to make people miss. I love all the tactics it allows. Getting good at jump shotting and wallrun shots helped a ton. I have a lot of advantages though. I'm on Gigabit internet often with a really low ping to most servers. Though when I'm not I didn't notice much increase in lag. Run and gun is where it's at. That 43Kd game wasn't headglitching lol. -
I see. I play more like this:
Not the best of my gameplay, I missed a good few shots, but you get the idea. I was just testing recording with a program called Action! at the time. The class I chose at the beginning was a class specifically for beating someone who hates when I jump. So I told him I'd never jump in combat but I'll just use every OP weapon in the game. I *ALWAYS* make him rage quit. But it's too cheap to use against normal people.
But BO3? I just like, can't hit people. At all. It does not work. I've fixed that random lag spike you saw near the start of the video with WLAN Optimizer since then, so that problem wouldn't exist for BO3. It's like the game just hates my internet, or something. -
Yeah interesting... That gameplay was pretty solid, some of those shotgun misses that turned into knifes you so should have died from lol, but otherwise clean.
In Black Ops 3 depending on where I am I always try and do the least expected move. From jump, to slide, to drop shot, or wallrun. Because you have so many options, if you learn what people expect you can often win a lot of gun fights. I also have good aim though, I normally snipe, and after that for a long time normal guns feel pretty simple to use.
Here is the first part of my 43-0 broken into two because I'm incompetent.
https://vid.me/HzeT
https://vid.me/zXQ3
The first one shows positioning. Kills like 3-8 I guess I'm slowly moving around that area always trying to get away from where they think I am.
The second clip I have a ridiculous kill on a headglitcher right after the flamethrower runs out. Totally should have died but I didn't miss basically a shot.
I also get lucky though. Literally should have died in the very first encounter I had. -
Yeah I can't hit like that at all. It really doesn't work.
Here's another of me in BO1; grabbed with shadowplay so no theater mode oddness:
I mean, if I know I'm shooting properly at someone, they should be hit. But BO3 and AW is just... weird. -
I had such a love hate relationship with the 74u. It was so OP... But so much fun to use haha. Yeah idk that's weird.
-
Support.2@XOTIC PC Company Representative
I came across this video yesterday and thought it would be relevant to this. It is amazing that this was even achievable, although it is noted his is using top of the line everything. I was also astonished it actually could work, in game and with windows resolution settings.
hmscott likes this. -
I mean two 980 Tis can do 4K pretty handily. Two Titan Xs are almost double that, which is about what you'd need for 8K to like mostly work.
Still that is pretty awesome, and I expected more stuttering. -
Support.2@XOTIC PC Company Representative
Yes, they may have edited troubleshooting out of the video, but it seems like it was a few setting changes and they were up and running, which is very suprising. -
The HB bridge gains are directed fully to avoid stuttering and improving frame pacing, so technically it makes sense for it to not do that, however, honestly, that wasn't really using SLI. Maybe running the separate screens did a split-frame rendering kind of deal, but it really isn't what you'd get with SLI in most games.
He honestly would have had a better time if he'd run a single 4K display using 4x DSR; then he could've used SLI properly. The framerate would have been a bit similar, but I think it'd have been improved, due to the memory access bandwidth improvements that SLI offers, which as far as I know don't happen when you turn off SLI and run multiple screens using multiple GPUs. I remember Titanfall 1 had serious mGPU problems on launch (30fps lag) and people running multiple GPUs spamming 3 monitors had no issues whatsoever, so that's where I get the information from.Support.2@XOTIC PC and Galm like this.
The Future of Multi GPU
Discussion in 'Gaming (Software and Graphics Cards)' started by HaloGod2012, Jan 14, 2017.