AMD cards are doing a pretty splendid job of "competing" in Far Cry 4. More like utterly destroying Nvidia in perf/price. Case in point, a $300 290X versus a $550 980. 45% cheaper but only 5% slower. Or how about a $250 290 versus a $350 970. 29% cheaper but actually 3% faster.
-
NVIDIA destroys AMD in the world of notebooks though. So it's not all bad news for the greens.
-
moviemarketing Milk Drinker
-
Sad as it is to say, it's not always the drivers (and I know a lot of times it is), but rather sometimes it's just that awful of game coding. Also, devs usually never test things with multiGPU setups. Then compare and contrast with how much they care about the PC platform and you could get a scenario where a game like Titanfall took something along the lines of 4-5 months for a single SLI profile, and it's STILL broken too. Please note, TF is on Source engine; an engine which basically has supported pretty much everything you could toss at it since time immemorial.
Now you might have much better ideas as to why exactly I hate the whole AAA game market so much right now. Hell, even when I dropped to 38fps at one point testing SLI in Evolve's Alpha I never felt stuttery in the slightest; it was basically like running on single GPU. That game hasn't even officially hit beta yet, far less be out. CoD: AW? HA. Dark Souls 2? Randomly drops to 45fps in some specific parts of the world while GPU util remains low (it's not all that common though), and feels like playing at 20fps the minute it goes below 50 anywhere with SLI on. BF4 will happily drop me to ~90fps from my target 125fps cap with my GPU util sitting at 70% and my CPU util barely cracking 60%, but TotalBiscuit's Dragon Age review had his 980s sitting at 90-95% almost constant. Same frostbite 3 engine, different mindset of publisher when releasing (framerates aside). Know how I have to get 99% constant scaling on my 780Ms in BF4? Turn the game to ultra and THEN bump resolution scale. Let's not even talk about Watch Dogs which for the most part took months to get into an acceptable (not good, not great) state for most PC users, which then BROKE MULTIGPU SUPPORT WITH A NEW PATCH ALONGSIDE DLC. AC:Unity is riddled with bugs and there's not a system on this planet that has less than 3 GPUs that'll never dip below 60fps at 1080p on it. Far Cry 4? LOL. Game stutters like mad driving around as if they're running into a vRAM bottleneck or something... except it doesn't max out a 4GB vRAM buffer. AND MORE TO THE POINT: PS4 RUNNING FAR CRY 4 LOOKS LITERALLY ALMOST AS GOOD AS PC ON MAX GRAPHICS, BUT DOES SO AT A SOLID FRAMERATE WITH NO HITCHING AND FAR FEWER BUGS. ON BASICALLY AN ENTRY-LEVEL CORE 2 QUAD WITH A GTX 570. I mean really. REALLY?
No. I'm not blaming manufacturers right now. I don't get lack of smoothness in most older games, I get great performance with low vRAM in most older games with similar or better graphical fidelity to currently new titles, I get almost no stuttering and no problems at low framerates in even unoptimized titles like Arma 2 and Arma 3, where when I drop to ~30 the game still feels like a solid 30fps experience, and I have to go down to 25 or so before I start to have problems that make me run the game on a single GPU.
So yeah... microstutter and bad frametimes is a huge problem, but drivers aren't really the blame right now. If drivers were the biggest blame, then they should be broken in most titles, which they're not. They're only broken in NEW titles. And from the same kind of devs too. Hell my SSF4 AE and then USF4 have gotten random stuttery times since I got this bloody PC. I thought it was a SLI problem, so I forced the game to 1 card. Nope, still does it. Most other people don't have the problem either; the game just doesn't appear to like my system (or people aren't sensitive to it like I am). SF4 doesn't do it though. Never did. So go figure. -
CODAW for me at least starts getting stuttery in the 40fps range.
-
Yeah same here. I swear somewhere between 40 and 50 FPS my setup hits a stuttering threshold, below which I really have to either turn down settings or simply not give a damn to keep playing. Happens both on my laptop and desktop.
What really grinds my gears is that for the desktop Maxwell cards, I get the same stuttering effect whether in SLI or single GPU. I don't disagree bad coding plays a big role, but in this case I very strongly suspect its due to the dynamic throttling that occurs with Maxwell in order to make it efficient and keep it cool. Well guess what I DON'T GIVE A RAT'S BUTT if my card's load temp is 55C when I'm losing performance and stuttering.
Yeah so nVidia wasn't kidding when they said Maxwell was designed "mobile first". What they didn't mention in the fine print, is that features that benefit mobile users but detrimental to desktop users are also included, such as this dynamic throttling nonsense that might make sense for laptops because of improved efficiency and reduced heat but does jack all for desktops.D2 Ultima likes this. -
-
-
Trust me that's the first thing I tried and it didn't make any difference, except causing my master card to not downclock and run full boost even while idling on my desktop LOL. These drivers are a joke.
But hey at least I don't have the dreaded DisplayPort issue people seem to be having with anything newer than 344.11. That's actually the major reason why I didn't bother updating beyond 344.11, since I am running off of DP. -
-
-
I guess his cards and/or the drivers are borked.
-
Something is borked that's for sure. Maybe it's my brain or my eyes...
-
Anyway little laugh aside... so what you're saying n=1 is that setting prefer maximum performance causes downclocks in games, keeps stuttering AND leaves the GPU at max performance on desktop?
So... you know I know you hate having to fix basic functionality on your own and stuff, but seriously? You might want to use that modded vBIOS right about now. I don't think I could live with something I know I could fix. -
I've been doing some reading on modding my own vBIOS and it's tempting, the problem is in order to fix the weird voltage discrepancy bug in SLI I pretty much have to pre-overclock the cards if that makes sense, so still a bit more work needs to be done.
-
-
The issue goes much deeper than that...
So the root problem is that for whatever reason, the 344.xx drivers fails (or refuses, god knows) to recognize that different ASIC qualities in these Maxwell cards may lead to different VDDCs when running boost. For example my master 970 has an ASIC of 74.3% and can do 1367 boost on 1.156V. The slave card has an ASIC of 66.1% and needs 1.206V for 1367 boost. So even out the box AT STOCK there's a 50mV difference between the 2 cards.
So any attempt at overclocking without manually compensating for the voltage results in constant crashes and hard locks, because apparently what happens is that somehow the master card is forced to run the full boost OC but never overvolts for whatever reason, so as you can imagine trying to run 1500+ boost on 1.156V just isn't gonna fly. And the workaround is to actually let the master GPU (higher ASIC, lower VDDC at stock) lag behind the slave GPU by an arbitrary amount. This amount needs to be determined by testing because I've seen some really weird stuff happen:
Oh and just for kicks, can you spot what's wrong with these Firestrike results? I'll give you a hint: one set of cards had nearly matching ASIC (64.1% vs 66.3%), while the other set didn't. Care to guess which is which?
After very quick test it seems the master GPU (lower voltage card) is bogging down the entire SLI setup, including itself ROFL
With SLI disabled and +120 core/+300 mem, boost increased to 1524MHz, while voltage also jumped to 1.225V while running Firestrike. Now keep in mind the master GPU is 20MHz behind the slave (higher volt card). At +140 core/+300 mem, boost further increased to 1545MHz and voltage stayed put at 1.225V.
So what this means is that if everything were to have worked properly, I should see a boost of 1524MHz in Firestrike. But I don't, instead I see 1506MHz, and the reason for that is because the master GPU is downvolting to 1.200V instead of 1.225V, so it simply does not have enough juice to eek out another boost bin or two. Even with the exact same OC, the cards that had comparable ASICs and voltages worked much better than the other set. Clearly the "mismatched set" is losing about 40MHz of core clock even though on paper everything appears to be the same, which would explain the 3% loss in Firestrike score.
Yeah it's a cluster____D2 Ultima likes this. -
And maybe now you understand why I'm reluctant to do any more fixing. Seriously the first couple days after I got my cards back from RMA I had to deal with this BS, and if it wasn't for someone pointing out the workaround on the GeForce forums I'd never have figured it out. And just to top it off I can't push my cards too hard either because once I start overvolting in Afterburner all bets are off, so I'm basically limited to what the cards can do on 1.2V.
-
And boost is the core problem again too right? But you can't disable boost because then when it hits 2D clocks the card'll still be at something like 1.2v etc wasting power like crazy. #GGnVidia
And of course... the big thing... this doesn't happen at reference/stock for most of these cards, so they'll simply say "well not a problem". -
I honestly don't know what nVidia screwed up. If it's boost why didn't the Kepler cards have any issues with older (pre 344) drivers? Unless the driver has the ability to modify the boost table, but I thought that was hard coded into the vBIOS.
I just hope the 390X comes with more than 4GB of vram, and AMD continues to improve upon their frame pacing issues. They've definitely made a step in the right direction with Catalyst 13.8, but they still have a lot of catching up to do. 4GB of vram and frame pacing would still be 2 major reasons I might not go for the 390X in the end. And before you say it no I can't go back to single GPU setups until something with the power of 2x GM200/390X is released. But by then we'll probably need 2 of those cards to run 60 FPS at 1080p given the optimization these days... (yes it's the epitome of hyperbole deal with it)
D2 Ultima likes this. -
There have been several frame pacing drivers since Cat 13.8 plus AMD has XDMA. Honestly CrossFire is looking better than SLI right now.
-
I want AMD to have a healthy selection so that both companies bring out the best in each other. Honestly, for all the problems you're having with maxwell, there's just as many people who enjoy the easy driver updates and all the stuff you can do in NCP and are running a 970 on a 500W PSU with an i5-4690K at stock and are beyond happy with their machine.
So I want AMD to bring out something good without it being a blast furnace, and I want nVidia to stop being complacent like they currently are, and I want everybody to have good competition. This is the first time in as long as I can remember that AMD has NOT brought out a new lineup before nVidia's new lineup. Usually AMD's new line is what kicks nVidia into some sort of gear. I can't imagine the state Fermi would have been in if the 5000 and 6000 series from ATI/AMD was a flop.
For my sake as a nVidia user, I want AMD to smoke them. And for AMD users out there, I want AMD to improve. I can't say I'd move to AMD (especially not dual-GPU as long as I need fullscreen for crossfire to work) but seriously, I might just have no choice at some point. -
-
-
-
Up to 1600p it's a wash but the 980 is already starting to lag, and at 4K the 780 Ti is clearly better except in Metro LL.
-
PCPer has FCAT numbers of 290X CrossFire vs. 980 SLI:
NVIDIA GeForce GTX 980 and GTX 970 GM204 Review: Power and Efficiency | PC Perspective
As I see it, CrossFire is basically even with SLI in frame consistency now except in the lone remaining problem child which is DX9.
EDIT:
I guess if you really wanted to compare newer vs. older Cats in terms of frame pacing, you can take a look at the 290X CrossFire numbers from launch. These were obtained using 13.11, while the ones above were using 14.7.
Frame Rating: AMD Radeon R9 290X CrossFire and 4K Preview Testing | PC Perspective
And here's a possible ace up AMD's sleeve. Frame time variance (or lack thereof) in CrossFire Mantle is simply obscene. It's better than single GPU DX11 on their high-end (3960X) test platform. A multi-GPU setup having less microstutter than a single GPU setup?!
Frame Rating: Battlefield 4 Mantle CrossFire Early Performance with FCAT | PC Perspective -
Game is broken for both AMD and nVidia users. Being a GameWorks title means at least there's a SLI profile while the XFire profile is disabled, but the SLI profile has a game breaking shadow bug, so you're not going to be playing with SLI on anyway. That and at least the 970/980 don't seem to have the constant stuttering bug with a single card.
-
Really? TotalBiscuit had huge game-breaking stutter in his video using 980 SLI.
Last edited by a moderator: May 12, 2015 -
Single card apears to work OK, SLI is broken so may as well not use it. This is what they had to say:
Also inb4 some random dufus goes "use single GPU hurr durr" yeah well that's kind of not the point -
That is the game I will be playing after I'm done stuffing my face.
-
I can't enjoy this game on my 780M. I just can't. Too stuttery and low performance. Only cost me £4.49 to preorder so I don't feel like I've lost out.
I'm gonna try out Dragon Age Inquisition using the refund money I've just received from Amazon for an 8GB Crucial RAM module I never needed to use in my Gigabyte and returned. Hopefully that will tide me over until I get a 980M again to enjoy FC4 at its full potential. -
Cakefish and Getawayfrommelucas like this.
-
-
Getawayfrommelucas Notebook Evangelist
Yeah - I haven't really given this game a shot. I've been too busy playing MyNBA2k15 mobile + Dragon Age 3. But I did glance over that they pushed out a patch that disabled some settings in the game to increase playability. Can't confirm it though
Anyone with a 680 have anything positive to say? -
http://youtu.be/q8u2wzKx91M
My settings are in the video, though I have overclocked my 780m.Cakefish likes this. -
Anyways, enough off-topic DA:I talk from me, if I have any other questions I'll take them over to the relevant thread. -
Just set your gpu to do adaptive vsync (half refresh rate) for the game and you should be able to crank your settings.
Cakefish likes this. -
Cakefish likes this.
-
-
-
King of Interns Simply a laptop enthusiast
Any updates to this! I read patch 2 only fixes hair works whatever that is. No mention of stuttering framerate/hitching etc.... The developers must know about it!
-
King of Interns Simply a laptop enthusiast
-
Not even 980 SLI + 5930K can play Inquisition at 1080P 60. Good luck with a 980M,
-
King of Interns Simply a laptop enthusiast
I tried GPU max buffered frames set to 3 and now I can play practically stutter free at very high preset with some other features set to ultra. Only the trees/vegetation seem to flash a lot when moving especially driving.
I guess a few patches later we can call this playable... -
dumitrumitu24 Notebook Evangelist
How do you like guys godrays in this game?in black flag they werent to impressive but in this game they really made the difference but i tryied to mix settings today and they can reduce the performance up to 20% in some scenes but its worth i guess to reduce hbao to ssbc to have god rays on.I hope nvidia will make some great drivers before the end of the year which will improve performance cause the last game ready drivers didnt do anything.The game is still unplayable cause of stuttering while driving and framerate drops but i hope they fix it.
-
King of Interns Simply a laptop enthusiast
Have people noticed the game runs MUCH better now since the patch! I am running mixture of very high and ultra settings at 1080p with my mildly oced 680m and things are smooth still slowly increasing the settings. Stuttering has stopped and the foliage no longer flashes when I move about! Also the cutscenes now play clearly before they were completely messed up!
Time to play.... If you have a 780m or above you should be able to max this game at 1080p no problem
edit: make that max settings minus antialiasing still smooth playable at 930/1100mhz! Not great framerate though at these high settings! -
Also, define 'smooth FPS'. Are we talking about 30? Because I was struggling to maintain a steady 60 in certain areas (the villages/outposts - seems to be NPC related but I can't imagine I'm CPU limited). -
King of Interns Simply a laptop enthusiast
I did also update my drivers but I can't imagine that would have improved things so much! Anyways I will set the 680M to 1ghz and play away in ultra at about 30fps solid and more or less enjoy it now.
Despite being the same engine as FC3. FC4 definitely has much more detail in game. I averaged 1800-1900mb of vram over a 2 hour gaming session with vram hitting up to 2030mb max acord. to gpuz. It seems if you keep settings to max minus anti aliasing and also keeping FOV slider at normal at 1080p you can just about keep vram usage under max on 2GB cards.D2 Ultima likes this.
Far Cry 4
Discussion in 'Gaming (Software and Graphics Cards)' started by moviemarketing, Oct 30, 2014.