The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Far Cry 4

    Discussion in 'Gaming (Software and Graphics Cards)' started by moviemarketing, Oct 30, 2014.

  1. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    AMD cards are doing a pretty splendid job of "competing" in Far Cry 4 ;). More like utterly destroying Nvidia in perf/price. Case in point, a $300 290X versus a $550 980. 45% cheaper but only 5% slower. Or how about a $250 290 versus a $350 970. 29% cheaper but actually 3% faster.
     
  2. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    NVIDIA destroys AMD in the world of notebooks though. So it's not all bad news for the greens.
     
  3. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    Sounds like a job for my beastly 5830M! (Powered by Ubi-Awesome GDDR3™ ;) :cool:
     
  4. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Frametimes was something that was extremely bad a good while ago. AMD had it extremely bad, but have since shaped up their deal. Microstutter doesn't really come in around 45fps, it's much closer to say... 35 or below. At 30 or below it becomes extremely apparent and almost entirely unplayable for most SLI setups. The problem is that with very unoptimized games like CoD: Ghosts and Watch Dogs and such, the frametimes get so bad that they end up giving "microstutter" effects to even single GPU setups, and people can't really figure out why.

    Sad as it is to say, it's not always the drivers (and I know a lot of times it is), but rather sometimes it's just that awful of game coding. Also, devs usually never test things with multiGPU setups. Then compare and contrast with how much they care about the PC platform and you could get a scenario where a game like Titanfall took something along the lines of 4-5 months for a single SLI profile, and it's STILL broken too. Please note, TF is on Source engine; an engine which basically has supported pretty much everything you could toss at it since time immemorial.

    Now you might have much better ideas as to why exactly I hate the whole AAA game market so much right now. Hell, even when I dropped to 38fps at one point testing SLI in Evolve's Alpha I never felt stuttery in the slightest; it was basically like running on single GPU. That game hasn't even officially hit beta yet, far less be out. CoD: AW? HA. Dark Souls 2? Randomly drops to 45fps in some specific parts of the world while GPU util remains low (it's not all that common though), and feels like playing at 20fps the minute it goes below 50 anywhere with SLI on. BF4 will happily drop me to ~90fps from my target 125fps cap with my GPU util sitting at 70% and my CPU util barely cracking 60%, but TotalBiscuit's Dragon Age review had his 980s sitting at 90-95% almost constant. Same frostbite 3 engine, different mindset of publisher when releasing (framerates aside). Know how I have to get 99% constant scaling on my 780Ms in BF4? Turn the game to ultra and THEN bump resolution scale. Let's not even talk about Watch Dogs which for the most part took months to get into an acceptable (not good, not great) state for most PC users, which then BROKE MULTIGPU SUPPORT WITH A NEW PATCH ALONGSIDE DLC. AC:Unity is riddled with bugs and there's not a system on this planet that has less than 3 GPUs that'll never dip below 60fps at 1080p on it. Far Cry 4? LOL. Game stutters like mad driving around as if they're running into a vRAM bottleneck or something... except it doesn't max out a 4GB vRAM buffer. AND MORE TO THE POINT: PS4 RUNNING FAR CRY 4 LOOKS LITERALLY ALMOST AS GOOD AS PC ON MAX GRAPHICS, BUT DOES SO AT A SOLID FRAMERATE WITH NO HITCHING AND FAR FEWER BUGS. ON BASICALLY AN ENTRY-LEVEL CORE 2 QUAD WITH A GTX 570. I mean really. REALLY?

    No. I'm not blaming manufacturers right now. I don't get lack of smoothness in most older games, I get great performance with low vRAM in most older games with similar or better graphical fidelity to currently new titles, I get almost no stuttering and no problems at low framerates in even unoptimized titles like Arma 2 and Arma 3, where when I drop to ~30 the game still feels like a solid 30fps experience, and I have to go down to 25 or so before I start to have problems that make me run the game on a single GPU.

    So yeah... microstutter and bad frametimes is a huge problem, but drivers aren't really the blame right now. If drivers were the biggest blame, then they should be broken in most titles, which they're not. They're only broken in NEW titles. And from the same kind of devs too. Hell my SSF4 AE and then USF4 have gotten random stuttery times since I got this bloody PC. I thought it was a SLI problem, so I forced the game to 1 card. Nope, still does it. Most other people don't have the problem either; the game just doesn't appear to like my system (or people aren't sensitive to it like I am). SF4 doesn't do it though. Never did. So go figure.
     
  5. thegh0sts

    thegh0sts Notebook Nobel Laureate

    Reputations:
    949
    Messages:
    7,700
    Likes Received:
    2,819
    Trophy Points:
    331
    CODAW for me at least starts getting stuttery in the 40fps range.
     
  6. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Yeah same here. I swear somewhere between 40 and 50 FPS my setup hits a stuttering threshold, below which I really have to either turn down settings or simply not give a damn to keep playing. Happens both on my laptop and desktop.

    What really grinds my gears is that for the desktop Maxwell cards, I get the same stuttering effect whether in SLI or single GPU. I don't disagree bad coding plays a big role, but in this case I very strongly suspect its due to the dynamic throttling that occurs with Maxwell in order to make it efficient and keep it cool. Well guess what I DON'T GIVE A RAT'S BUTT if my card's load temp is 55C when I'm losing performance and stuttering.

    Yeah so nVidia wasn't kidding when they said Maxwell was designed "mobile first". What they didn't mention in the fine print, is that features that benefit mobile users but detrimental to desktop users are also included, such as this dynamic throttling nonsense that might make sense for laptops because of improved efficiency and reduced heat but does jack all for desktops.
     
    D2 Ultima likes this.
  7. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Yeah adjusting clockspeeds can cause a kind of downstep. Here's what you could do. Go into nvidia control panel --> manage 3D settings and find the games the problem is occurring on, then select and apply the option in the pic below. Or try it globally, but I dunno if your cards will ever downclock then (feel free to test for me!)

    [​IMG]
     
  8. thegh0sts

    thegh0sts Notebook Nobel Laureate

    Reputations:
    949
    Messages:
    7,700
    Likes Received:
    2,819
    Trophy Points:
    331
    that's why i target between 50-60 as a best playable range though constantly staring at the fps counter isn't much fun when you should really be seeing if it stutters and just note when it occurs and see what the fps was at the time.
     
  9. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231


    Trust me that's the first thing I tried and it didn't make any difference, except causing my master card to not downclock and run full boost even while idling on my desktop LOL. These drivers are a joke.

    But hey at least I don't have the dreaded DisplayPort issue people seem to be having with anything newer than 344.11. That's actually the major reason why I didn't bother updating beyond 344.11, since I am running off of DP.
     
  10. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Even if you set 'prefer maximum performance' globally, it still downclocks like normal on the desktop. It's only when you launch a 3D app that it applies.
     
  11. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Apparently not anymore XD.
     
  12. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I guess his cards and/or the drivers are borked.
     
  13. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Something is borked that's for sure. Maybe it's my brain or my eyes...
     
  14. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Anyway little laugh aside... so what you're saying n=1 is that setting prefer maximum performance causes downclocks in games, keeps stuttering AND leaves the GPU at max performance on desktop?

    So... you know I know you hate having to fix basic functionality on your own and stuff, but seriously? You might want to use that modded vBIOS right about now. I don't think I could live with something I know I could fix.
     
  15. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    I've been doing some reading on modding my own vBIOS and it's tempting, the problem is in order to fix the weird voltage discrepancy bug in SLI I pretty much have to pre-overclock the cards if that makes sense, so still a bit more work needs to be done.
     
  16. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I thought that forcing the cards to disable boost functionality (like we have for our 780Ms) would let you use nVidia Inspector to OC them properly enough?
     
  17. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    The issue goes much deeper than that...

    So the root problem is that for whatever reason, the 344.xx drivers fails (or refuses, god knows) to recognize that different ASIC qualities in these Maxwell cards may lead to different VDDCs when running boost. For example my master 970 has an ASIC of 74.3% and can do 1367 boost on 1.156V. The slave card has an ASIC of 66.1% and needs 1.206V for 1367 boost. So even out the box AT STOCK there's a 50mV difference between the 2 cards.

    So any attempt at overclocking without manually compensating for the voltage results in constant crashes and hard locks, because apparently what happens is that somehow the master card is forced to run the full boost OC but never overvolts for whatever reason, so as you can imagine trying to run 1500+ boost on 1.156V just isn't gonna fly. And the workaround is to actually let the master GPU (higher ASIC, lower VDDC at stock) lag behind the slave GPU by an arbitrary amount. This amount needs to be determined by testing because I've seen some really weird stuff happen:

    So to fundamentally fix my issue I would actually need to comb through the boost and voltage tables and adjust the entries manually, which as you can imagine is quite involved.

    Oh and just for kicks, can you spot what's wrong with these Firestrike results? I'll give you a hint: one set of cards had nearly matching ASIC (64.1% vs 66.3%), while the other set didn't. Care to guess which is which?

    After very quick test it seems the master GPU (lower voltage card) is bogging down the entire SLI setup, including itself ROFL

    With SLI disabled and +120 core/+300 mem, boost increased to 1524MHz, while voltage also jumped to 1.225V while running Firestrike. Now keep in mind the master GPU is 20MHz behind the slave (higher volt card). At +140 core/+300 mem, boost further increased to 1545MHz and voltage stayed put at 1.225V.

    So what this means is that if everything were to have worked properly, I should see a boost of 1524MHz in Firestrike. But I don't, instead I see 1506MHz, and the reason for that is because the master GPU is downvolting to 1.200V instead of 1.225V, so it simply does not have enough juice to eek out another boost bin or two. Even with the exact same OC, the cards that had comparable ASICs and voltages worked much better than the other set. Clearly the "mismatched set" is losing about 40MHz of core clock even though on paper everything appears to be the same, which would explain the 3% loss in Firestrike score.

    Yeah it's a cluster____
     
    D2 Ultima likes this.
  18. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    And maybe now you understand why I'm reluctant to do any more fixing. Seriously the first couple days after I got my cards back from RMA I had to deal with this BS, and if it wasn't for someone pointing out the workaround on the GeForce forums I'd never have figured it out. And just to top it off I can't push my cards too hard either because once I start overvolting in Afterburner all bets are off, so I'm basically limited to what the cards can do on 1.2V.
     
  19. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    So basically... drivers could fix this by working with each card on its own, like we have with Kepler. But nVidia has said this is intended, meaning that everybody will want to go AMD for the next few years, unless they fix this, which they show no signs of wanting to do (because price be damned, their cards are cool, quiet and strong and everybody is buying them like hotcakes right?).

    And boost is the core problem again too right? But you can't disable boost because then when it hits 2D clocks the card'll still be at something like 1.2v etc wasting power like crazy. #GGnVidia

    And of course... the big thing... this doesn't happen at reference/stock for most of these cards, so they'll simply say "well not a problem".
     
  20. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    I honestly don't know what nVidia screwed up. If it's boost why didn't the Kepler cards have any issues with older (pre 344) drivers? Unless the driver has the ability to modify the boost table, but I thought that was hard coded into the vBIOS.

    I just hope the 390X comes with more than 4GB of vram, and AMD continues to improve upon their frame pacing issues. They've definitely made a step in the right direction with Catalyst 13.8, but they still have a lot of catching up to do. 4GB of vram and frame pacing would still be 2 major reasons I might not go for the 390X in the end. And before you say it no I can't go back to single GPU setups until something with the power of 2x GM200/390X is released. But by then we'll probably need 2 of those cards to run 60 FPS at 1080p given the optimization these days... (yes it's the epitome of hyperbole deal with it :))
     
    D2 Ultima likes this.
  21. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    There have been several frame pacing drivers since Cat 13.8 plus AMD has XDMA. Honestly CrossFire is looking better than SLI right now.
     
  22. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I said it years ago and I'll say it again, though with a different twist. AMD needs to bring out new things. Not just fuller architectures or bigger GPUs or whatever. And they need a high end CPU lineup; the APUs are not anything anyone buys beyond midrange. I'll say it straight: I don't like AMD. I have not liked AMD for quite a while, with bad AMD boards and other various issues with their cards on friends' systems and their drivers and a whole set of crap. But you know what? There's people out there who don't give a crap. And they will buy, and keep buying. And the competition is healthy. I have a friend who'd rather buy a FX-8350 and OC it within an inch of its life to 4.8-5GHz on a watercooling loop and do the same for his Tri-X R9 290 than touch a system with intel in it because "intel's too expensive" (don't study the fact that his watercooling loop, more expensive mobo, larger PSU, etc could have had price even out with a base intel system and a $30 air cooler setup). Same with his hatred of nVidia: he doesn't wanna touch any of their GPUs. And there's tons of people like him.

    I want AMD to have a healthy selection so that both companies bring out the best in each other. Honestly, for all the problems you're having with maxwell, there's just as many people who enjoy the easy driver updates and all the stuff you can do in NCP and are running a 970 on a 500W PSU with an i5-4690K at stock and are beyond happy with their machine.

    So I want AMD to bring out something good without it being a blast furnace, and I want nVidia to stop being complacent like they currently are, and I want everybody to have good competition. This is the first time in as long as I can remember that AMD has NOT brought out a new lineup before nVidia's new lineup. Usually AMD's new line is what kicks nVidia into some sort of gear. I can't imagine the state Fermi would have been in if the 5000 and 6000 series from ATI/AMD was a flop.

    For my sake as a nVidia user, I want AMD to smoke them. And for AMD users out there, I want AMD to improve. I can't say I'd move to AMD (especially not dual-GPU as long as I need fullscreen for crossfire to work) but seriously, I might just have no choice at some point.
     
  23. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Hyperbole? It's already reality.

    [​IMG]
     
  24. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    I'm too <del>lazy</del> tired to dig for reviews, do you know of any that compares frame pacing in newer drivers vs 13.8, or even better vs nVidia? Yeah that 512-bit bus really shines at 1440p and above, remember reading on multiple sites that several recent releases ran much better with 290X XFire than 980 SLI when at 4K or even 1440p sometimes.

    As Mr. Fox would say, that's because they don't know any better. ;)
     
  25. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I still think it's architecture built for higher resolution than mem bus. The 780Ti and Titan Black have better memory bandwidth, but the 970/980 with their worse mem bandwidth still do better at higher resolutions too. The R9 290 & 290X cards do benefit from being designed much later than the 7000 series.
     
  26. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
  27. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Frame pacing phase 2 fixes were deployed with Cat 14.1. I don't know of any benchmarks directly comparing 14.1 vs. 13.8.

    PCPer has FCAT numbers of 290X CrossFire vs. 980 SLI:

    NVIDIA GeForce GTX 980 and GTX 970 GM204 Review: Power and Efficiency | PC Perspective

    As I see it, CrossFire is basically even with SLI in frame consistency now except in the lone remaining problem child which is DX9.

    EDIT:

    I guess if you really wanted to compare newer vs. older Cats in terms of frame pacing, you can take a look at the 290X CrossFire numbers from launch. These were obtained using 13.11, while the ones above were using 14.7.

    Frame Rating: AMD Radeon R9 290X CrossFire and 4K Preview Testing | PC Perspective

    And here's a possible ace up AMD's sleeve. Frame time variance (or lack thereof) in CrossFire Mantle is simply obscene. It's better than single GPU DX11 on their high-end (3960X) test platform. A multi-GPU setup having less microstutter than a single GPU setup?! :eek:

    Frame Rating: Battlefield 4 Mantle CrossFire Early Performance with FCAT | PC Perspective
     
  28. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Game is broken for both AMD and nVidia users. Being a GameWorks title means at least there's a SLI profile while the XFire profile is disabled, but the SLI profile has a game breaking shadow bug, so you're not going to be playing with SLI on anyway. That and at least the 970/980 don't seem to have the constant stuttering bug with a single card.
     
  29. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Really? TotalBiscuit had huge game-breaking stutter in his video using 980 SLI.

     
    Last edited by a moderator: May 12, 2015
  30. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Single card apears to work OK, SLI is broken so may as well not use it. This is what they had to say:

    And there goes my hopes for a Thanksgiving gaming marathon. Screw this, maybe I should this game instead.

    Also inb4 some random dufus goes "use single GPU hurr durr" yeah well that's kind of not the point
     
  31. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    That is the game I will be playing after I'm done stuffing my face.
     
  32. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    I can't enjoy this game on my 780M. I just can't. Too stuttery and low performance. Only cost me £4.49 to preorder so I don't feel like I've lost out.

    I'm gonna try out Dragon Age Inquisition using the refund money I've just received from Amazon for an 8GB Crucial RAM module I never needed to use in my Gigabyte and returned. Hopefully that will tide me over until I get a 980M again to enjoy FC4 at its full potential.
     
  33. thegh0sts

    thegh0sts Notebook Nobel Laureate

    Reputations:
    949
    Messages:
    7,700
    Likes Received:
    2,819
    Trophy Points:
    331
    I have a 780m and dragon age inquisition runs nicely so far.
     
    Cakefish and Getawayfrommelucas like this.
  34. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    At ultra? :D
     
  35. Getawayfrommelucas

    Getawayfrommelucas Notebook Evangelist

    Reputations:
    84
    Messages:
    636
    Likes Received:
    51
    Trophy Points:
    41
    Yeah - I haven't really given this game a shot. I've been too busy playing MyNBA2k15 mobile + Dragon Age 3. But I did glance over that they pushed out a patch that disabled some settings in the game to increase playability. Can't confirm it though

    Anyone with a 680 have anything positive to say?
     
  36. thegh0sts

    thegh0sts Notebook Nobel Laureate

    Reputations:
    949
    Messages:
    7,700
    Likes Received:
    2,819
    Trophy Points:
    331
    Ha! Keep dreaming. Here's my video:

    http://youtu.be/q8u2wzKx91M

    My settings are in the video, though I have overclocked my 780m.
     
    Cakefish likes this.
  37. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Oh that's alright. I don't mind playing 3rd person games at 30FPS. It's only 1st person games where I really can't stand it - like Far Cry 4. Judging by your settings and framerate there, I should be able to squeeze out mostly high settings for a 35-40FPS experience. Which will be good enough for me until the glorious triumphant return of the 980M into my life :)

    Anyways, enough off-topic DA:I talk from me, if I have any other questions I'll take them over to the relevant thread.
     
  38. thegh0sts

    thegh0sts Notebook Nobel Laureate

    Reputations:
    949
    Messages:
    7,700
    Likes Received:
    2,819
    Trophy Points:
    331
    Just set your gpu to do adaptive vsync (half refresh rate) for the game and you should be able to crank your settings.
     
    Cakefish likes this.
  39. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Funnily enough, a 780M is better than a PS4 which runs it at 1080p ultra (minus I think a fog setting) without problems. #Ubitimization. Anyway, you could probably try playing it on high or medium and OCing your 780M to 950/6000 to get it closer to a 680 and see if it works a bit better. But I completely understand your plight XD.
     
    Cakefish likes this.
  40. thegh0sts

    thegh0sts Notebook Nobel Laureate

    Reputations:
    949
    Messages:
    7,700
    Likes Received:
    2,819
    Trophy Points:
    331
    I'm running my 780m at 1019/6000 which is effectively the base clock of a GTX 680 and it the game stutters like crazy. I have my game set to med and high settings.
     
  41. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Awww, my 780M is not as good an overclocker as that on the core (on the memory it can go +620MHz without crashing in games, but I never do because the the trade-off in temps is never justified by the minimal performance increase). I use 950/6000 in games. On FC4 it needs to go down to medium/high to feel smooth, which after having used a 980M at ultra in the game, feels too low for me. 980M definitely feels much more fluid. It'll give me something to play on my new laptop over the Christmas holidays :)
     
  42. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    Any updates to this! I read patch 2 only fixes hair works whatever that is. No mention of stuttering framerate/hitching etc.... The developers must know about it!
     
  43. krizzjaa

    krizzjaa Notebook Guru

    Reputations:
    7
    Messages:
    64
    Likes Received:
    15
    Trophy Points:
    16
  44. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
  45. 1nstance

    1nstance Notebook Evangelist

    Reputations:
    517
    Messages:
    633
    Likes Received:
    221
    Trophy Points:
    56
    Not even 980 SLI + 5930K can play Inquisition at 1080P 60. Good luck with a 980M,
     
  46. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    I tried GPU max buffered frames set to 3 and now I can play practically stutter free at very high preset with some other features set to ultra. Only the trees/vegetation seem to flash a lot when moving especially driving.

    I guess a few patches later we can call this playable...
     
  47. dumitrumitu24

    dumitrumitu24 Notebook Evangelist

    Reputations:
    24
    Messages:
    401
    Likes Received:
    40
    Trophy Points:
    41
    How do you like guys godrays in this game?in black flag they werent to impressive but in this game they really made the difference but i tryied to mix settings today and they can reduce the performance up to 20% in some scenes but its worth i guess to reduce hbao to ssbc to have god rays on.I hope nvidia will make some great drivers before the end of the year which will improve performance cause the last game ready drivers didnt do anything.The game is still unplayable cause of stuttering while driving and framerate drops but i hope they fix it.
     
  48. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    Have people noticed the game runs MUCH better now since the patch! I am running mixture of very high and ultra settings at 1080p with my mildly oced 680m and things are smooth still slowly increasing the settings. Stuttering has stopped and the foliage no longer flashes when I move about! Also the cutscenes now play clearly before they were completely messed up!

    Time to play.... If you have a 780m or above you should be able to max this game at 1080p no problem

    edit: make that max settings minus antialiasing still smooth playable at 930/1100mhz! Not great framerate though at these high settings!
     
  49. Vitor711

    Vitor711 Notebook Evangelist

    Reputations:
    156
    Messages:
    654
    Likes Received:
    51
    Trophy Points:
    41
    When was the patch? Haven't touched this since I started Dragon Age last week. The stuttering was horrendous.

    Also, define 'smooth FPS'. Are we talking about 30? Because I was struggling to maintain a steady 60 in certain areas (the villages/outposts - seems to be NPC related but I can't imagine I'm CPU limited).
     
  50. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    Patch released today. The difference for me is night and day. If I jam all settings to max the game won't run at a very high framerate due to only having a 680M but it does run fluidly. It seems many of the problems (stuttering, textures etc) have been either fixed or greatly improved despite there being no mention of these things being improved upon in the patch notes.

    I did also update my drivers but I can't imagine that would have improved things so much! Anyways I will set the 680M to 1ghz and play away in ultra at about 30fps solid and more or less enjoy it now.

    Despite being the same engine as FC3. FC4 definitely has much more detail in game. I averaged 1800-1900mb of vram over a 2 hour gaming session with vram hitting up to 2030mb max acord. to gpuz. It seems if you keep settings to max minus anti aliasing and also keeping FOV slider at normal at 1080p you can just about keep vram usage under max on 2GB cards.
     
    D2 Ultima likes this.
← Previous pageNext page →