One small nick I'd have to be picky about, you spelled "Unbiased" wrong in the thumb nail.
Great game review Ultima, also I love your accent lol.
Also, 4GB vRAM usage! Jeez!
-
-
And about the VRAM usage, it'll use as much as you throw at it due to caching. People with 2GB cards are running it without issue. -
As for the 4GB vRAM usage, I think I said at one point that it just uses all the vRAM you toss at it; even using 8GB vRAM on a 980M. No idea how much vRAM it actually uses, and the caching options are still broken even with 8GB vRAM -
It just "reserves" it, or actually stores textures in it?
-
I will point out though, that running ANYTHING ELSE that took up vRAM on ghosts (this includes watching a stream on chrome or livestreaming the game) for the first few weeks would cause MASSIVE performance tanking and stuttering issues... mainly what most streamers likely complained about.
There's no way I can downgrade patches to check anymore, and I'm not installing that game again for any reason. All I can say is, AW's MP, most of the time, feels like it's just entered a beta after putting in all the content (the criteria for exiting Alpha). -
Did you try disabling SLi? Can cause stutter sometimes. Not like it's a fix, more of a trouble shoot
Not surprised to hear it's a bit buggy, I suppose it's nice to hear it's using all it can get, even if not for good use lol -
Using Mr Fox's setting in his video, with the exception I run at 1080p, I see my frame rates go from 45-85 FPS. Using EricO setting my rates stay 100+FPS. Only O'C I have done is to my 7970M(910/1250). Got to put back my 3940XM and run the beast at 4.4Ghz
-
is there any visible difference between the number of dynamic lights in a scene?
-
Attached Files:
-
-
13GB of sRAM used? And practically all of your vRAM? What a resource hungry game...........
-
Forcing AFR2 is still better than using the default GPU drivers. -
Mr. Fox likes this.
-
HWiNFO64 is pushing selected sensor data through RTSS.
-
Gents - just a quick add:
Both the driver, and the game (twice) have been updated, and it runs so well now with increases settings, I'm even using 4x supersampling too.
I'll run some benchies soon, but playing around with the GTX980 before I have to hand it back.
Cheers.Mr. Fox likes this. -
Glad things are working for you now, Brother ErikO.
I just finished the single-player campaign on this game this evening, and all I can say it that it was absolutely amazing. It ranks right up there with BF4 and Crysis 3 on the awesomeness scale. It's nice to have such an excellent title that can totally exploit every bit of performance a machine can muster to make the experience even better. It scales really well, and took advantage of both GPUs, used all of the available vRAM and used about 13GB of my 32GB of system memory and ran as smooth as silk 99% of the time. Loved every minute of it. I wish every game was as great as this one was.RaSeven likes this. -
Guy you just turn up the supersampling to 8X 1920x1080 that's equal to 7680x4320p resolution hence the low fps you got. Poor 7970M =DD2 Ultima likes this. -
Check your math, 7680x4320 (8K resolution) is 16x (4x4) of 1080p. AW goes up to 16x SSAA max, no?
-
Notwithstanding the mathematical errors and looking at the bottom line... way too high to be useful, LOL. That would play like an Ultrabook slide show even with the best desktop hardware.
-
I think if AW runs @ 1080p60 on a 640-core 750 Ti/860M, it can do 8K60 on a 8192-core 980 quad-SLI with some OC, no?
Mr. Fox likes this. -
Quad 980 SLI yessir... but, go with a popular gamer-boy mini-ITX and one GTX 980... break out the knitting needles, pull up a rocking chair, and enjoy the majestic splendor of an 8K slide show.
Kind of like using a serrated steak knife to cut a 2x4 stick of lumber. You can, given enough time, by why would you even try? LOL. -
Right up your alley I'm sure, Mr. Fox. /s
BTW take back what I said earlier. You don't even need to OC. Single stock 980 already does 60 FPS @ 4K. So with perfect SLI-scaling @ 8K (4x4K)--which is pretty likely given how GPU-bound 8K is--60 FPS not only possible but likely on stock 980 quad-SLI :thumbsup:. You might need the 8GB cards though.
(Above numbers taken before SLI profile added to the driver, obviously.)Mr. Fox likes this. -
Don't even bother; it won't even scale well at 16x supersampling haha. Util goes straight down; technically I should be able to get 25fps with 99% scaling here but I get about 15fps on average. The util spikes to 70% or so on main or slave card every now and then, but never at the same time and averages 55% for both most of the time. Yes, I was bored. Yes I had to force-crash my game twice to apply supersampling settings. Also, it's interesting to note the abysmal CPU usage too. When I put BF4 and Tomb Raider etc above 1080p with supersampling my CPU usage went down, but my GPU utils remained at 99% constantly, so this was an unexpected experience.
-
Yeah AW needs to be patched, not AFR friendly ATM. Some effects only run on single GPU so perfect scaling not yet possible without breaking visuals. Game seems to run great on single GPU though and looks better than AC Unity (and runs waayyy faster ofc).
You may try 0x080040F1 with in-game motion blur disabled and see if it helps.
EDIT: Make sure SLI rendering mode is on default (force AFR1 I think?) when trying above custom SLI bits. -
I don't know why, but I really like seeing pictures of large amounts of system resources being used, really makes me feel the hardware is really being used, even if not efficiently lol.
-
Yeah, it hits 90fps easy and stuff, but I still get framedrops without hitting 99% constant. Or even 80% constant. But I hit it easy in Black Ops 2. Hell, BO2 is the most optimized CoD game I've ever played, hands down. I'm fed up of people comparing AW to Ghosts and going "well it's a huge improvement!" but Ghosts was made by Neversoft until IW stepped in to take over, rushed on a 2 year (at best) cycle, pushed out on SIX platforms (even though Wii U was done by treyarch as far as I know) and in general expected to come out as such.
And know what? It STILL looked pretty good in some of the maps (which I assume separate studios worked on; some did the good maps and the rest were done by less-qualified studios probably) and it had the best netcode I've ever seen in an online game ever, though they screwed up the time to kill.
So yeah... not hard to surpass ghosts, but AW is far below the other games in the series. It's below MW2, BO1 (despite its launch running issues which were fixed quickly), FAR below BO2 and hell, even below MW3, which while an awfully coded game, still ran fairly well and didn't have problems with connections and had the ability to mute people in games and all sorts of other stuff you could want. It's sledgehammer's first try and I cut them slack for it, but people are overselling it badly. -
-
@D2 Ultima: How was the custom SLI bits I suggested?
-
-
MW3 was the laggiest COD game I ever played, truly awful experience for me. I was playing along side a friend, same problem for him.
Network wise anyway, FPS wise it was okay.
We'll see how AW plays when I have a chance to play it. -
-
That's because it's a custom profile, which doesn't already exist in the driver. You have to enter it manually.
-
-
-
-
Is AW a DX8/9 or OGL game?
-
So, seems to be mostly okay... not great not bad. No stutter today so far. Still runs like butts (even worse) at 16x SSAA though haha -
AFR1 and motion blur disabled?
The only bit it changes over the official SLI profile, #14, is supposed to fix scaling at the cost of breaking motion blur. You might also try it along with the undefined Crysis 3 cache bits I showed you earlier for Evolve Alpha, which improve the pauses you get in SLI. -
-
Advanced Warfare doesn't have a motion blur setting?
-
Nope. Options menu still looks exactly like it does here in my video.
Also, what AMD 7900 card has 2.6GB of vRAM? -
Oh I see. Motion blur is only in single player so I guess you don't have to worry about it if you're testing in MP.
Probably a reporting error but 7950 and 7970 both have 3GB. -
-
-
Really, I wish it was permanently disabled on every game. I don't know why motion blur exists because it sucks. Every game I have played with motion blur enabled looks and plays better with it disabled.
-
I disagree, I think the choice should always be available. You hate lack of choice in the PC hardware space like the devil (yeah I see your rants against BGA
), so why shouldn't graphics options be the same way?
IMO granular motion blur (both camera and object) should be mandatory in every first-person game. If done right, it can increase the perception of visual smoothness at lower frame rates and actually fight nausea/simulator sickness for some people. But the effect is so heavy-handed in most games that it usually does the exact opposite, making people more dizzy. And I find good motion blur helps the immersion and cinematic qualities of single player games. Source Engine's default motion blur is probably the best I've ever seen, I refuse to play Half-Life 2 without it. But ofc I always turn off motion blur in multiplayer games for competitive reasons.
As an aside, I have noticed that more and more games nowadays are using motion vector-based frame blending in an effort to increase the cinematic qualities of their motion blur (this is the same kind of motion blur used in CGI films). Unfortunately, this kind of motion blur is not AFR friendly, thus breaking SLI, since each GPU in an SLI configuration cannot access frames rendered previously on another GPU to do frame blending. This is the same reason why temporal antialiasing is also not supported in SLI. Take SMAA T2x as an example. It is a combination of regular SMAA 1x and 2x temporal SSAA--basically blending each current frame with the previous one to get 2x supersampling with almost zero cost. Exciting for sure, as no other current AA method beats its image quality at the same level of performance, it's just not supported on SLI.
If I had to venture a guess at why Advanced Warfare's motion blur goes nuts when trying to use bit #14 to fix SLI scaling, the above is probably why.Mr. Fox likes this. -
OK, you win.
Losing options is bad.
I'll just keep disabling it. It generally does do the opposite for me... hate it.octiceps likes this. -
The higher your FPS the worse motion blur looks. If you're getting 35-45 FPS and your game is "playable" because older hardware (like my 280M had) then motion blur is a nice thing indeed. It does boost smoothness. But the minute you get a system like mine or Mr. Fox's or hell, a 980M SLI system? Especially with a high refresh rate? It's just bad. 50+fps and motion blur starts to lose it's charm and look more unnatural. I can say at 120Hz in FPS games motion blur entirely kills the benefit of the high FPS/refresh when chasing enemies at close range, because the blur causes them to vanish. At 60fps it won't matter as much because they'll simply appear to vanish into thin air with or without motion blur, but if you're spinning around trying to lock onto enemies at medium range or whatever, you can miss them with motion blur.
So basically it's a personal choice. But yes, choice is always the best answer =D.Mr. Fox likes this. -
Yep, motion blur can somewhat counteract high refresh rate. Definitely want to turn off any screen space blurring for competitive multiplayer in a twitch shooter. But I find Half-Life 2's motion blur subtle enough to be usable at 100Hz, which is what my panel is overclocked to.
Mr. Fox likes this. -
-
Gents, just to add to this performance discussion, I updated to the newest beta driver, as it supported Far Cry 4, and all is back to hell again with AW.
Again, performance is better with Crossfire disabled.
I give up. I'll come back to this game in a couple of months when they've ironed issues out. Sorry I bought on day one now.
Oh, and Far Cry 4 looks amazing.
Help with Call of Duty: Advanced Warfare
Discussion in 'Alienware 18 and M18x' started by aliengamer, Nov 4, 2014.