Just benchmarked Middle Earth: Shadow of Mordor
Crossfire AMD 290 (Tri X and Gaming edition) at 1100mhz/1375mhz, +87mV
![]()
can't seem to get a framegrab for some reason, anyway
Max FPS: 120
Min FPS: 35
Average FPS: 90
![]()
Basically, the game has a solid 90-100 FPS all round, the dips I assume were simply frame variances due to Crossfire. Extremely smooth gameplay, Crossfire seems to work properly.
![]()
![]()
![]()
![]()
Anyway the GPUz shot, notice the VRAM usage
After about 2 hrs of gameplay, VRM temps on the Tri X sits at 80 degrees, Core sits at 75 degrees, fan speed at 65%, power draw spikes to 300W during fights with 30+ dudes.
Screenshot, more pictures and GTX 860M benchmarks to come
-
-
What the heck is that??? 8 GB of VRAM usage??? That's crazy,
-
Getawayfrommelucas Notebook Evangelist
So it's well optimized (minus textures)? Breath of fresh air coming off of Dead Rising 3.
-
Gameplay? I'd say trolls are totally OP, you can kill just about any tier 1-3 Captain with no effort by aggroing them towards a troll. Also extremely handy if you are getting chased by 50+ dudes, just run towards one of the many roaming trolls, they will all die and you still get the XP.CptXabaras likes this. -
It doesn't have AA applied?
-
-
Uhh... as far as I know, multi-GPU support should copy the vRAM across. Does CrossfireX not do this? If it does, and GPU-Z somehow is reporting it a little oddly, it's only using 4GB per card.
Also, have you found and downloaded the "Ultra" texture pack? Otherwise ultra settings doesn't do much different than high I believe. -
At the moment, I don't think anybody has the high res texture pack. If they have, I haven't been able to locate it. -
Getawayfrommelucas Notebook Evangelist
ooc - how moddable is this game? 6g+ for textures is pretty crazy, especially when you visually see the game. Makes me think we'll have some texture guru's optimizing textures if the requirements are so out there.
-
Hmm. So we need to wait then I guess. Welp, looks like maxing it at 1080p will use all 4GB of our cards then. But as I thought when I saw it on a stream last night, the textures aren't nearly good enough to warrant the vRAM usage. Oh well, GG.
-
-
octiceps likes this.
-
I don't mind a game using more vRAM than Crysis 3 without looking equal or better... *IF* it has technological benefits that are superior. Like draw distance and good ATOC for grass and stuff... but if there isn't anything truly advanced in these fields, excess vRAM usage is just unoptimization bred from a lack of having to optimize down for the last-gen consoles. -
You can`t compare different games and just look at the graphics and decide what game should use how much VRAM -
Getawayfrommelucas Notebook Evangelist
octiceps likes this. -
-
Does the game have MSAA/SSAA? That can seriously bump up video memory usage too, arguably more so than any other setting.
-
Correct me if I'm wrong but doesn't texture make up the bulk of vRAM usage? I'm simply thinking if a game's texture looks worse than Crysis 3 while using more vRAM, it had better make up for it with "advanced technologies" or some other <del>fluff</del> eye candy.
But even then I can tell you Crysis 3 at 1080p with 2x MSAA never exceeded 3GB of vRAM throughout the entire game, so I fail to see how 6GB can ever be justified on a game with worse textures. (yes I know Shadow of Mordor "only" uses 4GB on Ultra, but still...)
EDIT: yeah ok cranking up MSAA/SSAA would do the trick I suppose -
-
-
Getawayfrommelucas Notebook Evangelist
I've tried but I can hardly notice the difference between FXAA and all the other AA's. I like FXAA because of that...and it really doesn't hurt the FPS. Is there a significant difference that I may be over looking? Worth reconsidering my AA?
-
-
Getawayfrommelucas Notebook Evangelist
I'm just curious if I'm missing anything -
-
Getawayfrommelucas Notebook Evangelist
-
Getawayfrommelucas likes this.
-
Getawayfrommelucas Notebook Evangelist
Price you pay to be a mobile gamer. $$$ -
Any idea on when the 860m benchmarks will be up?
I'm going to be running it on a i7-4710MQ, 860m 2gb, 12 gb of RAM, have it installed on an SSD (not sure how much of a difference it'd make, just saying). What settings do you think will be viable for that rig? -
Why do people continuously cry foul, while touting Crysis 3's VRAM usage, as if it was anything near an open-world game? Your modern corridor-to-corridor shooter shouldn't use much VRAM.
I'm not defending this game, or saying there is no "foul", just pointing out that there needs to be a better argument made. The raw totality of a game's graphics does not mean every game after it should be compared to it on the basis of graphics. FPS and open-world games simply shouldn't be compared in their VRAM usage. Need an example to which to look forward? Batman: Arkham Knight will be a much better barometer, to determine how much VRAM this type of game should use.
The "lazy developers" argument is well... lazy. -
-
Then we agree that the VRAM usage is likely not unjustified?
-
Specs below in sig.
I get a variable 45-50FPS more or less with everything at high at 1080 and transparency effects turned off (it's a nice 10% boost and the visual hit is low).
The benchmark is broken and their FPS measurements are wrong. FRAPS has it off by around 10 or so. The bench pegged me as an average of 75 but that's not what I get in game at all.
Also, the opening sequence in the rain is a massive performance hog, much like Crysis 3's first rain stage: It is not representative of performance, so just ignore ir and wait until you get into the open world. The number of Orcs on screen doesn't impact things much if at all, but rain does for some reason. -
I'm saying if a game has amazing shadow quality and draw distance details and all sorts of stuff like that, it can use a lotta vRAM as much as it pleases. But if the game relegates itself purely to that being on textures, like most games do, then there is indeed a problem, as then we must relegate it to visuals.
I've no problem with a game like SoM using 3GB+ vRAM on max. Even though the texture quality is not amazing, it's HUGELY open and the distance one can see is pretty far, and it has pretty nice shadows and all that, and I feel that above 2GB is ok for a game of that quality mark... but not 6GB. And from what we've seen of The Evil Within too, it's a fairly linear 3rd person game without ground-breaking graphics in closed corridors. There's no way on this planet it should need 4GB. Same with Wolfenstein: The New Order. Same with Titanfall, which simply made SUPER high resolution textures (and BELIEVE ME THOSE TEXTURES WERE THE CRISPEST THINGS I'VE SEEN IN A LONGGGG TIME) but didn't remember to actually make *GOOD* textures, so we had super high resolution muddy textures. It's a very interesting experience, to see textures be crisp as hell, but have a fundamental "messy" look to them... the only way I could describe it is me looking at them and thinking "Wow, you guys couldn't like colour in the lines for this? It looks like you tried to and failed" about them.HTWingNut likes this. -
why does the game look like vomit
-
This game looks barely better than the old LOTR: War in the North. It's totally unjustifiable needing 4GB + for such visual mediocrity.
octiceps likes this. -
ROFL and here we all thought the next-gen consoles would usher in a new era of well-optimized PC ports when the reality of it is the exact opposite. It's lead to nothing but ballooning VRAM requirements due to the consoles' large unified memory pool. Developers aren't optimizing for the split memory architecture of the PC as they have in the past, so say goodbye to Ultra unless you happen to own a $1000 GPU. Are they even following industry best practices regarding streaming and texture compression to reduce reliance on VRAM and swapping? Doesn't seem like it.
Make no mistake about it, the monstrous VRAM requirement is solely because of poor optimization/laziness/incompetence, not because the art assets are significantly improved in any way over last-gen PC titles, which anybody can see isn't the case. -
At this rate 4GB will become the new 2GB...
Honestly though shouldn't nVidia also take some heat for playing the vRAM shell game? If the 6GB Titan cards didn't exist, then the devs would look at the PC market and go "welp they only have 4GB of vRAM, we gotta at least do the bare minimum to make it fit in that limit" as opposed to "You don't have a 6GB card? NO ULTRA FOR YOU. Either plop down $1000 or go cry in that corner."
It's just really sad when $550 is now "mid-range". **** these devs. -
"But...but...Titans are not gaming cards."
They are now, apparently. One wonders if the devs ever saw this: Steam Hardware & Software Survey -
-
Or if they're anything like me, they'll just NOT BUY THE DAMN GAME period. Yeah jokes on these lazy incompetent nitwits.
I understand what you're saying about advancing technology, but the flip side is it provides an excuse for devs to point their finger and go "well you should've gotten that card". I mean Quadro and Teslas always existed but the devs were at least smart enough to realize they weren't gaming cards so they didn't use those as a reference. But since nVidia markted the Titan (and the laughable Titan-Z) as a gaming card, it's now ok to go "well but those 6GB cards can handle it so tough luck". -
InB4ConspiracyTheories
-
I wish people would have the sense to vote with their wallets, but nobody really does. Or you'll get people going "oh well I have a titan, I can play it!" or something. I was even joking with Ethrem that he could finally put those 880M buffers to use as well XD. But still... devs just need to stop. It doesn't matter if the hardware was capable; I don't remember anyone back in 2007 able to run Crysis on max graphics at 1080p at 60fps. Ever. The difference is that Crysis boasted never-before-seen levels of visual fidelity and tech (like how almost every tree you could shoot down etc) that people never thought could happen. These games are being made for almost-not-even-here-yet technology, but they don't have ANYTHING visually or under the hood that justifies it. So we cry unoptimization.
I'm not against tech and requirements and everything pushing ahead. I know I WILL be able to max just about any game for the next couple years with my 780Ms and so shall you. But if requirements advance, the tech should too, and I don't see the latter happening, which is what my problem is.
Don't worry though, eventually they'll have to start "optimizing" for the consoles pretty soon, and PC won't be getting "hey you need 8GB vRAM and 32GB system RAM for medium settings on this game that looks like a high resolution APB reloaded!" as much. But right now? It's gonna get really bad really fast. -
-
I'm voting with my wallet. It's hard because I'm a massive LOTR nut, but I'm not buying the game until the textures get optimized, either by the devs or by modders. Medium looks like donkey for requiring 2GB VRAM. The difference between High and Ultra is minimal, but High currently requires 3GB VRAM, so if somebody manages to get High running smoothly on 2GB, they'll get my money.
-
Basically there's no difference between high and ultra, and barely anything noticeable from a still-frame perspective between medium and high...
Just absurd. Again not optimized for PC at all. They only do it because the memory is shared between system and video card on the consoles. Windows PC's don't handle that too well. nVidia AMD could do themselves a favor and implement a driver that will manage the texture swapping smartly so they don't need to throw expensive 8GB of fast GDDR5 vRAM on every video card out there. By the way this looks, even mid range video cards will now need at least 6GB vRAM because you will still need it to play at lower resolutions but ultra quality textures. -
Even with Ultra textures, the game is still at Dark Souls levels of hideous.
TBoneSan likes this. -
I know for sure that at 1440p on a 2GB 680 without downloading the Ultra texture pack it works fine on "high" with everything else maxed (except likely SSAA). However the user claimed he couldn't use "ultra" without framedrops at that point, regardless of not having the ultra textures downloaded. I am interested to note how his experience will differ if he force-downloads the texture pack and THEN tries to run the game the same way. I'm going to ask him to do it now and I'll report back when he's finished.
-
Utterly pathetic.HTWingNut likes this. -
-
I need to find out from him to be sure. I do not own the game and will not be buying it of my own volition (even if I had money for it right now) at least not until the "GOTY" edition comes out with all the DLC, so I can't do my rigorous testing methods to check.
Edit: My friend is an idiot, doesn't even remember what he says and doesn't listen to people talking or convey information correctly, apparently. He meant he could not "put the game on ultra" as in "turn all the rest of the settings to ultra" and was downsampling from 1440p to 1080p. He also hacked in SLI to work and gets 60-80fps with two 2GB 680s at 1440p downsample with everything on "high" except ultra textures... I think. I can't even tell if the rest of the settings are on ultra or just high anymore. He also has a 120Hz screen but "60fps is enough for me" was his statement when I said 60fps on "high" for two 680s was low fps.
Middle Earth: The Shadow of Mordor Benchmarked
Discussion in 'Gaming (Software and Graphics Cards)' started by Marksman30k, Sep 30, 2014.