The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Middle Earth: The Shadow of Mordor Benchmarked

    Discussion in 'Gaming (Software and Graphics Cards)' started by Marksman30k, Sep 30, 2014.

  1. Marksman30k

    Marksman30k Notebook Deity

    Reputations:
    2,080
    Messages:
    1,068
    Likes Received:
    180
    Trophy Points:
    81
    Just benchmarked Middle Earth: Shadow of Mordor

    Crossfire AMD 290 (Tri X and Gaming edition) at 1100mhz/1375mhz, +87mV

    ShadowOfMordor 2014-09-30 22-16-54-96.jpg

    can't seem to get a framegrab for some reason, anyway
    Max FPS: 120
    Min FPS: 35
    Average FPS: 90

    ShadowOfMordor 2014-09-30 22-16-48-21.jpg

    Basically, the game has a solid 90-100 FPS all round, the dips I assume were simply frame variances due to Crossfire. Extremely smooth gameplay, Crossfire seems to work properly.

    Temperatures.jpg

    ShadowOfMordor 2014-09-30 22-16-45-13.jpg

    ShadowOfMordor 2014-09-30 22-16-27-97.jpg

    ShadowOfMordor 2014-09-30 22-16-23-28.jpg

    Anyway the GPUz shot, notice the VRAM usage

    After about 2 hrs of gameplay, VRM temps on the Tri X sits at 80 degrees, Core sits at 75 degrees, fan speed at 65%, power draw spikes to 300W during fights with 30+ dudes.

    Screenshot, more pictures and GTX 860M benchmarks to come
     
  2. Arioch

    Arioch Notebook Consultant

    Reputations:
    109
    Messages:
    138
    Likes Received:
    10
    Trophy Points:
    31
    What the heck is that??? 8 GB of VRAM usage??? That's crazy,
     
  3. Getawayfrommelucas

    Getawayfrommelucas Notebook Evangelist

    Reputations:
    84
    Messages:
    636
    Likes Received:
    51
    Trophy Points:
    41
    So it's well optimized (minus textures)? Breath of fresh air coming off of Dead Rising 3.
     
  4. Marksman30k

    Marksman30k Notebook Deity

    Reputations:
    2,080
    Messages:
    1,068
    Likes Received:
    180
    Trophy Points:
    81
    Yeah, textures are awful, I'd say about the same standard as Dragon Age Origins, certainly not Crysis 3. However, area transitions are seamless, which is probably why it uses 8gb of VRAM. Game also performs extremely well, very smooth.

    Gameplay? I'd say trolls are totally OP, you can kill just about any tier 1-3 Captain with no effort by aggroing them towards a troll. Also extremely handy if you are getting chased by 50+ dudes, just run towards one of the many roaming trolls, they will all die and you still get the XP.
     
    CptXabaras likes this.
  5. Arioch

    Arioch Notebook Consultant

    Reputations:
    109
    Messages:
    138
    Likes Received:
    10
    Trophy Points:
    31
    It doesn't have AA applied?
     
  6. Marksman30k

    Marksman30k Notebook Deity

    Reputations:
    2,080
    Messages:
    1,068
    Likes Received:
    180
    Trophy Points:
    81
    It might be my screenshot uploads, I did downgrade the original BMP files to JPEG
     
  7. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Uhh... as far as I know, multi-GPU support should copy the vRAM across. Does CrossfireX not do this? If it does, and GPU-Z somehow is reporting it a little oddly, it's only using 4GB per card.

    Also, have you found and downloaded the "Ultra" texture pack? Otherwise ultra settings doesn't do much different than high I believe.
     
  8. Marksman30k

    Marksman30k Notebook Deity

    Reputations:
    2,080
    Messages:
    1,068
    Likes Received:
    180
    Trophy Points:
    81
    It's probably an error with the GPUz reporting, basically, the RAM banks of both GPUs were completely maxed out.
    At the moment, I don't think anybody has the high res texture pack. If they have, I haven't been able to locate it.
     
  9. Getawayfrommelucas

    Getawayfrommelucas Notebook Evangelist

    Reputations:
    84
    Messages:
    636
    Likes Received:
    51
    Trophy Points:
    41
    ooc - how moddable is this game? 6g+ for textures is pretty crazy, especially when you visually see the game. Makes me think we'll have some texture guru's optimizing textures if the requirements are so out there.
     
  10. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Hmm. So we need to wait then I guess. Welp, looks like maxing it at 1080p will use all 4GB of our cards then. But as I thought when I saw it on a stream last night, the textures aren't nearly good enough to warrant the vRAM usage. Oh well, GG.
     
  11. Marksman30k

    Marksman30k Notebook Deity

    Reputations:
    2,080
    Messages:
    1,068
    Likes Received:
    180
    Trophy Points:
    81
    Lol, rookie mistake, realized Crossfire wasn't enabled properly

    ShadowOfMordor 2014-09-30 23-45-47-87.jpg

    much much better
     
  12. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    This. Until a game beats Crysis 3 graphically THERE IS ABSOLUTELY NO JUSTIFICATION for using >3GB vRAM at 1080p. None. Zilch. Nada.
     
    octiceps likes this.
  13. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    See, I know that shadows and lighting tech and stuff can take up vRAM, as well as texture caching for fast movement without popping and long sight distances without much quality degradation and all... but it STILL does not justify the vRAM usage. It just does not. Texture quality is almost the entire problem though. All of the above, at BEST, probably should be using up 600MB of vRAM. This whole texture quality thing is just ugh. There is absolutely 0 reason for the vRAM usage we're seeing here... ESPECIALLY since the ultra HD texture pack is not even OUT YET.

    I don't mind a game using more vRAM than Crysis 3 without looking equal or better... *IF* it has technological benefits that are superior. Like draw distance and good ATOC for grass and stuff... but if there isn't anything truly advanced in these fields, excess vRAM usage is just unoptimization bred from a lack of having to optimize down for the last-gen consoles.
     
  14. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    About 4GB because he is using CrossfireX

    Amount of objects being drawn at any given time, shadow usage, animations, what type of game engine is being used etc is more enough justifications.

    You can`t compare different games and just look at the graphics and decide what game should use how much VRAM
     
  15. Getawayfrommelucas

    Getawayfrommelucas Notebook Evangelist

    Reputations:
    84
    Messages:
    636
    Likes Received:
    51
    Trophy Points:
    41
    I don't know about that. Skyrim with ENB and loads of texture/models that look better use less vram then this game. If modding Skyrim has taught me one thing its that greatness can occur if it's well optimized. Another example would be Watch Dogs w/ the e3 mod. It made the game look better and run better. In both examples, one of the changes to increase performance was to optimize high quality textures and in some cases, models.
     
    octiceps likes this.
  16. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Like I posted above, I can see a lot of vRAM being used for certain technological feats... but 6GB is just a bit too much man. Especially since texture quality is mainly the only thing that bumps it up by a lot, which is a serious tell.
     
  17. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Does the game have MSAA/SSAA? That can seriously bump up video memory usage too, arguably more so than any other setting.
     
  18. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Correct me if I'm wrong but doesn't texture make up the bulk of vRAM usage? I'm simply thinking if a game's texture looks worse than Crysis 3 while using more vRAM, it had better make up for it with "advanced technologies" or some other <del>fluff</del> eye candy.

    But even then I can tell you Crysis 3 at 1080p with 2x MSAA never exceeded 3GB of vRAM throughout the entire game, so I fail to see how 6GB can ever be justified on a game with worse textures. (yes I know Shadow of Mordor "only" uses 4GB on Ultra, but still...)

    EDIT: yeah ok cranking up MSAA/SSAA would do the trick I suppose
     
  19. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Not necessarily, unless we're talking modded Skyrim with uncompressed 4K or 8K textures. Most games allocate a texture pool size of a few hundred megs of VRAM at the highest quality setting. Intelligent streaming and texture compressions algorithms are designed to reduce VRAM usage while maintaining good IQ. Since SSAA renders everything at a much higher resolution, it drastically increases GPU memory and bandwidth requirements. Same thing with MSAA in modern deferred shading engines due to the massive G-buffer.
     
  20. EvoHavok

    EvoHavok Notebook Consultant

    Reputations:
    5
    Messages:
    139
    Likes Received:
    25
    Trophy Points:
    41
    From what I've read you can set the resolution over your native one (150%, 200% like in BF4), so you could say it has SSAA. Otherwise there seems to be no MSAA or post-process AA setting in the menu.
     
  21. Getawayfrommelucas

    Getawayfrommelucas Notebook Evangelist

    Reputations:
    84
    Messages:
    636
    Likes Received:
    51
    Trophy Points:
    41
    I've tried but I can hardly notice the difference between FXAA and all the other AA's. I like FXAA because of that...and it really doesn't hurt the FPS. Is there a significant difference that I may be over looking? Worth reconsidering my AA?
     
  22. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    If you can't see a difference, don't worry about it. ;)

    Yeah that's OGSSAA just like BF4. Surprised there's no in-game PPAA option, but it's easy to force FXAA/MLAA through GPU control panel so no big deal.
     
  23. Getawayfrommelucas

    Getawayfrommelucas Notebook Evangelist

    Reputations:
    84
    Messages:
    636
    Likes Received:
    51
    Trophy Points:
    41
    Good rule of thumb ;P

    I'm just curious if I'm missing anything
     
  24. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    You are, but it'll set you back a whole lotta green to experience it. :p
     
  25. Getawayfrommelucas

    Getawayfrommelucas Notebook Evangelist

    Reputations:
    84
    Messages:
    636
    Likes Received:
    51
    Trophy Points:
    41
    Ya think? I run a single 680m 4gig.
     
  26. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yeah 4x SSAA @ 1080p has the same performance hit as running at 4K. Not happening on a single 680M.
     
    Getawayfrommelucas likes this.
  27. Getawayfrommelucas

    Getawayfrommelucas Notebook Evangelist

    Reputations:
    84
    Messages:
    636
    Likes Received:
    51
    Trophy Points:
    41
    Meh - oh well!

    Price you pay to be a mobile gamer. $$$
     
  28. Emperor_Nero

    Emperor_Nero Notebook Enthusiast

    Reputations:
    0
    Messages:
    25
    Likes Received:
    0
    Trophy Points:
    5
    Any idea on when the 860m benchmarks will be up?

    I'm going to be running it on a i7-4710MQ, 860m 2gb, 12 gb of RAM, have it installed on an SSD (not sure how much of a difference it'd make, just saying). What settings do you think will be viable for that rig?
     
  29. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Why do people continuously cry foul, while touting Crysis 3's VRAM usage, as if it was anything near an open-world game? Your modern corridor-to-corridor shooter shouldn't use much VRAM.

    I'm not defending this game, or saying there is no "foul", just pointing out that there needs to be a better argument made. The raw totality of a game's graphics does not mean every game after it should be compared to it on the basis of graphics. FPS and open-world games simply shouldn't be compared in their VRAM usage. Need an example to which to look forward? Batman: Arkham Knight will be a much better barometer, to determine how much VRAM this type of game should use.

    The "lazy developers" argument is well... lazy.
     
  30. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Already did it properly.
     
  31. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Then we agree that the VRAM usage is likely not unjustified?
     
  32. Vitor711

    Vitor711 Notebook Evangelist

    Reputations:
    156
    Messages:
    654
    Likes Received:
    51
    Trophy Points:
    41
    Specs below in sig.

    I get a variable 45-50FPS more or less with everything at high at 1080 and transparency effects turned off (it's a nice 10% boost and the visual hit is low).

    The benchmark is broken and their FPS measurements are wrong. FRAPS has it off by around 10 or so. The bench pegged me as an average of 75 but that's not what I get in game at all.

    Also, the opening sequence in the rain is a massive performance hog, much like Crysis 3's first rain stage: It is not representative of performance, so just ignore ir and wait until you get into the open world. The number of Orcs on screen doesn't impact things much if at all, but rain does for some reason.
     
  33. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Oh no, I believe it *IS* unjustified, purely because texture quality is the main factor in the size of the vRAM pool being needed. They actually have options that tell how much vRAM would be used as you change them in the options menu, if you watch TotalBiscuit's review. What I AM saying is that I do NOT believe a game needs to look better or even close to Crysis 3 to justify something like 3GB of vRAM. What I am ALSO saying is that Shadows of Mordor proved itself unoptimized with the vRAM buffer purely due to the fact that its main definition between 2GB, 3GB and 6GB vRAM buffers being needed is its texture quality, rather than all its other visual benefits like the draw distance and how far away detail is kept, etc.

    I'm saying if a game has amazing shadow quality and draw distance details and all sorts of stuff like that, it can use a lotta vRAM as much as it pleases. But if the game relegates itself purely to that being on textures, like most games do, then there is indeed a problem, as then we must relegate it to visuals.

    I've no problem with a game like SoM using 3GB+ vRAM on max. Even though the texture quality is not amazing, it's HUGELY open and the distance one can see is pretty far, and it has pretty nice shadows and all that, and I feel that above 2GB is ok for a game of that quality mark... but not 6GB. And from what we've seen of The Evil Within too, it's a fairly linear 3rd person game without ground-breaking graphics in closed corridors. There's no way on this planet it should need 4GB. Same with Wolfenstein: The New Order. Same with Titanfall, which simply made SUPER high resolution textures (and BELIEVE ME THOSE TEXTURES WERE THE CRISPEST THINGS I'VE SEEN IN A LONGGGG TIME) but didn't remember to actually make *GOOD* textures, so we had super high resolution muddy textures. It's a very interesting experience, to see textures be crisp as hell, but have a fundamental "messy" look to them... the only way I could describe it is me looking at them and thinking "Wow, you guys couldn't like colour in the lines for this? It looks like you tried to and failed" about them.
     
    HTWingNut likes this.
  34. klauz619

    klauz619 Notebook Geek

    Reputations:
    0
    Messages:
    85
    Likes Received:
    8
    Trophy Points:
    16
    why does the game look like vomit
     
  35. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    This game looks barely better than the old LOTR: War in the North. It's totally unjustifiable needing 4GB + for such visual mediocrity.
     
    octiceps likes this.
  36. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    ROFL and here we all thought the next-gen consoles would usher in a new era of well-optimized PC ports when the reality of it is the exact opposite. It's lead to nothing but ballooning VRAM requirements due to the consoles' large unified memory pool. Developers aren't optimizing for the split memory architecture of the PC as they have in the past, so say goodbye to Ultra unless you happen to own a $1000 GPU. Are they even following industry best practices regarding streaming and texture compression to reduce reliance on VRAM and swapping? Doesn't seem like it.

    Make no mistake about it, the monstrous VRAM requirement is solely because of poor optimization/laziness/incompetence, not because the art assets are significantly improved in any way over last-gen PC titles, which anybody can see isn't the case.
     
    D2 Ultima and TBoneSan like this.
  37. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    At this rate 4GB will become the new 2GB...

    Honestly though shouldn't nVidia also take some heat for playing the vRAM shell game? If the 6GB Titan cards didn't exist, then the devs would look at the PC market and go "welp they only have 4GB of vRAM, we gotta at least do the bare minimum to make it fit in that limit" as opposed to "You don't have a 6GB card? NO ULTRA FOR YOU. Either plop down $1000 or go cry in that corner."

    It's just really sad when $550 is now "mid-range". **** these devs.
     
    octiceps and TBoneSan like this.
  38. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
  39. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I don't exactly blame them for MAKING 6GB cards, because the quadro cards always had really large vRAM pools and the Titan was meant to be like a best of both worlds kind of thing from them. It's the same with the GTX 580M having 2GB vRAM back when no other cards had that much XD, they just advanced things naturally a bit I guess. It's like leaving a large plate of food on a table; just because it's there doesn't mean one person has to try and eat it all. That's a bad metaphor

    I guarantee you devs do NOT look at that. Most of them think people'll toss a stronger GPU at it to work if they want it to, and if not, then they'll just deal with lowered settings... or they'll buy a console.
     
  40. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Or if they're anything like me, they'll just NOT BUY THE DAMN GAME period. Yeah jokes on these lazy incompetent nitwits.

    I understand what you're saying about advancing technology, but the flip side is it provides an excuse for devs to point their finger and go "well you should've gotten that card". I mean Quadro and Teslas always existed but the devs were at least smart enough to realize they weren't gaming cards so they didn't use those as a reference. But since nVidia markted the Titan (and the laughable Titan-Z) as a gaming card, it's now ok to go "well but those 6GB cards can handle it so tough luck".
     
  41. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    InB4ConspiracyTheories
     
  42. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Well, I understand, but cards like this were out for a GOOD while before devs started to heavily use more than 2GB vRAM... and games that used more than 2GB existed for a really long time too, back when only the 7970 had 3GB vRAM. Arma 2 does, Skyrim with mods does, apparently Crysis 2 does at high enough resolutions and with some extra graphical mods (it uses 2.2GB for me normally at 1080p maxed anyway), etc. They just became way way too exceedingly common after the new consoles released, because they figure "oh well PC has no shortage of resources". Even if they instantly tried to use 4GB vRAM it'd still be an optimization joke; 6GB is just such a level of unheard of that even Joe Public realizes it's incredibly wrong.

    I wish people would have the sense to vote with their wallets, but nobody really does. Or you'll get people going "oh well I have a titan, I can play it!" or something. I was even joking with Ethrem that he could finally put those 880M buffers to use as well XD. But still... devs just need to stop. It doesn't matter if the hardware was capable; I don't remember anyone back in 2007 able to run Crysis on max graphics at 1080p at 60fps. Ever. The difference is that Crysis boasted never-before-seen levels of visual fidelity and tech (like how almost every tree you could shoot down etc) that people never thought could happen. These games are being made for almost-not-even-here-yet technology, but they don't have ANYTHING visually or under the hood that justifies it. So we cry unoptimization.

    I'm not against tech and requirements and everything pushing ahead. I know I WILL be able to max just about any game for the next couple years with my 780Ms and so shall you. But if requirements advance, the tech should too, and I don't see the latter happening, which is what my problem is.

    Don't worry though, eventually they'll have to start "optimizing" for the consoles pretty soon, and PC won't be getting "hey you need 8GB vRAM and 32GB system RAM for medium settings on this game that looks like a high resolution APB reloaded!" as much. But right now? It's gonna get really bad really fast.
     
  43. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
  44. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I'm voting with my wallet. It's hard because I'm a massive LOTR nut, but I'm not buying the game until the textures get optimized, either by the devs or by modders. Medium looks like donkey for requiring 2GB VRAM. The difference between High and Ultra is minimal, but High currently requires 3GB VRAM, so if somebody manages to get High running smoothly on 2GB, they'll get my money.
     
  45. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Basically there's no difference between high and ultra, and barely anything noticeable from a still-frame perspective between medium and high...

    Just absurd. Again not optimized for PC at all. They only do it because the memory is shared between system and video card on the consoles. Windows PC's don't handle that too well. nVidia AMD could do themselves a favor and implement a driver that will manage the texture swapping smartly so they don't need to throw expensive 8GB of fast GDDR5 vRAM on every video card out there. By the way this looks, even mid range video cards will now need at least 6GB vRAM because you will still need it to play at lower resolutions but ultra quality textures.
     
    octiceps, TBoneSan and n=1 like this.
  46. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Even with Ultra textures, the game is still at Dark Souls levels of hideous. :(
     
    TBoneSan likes this.
  47. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I know for sure that at 1440p on a 2GB 680 without downloading the Ultra texture pack it works fine on "high" with everything else maxed (except likely SSAA). However the user claimed he couldn't use "ultra" without framedrops at that point, regardless of not having the ultra textures downloaded. I am interested to note how his experience will differ if he force-downloads the texture pack and THEN tries to run the game the same way. I'm going to ask him to do it now and I'll report back when he's finished.
     
  48. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Like I said, $500 cards are now "mid-range", and $1000 cards will be the new "high end". Anything below $200 could probably be categorized as "disposable" at this point.

    Utterly pathetic.
     
    HTWingNut likes this.
  49. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Well that's weird, because Ultra textures w/o the texture pack is the exact same as High textures.
     
  50. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    <S>That's EXACTLY my point. I thought it ought to be the same, but I've been wondering if not installing "ultra" textures somehow reduces the overall performance hit of the whole thing. Like uh... ultra needs 3GB+, high needs 2GB+, medium needs 1GB+ only, etc. and installing the ultra textures brings it up to the 1GB low, 2GB med, 3GB high, 6GB ultra levels.</S>

    I need to find out from him to be sure. I do not own the game and will not be buying it of my own volition (even if I had money for it right now) at least not until the "GOTY" edition comes out with all the DLC, so I can't do my rigorous testing methods to check.

    Edit: My friend is an idiot, doesn't even remember what he says and doesn't listen to people talking or convey information correctly, apparently. He meant he could not "put the game on ultra" as in "turn all the rest of the settings to ultra" and was downsampling from 1440p to 1080p. He also hacked in SLI to work and gets 60-80fps with two 2GB 680s at 1440p downsample with everything on "high" except ultra textures... I think. I can't even tell if the rest of the settings are on ultra or just high anymore. He also has a 120Hz screen but "60fps is enough for me" was his statement when I said 60fps on "high" for two 680s was low fps.
     
 Next page →