That's because MW3's internal render resolution is 1024x600, while something like Virtua Tennis 3 renders natively at 1080p. If what you said is true (which it isn't), how do you explain these direct frame buffer captures on 360 and PS3, both of which are clearly full 1080p?
http://images.eurogamer.net/articles//a/7/4/5/2/8/Virtua_Tennis_1_1080p.jpg
http://images.eurogamer.net/articles//a/7/4/5/2/8/Virtua_Tennis_2_1080p.jpg
http://images.eurogamer.net/articles//a/7/4/5/2/8/Virtua_Tennis_3_1080p.jpg
Anyway, did you take a look at that Beyond3D post I referenced? Read it and the links embedded within, it'll tell you all you need to know about resolution and scaling on last-gen consoles. I studied this stuff pretty in-depth many years ago, you're not gonna be able to convince me otherwise regarding something I've seen with my own eyes which I know to be correct.
-
Well, I'm telling you what I know. Also a direct framebuffer capture of the dashboard is the same 620p thing, even if you set the console to output at 1080p. Had a friend test that in real-time for me when I pointed it out to him and he said it wasn't true. So for a game to render above the dashboard is indeed weird. No, I didn't look at that post in depth yet. I did a lot of looking before, at different games that were meant to have an internal render above the one I know to be X360's max, but I neither have a X360 nor any way to test the framebuffer captures myself, far less the games.
And he didn't only work on MW3. He didn't even try to find out all the things I mentioned until I started asking him about what it's like working and coding for games (as I always wanted to get into game design) when I found out he was working at Activision, and then later at Microsoft. If your statements are true (and you have a lot of documentation to the fact), then the simple fact is that to this very day, MICROSOFT'S GAME DESIGN EMPLOYEES and many MANY MANY other developers from various companies like Activision's collection of design studios (treyarch, Infinity Ward, Sledgehammer, Raven Studios, Neversoft Studios - now defunct, High Moon Studios, and various others) as well as EA's repertoire of game design studios do not know how the consoles they are working on even properly work. -
Maybe because they're game designers and not programmers/engineers or artists.
-
I was including the entire teams, which means the programmers, engineers and artists.
-
Oh, I was simply making the distinction between gameplay designers and more technical roles, e.g. Romero vs. Carmack, Bleszinski vs. Sweeney, etc.
-
Yes, I understand, but what I meant was I was including said people. Everyone in a team means everyone in a team XD.
-
Karamazovmm Overthinking? Always!
I just hope that total war actually start leaning for heavier system requirements.
simply put its one of the most demanding games out there and it isn't in a 64bit exe -
64-bit .exe games don't really have many benefits yet. They have the ability to access and use a lot more memory in total, but they rarely do so (except apparently vRAM usage these days... >_>). I don't mind games using more out of our hardware you see. I just want it to actually MEAN something. Instead of just being "oh well we don't need to optimize down for 512MB on X360 anymore, so we'll just use 6, 8, 12GB on PC and make 100GB games that are 12-hour SP linear story runthroughs".
-
I think 64-bit should be standard, not extra. 64-bit CPU's have been out for over a decade, 64-bit OSes have been widespread since Windows 7. LAA 32-bit apps can only access 4GB RAM and a lot of games nowadays need more than that. PlanetSide 2 used to crash incessantly (at least once per hour) due to hitting the limit. After the 64-bit update, out-of-memory crashes are almost completely gone, and performance got a nice boost too. It's the only game I have that consistently uses 5+ GB RAM, which is understandable for an MMOFPS of massive scale.
-
If PS2 uses 5GB of RAM, then that's the ONLY x64 game I know of that <S>does so</S> uses over 3GB, far less more than 2GB. BF4 uses 2.2GB at most (3GB with mantle) and all the x64 CoD games use 2GB alone. BF4 also used 2GB alone. So did Titanfall. Watch Dogs I believe also used 2GB too. x64 is currently more of a thing people just use for no reason.
-
Every other FPS is a drop in the ocean compared to PS2.
LOL I think most people have >4GB combined RAM + VRAM. Things get really nasty when you're dealing with SLI/CrossFire since it eats into the address space even further.
-
It's true though. I SERIOUSLY wish people would properly use hardware for real advancements in gaming. Don't ever get that wrong. I just see way too many 64-bit games coming out that don't use even close to 4GB of RAM at all. Makes me feel like it could have just been x86 and been done with it. But they'll push for x64 to no end and then use almost no advancements from it. It's dumb.
-
If I had my way, all intensive apps like games would be native 64-bit, no 32-bit fallback. 4GB can be pretty restrictive for modern AAA's (especially DX9 since combined RAM+VRAM usage cannot exceed 4GB), and native x64 apps gain additional performance benefits, which I'd argue is even more important than the much larger address space.
-
I wonder if this game will be playable with the latest nvidia mobile cards. Like the gtx 870m (6gb) and 880m (8gb)
Sent from my iPhone using Tapatalk -
By the time The Evil Within releases, those cards will be relegated to also-ran status.
-
If nvidia releases the new 970m and 980m yes. But i wonder if i can play this game with those cards.
Sent from my iPhone using Tapatalk -
I'm fairly sure the 870M and 880M can handle anything you toss at it. Incidentally even Shadows of Mordor with its 6GB vRAM requirement LOL
-
You should be fine as long as you have at least 4GB VRAM.
-
That's what i wanted to hear
Sent from my iPhone using Tapatalk -
A 32-bit program (Oct_Test.exe) exceeding 4GB of DRAM + VRAM. Unfortunately I only have a 2GB card to test with so while requested VRAM usage of 2.8GB exceeds 2GB some obviously spills over into the GPU shared bytes.
Optimizing takes time and costs so as many have said, if it runs good enough with the hardware then usually optimizing slows right down from there. Not saying that what's happened here, just that it happens.
Do you remember quite some time ago kkrieger-beta, a sub 100KB FPS, what ever happened to that?
Look on the positive side, should give those 880M owners with 8GB of VRAM something to smile about.
-
Did it crash?
Yeah I remember playing Krieger many years ago. An amazing feat for sure, but I got tired of it pretty quickly. -
Did it crash, no. Having to spill over into shared bytes though doesn't do any favors for the smoothness of the the graphics. The DRAM usage was not necessary, just allocated as non-pageable to demonstrate only.
Yep, kkrieger was only ever a beta demo AFAIK, which showed what might be possible but never seemed to move on from there. Wonder why, too much development time maybe. -
There was an entire website of decent games that were 100KB or under. My friend showed it to me one day. That kkrieger may have been a part of it, and why it never actually passed the beta demo stage you think it was in XD.
I could be totally wrong about kkrieger, mind you. I just know that the 100KB limit projects existed, and for 100KB they looked pretty great. -
I have a gtx 860m, and play wolfenstein with no problems on ultra. runs about 40ish. hmm this will be interesting. I plan on building a new pc come tax time anyways. hope my laptop can at least run it somewhat for now.
-
Unless that is a 4GB 860M, that game cannot use ultra textures. You've been on "high".
-
ROFL might be locked at 30 FPS too. :laugh:
Rumor: The Evil Within PC May Be Locked At 30FPS | DSOGaming | The Dark Side Of Gaming -
I'm trying my damned hardest to not burst out laughing because I'm in a teamspeak full of like 10 other people. But holy meowcow, that is something... lolwhat
http://www.pcgamer.com/the-evil-within-locked-at-30fps-but-you-can-change-it-manually/ Whoops, apparently it's no rumor but it is confirmed, and "intentional". -
It's hard not to laugh, what made me laugh the most is this gem;
There it is people, throw your new high end GPU into the gutter, because you now need 4gb vram to run 1080p. Such a shame as I was really looking forward to buying this on day one. No chance now. This is going to be a disaster. Only mods will save this game. -
King of Interns Simply a laptop enthusiast
Erm why not just run textures on high. On 1080p it is unlikely you will see much difference anyway. I actually enjoyed Wolfenstein. Gameplay was fun. Is graphics all people care about these days.....? Come on people! Quit complaining and start gaming.
If you don't like it don't buy it right! -
If someone pays for first class they expect first class yes? Sure it's all about the gameplay and people can lower the settings to 480p if they want, but when you pay for better than that, then surely you deserve it.
In any case 4gb vram to run 1080p is ridiculous, if the beast that was Crysis 3 did not do that, then I have no idea why this game does, well actually we do know, it's a crap port. Turning a blind eye to it only encourages it. I don't doubt that the game itself is good. I love RE4 but that first PC release is god damn awful, they ain't no denying that. Sadly this seems to be going the same way.octiceps likes this. -
They explained it in that article and admitted it was because that's how the consoles are set up so that's how they set up the PC. I still don't get it though, because it's not like there is any justification for all that vRAM use.
-
Wow. Low FPS and letterboxing? Is this a movie or a game?
-
They're setting themselves up to release a remastered version of The Evil Within a few years down the line.
It's called laziness and not optimizing for the strengths of a specific platform, in this case the split memory architecture of the PC. -
To be honest, if reducing textures is like doing it in MW3, then it will simply upscale the game from a lower res, regardless of whatever you're running the game at. MW3 you had to use "native" texture resolution to match 1080p; anything less, even "extra", was simply 720p upscaled (which drove me crazy because I couldn't get over how blurry things were).
If they said you CANNOT experience the game at 1080p, I REFUSE to upscale a game because somebody decided to use a generally accepted-as-bad engine AND still code badly.
Please note, I can indeed run the game, at beyond max. I simply dislike these practices. You're right, the solution would be indeed to not buy the game, but if only most people would do that. The problem is most everyone with a voice is GOING to buy the game because 1 - hype and 2 - don't care much about settings, and they're gonna tell most everyone ELSE to do so too, and that it runs fine.
Basically, it's pointless. Anyway, we'll see when the game comes out. I simply find it's awful that this is now a THING. They even admitted entirely that it was because of the current-gen consoles why it needs 4GB vRAM. -
Right, but what is it using the vRAM for? I understand if it's shared RAM on the consoles, but that means also for stuff that needs to be stored in regular RAM, and it doesn't quite work the same way in the PC as it does on the consoles. That should mean no system RAM is used for the game itself then, which is hard to believe.
-
Back with X360 era, the devs tried really hard to compromise. They took out little things that ate up extra memory, compressed things like crazy, etc. Now that it's open, they appear to be just leaving those compression methods alone, because it's not necessary. The problem is it doesn't work out well for PC users because when they transfer things over and/or bump the resolution up any more (because most of the console games don't run at 1080p at all) of the textures, shadows, etc and add extra light sources and reflections, maybe improve the view distance a bit, etc etc? The vRAM counter goes up further from the already-flippant 3-4GB usage on the consoles. This is why SoM wants 6GB. This is why Evil Within wants 4GB for 1080p 30fps. This is why Titanfall wanted 3GB+ vRAM.
Thing is, the texture quality doesn't improve with resolution bumping. Run Half Life 2 rendered at 8K and downsample to 1280 x 720 and then proceed to add 64xQ CSAA via nVidia control panel... what are you gonna get? The sharpest, crispest looking game you've ever seen where jaggies are nowhere to be found. What's going to look better? L4D2 natively rendered at 1080p. Why? Because better drawn textures. The consoles can't handle impeccable texture detail because they're weaksauce, but they can handle high resolution textures due to an overabundance of memory, therefore they don't compress things much and let the memory get used up which doesn't impact the game's performance, and let high res help out the mediocre drawn quality. Then because PC is essentially (according to my dev friend) a "port of the beta of the primary console version" which is then updated for the varying hardware and such, they won't bother doing a thing about that. Because it's not enough work for profit. Because the general public will simply, to quote a Ubisoft CEO's now-redacted statement, "toss a stronger graphics card at it". And so said so done... Watch Dogs, Wolfenstein, BF4, BF hardline (with its beta), Titanfall, etc all came out and demanded POWAH from the PC, and only two of those games actually really deserved it. But everyone just went "oh, well I guess time to upgrade" and did so anyway.
And so the cycle continues. And as nipsen said, a lot of people shelved the hobby. I do a lot of the same... I wait till I get something in giveaways or on a sale for a super cheap amount usually. But I'm not fooling myself by using that to vote with my wallet or anything... I'm just waiting until I only have to give the amount they deserve for the effort they put in for this particular platform. I pre-ordered Payday 2 because of the effort it got and how they really tried. And it was worth it, despite the time it took to become an actually good game mechanically. But it did, and was fully supported well. I can't give the same tip of the hat to something like Skyrim, or CoD: Ghosts. It doesn't mean I won't play a game I like, it just changes how much I am going to determine it's worth giving to the game. If the best game in the world ran like Watch Dogs and it does so because the devs purposefully coded it as such, then why would I give the game money? They clearly didn't want the money from this platform... but it's a personal thing and I have no delusions of grandeur from it.
I would like to have a loud enough voice to influence others about these things though, but alas I am just fairly recognized on this forum and nowhere else on the internet. And most of you guys are smart enough to not need me explaining all this to realize it doesn't deserve much of your money XD. -
Well said fella. Going abit off topic but Payday 2 was like crack to me for quite a while. I wouldn't mind getting into it again. I got the Hotline Miami DLC but haven't tried it yet.
-
Jesus H. Christ! I still can't get over how these people could possibly think that 30 FPS and letterboxing is acceptable in this day and age. And why does this game have such ludicrously high system requirements if 25% of the screen is obscured by black bars?!
No offense, but if Japanese devs can't understand the desires and expectations of Western audiences and specifically PC gamers, they shouldn't be making these kinds of games. They ought to read this well-versed letter written by Peter "Durante" Thoman AKA the modder who saved Dark Souls on PC.
/rantTBoneSan likes this. -
But but but.. it's for the "filmic" effect
My personal favorite cop out yet.D2 Ultima likes this. -
Here's an another example of the kind of verbal diahoria falling from big developers dirty mouths.
Assassin's Creed Dev Says Industry Is Dropping 60fps - Twinfinite
Ubisoft, you flock of shills, crawl back from whence you came!octiceps likes this. -
Of all the lame excuses I've heard made for this current generation of weak sauce consoles, this has to be the best one yet!
ROFL @ all the clueless 30 FPS apologists/console fanboys in the comments. Someone please show them the light.TBoneSan likes this. -
Isn't it! If he's not shilling I don't know who is. Just grow a pair and say it's not worth the sacrafice for their games due to hardware (console) limitations.. fine. Nothing wrong with a little honesty and goodwill.
But to be spilling this kind of deliberate BS is pure sleaze. Pretty telling how switched on they think their audience is though isn't it. -
So why not make it 24 FPS then? Even more cinematic!
Just realized how much this game reminds me of Dead Rising 3...TBoneSan likes this. -
So.....
Us mobile users are going to laugh at all the 3GB desktop cards now? That's a change.
There are going to be lots of pissed off 780 Ti owners when they see their buddy's 6GB 970M stutter less... I know I threw a fit when I realized that my 880Ms stutter less in Watch Dogs than my 780 Ti which was basically unplayable on Ultra.
I smell a conspiracy... must be lots of cash going towards destroying PC gaming right now..... -
Basically. You should read the Shadow of Mordor megathread over at OCN. Filled with crying 780/780 Ti owners.
-
And I imagine 970 and 980 owners are next since 4GB isn't that far from 3GB.
We may very quickly be in a world where desktop gamers actually cry over our laptops :/D2 Ultima, TBoneSan and Shadow God like this. -
Ha, Japanese devs at it again!
Final Fantasy XIII â PC Version Is Locked At 720p | DSOGaming | The Dark Side Of Gaming
What is this world coming to? -
Lazy developers everywhere
Too much profit driven and bosses giving the all clear to invest as little time as possible to get the PC version out as quick as possible
This is what happens when consoles share the exact same architecture as PC. We end up picking the shortest straw -
Seriously? And people were just praising them the other day for NOT being lazy. UGH!!
-
moviemarketing Milk Drinker
980m already shipping with 8GB VRAM on some models, better than the desktop version in this way.
The Evil Within - ridiculous system requirements
Discussion in 'Gaming (Software and Graphics Cards)' started by octiceps, Sep 25, 2014.