Here we go again. Unoptimized id Tech 5 game, round 3.
The Evil Within system requirements calls for some serious hardware | PC Gamer
Rage (2011) - Broken release drivers, texture streaming issues, no multi-GPU support, 60 FPS cap, zero graphics options, dynamic frame buffer (graphics and resolution adjust on the fly to maintain 60 FPS), looks like donkey
Wolfenstein: The New Order (2014) - Still texture pop-in, no multi-GPU support, 60 FPS cap, limited graphics options (no AA & AF), Ultra locked out on <3GB VRAM cards, looks like donkey
The Evil Within (2014) - ?
Amazing showing for an ancient OpenGL 3.2 (DX9 equivalent) engine. GG Bugthesda.
-
Haha I dont care. That is one of my most anticipated games this year. I shall play it even if it means fiddling around with setting and drivers for hours and hours. Ive seen some previews from youtubers and the game does need a decent rig.
Its been so long since I played a horror game and it looks awesome -
dumitrumitu24 Notebook Evangelist
but 4gig vram is a minimum for optimal playing even on low?i thought it would be something similar to wolfenstein which wasnt a bad optimized game except that 3gb vram for ultra.It worked fine for me.Guess we have a clear winner of the most demanding game and i thought it would be Ryse
-
50GB? oh those poor soul with crap internet.
-
Of course, I have no right to preach, but PC gamers should care about stuff like this. This kind of crap goes on year after year, one AAA release after another, and it's unacceptable.
Don't the studios/publishers know that obscene system requirements and poor optimization turn off potential customers? They're essentially shooting themselves in the foot with these practices, among others. Then they turn around and blame poor sales on "piracy" and "all PC gamers are thieves" and devote even fewer resources to PC. And so the viscous cycle continues.Karamazovmm, D2 Ultima, TBoneSan and 1 other person like this. -
-
I always say they're going crazy now that they don't HAVE to optimize anymore. I rarely buy these new games anymore. There is zero reason 200GB = 4 games on my SSD. (Titanfall, Wolfenstein, CoD: Ghosts, Evil within. All 50GB+ each, NOT COUNTING DLC). And it's not like the visuals make up for it, or game length either. It's all pure unoptimization. Look at payday 2: they got a 30GB game down to about 12GB EASY just optimizing how things loaded... AND It still provided benefits. Like come on!
-
Well the videos of this game in youtube running on PS4 were stuttering.
Maybe its meant to be played with framerates lower than 30! lol -
dumitrumitu24 Notebook Evangelist
i saw shadow of mordor getting nice review so far?but i think a minimum is gtx 560?thats also a demanding game
-
Oh dear, oh dear, this game is going to end up with negative reviews for the PC version. I can see loads of people buying this on Steam and then having a number of issues running it. To think this would have been the first game I'd buy on day one after years of not doing so, looks like my run will carry on. Will wait for GOTY edition, aka we fixed everything version.
-
What worries me is the new Doom is going to be the same sad story.
-
dumitrumitu24 Notebook Evangelist
Has anybody saw the reguirements for shadow of mordor?1gb for low textures 2gb for medium 3gb for high and 6gb vram for ultra.The battle has just begun haha and probably next week ubisoft will take the crown with official AC:Unity requirements
-
saturnotaku Notebook Nobel Laureate
Maybe it's because I play at 720p (well, 800p on account of my MacBook's 16:10 screen), but Wolfenstein runs decently well at medium/high on my comparatively ancient Radeon 6770M with only 1 GB vRAM.
-
The point isn't that it runs well, it's that it's unnecessary to require something like this, and it's commonplace these days, and it comes with a lack of comparable increase in visual fidelity. -
It seems these days devs simply look at what's the most powerful thing available on the market, and then use that as a baseline for their "Ultra" requirements.
-
I guess there is SOME use to the ridiculous amounts of vram in mobile GPUS huh?
hahaha
nipsen likes this. -
-
-
-
R.I.P. GTX 780M, it was good knowing ya. We had good times. Unfortunately, bad optimisation on the behalf of devs has forced us apart. Love you always, but GTX 980M is calling...
D2 Ultima likes this. -
How is asking for a 2+ year old card and 4GB VRAM in any way "ridiculous"?
-
-
-
-
If they know the audience they're speaking to, something more and more devs are doing on PC, they gave out the specs needed to max the game's settings.
-
Which tells me they don't know their audience.
-
As far as recommended spec has always been, it has been "high" spec, 1080p and "playable" framerate (30 or higher). I have NEVER seen a game maxed on recommended specs at 1080p/60fps before. If they have seriously changed this, then fine. That's perfectly acceptable for an "ultra" spec requirement, especially for a game that's going to be running at a similar (though a bit toned down) visual fidelity on current-gen consoles at 900p/1080p 30fps. -
Hey guys! Just got myself this bad boy:
-
One thing I do give credit to consoles for is the fact that later in the lifespan, devs actually learn some restraint and actually start trying to optimize, which spills over to PC. 50 gigs? .
-
I only expect High/30fps out of the 670, which seems to contradict my previous statement.D2 Ultima likes this. -
So what exactly are you trying to say?
-
http://www.newegg.com/Product/Product.aspx?Item=N82E16814195129
-
-
D2 Ultima likes this.
-
By the way this is totally relevant right now apparently
-
I'm having zero issues playing wolfenstein.. I don't think idtech 5 is going to be an issue.
-
-
And I think that's really the point everyone is making. It's not right. Like I was saying to Kevin earlier; if the game required those settings for 1080p 60fps Ultra settings, then I'd say ok. Sounds fair. But recommended settings is NEVER 1080/60 @ ultra. It's 1080/~40 @ "high". Which is indeed a problem as far as optimization goes. And people will yell at me and say I don't know how they optimized and I'm not a dev and whatever, but the end result speaks for itself: if the game looks better than what we've seen before which justifies its ridiculous requirements, then it is justified. If it doesn't, then we have a problem of unoptimization/sloppy coding/all-around-uncaring-for-PC-crowd. And that's just no bueno. -
What do you think of the Ryse system reqs?
Minimum:
CPU: Dual core with HyperThreading technology or quad core CPU
Examples: Intel Core i3 2.8 GHz (3220T) or AMD Phenom II X4 3.2 GHz (945)
Memory: 4 GB RAM
GPU: DirectX 11 graphics card with 1 GB video RAM
Examples: NVIDIA GeForce GTX 560 or AMD Radeon HD 7770
OS: 64 bit Windows (Vista, 7, 8)
HDD: 26GB
Recommended:
CPU: Quad Core or Six Core CPU
Examples: Intel Core i5 3.3 GHz (2500k) or AMD FX-6350 3.9 GHz
Memory: 8 GB RAM
GPU: DirectX 11 graphics card with 2 GB video RAM
Examples: NVIDIA GeForce GTX 660Ti or AMD Radeon 260x or 7850
OS: 64 bit Windows (Vista, 7, 8)
HD: 26GB
Recommended for 4k gameplay:
CPU: Quad Core or Eight Core CPU
Examples: Intel Core i7 3.5 GHz (2700k) or AMD FX-8350 4.0 GHz
Memory: 8 GB RAM
GPU: DirectX 11 graphics card with 4 GB video RAM
Examples: NVIDIA GeForce GTX 780/ Titan or AMD Radeon 290x
OS: 64 bit Windows (Vista, 7, 8)
HD: 26GB -
I already discussed it in the first post of this thread: http://forum.notebookreview.com/gam...diculous-system-requirements.html#post9785053
Long story short, it's awesome, better than 99% of games out there. Crytek is doing PC gamers a great service by being so in-depth as to list 3 tiers (Low, High, and Ultra+) along with specific hardware so informed people can make direct comparisons and it leaves no room for doubt. And the requirements are reasonable given the level of visual fidelity. -
This round of consoles are shockers. -
My only issue is the "8GB RAM recommended" thing. Until I see games using 4GB+ of system RAM, I shouldn't ever see more than 4GB (I'll let 6GB slide) of recommended RAM. Not to say that having 8GB of RAM is a bad thing, and I recommend 8GB as the minimum to almost everyone these days who even considers light gaming, but there is no way games are going to use 6GB of RAM to run (which'd require 8GB for x64 OSes as the rec specs).
The 780/Titan for 4K res gameplay is pretty good, to be honest. And Ryse does look quite impressive. Since min spec is 720/30 which is what Xbox 1 runs at, the HD 7770 or GTX 560 is a perfect fit for it. It DOES actually make sense the specs they ask for. A 660Ti for 1080/30+ is what all games that run at 720p or higher on a Xbox 1 should need, considering the extremely weak nature of the X1's GPU. -
To require 8GB RAM would indicate 64-bit would it not?
-
The Xbox 1 is seriously pathetic though. I won't lie. I really can't believe they did not intend for that console's internal max render resolution to be above 900p in 2013. -
-
I stand corrected about the res on X1. If that's the case, then those specs are too high for PC min... but I don't know what the settings on X1 were at, so I can't comment. But for how it looks, it's still okay I guess. I'd need to see what min spec could do in terms of resolution and specs, and how well it runs.
PS3 was definitely stronger than X360; devs simply could not (and cannot still) figure out how to make use of its hardware properly is all. I do have confirmation from a dev who's worked on X360, PS3, Wii, Wii U and PC titles of the same game who confirmed their power relative to each other for me. I think the PS3 was stronger than a 8800GTX for sure; I don't think X360 was though. -
How so? 360 had a stronger GPU and much higher bandwidth due to its eDRAM which allowed free 2x MSAA, hence why it looked/ran better than PS3 in multiplats.
It also had a stronger CPU--3 x 3.2 GHz general purpose IBM PPC cores w/SMT (very simple, lacked OoOE, tiny cache, much smaller die size than x86 P4/Athlon of the day) whereas PS3 only had one PPC core and 6 SPU's, which were underutilized by most devs that were not Sony first-party and hence lay around as useless DSP's. Cell was an overhyped piece of trash whose real strength lay in HPC not in gaming.
To put PS3 in comparison with 8800 GTX is a joke. First of all, the 8800 GTX cost as much as an entire PS3 did. Second, they're not even in the same stratosphere performance-wise. The PS3 GPU (RSX) was a severely crippled off-the-shelf 7800 GTX 256MB (halved memory bus/controllers and ROP's), 8800 GTX was 3x as fast as a full-fledged 7800 GTX 512MB.
As far as Ryse is concerned, I'd venture the listed minspec is good for 720p45 or 900p30 (like XB1) at Medium. -
I can't claim to know what the PS3's specs are or were. All I know is that it is most definitely a much more powerful system than the Xbox 360... actually capable of internally rendering 1080p games (though there were only a literal handful of them, according to my friend) while Xbox 360's MAXIMUM internal render resolution was 1060 x 620 or something similar (all higher "resolutions", including 720p, were upscales done on an internal upscaler chip). He said PS3 is stronger, but since only Sony knew how to actually code for it because the architecture was confusing as hell and they provided no tools (the PS3 devkit was little more than a PS3 hooked up to a PC. X360 devkit had more tools they had available to use; PS4 fixed this issue straight) and thus most people simply coded X360 then ported to PS3, which is why PS3's games usually had worse quality except for 1st party Sony games (visually at least).
The Wii U devkit was worse though. The Wii U's devkit has zero software and people have to try Wii hacks to get their games running on a Wii U. It's part of the reason why the Wii U isn't getting any games even when they're coming out for X360/PS3 still. The devs don't feel like putting that much time and effort into throwing a game there when it's barely going to sell; the time/effort it'll take to even figure out the Wii U isn't worth the returns to publishers. -
No, what I'm saying is that 360 had the superior hardware. It had the better console GPU with an early unified shader architecture, a more conventional CPU setup that was much easier for developers to work with, and a great toolset. PS3 had a weaker GPU, and while Cell's SPU's could be made to do wonderful things, it was a PITA to code for, so most multiplat devs did little to take advantage of it. Since PS3 had only one general purpose CPU core as opposed to 360's three, it effectively had a fraction of the CPU power if SPU's were not programmed properly, in addition to the weaker GPU and lack of high bandwidth eDRAM that were in the 360's favor.
Superior hardware was a big part of the reason 360 either tied or beat PS3 head-to-head in multiplats, it was never the other way around. For example, you can look through Digital Foundry's analyses of the entire CoD franchise from CoD3 onward. 360 almost always had better image quality (usually better AA/AF) and/or a more stable frame rate.
What you said is bogus, there is no hard limit on render resolution of either console. You can see the resolutions of many last-gen console games here, there are a bunch that render at full 1920x1080 on both consoles. -
Nope, got ya wrong there. Get a capture card that actually grabs the internal render of the X360 and it will never cross the resolution I listed (it may be slightly different, like 1050 x 640 or something, but it definitely isn't past 1080 width or 650 height). Anything above that res *IS* an upscale. And it's funny, the person who I'm talking about worked on CoD: MW3 as his first game XD. He's the one who usually tells me what the consoles are like to code for for devs. You ARE right that the PS3's hardware doesn't do as well unless you know what you're doing with it, which as I did say most devs DIDN'T, but again, according to him, if you COULD take advantage of the PS3, it beat the heck out of the Xbox 360. I will say colours are better on Xbox 360 though, I know at least that much is always true regardless of game.
Also fun tidbit: Incidentally, Xbox 1 is SUPPOSED to have a max internal render of 900p, and have a similar upscale function for 1080p games like Forza, but I've heard from end-users that the game does indeed render at 1080p. I haven't seen a firmware update that addresses the 900p max render, so my only guess is they updated the SKU or gave different SKUs to different devs or something? I'm really not sure. Eventually people will find out for sure though. My buddy was working at Microsoft at the time X1 was released and that's how I found that but out, but I can't find enough concrete proof for or against that anywhere.
I never coded for either. I'm not going to say what I say as first hand, nor that I know everything. I am going to say I heard about it second hand though, which is why I'm not backing down about some points.
Also, again, you're right about coding for X360 being a lot easier. But that's partially why PS3 ports were so bad. See, they never really got any time or attention. It was "make game for Xbox 360 first" then "port game from X360 to PS3" and then once that was done, "port game to PC" if they made a PC version. Xbox multiplats usually sold better (especially the online multiplayer ones) so they got even MORE attention. Since devs of mutliplat games got the porting down so well, they never really put the time into figuring out coding for PS3 any better. They just did enough to make the games work on them. Mainly due to time constraints, especially with the aforementioned CoD franchise. Those poor devs only had 1.5 years to make each game. You could say 2 years, but at least half of the first year would be split into making DLC for the last game, so it's not like it was full steam ahead. Why split the resources more just to learn how to code? Can port like usual and get just as much money. PC? It sells about the same amount each year no matter how good or bad the port is, so why not make a PC version?
Ever notice how EVERY single Wii/Wii U version of a COD game was made or ported by treyarch? CoD 4's version, Reflex? Treyarch did it. MW3 for Wii? Treyarch. Black Ops 1 for Wii and BO2 for Wii U? Treyarch did it. If it didn't involve Xbox 360, PS3 or PC, infinity ward didn't bother giving it the time of day. That should explain how little devs usually cared about learning a platform they didn't currently understand.
Aaaaand wow I went way off topic there. But it's kind of on-topic, at least you get an idea on why devs probably don't bother optimizing for PC. Most people won't care and will quite literally just toss a stronger graphics card at it until it works. It's a sad reality. I saw some streamers play Watch Dogs and Wolfenstein and they just didn't care about framerates or anything. They just ran the game and I suppose it ran well enough that they didn't notice random stutter (under 30fps) and streamed it and they went around recommending it to everyone who asked if they should get it for PC. And that's probably how the general public sees things.
The Evil Within - ridiculous system requirements
Discussion in 'Gaming (Software and Graphics Cards)' started by octiceps, Sep 25, 2014.