The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    The Evil Within - ridiculous system requirements

    Discussion in 'Gaming (Software and Graphics Cards)' started by octiceps, Sep 25, 2014.

  1. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Here we go again. Unoptimized id Tech 5 game, round 3.

    The Evil Within system requirements calls for some serious hardware | PC Gamer

    Rage (2011) - Broken release drivers, texture streaming issues, no multi-GPU support, 60 FPS cap, zero graphics options, dynamic frame buffer (graphics and resolution adjust on the fly to maintain 60 FPS), looks like donkey

    Wolfenstein: The New Order (2014) - Still texture pop-in, no multi-GPU support, 60 FPS cap, limited graphics options (no AA & AF), Ultra locked out on <3GB VRAM cards, looks like donkey

    The Evil Within (2014) - ?

    Amazing showing for an ancient OpenGL 3.2 (DX9 equivalent) engine. GG Bugthesda.
     
    Mr. Fox and D2 Ultima like this.
  2. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Haha I dont care. That is one of my most anticipated games this year. I shall play it even if it means fiddling around with setting and drivers for hours and hours. Ive seen some previews from youtubers and the game does need a decent rig.

    Its been so long since I played a horror game and it looks awesome
     
  3. dumitrumitu24

    dumitrumitu24 Notebook Evangelist

    Reputations:
    24
    Messages:
    401
    Likes Received:
    40
    Trophy Points:
    41
    but 4gig vram is a minimum for optimal playing even on low?i thought it would be something similar to wolfenstein which wasnt a bad optimized game except that 3gb vram for ultra.It worked fine for me.Guess we have a clear winner of the most demanding game and i thought it would be Ryse
     
  4. baii

    baii Sone

    Reputations:
    1,420
    Messages:
    3,925
    Likes Received:
    201
    Trophy Points:
    131
    50GB? oh those poor soul with crap internet.
     
  5. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    That's all fine and dandy, but there's no excuse for it. id Tech 5 is less advanced than 10-year-old id Tech 4 (Doom 3 engine) from a rendering perspective, John Carmack himself said so. It sacrifices dynamic lighting and shadowing, among other things, for MegaTexture. The newer id Tech 5 games are better in terms of raw asset quality (texture resolution, poly count), but those old games run maxed out on my toaster, while The Evil Within has higher system requirements than Crysis 3?! Where's the disconnect? Obviously, devs don't know how to optimize anymore, or they just don't care.

    Of course, I have no right to preach, but PC gamers should care about stuff like this. This kind of crap goes on year after year, one AAA release after another, and it's unacceptable.

    Don't the studios/publishers know that obscene system requirements and poor optimization turn off potential customers? They're essentially shooting themselves in the foot with these practices, among others. Then they turn around and blame poor sales on "piracy" and "all PC gamers are thieves" and devote even fewer resources to PC. And so the viscous cycle continues.
     
  6. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Except that CryEngine looks amazing and Ryse has much more reasonable system requirements given its level of fidelity. Whatever you may think of Ryse's gameplay, there's no denying it's basically a tech demo. If there's anything Crytek hasn't forgot about making games, it's bleeding-edge graphics.

     
  7. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I always say they're going crazy now that they don't HAVE to optimize anymore. I rarely buy these new games anymore. There is zero reason 200GB = 4 games on my SSD. (Titanfall, Wolfenstein, CoD: Ghosts, Evil within. All 50GB+ each, NOT COUNTING DLC). And it's not like the visuals make up for it, or game length either. It's all pure unoptimization. Look at payday 2: they got a 30GB game down to about 12GB EASY just optimizing how things loaded... AND It still provided benefits. Like come on!
     
  8. thesilent85

    thesilent85 Notebook Geek

    Reputations:
    0
    Messages:
    79
    Likes Received:
    8
    Trophy Points:
    16
    Well the videos of this game in youtube running on PS4 were stuttering.

    Maybe its meant to be played with framerates lower than 30! lol
     
    D2 Ultima and octiceps like this.
  9. dumitrumitu24

    dumitrumitu24 Notebook Evangelist

    Reputations:
    24
    Messages:
    401
    Likes Received:
    40
    Trophy Points:
    41
    i saw shadow of mordor getting nice review so far?but i think a minimum is gtx 560?thats also a demanding game
     
  10. Rizer

    Rizer Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    9
    Trophy Points:
    16
    Oh dear, oh dear, this game is going to end up with negative reviews for the PC version. I can see loads of people buying this on Steam and then having a number of issues running it. To think this would have been the first game I'd buy on day one after years of not doing so, looks like my run will carry on. Will wait for GOTY edition, aka we fixed everything version.
     
  11. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    What worries me is the new Doom is going to be the same sad story.
     
  12. dumitrumitu24

    dumitrumitu24 Notebook Evangelist

    Reputations:
    24
    Messages:
    401
    Likes Received:
    40
    Trophy Points:
    41
    Has anybody saw the reguirements for shadow of mordor?1gb for low textures 2gb for medium 3gb for high and 6gb vram for ultra.The battle has just begun haha and probably next week ubisoft will take the crown with official AC:Unity requirements
     
  13. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,879
    Messages:
    8,926
    Likes Received:
    4,706
    Trophy Points:
    431
    Maybe it's because I play at 720p (well, 800p on account of my MacBook's 16:10 screen), but Wolfenstein runs decently well at medium/high on my comparatively ancient Radeon 6770M with only 1 GB vRAM.
     
  14. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    You are not allowed to turn on "ultra" textures in Wolfenstein unless you own a 3GB+ vRAM GPU. If you forced it, it would run like balls even though the GPU is sufficiently powerful enough. Proof here

    The point isn't that it runs well, it's that it's unnecessary to require something like this, and it's commonplace these days, and it comes with a lack of comparable increase in visual fidelity.
     
  15. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    It seems these days devs simply look at what's the most powerful thing available on the market, and then use that as a baseline for their "Ultra" requirements.
     
  16. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    I guess there is SOME use to the ridiculous amounts of vram in mobile GPUS huh? :D hahaha
     
    nipsen likes this.
  17. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    This makes me sick to my stomach. Never thought I'd see the day a hack'n'slash asked for 6gb VRAM. They could at least humor us and pretend to optimize.
     
  18. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Will it even be good enough to get a GOTY version?

    I have hope because Doom will be on id Tech 6 and Carmack is no longer involved. The stupid decisions made with id Tech 5 were his fault. 10 years ago I never thought I'd say this about Engine John, how times have changed.
     
    D2 Ultima and TBoneSan like this.
  19. Rizer

    Rizer Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    9
    Trophy Points:
    16
    The game it self looks ok to me, like RE4 style. Funny because RE4 first PC release was awful, years later they gave it some justice with HD release. However if theirs no GOTY bugs fixed more optimized version of Evil Within, then I suspect this game will be getting 40-60 percent off regularly on Steam sales very quickly. Unless they come out with another statement, then they have probably already killed sales for the game, and will probably blame piracy like Ubisoft and their crap ports.
     
  20. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    R.I.P. GTX 780M, it was good knowing ya. We had good times. Unfortunately, bad optimisation on the behalf of devs has forced us apart. Love you always, but GTX 980M is calling...
     
    D2 Ultima likes this.
  21. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    How is asking for a 2+ year old card and 4GB VRAM in any way "ridiculous"?
     
  22. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    That's the ridiculous part. Check the top of this chart. All cards except Titan and Hawaii have <4GB. There goes 99% of your market.
     
  23. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Do you really think the game will want 4GB VRAM for low, medium, high, and ultra settings?
     
  24. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    It's recommended spec, which is usually "high" at 1080p (and DOES NOT even mean 60fps either). It doesn't even mean "ultra". By this logic, 2GB vRAM or so is probably only good for low-medium, similar to Shadows of Mordor. Which is a problem. It's one thing if ultra textures (AND ULTRA TEXTURES ALONE) is what needs 3GB+, but they're specifically asking for a minimum of 4GB for just high specs. It'd be fine if the game looked like Crysis 4, but as far as I've seen, it does not.
     
    octiceps and TBoneSan like this.
  25. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    If they know the audience they're speaking to, something more and more devs are doing on PC, they gave out the specs needed to max the game's settings.
     
  26. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Which tells me they don't know their audience.
     
  27. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    If I can turn that game's SLI support off and max that game and get MINIMUM 1080p 60fps with 950/6000 on my card (better than a GTX 670's stock clocks), then I will eat my words.

    As far as recommended spec has always been, it has been "high" spec, 1080p and "playable" framerate (30 or higher). I have NEVER seen a game maxed on recommended specs at 1080p/60fps before. If they have seriously changed this, then fine. That's perfectly acceptable for an "ultra" spec requirement, especially for a game that's going to be running at a similar (though a bit toned down) visual fidelity on current-gen consoles at 900p/1080p 30fps.
     
  28. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Hey guys! Just got myself this bad boy:

    [​IMG]
    I'm so ready for next-gen gaming!
     
    HTWingNut and TBoneSan like this.
  29. Defengar

    Defengar Notebook Deity

    Reputations:
    250
    Messages:
    810
    Likes Received:
    40
    Trophy Points:
    41
    One thing I do give credit to consoles for is the fact that later in the lifespan, devs actually learn some restraint and actually start trying to optimize, which spills over to PC. 50 gigs? .
     
  30. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Hey I didn't say anything about 60ps. That a whole different ball game.

    I only expect High/30fps out of the 670, which seems to contradict my previous statement.
     
    D2 Ultima likes this.
  31. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    So what exactly are you trying to say?
     
  32. Dufus

    Dufus .

    Reputations:
    1,194
    Messages:
    1,336
    Likes Received:
    548
    Trophy Points:
    131
    Noooo.. Send it back and future proof yourself for a whole year with this one.

    http://www.newegg.com/Product/Product.aspx?Item=N82E16814195129

    :rolleyes:
     
  33. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Well then it comes down again to the point XD. If I need more than a 670 4GB to get 1080/60 on max and it doesn't look like Crysis 4... =D. Far less if it looks like any game from 2012 (which probably runs 1080/60 ultra on a single 680 2GB)
     
  34. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    ZOMG! 4GB video card! It must be SOOOO, like, super fast.
     
    D2 Ultima likes this.
  35. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    By the way this is totally relevant right now apparently
    [​IMG]
     
    HTWingNut and TBoneSan like this.
  36. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,296
    Likes Received:
    3,048
    Trophy Points:
    431
    I'm having zero issues playing wolfenstein.. I don't think idtech 5 is going to be an issue.
     
  37. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    It was totally relevant last gen too, just with Xbox 360 and then PS3 rounding out the bottom.
     
  38. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    It's not exactly that the games RUN badly... it's that their demands for the visual fidelity they provide is too high. I mean I *WILL* be able to run The Evil Within at max settings 1080p... by right of a 4800MQ and two overclockable 780M 4GB cards. But it doesn't look like a game that'll need 99% scaling from both cards with an OC for 60fps at max settings, for example. If that's what it wants to run, it should look like a game that demands so much from a system. Which it doesn't. Bethesda has some obsession with DX9-type games. Skyrim was DX9 and I believe Rage was intended to be DX9 (because I remember SOMEONE which may or may not be John Carmac stating how there was still much they could do with the DX9 API) even if it ended up being OpenGL. I also believe Wolfenstein uses the OpenGL feature level that equates it to DX9. So yeah... high demands like that, ESPECIALLY from DX9-feature-level games? It is not right.

    And I think that's really the point everyone is making. It's not right. Like I was saying to Kevin earlier; if the game required those settings for 1080p 60fps Ultra settings, then I'd say ok. Sounds fair. But recommended settings is NEVER 1080/60 @ ultra. It's 1080/~40 @ "high". Which is indeed a problem as far as optimization goes. And people will yell at me and say I don't know how they optimized and I'm not a dev and whatever, but the end result speaks for itself: if the game looks better than what we've seen before which justifies its ridiculous requirements, then it is justified. If it doesn't, then we have a problem of unoptimization/sloppy coding/all-around-uncaring-for-PC-crowd. And that's just no bueno.
     
  39. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    What do you think of the Ryse system reqs?

    Minimum:
    CPU: Dual core with HyperThreading technology or quad core CPU
    Examples: Intel Core i3 2.8 GHz (3220T) or AMD Phenom II X4 3.2 GHz (945)
    Memory: 4 GB RAM
    GPU: DirectX 11 graphics card with 1 GB video RAM
    Examples: NVIDIA GeForce GTX 560 or AMD Radeon HD 7770
    OS: 64 bit Windows (Vista, 7, 8)
    HDD: 26GB

    Recommended:
    CPU: Quad Core or Six Core CPU
    Examples: Intel Core i5 3.3 GHz (2500k) or AMD FX-6350 3.9 GHz
    Memory: 8 GB RAM
    GPU: DirectX 11 graphics card with 2 GB video RAM
    Examples: NVIDIA GeForce GTX 660Ti or AMD Radeon 260x or 7850
    OS: 64 bit Windows (Vista, 7, 8)
    HD: 26GB

    Recommended for 4k gameplay:
    CPU: Quad Core or Eight Core CPU
    Examples: Intel Core i7 3.5 GHz (2700k) or AMD FX-8350 4.0 GHz
    Memory: 8 GB RAM
    GPU: DirectX 11 graphics card with 4 GB video RAM
    Examples: NVIDIA GeForce GTX 780/ Titan or AMD Radeon 290x
    OS: 64 bit Windows (Vista, 7, 8)
    HD: 26GB
     
  40. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I already discussed it in the first post of this thread: http://forum.notebookreview.com/gam...diculous-system-requirements.html#post9785053

    Long story short, it's awesome, better than 99% of games out there. Crytek is doing PC gamers a great service by being so in-depth as to list 3 tiers (Low, High, and Ultra+) along with specific hardware so informed people can make direct comparisons and it leaves no room for doubt. And the requirements are reasonable given the level of visual fidelity.
     
  41. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    True. But at least the 360 and PS3 to a degree showed some prowess when it launched. If not with games, at least with a GPU wasn't out of a two year old laptop.

    This round of consoles are shockers.
     
  42. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    My only issue is the "8GB RAM recommended" thing. Until I see games using 4GB+ of system RAM, I shouldn't ever see more than 4GB (I'll let 6GB slide) of recommended RAM. Not to say that having 8GB of RAM is a bad thing, and I recommend 8GB as the minimum to almost everyone these days who even considers light gaming, but there is no way games are going to use 6GB of RAM to run (which'd require 8GB for x64 OSes as the rec specs).

    The 780/Titan for 4K res gameplay is pretty good, to be honest. And Ryse does look quite impressive. Since min spec is 720/30 which is what Xbox 1 runs at, the HD 7770 or GTX 560 is a perfect fit for it. It DOES actually make sense the specs they ask for. A 660Ti for 1080/30+ is what all games that run at 720p or higher on a Xbox 1 should need, considering the extremely weak nature of the X1's GPU.
     
  43. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    To require 8GB RAM would indicate 64-bit would it not?
     
  44. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    They didn't shock me at all. I fully believed that they would have come out about as strong as they did... I expected the Xbox 1 to be more similar to the PS4, but I didn't expect anything stronger than the PS4. The X360 and PS3 both sold at HUGE losses on release. The X1 and PS4 did not. In fact, I could say PS4 was almost at-cost, and the online fee to play would help them make money for it instantly from most of its crowd. X360 and PS3 were quite advanced for their release, I do admit that. They got passed by desktops a short while after, sure, but not by a huge amount until say... 2008 or so (to my knowledge, anyway).

    The Xbox 1 is seriously pathetic though. I won't lie. I really can't believe they did not intend for that console's internal max render resolution to be above 900p in 2013.
     
  45. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Ryse is 900p30 on Xbone.

    Sure, at release they were much cheaper than comparable PC hardware, especially the 360 which launched a year before PS3 with superior specs, hence selling at a massive loss. But flagship PC hardware of the day was still more powerful. 360 launched after 7800 GTX/X1800 XT, PS3 launched after 8800 GTX.
     
  46. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I stand corrected about the res on X1. If that's the case, then those specs are too high for PC min... but I don't know what the settings on X1 were at, so I can't comment. But for how it looks, it's still okay I guess. I'd need to see what min spec could do in terms of resolution and specs, and how well it runs.

    PS3 was definitely stronger than X360; devs simply could not (and cannot still) figure out how to make use of its hardware properly is all. I do have confirmation from a dev who's worked on X360, PS3, Wii, Wii U and PC titles of the same game who confirmed their power relative to each other for me. I think the PS3 was stronger than a 8800GTX for sure; I don't think X360 was though.
     
  47. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    How so? 360 had a stronger GPU and much higher bandwidth due to its eDRAM which allowed free 2x MSAA, hence why it looked/ran better than PS3 in multiplats.

    It also had a stronger CPU--3 x 3.2 GHz general purpose IBM PPC cores w/SMT (very simple, lacked OoOE, tiny cache, much smaller die size than x86 P4/Athlon of the day) whereas PS3 only had one PPC core and 6 SPU's, which were underutilized by most devs that were not Sony first-party and hence lay around as useless DSP's. Cell was an overhyped piece of trash whose real strength lay in HPC not in gaming.

    To put PS3 in comparison with 8800 GTX is a joke. First of all, the 8800 GTX cost as much as an entire PS3 did. Second, they're not even in the same stratosphere performance-wise. The PS3 GPU (RSX) was a severely crippled off-the-shelf 7800 GTX 256MB (halved memory bus/controllers and ROP's), 8800 GTX was 3x as fast as a full-fledged 7800 GTX 512MB.

    As far as Ryse is concerned, I'd venture the listed minspec is good for 720p45 or 900p30 (like XB1) at Medium.
     
  48. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I can't claim to know what the PS3's specs are or were. All I know is that it is most definitely a much more powerful system than the Xbox 360... actually capable of internally rendering 1080p games (though there were only a literal handful of them, according to my friend) while Xbox 360's MAXIMUM internal render resolution was 1060 x 620 or something similar (all higher "resolutions", including 720p, were upscales done on an internal upscaler chip). He said PS3 is stronger, but since only Sony knew how to actually code for it because the architecture was confusing as hell and they provided no tools (the PS3 devkit was little more than a PS3 hooked up to a PC. X360 devkit had more tools they had available to use; PS4 fixed this issue straight) and thus most people simply coded X360 then ported to PS3, which is why PS3's games usually had worse quality except for 1st party Sony games (visually at least).

    The Wii U devkit was worse though. The Wii U's devkit has zero software and people have to try Wii hacks to get their games running on a Wii U. It's part of the reason why the Wii U isn't getting any games even when they're coming out for X360/PS3 still. The devs don't feel like putting that much time and effort into throwing a game there when it's barely going to sell; the time/effort it'll take to even figure out the Wii U isn't worth the returns to publishers.
     
  49. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    No, what I'm saying is that 360 had the superior hardware. It had the better console GPU with an early unified shader architecture, a more conventional CPU setup that was much easier for developers to work with, and a great toolset. PS3 had a weaker GPU, and while Cell's SPU's could be made to do wonderful things, it was a PITA to code for, so most multiplat devs did little to take advantage of it. Since PS3 had only one general purpose CPU core as opposed to 360's three, it effectively had a fraction of the CPU power if SPU's were not programmed properly, in addition to the weaker GPU and lack of high bandwidth eDRAM that were in the 360's favor.

    Superior hardware was a big part of the reason 360 either tied or beat PS3 head-to-head in multiplats, it was never the other way around. For example, you can look through Digital Foundry's analyses of the entire CoD franchise from CoD3 onward. 360 almost always had better image quality (usually better AA/AF) and/or a more stable frame rate.

    What you said is bogus, there is no hard limit on render resolution of either console. You can see the resolutions of many last-gen console games here, there are a bunch that render at full 1920x1080 on both consoles.
     
  50. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Nope, got ya wrong there. Get a capture card that actually grabs the internal render of the X360 and it will never cross the resolution I listed (it may be slightly different, like 1050 x 640 or something, but it definitely isn't past 1080 width or 650 height). Anything above that res *IS* an upscale. And it's funny, the person who I'm talking about worked on CoD: MW3 as his first game XD. He's the one who usually tells me what the consoles are like to code for for devs. You ARE right that the PS3's hardware doesn't do as well unless you know what you're doing with it, which as I did say most devs DIDN'T, but again, according to him, if you COULD take advantage of the PS3, it beat the heck out of the Xbox 360. I will say colours are better on Xbox 360 though, I know at least that much is always true regardless of game.
    Also fun tidbit: Incidentally, Xbox 1 is SUPPOSED to have a max internal render of 900p, and have a similar upscale function for 1080p games like Forza, but I've heard from end-users that the game does indeed render at 1080p. I haven't seen a firmware update that addresses the 900p max render, so my only guess is they updated the SKU or gave different SKUs to different devs or something? I'm really not sure. Eventually people will find out for sure though. My buddy was working at Microsoft at the time X1 was released and that's how I found that but out, but I can't find enough concrete proof for or against that anywhere.

    I never coded for either. I'm not going to say what I say as first hand, nor that I know everything. I am going to say I heard about it second hand though, which is why I'm not backing down about some points.

    Also, again, you're right about coding for X360 being a lot easier. But that's partially why PS3 ports were so bad. See, they never really got any time or attention. It was "make game for Xbox 360 first" then "port game from X360 to PS3" and then once that was done, "port game to PC" if they made a PC version. Xbox multiplats usually sold better (especially the online multiplayer ones) so they got even MORE attention. Since devs of mutliplat games got the porting down so well, they never really put the time into figuring out coding for PS3 any better. They just did enough to make the games work on them. Mainly due to time constraints, especially with the aforementioned CoD franchise. Those poor devs only had 1.5 years to make each game. You could say 2 years, but at least half of the first year would be split into making DLC for the last game, so it's not like it was full steam ahead. Why split the resources more just to learn how to code? Can port like usual and get just as much money. PC? It sells about the same amount each year no matter how good or bad the port is, so why not make a PC version?

    Ever notice how EVERY single Wii/Wii U version of a COD game was made or ported by treyarch? CoD 4's version, Reflex? Treyarch did it. MW3 for Wii? Treyarch. Black Ops 1 for Wii and BO2 for Wii U? Treyarch did it. If it didn't involve Xbox 360, PS3 or PC, infinity ward didn't bother giving it the time of day. That should explain how little devs usually cared about learning a platform they didn't currently understand.

    Aaaaand wow I went way off topic there. But it's kind of on-topic, at least you get an idea on why devs probably don't bother optimizing for PC. Most people won't care and will quite literally just toss a stronger graphics card at it until it works. It's a sad reality. I saw some streamers play Watch Dogs and Wolfenstein and they just didn't care about framerates or anything. They just ran the game and I suppose it ran well enough that they didn't notice random stutter (under 30fps) and streamed it and they went around recommending it to everyone who asked if they should get it for PC. And that's probably how the general public sees things.
     
 Next page →