Well, I see what you're saying however in general, there are way technically impressive games than DmC for that year (TLOU anyone?). IMO, Metal Gear Rising is way better technically.
And since those are console games you listed, DmC console 30fps with too many glitches is not impressive at all.
-
As for my opinion as a Devil May Cry game, well I don't want to go off topic -
It's strange how I can easily tolerate sub-60 in some games (like Crysis 3 and Dragon Age Inquisition) but not others. -
Also, Capcom made a definitive edition on that game for Xbox One and PS4 with all the combat changes and fixes. Some of the stuff was on the modded PC version. -
It's a good attempt, though. But what it really shows is that the general opinion is that gamers appreciate cutscenes and eyecandy over actual content. Or, that a good looking game is not just a better sell than a mechanically interesting game, but a better game period.
Why people seem to think that higher framerate automatically means better response and higher precision is a bit strange to me in that sense. For example, if you play Far Cry 3 at a 120 hz target framerate, what you run into is that either you have to halt the rendering threads until the information you need is actually there (i.e., like they did, max framerate will go from 40ish to 160. Which for example Total Biscuit managed to criticize as bad optimisation. He is not an engineer either). Or else you have to drop in something cheap, and then apply further replacement passes from game-logic over it later as you get closer (extremely expensive. Alternatives is to use switches from resources in vram, for example, but that's limited to switching out the resource, you can't recalculate updates or the benefit is removed). Or you might just put the xbox fog over everything and only render properly in a bubble around the player (cheap and quick, but not very visually pleasing).
Or you're going to do what most developers will do, creating the entire game inside a sequence of very small square boxes. To "limit" the overdraw problem and the amount of pixels that have to be drawn. That was how you managed in original Quake mods. And it's actually still how things are handled, broadly speaking.
See, if they had designed DmC to be a free framerate game, running against vsync perhaps, with better framerate if hardware allows, that sort of thing -- what they would have then is a type of gameplay that would feel like running around in treacle, with improbable out of sync animation corrections (there are many in DmC, in spite of how game-reviewers don't know they exist). Everything would feel out of sync, because it actually would be.
And that's why no one make these kinds of games. On the currently golden standard in system development, games like that are simply too hard to figure out. -
KEK. Care to back that up with proof please? So in your words, us gamers prefer Xenosaga Episode 2 which has long cutscenes than Demon Souls/Dark Souls/Uncharted/TLOU/GTA/Mario, right? I guess Heavy Rain is GOTY, amirite?
And what is so technically impressive about dumb down glitch action slash gameplay compared to its predecessors aside from moving environments that has literally nothing to do with gameplay at all? If that's your fancy, I guess good for you. I find it less impressive doe.
And really? You're going to say TLOU is less impressive because of lack of animations? You're not playing as Nathan Drake btw. Plus both are different games.
I also love it that you defend lower framerate that its a good thing. Why lower your standards? Any reason?
Good job on the wall of text doe. I mean no disrespect but c'mon, its full of holes. -
The "wall of text" talks about consistent behavior across deterministic frames, not about absolute frame rate. -
-
-
As for this point, it still is bad optimization, if this is offloaded onto one CPU core and the game clearly asks for quad or octo core CPUs, and it works on the lower power consoles which have far far weaker cores (meaning it cannot afford to be single-thread limited). If a quadcore is required for a game, it should not be using 80% of one core and 20% of the other three... that is feasible to use on a dual core machine, which negates their hardware requirements... which is STILL a screw up on their part. I understand what you say where it's generally a small box you move in though, and level of detail at distance is often a determinant of this, but interactive things (like animals) at distances should not be that difficult to implant if the environment is for the most part static beyond the box, no? I will agree that the "fog of war" is not enjoyable at all. -
So having something like physx predefined is a huge help. But it is limited, so if you want to do something complex it's neither as fast or as dynamic as running the same thing on a generic processor. But if you want to run updates in the graphics context, you run into the problem that you need to replace the data in vram through the bus, which takes too much time in total. So the solution typically is to simplify the mechanics (such as going with static tables for curves, instead of updating the curve of say the car slide based on current speed and direction in real time), so that the updates per frame can be run from vram and calculated on the graphics card module, and the tables updated asynchronously from main ram whenever there's available processor time and bus bandwidth.
This also helps planning the project, since graphical artists and programmers can work on separate things throughout the project, and you don't need to plan a processing diagram with everyone fighting for processing cycles. Or, as would happen, having programmers say that "if you want the image to look this sharp, and the animator wants to do this, then you are looking at about... oh, 12 frames per second. And incidentally, there's a minute of idle processor time in between things right about... here".
Same thing with having "input lag" down to actually making changes in the program in "the next frame", etc. It requires a type of routine that sends the normalized input from several points once the input it read into the program, so it can predict movement from previous data, etc. And that's not trivial to make this run as fast as "inside the next frame". That's just not how the program will work. You run updates /towards/ every frame, but you need some sort of interpolation to make it smooth. So on top of pre-rendering and supersampling, this is why the "input lag" is there. I.e., there's no such thing as 100% response once the framerate goes over 144hz on a modern monitor, for example. Updates for input are actually done much less often, and we're dumping squarely into the domain of "visual science", as they call it at Apple. Or "your eyes trick you", to put it in another way.
Meanwhile, there's nothing stopping anyone from writing modules and logic with the UE3 framework that has a different approach. Such as constantly running a thread with somewhat complex logic with low response time (that Windows then might assign to one core and not allow other threads to use, when the thread isn't idle or releases the data areas it uses). But to actually program something like that from scratch with the limits on a normal PC bus is a huge challenge.
So that's why DmC gets a nod as one of the more interesting titles technically.
Btw, yes, this is just for the sake of arguing, I'm taking a piss, and I know what you're thinking of when talking about the most visually impressive game, etc. But the technical bits here are interesting. And it's maybe interesting to know that they didn't choose 30fps as a target because they didn't want to waste time on culling maps and optimising objects, replacing complex objects with pre-rendered billboards, and so on. While a lot of other games only reach 30fps because the developer doesn't want to spend more time on optimising what is sent to the rendering pipeline. I.e., most of what is drawn is not seen on the screen, overdraw, bad occlusion handling, and so on.
Heavy Rain was actually interesting in that sense, as they designed it with the intention of letting a player actively control the animation splines in real-time. And it is a success from that point of view - that instead of having the screen stop and wait for a button-prompt, before you click the button and have a movie of someone rising up from a chair, etc., and only watch. You instead follow the entire motion, and the game plays back how you stand up from the chair as you perform the movement, with hesitation, or straight up. Or that you stop halfway and sit down again, etc. It's very obvious when you play it that this is favorable to simply a cutscene. But nomenclature in the gaming press essentially sets an equal mark between quick-time events and player guided spline-correction, so there's that.
But obviously I want to see a flat 100hz on any screen, where objects and input feels and is immediate, regardless of how complex the operation you just launched was. I'm just saying that it's not actually possible on current hardware.
And that as PC technology gradually now starts to allow it in a limited way (without sacrificing all visuals), the shortcuts developers take to simply reach 60fps is costing you for example interesting animation, accurate hit-detection, more intelligent occlusion detection, dynamic environments, objects that interact convincingly, background calculation that maintains world consistency outside the fov, netcode that doesn't insert 50ms extra response time, things of that sort.
Basically - "this game is ****, because it won't get 144fps on my quadrouple nvidia sli-setup from SnakeInc Overcharged Computers" is not a valid complaint. -
I love books.
I also want nipsen to just like, impart knowledge directly into my head. -
-
Ya mon
-
-
Nice attempt of trolling however people like Call of Duty because of 60fps and multiplayer gameplay. Most of them don't play the campaign. Care to try again?Last edited: Mar 29, 2015 -
Also it seems ironic that you are a huge advocate of cutscene games yet you don't list Metal Gear Solid 4 as the most impressive game of 2008. -
Yeah that's what I figured, and he even admits it. "Let me just be different for different's sake and back up my point with giant posts of rubbish."
He already lost me when he called PS4 and XB1 a PC, but "CoD is a pre-rendered cutscene" is when he went full troll.Last edited: Mar 29, 2015GTO_PAO11 likes this. -
-
.. I'm just explaining why the OP is asking the wrong question.
And I know Call of Duty has.. at least had.. a solid MP component. Before we were so thankfully blessed with lack of lan-play, client side hit-detection, super-platinum subscriptions worth three times the price of the full game, and closed and uneditable servers (which somehow became the industry standard.. with very little protest).
But there's nothing controversial about pointing out that all the visually impressive graphics in COD games happen during the cutscenes. The last game, for example, when they cruise down the road in the jeep -- all the reflections, all the explosions, blood, faces, model detail (specially on the plane outside), etc., is there thanks to the fixed camera angles. They insert static pre-rendered effects. You're seeing one side of that plane, and the backside doesn't exist. Then the pre-renders of course all disappear once the game is played. Which is for shorter and shorter stretches for each game, it seems..
Meanwhile, "quick-time events" in all other games are lampooned as relics from back when Japanese games weren't horrible, or something like that. And I know for a fact that big developers have been instructed by publishers to copy that type of gameplay in their games, with the justification that it will sell (I've spoken with people at a few different studios - they say the same thing: this doesn't really make sense to us. But we're just a studio, we're providing a product for our customer, and we don't own the actual project).
And that idea about "what gamers want" is still a general assumption in the industry, even when the prediction doesn't strike home, and the flat games don't actually sell.. with the result that solid studios disappear. I.. don't think it makes all that much sense, and that's where the "gamers demand this" sarcasm comes from.
It's a natural evolution in the entire industry, to make different engines that can easily be ported across platforms. Which obviously makes economical sense, when, like remarked, gamers loudly wish for polished surfaces - rather than more complex animation and model-work, more efficient network code, more complex AI, better animation treatment, dynamic maps, world persistence, things of that sort. See the thing above about limitations involved when designing games on an industry standard bus and PC-hardware above for a specific explanation. -
Last edited: Mar 30, 2015GTO_PAO11 likes this.
-
-
I suppose harassing people because you disagree with them might very well be against the rules as well. But I don't really see the point with banning people with different opinions.
I mean, if it was up to me, I'd ban anyone who claim something without an argument and an explanation, or evidence, to back it up. But thankfully, it's not up to me. -
-
Starlight5 Yes, I'm a cat. What else is there to say, really?
1994 - Quarantine.
-
And funny you said you would ban anyone who claim something wihtout any evidence to back it up but you are doing it yourself.
I mean, DmC = best visually impressive game of 2013 ? C'mon, man. The question octiceps asked is clear.
I respect your opinion but I don't agree on it.octiceps likes this. -
2013 was a relatively big year of visually stunning heavy-hitters. Tomb Raider, BioShock Infinite, Crysis 3, Metro LL, BF4, ARMA III. DmC wasn't even a blip on the radar. In fact I was far more impressed by a couple console exclusives, Ryse and Killzone Shadow Fall.
Also, I don't know why nipsen keeps bringing up CoD. Guess you just want to bash it because it's the cool thing to do on the Internet? That franchise has been behind the technology curve since 2007, that's why nobody except you has mentioned it in this thread so far.Last edited: Mar 30, 2015 -
-
Far from it. Plus that game can run on midranges pcs with fast fps. Crysis 3 IMO should be the most technically impressive game for 2013. -
-
Has any game since matched the physics and interactivity of 2007's Crysis? No. Or how about BFBC2's destruction? Nope. Or the ragdolling and bullet time seen in 2005's F.E.A.R.? Nada. Isn't it sad how classics like Half-Life 2 and Portal still put most games nowadays to shame with their physics-based gameplay via Gravity/Portal Gun?
And don't even get me started on sound. It's true that most games nowadays are mixed better and employ higher quality assets due to an abundance of storage, but environmental effects (e.g. reverb and occlusion) and positioning/surround sound are much worse than they were 10 or 15 years ago. First it was Creative Labs suing, bankrupting, and eventually buying Aureal so it could kill their competing A3D tech (amazing BTW) and replace it with the vastly inferior Creative EAX. But at least that was something which was worth buying a discrete sound card for to enjoy higher quality audio in games. Then Vista came along and turned the Windows sound stack on its head, effectively killing hardware accelerated sound forever. With very few exceptions (e.g. Battlefield franchise), newer games sound much worse than older games.Last edited: Mar 30, 2015 -
What about the good ol' Red Faction games in terms of destruction? The original Red Faction's Geo-Mod was pretty cool even if it was gimmicky and limited in scope.
Then Red Faction Guerilla made destruction really really fun. Even better when you start modding the game physics. I remember modding the weapons table to give the singularity bomb ludicrous blast radius and velocity. The result was any building with a bomb attached to it first becamse a spinning vortex of debris, and then as the singularity collapsed and the "bomb" part of it went off, you got a nice firework display of pebble and brimstone.D2 Ultima likes this. -
No idea, never played Red Faction games. Still I doubt anything will ever top Crysis 1 for me.
-
Give Red Faction Guerilla a whirl if you enjoyed Crysis 1, I promise you won't walk away disappointed.
Or look at it this way, if you spend more than $6.91* on junk food each week, might as well put that towards this game.*shipping not included
Armageddon was more impressive graphics wise and equally fun, but it's more of a corridor shooter so not sure if that might be your thing.
Last edited: Mar 30, 2015 -
Welp, it's not an FPS. On the backlog it goes.
I don't eat junk food.
Also gj linking to the Amazon retail version with GFWL instead of the Steam Edition. -
killkenny1 Too weird to live, too rare to die.
-
Linked to the GFWL edition because even with shipping it's almost half price compared to the Steam Edition. Plus I'm not a fan of Steam... -
Anything is better than that abomination GFWL (soon to be Xbox Live in W10). Yes, even Uplay. Steam and Origin are in another league.
And price is never an issue when there are never-ending sales. It seems to drop to $3-4 from Steam and other resellers frequently, plus the Steam Edition apparently has DX11 and better optimization.
Last edited: Mar 31, 2015 -
Interesting, didn't realize Nordic was releasing a DX11 remake of this game. Seems to be Steam only though. :/
GFWL never really bothered me much, so I gave it a pass. Still prefer all my games to be "standalone" though, so if given the choice between a "normal" version and Steam version, I go for the normal version every single time.
P.S. Let's just say me and Steam have a very long history, going all the way back to its initial conception and Valve forcing it down our throats for games like CS 1.6 and TFC. Yes you could play on non-Steam servers, but it was almost guaranteed to be infested with cheaters. Plus the choice of non-Steam servers SUCKED. I used NoSteam to circumvent Steam for as long as I could until Valve finally caught on like 2 years later and banned my account from all CS-based games. That was the final straw for me and I quit MP for good, and ever since then I've avoided tying my games to Steam as much as humanly possible.
In fact I may still have the original NoSteam client somewhere on my now 9.5 year old Compaq...Last edited: Mar 31, 2015 -
GFWL is hardly "standalone" though since it also requires a client. The ultimate standalone DRM-free is GOG, which is my personal preference if given a choice, unless the Steam version is massively cheaper or something.
-
I know. Didn't mean to imply GWFL was "standalone", but it wasn't exactly a DRM either. Also see addendum above.
-
If non-Steam servers were "almost guaranteed to be infested with cheaters" and "the choice of non-Steam servers sucked", why did you still play on them? Anyway, for better or worse, you pretty much don't have a choice nowadays since just about every AAA game that isn't DRM-free or made by Ubisoft or EA requires Steam, even if you buy a physical retail copy. (Unless ofc you sail the high seas). I have to agree with you on one point, though: Steam sucked in the beginning, and it took many years for it to not suck. And IMO it's gone downhill again in recent years. The Discovery Update was mind-blowingly stupid and Steam seems to be getting more and more bloated. I wish I could opt-out of installing components I would never use like the music player, family sharing, and broadcasting.
Just to be clear, I'm not trying to defend Steam at all. It's more of a resigned "that's just the way it is" and "monopolies suck."Last edited: Mar 31, 2015 -
I didn't, the NoSteam client was designed to bypass the regular Steam client. The project actually started as an alternative platform to Steam, and was generally pretty terrible like I mentioned. Eventually the guys found an exploit in Steam that let you play on Steam servers without actually having to own a Steam account, and it morphed into its (then) current form.
I don't remember the details, but I distinctly recall you still had to have the regular Steam client installed, so possibly it was an injector I suppose. Anyway with NoSteam I could play on all the Steam servers without actually using my Steam account. Of course the fatal flaw in my plan was I did have a Steam account at the time for some other Valve games, so needless to say when Valve eventually caught on to these shenanigans, mass bannings ensued. I imagine things would have gone down a lot better had I created my Steam account AFTER Valve finally patched out the NoSteam exploit once and for all. The client has been dead for almost 8 years now I think, and not much can be found on the Internet about it.
And yeah, it is becoming increasingly hard to find DRM-free games these days unfortunately. I still prefer the physical retail copy because if nothing else, I at least have something material. Plus I like collecting game boxes it seems.Last edited: Mar 31, 2015octiceps likes this. -
-
-
-
-
And my impression is that a lot of "gamers" either actually prefer a pre-rendered cutscene, or don't think about it at all.
So I'm challenging you to think about "good graphics" in a different way. (Or "trolling" you to think, like GTO says.)
(Btw, if you were a fan of the first DMC games (or at least the first one and DMC3:SE) - then we know you're not 12 years old. Just saying.). -
Your impression is off the mark and doing gamers a disservice.
I already know what comprises "good graphics." Don't obfuscate a simple question even more. You're the one who originally admitted to trolling.Last edited: Mar 31, 2015
What was the most technically impressive and demanding game each year?
Discussion in 'Gaming (Software and Graphics Cards)' started by octiceps, Mar 12, 2015.