Back when Half-Life 2 came out, there was alot of buzz about physics and AI in computer games, and it looked like good physics, AI and interactive enviornments where going to be just as essential to first person shooters as good graphics. Games like Crysis, Farcry 2 and Red Faction: Guerilla suggested that physics and interactive environments were going to be the new norm.
Fast forward to present day, and video games have taken a huge step backwards. Just look at the sequels to crysis. The AI, physics, and semi-destructable environments were totally scrapped for crap-for-brains enemies and static environments. The 3rd game wasn't any better.
I play a game like Rage, where I throw a hand grenade into a garage filled with stocked shelves and cluttered tables, and none of the objects move an inch. It totally negates the sense of realism imposed by the wonderful textures and lighting effects. It's more offensive to my sense of realism than low-res textures or crumby static lighting. Takes me back to the days of Doom and Quake. I've always prided the PC gaming community as being more intelligent than the console crowd and driving innovation, but it seems like an emphasis on custom hardware has turned too many of us into graphics mongers that will buy anything that looks good in a screen-shot and justifies the existence of the mega-rigs many of us like to build.
Seriously, AI and Physics are two of the most CPU intensive aspects of video games... Crysis still has some of the best and it could run on the PENTIUM 4 in my sig (didn't look good, but I got a good framerate and the AI and physics weren't effected). And yet despite the awful AI and painfully static environments, I highly doubt that same computer will crysis 2 or 3 at all. Every now and then we get a game like BF4 that gives us believable environments, but we don't get em enough. Modern gaming kinda sucks.
-
I think it's more a bell curve than a decline, but I see the point you're trying to make.
Games are designed by developers, who are designing said games for someone or some entity to make a ton of money. Once the first game is out, a surge of money falls from the sky, and their bosses want that "surge" of cash flow to continue - quick, hyped up cash flow. Sequels do not allow for as much profit because it takes just as many resources, and just as much time as before, but now there is a lesser chance that people would be willing to buy the second game. Rushing never gets better results.
Games like Battlefield do not necessarily apply to this logic because of their size (player base). Although, there were millions of dissatisfied customers when it first launched. I'm sure if BF5 were to come out, people would be skeptical about purchasing it, seeing what happened to BF4 and BF: Hardline. Luckily for them, they don't launch a game every year or two like other companies do. They can suck BF4 dry for years before having to make something new. -
-
I think its more to the fact that most of these earlier titles (Half Life 2) were 'PC' games through and through, now we see 'one size fits all' for consoles and PC's, hence why I think we get the dumbing down. Pretty graphics take the centre stage (that's because we are all fickle and buy the best looking games), but yes, physics and AI seem to have stagnated.
killkenny1 likes this. -
Londez I agree with your post. Right now it's hard to find a decent game with good gameplay. Lately the developers are focusing on the graphics part of the game. Even on reviews, many game reviewers focus on graphics rather than the gameplay.
-
King of Interns Simply a laptop enthusiast
There again though the newest consoles are much more powerful than the last generation. Hopefully we get to see improvements not only in graphics. Even graphics however don't look that amazing these days. Lighting effects and textures sure but not much else.
-
Gameplay plus good graphics and good story equals many gigabytes. For now, I think it will be difficult because of the limitation in storage and processor that is if you want a good graphics in a game, has good gameplay and physics, it might cost too much storage and cpu power.
-
Iondez, I think you are just not realizing what game companies are considering as in-demand from the masses. You could make the worlds most advanced tech demo with the best Ai and destructible moldable world, but if the game isn't fun then it won't sell. Crysis 1 may have been more advanced for its time, with crysis 2 being a big step back and crysis 3 being about just right (It had everything 1 did except destructible environments) but 2 and 3 made more money then 1.
With the next gen consoles we will hopefully see the likes of crysis 1 again, but remember the budget and time spent on these cutting edge games is a massive investment that few want to risk. The money makers atm are open world games, but ones like GTA which live through lots of scripting, or witcher 3 which will probably use alot of ai, but neither one feature destructible environments cause like.. well lets just say destruction of urban buildings requires massive resources still since they are complex objects. Also mobile gaming is a big money maker now and THAT is more like trying to fit what we can do now, in hardware from 8 years ago. -
LOL WUT?!? Crysis 2 and 3 are completely linear. They can't compare at all to the original. The only thing Crysis 3 has over the second one is graphics, otherwise both have the same crappy gameplay and generic storyline. And it's easy to make Crysis and Warhead look much better than Crysis 3 with mods.
King of Interns and inap like this. -
King of Interns Simply a laptop enthusiast
Agreed crysis 1 was by far the best if the series! Unfortunately the ai was dumb but otherwise it was excellent!
-
I think AI really taken a backseat. I think Valve do it best here. In HL2 and L4D the AI is excellent. I'm disappointed that increasing difficulty in games only means taking more damage and getting less to with. I'd like to see increased difficulty actually smarten up AI too.
-
-
-
You have to remember they are targeting the 13-21 age demographic where most of the money is coming from.
These are also the same group that most will preorder because of some fancy pre-rendered video and only to be forgotten 30 mins later.
Rinse and Repeat.
What is the difference between a pile of cash and a very large pile of cash?
A pile of cash could be enough to make another, but a large pile of cash can fund 3 or 4 games.
Why bother fixing the cookie cutter than work for the next "big" hit? -
I can't believe you guys enjoyed the original Crysis more than the sequels. If I had to rank them it would be Crysis 2 > Crysis 3 > Crysis 1. I couldn't even finish Crysis 1 because it was just so boring and I didn't feel engaged. Yeah Crysis 2 is a corridor shooter but it was well done and I enjoyed it immensely. Crysis 3's campaign was far too short, but the plot was engaging enough so at least I cared enough to finish it. Anyway these are just my own opinions and we all have our preferences.
-
Tinderbox (UK) BAKED BEAN KING
Same order for me, i have played all the Crysis Games
John
-
All the games you've mentioned so far are action games. While it's possible that AI is taking a back seat in that genre, there's plenty of interesting stuff going on in other genres. I'm mostly a strategy game player, and there's been plenty of interesting recent or upcoming games that still are as good or better than their predecessors (if any) in AI. Examples include Europa Universalis, the upcoming GalCiv3 if it lives up to expectations, and, while not a player vs. AI game, Banished, with its challenging environment. Sure, there are a few rough patches (Civ5, whose AI is quite poor at combat, and SC5 with its pathfinding issues, though those might be fixed by now). But overall, particularly if you're willing to look past the AAA titles, there's a lot of really good games with challenging AI.
I suspect one of the reasons the action genre may be suffering is the increasing focus on multiplayer, particularly in action games. If 80% of your players' playtime is going to be multiplayer, you probably won't put tons of effort into the AI. Whereas if in 1999, only 30% of expected playtime would be in multiplayer, you'd better put a lot of effort into a good AI and single player. Action tends to be more multiplayer oriented than many other genres. I read somewhere recently that only 15% of Europa Universalis IV players play multiplayer at all. So you're looking at 5-10% multiplayer time. It may vary somewhat by game, but when that high of a percentage of playtime is single player, it necessitates a decent AI.trvelbug likes this. -
killkenny1 Too weird to live, too rare to die.
-
-
I completed and enjoyed crysis 1 except for the space ship levels. Crysis 2 was waaaay to linear for me. There was NO path branching at all. Also the consistent graphic short cuts like global reflection maps used everywhere turned me off. I am liking crysis 3 lot but I haven't had time over other games I currently have time invested in. It really seems to be what 2 should have been though.
-
Crysis 3 was waaaay too short, and also doesn't have that glitch in Crysis 2 that allows you to carry more than 2 weapons, so I like it less.
-
Even visually games nowaday tend to lack a bit on a texture level. While geometry gets more complex, if you get close to say a tree, in both Crysis 2 & 3, you'll notice how textures start to get blurry and undefined. Happens with other "HD Games" too. That is a derivate of consoles version getting ported. With the powerful hardware these days it's a pity..
Story wise, ideas are becoming scarse too and scripting takes a lot of effort (money and time). The last pc game I remember having a truly great story is Starcraft.
Also, technically, having a destructible world is great and adds a lot of immersion to a game's experience. Even NPC & object interaction is essential. Problem is companies need fast revenues and so time becomes a more critical factor. That's why it's seen as a plus not as a must.
It goes down at how our society is morphing willingly and unwillingly -
King of Interns Simply a laptop enthusiast
Wolfenstein has a pretty decent story. Also the environment can be destroyed to some extent
About crysis I was hoping for the open world feeling of 1, unfortunately only cry 3 was on same level as 1 which didn't impress.
Far cry series is where it is for open world now! Totally excited for the 4 installment as the third beat crysis in everything except destructiveness -
If Farcry 3 had a destructible environment it would have been awesome. One thing I didn't like about Farcry is all the bad guys usually wore red or yellow. How hard would it be to at least flood fill their tshirts different colors once in a while.
-
Play Battlefield Bad Company 2 if you want destruction as a meaningful game mechanic. The game is still very much alive and well.
BC2 is the holy grail of infantry combat in the Battlefield franchise as far as I'm concerned. Everything from the movement to the ragdolls to the gunplay to the pacing and flow of the maps, especially in Rush, is the best in the series, not to mention the signature destructibility. Oh yeah, and it actually has decent hitreg and netcode.
Plus it has the best Battlefield campaign as well, which admittedly is not saying much. At least the AI is passable on the harder difficulties. They'll actively try to find you and flank you instead of standing in one spot eating your bullets. And it is somewhat nonlinear if you go out of your way to explore alternate paths.
As far as AI is concerned, I don't think anything has topped the original Halo yet. Completing the Halo PC campaign on Legendary is a rite of passage that I think every FPS aficionado should experience at least once. It is brutally difficult and those Covenant and especially Flood enemies are some of the smartest, if not the smartest, AI you'll ever come across in the genre.
And let's not forget that the physics engine was equally impressive for its time. The inverse kinematics ragdoll on the models meant that they could contort and conform to the environment after coming to rest without too much clipping, which was definitely a rarity at the time. And it's pretty amazing just how much Bungie got right with the handling and inertia of the vehicles and how seamlessly it transitions from first- to third-person. Nowadays, game devs can't even be bothered to code an actual mount & dismount animation for vehicles like the nifty little one that was in Halo 13 years ago. -
The thing is that it didn't have to be like that. Around 2006, a certain company hired IBM to dust off the RISC designs from the 70s. While the modern requirements would be not just the concurrency aspect for completely asynchronous tasks. But instead that the execution cycles would be so short that it would be possible to design algorithms on each of the processing elements within one clock-cycle. The product in essence being an interconnected bus with various processing elements working on a fully coherent cache, while the execution times would still be so short that response for various input and output would not be an issue.
Sounds like science fiction, obviously. After all, now we're having a PC with a single pipe towards an industry standandard bus. While the memory controller can at most have one concurrent stream at a time to a specific element on the bus. DMA on a PC is actually a high level abstraction, the actual transport still has to obey the memory controller limits, of course.
So a lot of people were skeptical about whether this alternative integrated bus system would even be possible. And if it was possible, that it would cost too much. And if it wouldn't cost too much, that it would require specific software and compilation routines that made it cost-ineffective, etc.
Turns out it was actually cheaper to make in specific single chip cost than your average Intel based leaf (which still is cheaper due to production lines being established, of course). And suddenly you had a real system, for the first time, that had interconnected processors that actually could each access a memory area within the same clock-cycle. In effect, you would have the ability to write your own instruction set with complex math, and have this instruction set applied to memory areas each clock-cycle. Rather than having the cpu take a piece of memory, let it do a math operation on it (the shorter the better, and it has to have a fixed execution time, obviously), before resubmitting it to the graphical context, which again would have to recalculate the context (which is incredibly expensive if your optimized code relies on assumptions about object states to run fast).
And to top it off, that system actually was cheap enough to roll out in the consumer market. It was demonstrated later that not only could the system run unmodified "PC" code with minimal performance cost. But it could also run unmodified shader code for graphics on the extra processing elements. Such as that the system could be set up, as advertised, to either focus on physics and complex node-traversal, etc. Or to run complex math on graphics contexts (per object occlusion detection based on complex math, rather than per pixel detection was demonstrated..). Or a balance between the two.
Suddenly modern gaming doesn't suck after all. And the ps3 actually did see some incredibly good titles made for it that are legitimate pieces of nerd- that -- unfortunately -- we will probably never see again.
Because: the market demands that development of games and coding budgets shall be as small as possible. It also turns out that the titles with the least interesting tech, where the most amount of money goes into pre-generated scenes made by outsourced FX companies as well as marketing and advertisement -- sell hundred times better than games with interesting visuals.
In fact, if you talk to people at Sony about the ps3 games, which I have (I am not making this up) - they will tell you that their ps3 exclusive games, even the ones that were critical as well as moderately economical successes, were "confusing" to their focus groups. There was too much fiddling around, and the normal "lock on automatically" while jumping around chest-high walls was much better (note: chest-high walls are used for cover-animation because all the animation is static and keyed to one specific height).
Effects in many cases were actually taken out - I'll take one example: the sand in Journey. Thatgamecompany never actually explained why specifically. But they admitted later on that the sand effects (a wave as if walking in a soft, slightly thick weave, that fit with the aesthetic of the game completely) were taken out because certain people insisted that it looked confusing. This was an extremely expensive effect in terms of algorithm execution, and it was completely unique to actually see something like this in a game that wasn't generated by a static resource. But they axed it in the end because the powers that be found it confusing.
The narrative being, like explained, that "improved effects" and science fiction algorithm tech in games just doesn't pay off. There's no other application for it either, where a hub of linear systems with a slower bus can't do the same tasks with differently structured algorithms with lower latency requirements before each input.
So that tech is again retired. Over the linear systems that really do have a ceiling for what sort of effects that can be made.
And we're seeing that in games and programs that are developed now. That even if there was a possibility to create more advanced algorithms, with larger worlds and better reduction algorithms based on higher order maths -- there's no interest in doing it, because: 1. the current industry standard still sells. And cheaper development is always better. 2. there is a very present narrative going on about how complex math and complex graphics is actually worse than more simplistic models.
It's bs, of course. When you can have a super-complex model be reduced to the rendering target detail that fits the rendering pipeline hardware, in real-time. So that you can essentially replace the cgi budget with a couple of lines of code. And then do it in a dynamic player-controlled context -- that is obviously thousand times better than stalling the development for weeks, and spending half the budget on a cgi sequence.
But that is what the market wants. So that's how it's going to be. -
The number one problem right now is that developers are developing for the console crowd. It isn't even the hardware that is the main problem.
The main market in their opinion are low-brow ADHD forever-children who have an attention span measures in milliseconds.
The problem is... those people do not care if the game is spectacularly awful as they will never make it that far into the game. Most won't even finish the opening intro video. This is the problem... the hype and a barely functional game are all that is necessary to sell games right now.
Look at former untouchable gaming giants like Blizzard... Activision bought them out, fired all the decent game designers and their IPs have gotten more and more targeted for single-digit mentalities and droolingly-low IQs.
PC gaming is developing separately from consoles now as groupsourcing and PC modding communties are excelling at raising current games to astronomical heights. PC gamers are so starving for good content that they have taken to doing it for themselves. The results prove unerringly that game development companies are leaving the good talent unemployed.
In my opinion the rise of the "online DRM" is because Activision and other game design companies are afraid that their average PC gamer and their modding communities will make their whole design teams look like a team of retarded monkeys.
If you are a game developer for PCs... I beg of you... embrace the modding community and make a solid product designed with PC gamers in mind then let it blossom from there. -
But the development houses that survive and earn money, and that produce to the full extent of the capability of the studio (as in they always have everyone engaged in something that pays bills at all times) - these studios rely on business-savvy leadership and business-effective choices.
And what has been proven to be the case in the last ten-fifteen years or so, is that the only way to have efficient use of a studio like this, is to focus exclusively on the biggest niche-markets. Or the most uniform markets that are most predictable.
See, publishers and managers at successful studios who read things like you wrote here just laugh. Because they know it's not the case. But they also know that it's a demonstrable fact that imaginative and dynamic approaches alienate and diversify the audience they need to target in order to make the business sustainable.
You could argue that they shouldn't care about sales in the short term as much, and that they should allow studios to f*** around more to get the more creative stuff out. That they should focus on a sustainable model in the sense that salaries should get paid, instead of that the studio should increasingly earn more and more money the more money is spent, etc.
But the truth is that the big AAA publishers/developer teams have been incredibly successful at targeting the biggest niches. And the people who essentially take it up the *** with drm, who accept a mod for a title (like BF4:Hardline) as if it was a new game. Who accept that they'll be paying up towards 3 times the price for a full game in order to see their stats online. Or for getting early access for content that will be released as a full game anyway. People who pay for beta-test access, that sort of thing.
This audience is not the biggest market by any means. But they are the biggest and most consistent niche. So you can't fault a business-savvy person for picking this market and focusing exclusively on it. Same as Apple did with the iPhone and all the other i-s. And we all know how badly the business is going for Apple.
And so what if the average playtime for these people in the games after they've bought them is an average of 3 hours (offset by the 9000+ hour grinders who tattoo "Activision" on their backs, etc).
The developers and publishing partners still earn money on those people. In a way that they don't on a more finicky and discerning audience.
And anyone who commit to a business-model like that are bound to reach the same conclusion; it's inevitable.
Meanwhile, the solution isn't to only buy indie games either. Or to only go for crowd-funded products, etc. The truth is that most indie-games don't have a sustainable business-model. They might produce brilliant games. But the studios can't afford to stay open. I've paid a lot of tip money for games I like, until the point where it's not really tip-money any more. But the studios who make these games can't sustain themselves on that. In the same way, a large high-quality production overseen by a large studio can temp with creative solutions, but they're not going to consciously choose to alienate people/customers in order to get in something artistic. The crowd-funded studio could insist that they are avoiding this by initially choosing the audience on beforehand. But they are not going to risk the same problem - they must adhere, perhaps even more slavishly, to the majority opinions. Star Citizen and a few other projects in a sense avoid that by essentially putting a sign on the front door that says: "pay me money to make something I want". But in reality they need to answer to the demands of their audience very quickly. And that's not a creative environment, when you need to justify your pitch - not just at the beginning of the project, but every step of the way afterwards.
...I'll end the hour-long rant now. But the solution here is to manage to get more studios to be self-financed. So they can develop their own products, and hire out excess workforce to publisher-sponsored projects. That way the studios can spend time on their little frankenstein babies, and develop what they wish on full time. With a full studio (or a partially engaged studio), over longer development cycles of the kind that normally would not be possible. Since they then can hire out their services to publisher-sponsored efforts when the resources are not needed at the studio.
This goes to simple things like having more than a month for all the writing in the project, for example. Or to allow a more sequential development of scenarios and locations in the games. Or to have a huge initial bout for coding and foundation tools, exploring what the limitations of the graphics engine is and so on.
Rather than relying heavily on serial-production of models and locations without really seeing any of it in action until it's near final. Or being forced to work within the limitations set by the toolset/middleware while looking at it from a high, theoretical level.
And, you know, there are studios like that - who are very close to actually ending up in a situation where they can do stuff like this. Without having to go the route of a five-year cycle with tent-pitchers (see: Joe Danger devs who now are making No Man's Sky). These devs exist - but they're not exactly highly advertised in the gaming press, or in any industry press at all.
So again - it's not that there's no market for games made for an audience that doesn't want to feel insulted at every point from the purchase and to the credit screen. It's that the currently chosen business-models dictate that it's necessary to focus on the largest niches.
And you'd be surprised: lots of people inside the industry would love to see that change. But they don't dare to undermine their own business by trying to change focus, and perhaps spearhead an effort to advertise more broadly. Because: the ones who try, and they exist, and have existed - end up poor and forgotten. -
-
-
They built a company upon those things and then sold it when the time came. The thing is, the buyer was Activision... with Bobby Kotick as the helm. That guy and "savvy" do not belong in the same sentence... ever. Google him and come back with your tail between your legs. Blizzard is dead and Activision is doing unspeakable things with the remains.
In Blizzard's example, it is more than possible to make great games for PC and make money doing it. The problem is... the business "savvy" people you speak of are afraid of innovation... they play only when the odds are severely tipped in their favor and they have found a nice fatted calf in the ADHD-generation console crowd.
The answer for PC gamers is to simply stop paying for awful games. Crowdsource good games and ignore the braindead console games for what they are... junk.
If the industry wants to develop garbage... let them HAVE the console crowd. "ooh pretty colors" has business downsides too... nevermind the fact those customers tend to Darwin themselves.
Short term gains... long term failure.
Think not? WoW is fading (its pretty much only for single-digit mentalities now), Diablo franchise is pretty much dead, Starcraft2 is not a failure, but is lower quality than Starcraft1. Kotick has taken a dynasty and turned it into a massive rotting corpse. Their old-school fans are pissed. You think that wont hurt? You think it doesn't matter that millions have abandoned Blizzard? Sure, many companies would love to have the scraps that Blizzard is now... but given what it was, you don't think this is a loss?
You are right that a single pissed off customer is something executives laugh at. (indeed its impossible to prevent)
However, we aren't talking about one customer... we are talking about thousands or millions... enough to crowdsource extremely ambitious projects like Star Citizen.
The new order of PC games will ignore the traditional development and publishing routine. It will be crowdsourced, advertised online, and sold online. That's a lot of traditional people who normally get a cut getting a big fat goose egg. The worst part is... the PC crowd was perfectly ok buying the old fashioned way... the industry has abandoned them. -
The only reason for Diablo 3 being "bad" is because is not the same team that did the first ones, end. -
(in case anybody feels the need to give me a lecture on ethics and morality, this is meant to be a tongue-in-cheek response) -
They choose not to.
As for your suggestion n=1...
Morality aside, if the game is good I have no issue paying for it, in fact I'd rather prefer it. I'd like to reward the game company who made it and help with the resources to make another. If it blows chunks, my time is better spent doing something that doesn't make me want to slam someone up against the wall until they stop moving. -
It seems like gameing is going backwards as technology gets better
Discussion in 'Gaming (Software and Graphics Cards)' started by londez, Aug 1, 2014.