The author seems to have completely missed the biggest gaming sensation of recent years - Minecraft is exactly the kind of game he's talking about (so is Kerbal Space Program). That list of "classics" also seems very parochial - more like "the best games when I was growing up" than any kind of "all time greats".
Freedom and choices can be used as artistic elements. I'd cite e.g. Saya no Uta or Phantom of Inferno as the purest form of this - the interactivity of these "games" is absolutely minimal from a conventional "gaming" point of view, but it's vital to the narrative. You couldn't make these as movies, because the whole point is to make you complicit in what's happening, because the outcome is a result of your choices.
But not every story has to be about such things. Many of the best-loved gaming classics - Ocarina of Time, or even FF7 - are those cinematic games, that maybe have puzzles (almost minigames, really), but where the overarching narrative is purely linear.
If you can take a movie, or a movie-like narrative, and by sprinkling a few puzzles or quicktime events turn it into something more engaging, a better way to tell your story - why the hell not? Why is that not a perfectly valid form? Criticizing a game for being cinematic seems as pointless as criticizing a sculpture because it could have been done as a painting.
Yeah, when it started claiming that the games of yesteryear had mythical qualities that make them better because they have some lasting impact on the player, that's when I gave up on the essay.
We experience games very differently as adults than we did as children. Partly because of the child's mind, but partly because we no longer have the kind of time to completely pour ourselves into a game like we used to. I remember playing Master of Orion 2 for so long that I would hear its music in any kind of white noise.
No modern game, no matter how good, can measure up to that kind of adolescent commitment to a game.
I think he had good points about games like Thief and System Shock 2 (and Deus Ex, which wasn't mentioned). The games were more open-ended than any shooter today, but more limited than "sandbox" games. They gave you a constrained world with a large number of possible paths, so that you could comprehend the options and make meaningful choices, even replaying to try them again.
The opposite of this would be something like Skyrim, where you are plopped down in the middle and proceed to basically hop around randomly, while individual quests are quite limited, repetitive, and linear (99% of a dungeon quest is pressing forward through the tunnel towards the HUD marker) compared to the scale of the world.
For so many games of the era, that was a bug not a feature. How many hours did our generation spend lost in some poorly-rendered flat walls trying to figure out where the heck the next content was? I actually am one of the few gamers that was excited to see the move towards linear maps. I spent enough time in Doom and Hexen living in the Automap view, I'm done with that.
That seems like an excuse for laziness or not wanting to expend much thought into a game, but rather just to passively consume it. Quality level design was one of the key things that made '90s FPS so great, and is what's sorely missing in today's games.
To an extent, this might be attributed to the simplicity of engines and level editors at the time (I don't think I've seen an easier map editor to use than Build), but level design has definitely been dying over the past decade or so. Procedural generation is set to kill it off completely, but at least it can still offer non-formulaic environments. Potentially.
I'm perfectly willing to play a mentally challenging game. There are plenty of good puzzlers out there that I've enjoyed. What I wasn't willing to do was spend half my play-time staring at the map. Many "shooters" in the '90s were as much "shooter" as "cartography simulator". Not that it was wrong that a few games included this as a major element, but too many games aped Doom and Descent's labyrinthine map-designs.
Doom did it well, and that does make a big difference.
As a different example compare early Burnout with recent Burnout games. The early games gave you tracks you had to drive on. If you failed it was really quick to try again. Later Burnout gives you a city to drive around, with tracks overlayed on it. You have to be at a certain point in the map to trigger the start of each event which sucks when you're driving back to the start to restart the event.
I find it really bizarre that someone could get through a big long rant about why aren't games today like System Shock 2 without ever once mentioning Bioshock and Bioshock Infinite.
Because they're beneath mention from a design perspective?
There's no inventory management system, there's little stealth ability, there's no threat of player death, there's no backtracking, there's no maps, etc.
System Shock 2 got a lot wrong, but I don't think it's worth mentioning the "spiritual successors" from a design perspective, as they're strictly a step backwards.
I think what you're stating as fact is actually more a matter of opinion. Take the differences between Mass Effect 1 & 2. One of the big differences is the lack of an inventory system in the sequels. I happen to feel that the inventory in Mass Effect 1 was a bolted-on mess, a wart in every sense. Removing it from the sequels, in my opinion, was a step forwards, not backwards.
Basically, less stuff doesn't always mean worse design.
There are a subset of gamers that seem to think there's no such thing as a bad feature - that cramming more stuff to do into the game inexorably makes the game better. When a feature that is completely orthogonal to the best parts of the game is removed, they cry foul.
So, we're talking about the various Shock series here, not Mass Effect--I'm not commenting on them, so let's dispense with wasting our time there. Game design is also a matter of opinion, so yeah, that's where I'm coming from, although every thing I mentioned is a plain statement of fact about their design elements.
From a design standpoint, removing the inventory and skills system (and replacing it with a clunky plasmids/tonics/hat system) removed the ability to permanently change your character and evolve them, and also to easily temporarily change out your skills and abilities. Cybermodules (skillpoints) in SS2, once spent, never come back. You can't respec, and so if you decide to build a melee or stealth character, you really have to develop it.
In Bioshock, though, you can switch out components and plasmids and upgrades, and in Bioshock 2 you can do much the same--in effect, an extended inventory system like the chests in Resident Evil, and a clunky mechanism to use. The character never undergoes irreversible build changes, and you can't just drop into inventory and switch out tonics if they're not what you want, like you could in SS2.
Speaking of inventory management, there is never a point in Bioshock where hoovering up random shit off the ground is a bad idea, so why even make it an option? It might as well just be an automatic pickup ala Doom. SS2 had things that were junk or weren't useful--it was a richer, more interactive world in some ways.
In both Bioshock games, you end up with a limited number of weapons to use, wheres in SS2 you can carry around as many or as few as you'd like, depending on how you decide to allocate your inventory. Weapons in SS2 have more pronounced damage types. Melee weapons in SS2 require a dedicated character build, whereas you can bumble into an endgame-useful game-breaking wrench build very quickly in Bioshock.
The removal of text fallbacks for logs in Bioshock made it harder to rapidly review events and piece things together, and overall there were many fewer logs than in SS2.
The inability to use stealth to bypass fights and the relative surplus of ammunition meant that cinematic combat was the main workhouse of the Bioshock games, whereas SS2 played more similar to a survival horror RPG.
~
Those are just some of the things that streamlining of the design did to the games, with the end result that the Shock lineage devolved into fun and competent cinematic shooters. I'm not saying that they're bad games, I'm saying that their design has regressed so far that it is basically not worth mentioning in the same breath as their predecessors.
System Shock 1, for what it's worth, was an amazing example of design ideas that never got much love, and likely represent another evolutionary dead end. :(
The original article makes a point of contrasting Mass Effect with System Shock, which is I think why parent mentions it. I don't agree with everything you say here, but it's a lot more interesting and useful for the sort of point the author was trying to make than what the author actually wrote.
I'm not sure what you're driving at. Are you saying that Bioshock and Bioshock Infinite are like System Shock 2? Or are you saying that they're prime targets and reference examples for examining what's going wrong?
The people who made Bioshock (which, it should be noted, a lot of them also made System Shock 2) claim BioShock is like System Shock 2[1]. I don't think it's crazy to use it to either refute or reinforce the author's argument -- it's clear the games share some of the same DNA, but there are clearly changes as time and technology marches on. I do think it's kinda weird just to ignore them, though.
Perhaps the author never heard of them (in which case: I want a rock like that!), or more likely, didn't play them and did not want to speak from hearsay, or even more likely, simply didn't afford a thought to them while in the middle of their rant-typing-spree.
I'd grant them that Bioshock shares many of the elements that are praised about System Shock 2. Bioshock Infinite sounds like a great example of sequel gone wrong on the aspects the article's author dislikes.
I've been thinking about this recently. When I was young, video games were a complete mystery to me. Every game, no matter how generic, was a joy to play, because I simply didn't know what to expect at the end of every screen. Strange worlds with unknown rules unraveled before me, whether in dungeons of Prince of Persia, the labyrinths of Jill of the Jungle, or the space stations of Duke Nukem 3D.
Over time, this joy faded in the games I played. I can intuitively feel why this is the case: whenever I start playing a new game, I already know most of the rules, or at least I can figure them out from the first few levels. Floating things? Oh, just a Flying Enemy Trope. Glowing box in the corner? Health Kit Trope. Where does that door go? Nope, just a Door Decoration Trope. And don't even talk to me about the Crate Trope. What's more, you can immediately figure out how a game is going to play just by looking at the first level. It's rare nowadays that I play a shooter which doesn't solely involve moving forward, shooting bad guys, and collecting powerups. I can play the game in my mind almost from the get-go, and I get bored.
A few months ago, I started playing a Japanese indie metroidvania game called La-Mulana (HD). As a game, it's rather obtuse: you'd be very hard-pressed to complete it without a walkthrough, as many of the later levels require figuring out the answers to obscure, poorly-translated riddles. But for the first time in probably a decade, I was completely sucked in. There was a true sense of mystery to this game. None of the tropes I was familiar with made sense here. Doorways to new areas kept opening up. Every decoration on the walls could be analyzed with your hand scanner. Items with no apparent purpose were scattered all over the ruins. Every obstacle was bespoke, not a generic "find the keycard" equivalent puzzle.
For 20 hours straight, I could not get my hands off this game. It was exhilarating.
And thinking back to the games I used to play as a kid, I'm starting to think that maybe they really did have a bit more magic than games do today. Take Duke Nukem 3D. You never knew what you'd find in a given level. An inconspicuous wall could hide a joke or easter egg. A vertical vent could contain a secret weapon. A manhole could lead to a secret level. Every area had something new, be it an interactive element, monster, weapon, or setpiece. (Remember how effective the mouse and mirror scenes were in the original Prince of Persia?)
In those old games, there was always something new around the corner. You never knew what to expect, and the sense of mystery compelled you to explore until the closing credits.
While I do think that childhood ignorance and obsessiveness made games feel a lot more interesting, I also agree with the author that something is missing in modern games. I was starting to expect that I would never again be consumed by a game, and then along came this tiny indie title to blow away my expectations. And now I see games like The Witness have a whiff of that same feeling.
I'm really looking forward to getting sucked into games again.
I have a theory about why some games give players that magical feeling of infinite possibilities, and what causes players to lose that feeling over time. I think it has to do with learning a game's visual language.
When you first start playing a game, or just look at the trailers and concept art, the visuals might promise you tons of possibilities that the gameplay doesn't actually support. But as you play the game, you learn to pay attention to only those entities on the screen that are relevant to the gameplay, and filter out those that are just scenery. When your brain realizes that the beautiful mountains in the backdrop are just a painting and you'll never be able to go there, you no longer react to them emotionally.
That theory suggests several ways to improve immersion in games. You could make a conscious attempt to mix up the game's visual language until the very end, like in the old adventure games, where anything on the screen could eventually become relevant in surprising ways. You could make the graphics simpler, to avoid suggesting possibilities that are not supported by the gameplay. Or you could pay attention to which possibilities are suggested by the graphics of your game. If the mountains in the background are so beautiful that the player wants to go there - let them!
Yes, that's a great point. It's like Cypher in the Matrix: "All I see now is hallway, backdrop, arena..." And the more games you play, the quicker you're able to condense them down to their bare essentials.
Jill of the Jungle had some truly weird stuff in it. Or at least, the way I remember it. Such as this one level which had some sort of frog-enemies, that produced certain sounds as they moved back and forth and bumped into things. It wasn't part of the game mechanics of the level, but the the way they were placed produced this otherworldly mesmerizing rhythmical tune. Might have been my very first encounter with experimental techno music. Somewhat reminiscent of Autechre's track Gnit.
(I think this was the PC speaker sounds btw, on a proper soundcard it might not have sounded nearly as mysterious and weird)
I was with you until you mentioned quick time events. I will quit playing any game that thinks it's more engaging to ask me to break my keyboard/mouse/controller by rapidly mashing a button for no reason other than the game designer's laziness. Just say no to quick time.
Quick-time events are just a "Simon Says" mini-game. It is supremely lazy game-making; not a single gamer I know likes QTEs and I would venture to say that not a single copy of any game ever was sold because "Oh! This game has awesome quick-time events; I'm going to buy this game right now!".
I don't think QTE can define a game, but it can enhance it.
Years ago I played God of War and really enjoyed the QTE kill animations. It made me feel, well, like a God of War.
I wouldn't be surprised if that's the exception to the rule though. Shenmue had some terrible QTEs (my biggest criticism were that they were unpredictable). Also, too many FPS games have QTEs whenever some animal bites you. It's practically a trope now.
Guitar Hero is a form of twitch-based game (https://en.wikipedia.org/wiki/Twitch_gameplay). Where a QTE would be "hit the F key as many times as you can in 5 seconds", guitar hero is "hit the F key exactly when it needs to be hit".
I'm generalizing though. Some QTEs are a form of twitch gameplay (in which they require the player to hit at the perfect time), but twitch gameplay is not a form of QTE. What does set QTEs apart is that they are not, in fact, the whole game but are a sudden change of gameplay style which is gone as fast as it arrived.
I think the idea of a quick-time event is to make a cinematic experience more visceral/engaging, so it's not necessarily lazy... I mean it is more work than just putting the cutscene in after all. Anyways, maybe it's misguided and definitely overused, but I think it's pretty extreme to call it lazy. At some point pretty much every game devolves into "press button to make thing happen".
Honestly, I've never found them to make things more engaging. If I'm interested in the story, having to fumble with the controller during otherwise static periods pulls me out of the narrative (in contrast to just letting me keep playing during these events - that does seem to increase engagement).
My view is that it's theoretically possible to design interesting quicktime event system, but that it tends to mash your verb-space together. In general, you get a stronger message with a clearer relationship between the button the player pressed and the verb in the game.
It's worth noting that quick time events are old. Here is a video of Day9 and friends playing through King's Quest 6 from Sierra's days of yore. Video has about two minutes of cutscene followed by a 1 or 2 second time window needing interaction. Day9 and crew miss the critical moment the first time round.
http://www.youtube.com/watch?v=fDKpm0r5MCQ&list=UUaxar6TBM-9...
There was a similar article a while ago (https://news.ycombinator.com/item?id=8128216), where I defended the cinematic approach as well. The argument mostly isn't against cinematic narrative, but against badly executed cinematic narrative. In the book "The Art of Game Design", the author explains that a game should be an experience designed for the player. And as you say, the form in which this experience is irrelevant. What matters is the quality of the experience.
> And as you say, the form in which this experience is irrelevant. What matters is the quality of the experience.
This eclipses and outweighs what I would have to say.
I think the author of the article, in those terms, was saying that the experiences created by recent games are often, and more and more, far inferior in quality; I've followed the correct sequence of buttons, I am rewarded with a scene of my character doing awesome things. If done well, I might even feel a tidbit of attachment and of feeling that this was the result of my success. But it's not the experience of being a super-spy in a high-tech facility. It's the experience of seeing a super-spy do stuff after I complete my homework, and I will have more homework to do so the super-spy can go on to do awesome things.
The problem is that these lower-quality experiences still sell better, often because they abuse certain hacks or dopamine bypasses of the human brain, without necessarily reaching all the way through to what makes an experience fun and pleasurable for the player.
Of course, to make such an argument successfully, you first have to convince people that humans do not always act optimally rationally, and then that humans do have such "hacks" and twists that control what they want and do (which in turn requires an audience to be convinced that brains control actions, not "the soul" or some other immaterial entity). Think that's a high bar? It's not even the start.
The existence of human irrationality is the easy part. The hard part is arguing that an experience involving choices is inherently higher-quality. The fact that Thief gives you plot-relevant choices doesn't make it any more the real experience of being a thief, except in the trivial sense that real life feels like it involves making choices. Likewise for all those examples. If you want to claim that the experience is lower or higher quality, you need an argument for that beyond your own subjective opinion (which is naturally biased in favour of games from your childhood).
Yes, this. It's hard to make arguments about quality of experience when every step of the way there's many uncertain variables and your best evidence comes from personal experience (that most likely wasn't shared by the person you're trying to convince).
Freedom and choices can be used as artistic elements. I'd cite e.g. Saya no Uta or Phantom of Inferno as the purest form of this - the interactivity of these "games" is absolutely minimal from a conventional "gaming" point of view, but it's vital to the narrative. You couldn't make these as movies, because the whole point is to make you complicit in what's happening, because the outcome is a result of your choices.
But not every story has to be about such things. Many of the best-loved gaming classics - Ocarina of Time, or even FF7 - are those cinematic games, that maybe have puzzles (almost minigames, really), but where the overarching narrative is purely linear.
If you can take a movie, or a movie-like narrative, and by sprinkling a few puzzles or quicktime events turn it into something more engaging, a better way to tell your story - why the hell not? Why is that not a perfectly valid form? Criticizing a game for being cinematic seems as pointless as criticizing a sculpture because it could have been done as a painting.