Gabe Newell talked about this kind of stuff years ago. People are tired of big publishers promising the world pre-release, then putting out an unfinished product that is nowhere near the quality of what was advertised. Here's a snippet of an interview he did a while back:
You have to stop thinking that you're in charge and start thinking that you're having a dance. We used to think we're smart [...] but nobody is smarter than the internet. [...] One of the things we learned pretty early on is 'Don't ever, ever try to lie to the internet - because they will catch you. They will de-construct your spin. They will remember everything you ever say for eternity.'
You can see really old school companies really struggle with that. They think they can still be in control of the message. [...] So yeah, the internet (in aggregate) is scary smart. The sooner people accept that and start to trust that that's the case, the better they're gonna be in interacting with them.
I wish it were true but EA is still making bank while disrespecting customers, everybody complains yet still buy. Same with Facebook. Same with politicians.
A small group of smart people on the internet may do a lot of noise, but it just echoes in the void then it's business as usual.
Look at Bioware. If they hadn't lied about the Mass Effect 3 ending (they promised your decisions matter; but none of them do, you choose at the end between 3 buttons) Andromeda would still have been criticized, but it would have been criticized in a more positive atmosphere. That studio is in a downward spiral ever since. There are additional reasons than those lies, but I'm certain lying to their fans and eroding the loyalty they had (it's after all the studio that made Baldur's Gate) did not help them one bit.
(it's after all the studio
that made Baldur's Gate)
Games studios have always struck me as a sort of Ship of Theseus - if the ownership, the product quality, the game engines and assets, and 99% of the employees have changed but the logo remains the same - does being loyal to the logo make sense?
Unless the changeover happens all at once, there'd still be some cultural and knowledge transfer. It may not be the same company, but I don't think it's an entirely different company.
Well, presumably the branding entity still owns the IP and trade secrets like the source code so even if an entirely different team comes on, they can still reuse that.
I'm not in the game industry at all, but, do you or anyone know how this goes? What goes into a game design? How much more detail and story and worldbuilding is there behind the scenes?
I mean if it's anything like software development, the answer will for the most part be "well there's actually nothing here". A lot of sequels feel bolted on in that regard, where they just took elements from the original game, world and story and build on top of that.
BioWare is a good case study since a few articles and books have been written about their history (one of the co-founders also has a few videos on YouTube about how they managed the business).
If you read enough of of them you can see BioWare has had three phases in their life:
1. Shattered Steel to Mass Effect 2, where they were largely independent and taking on feast-or-famine work (more of the former given the quality of their work at the time)
2. Mass Effect 3 to The Old Republic, which ringed-in the EA era and cracks in the quality of their work start to show. The end of this era is also when the co-founders and some key contributors to their past games left.
3. Dragon Age 3 to Anthem, which ringed-in stronger ties with EA—specifically having to work with Frostbite—and where we see that a lot of the problems we saw in the last era were probably chronic rather than acute. Lots of key contributor churn.
> they promised your decisions matter; but none of them do, you choose at the end between 3 buttons
At the end of the day your decisions still matter to you. The debate around the ME3 ending has left me with a lot more respect for the ME3 ending over time than after I played through it the first time.
Which of those 3 buttons a player chose often says so much about their combined playthrough of all three games. Linearly in terms of that playthrough so many gamers felt they only had "one choice" at the end, and that is an amazing psychological trick because the statistics BioWare revealed show that among players that did complete the final choice it was an almost even split (and a surprising number of players took the interesting fourth choice of just quitting there and not making a choice).
I think the ME3 ending even before the Extended Cut (and more importantly the final Citadel "cleanup" DLC) was better than a lot of gamers gave it credit for, but the problem with it was that it was a very hard situation to see the forest from your particular gameplay tree.
(Now, Anthem from a story perspective I have a lot less respect for. It's a weird KOTOR B-Plot [almost entirely recycled] with the Star Wars filed off.)
> they promised your decisions matter; but none of them do, you choose at the end between 3 buttons
Your decisions do technically affect the ending, but it's subtle and not very satisfying. The outcome of every major decision in the trilogy is assigned a numeric value and added to a "war assets" sum as you progress through the game. Different decisions result in slightly different amounts of war assets, and actions in the third game were heavily weighted over previous games. The size of that number changes each ending very slightly.
You know, just because the ending is the same regardless of your choices doesn't mean they didn't matter. No matter what we do, each of us is destined for the grave, even the universe will end one day, but our decisions still matter even though the outcome is identical.
PS: I had no idea it was such a controversial statement to suggest that things matter even if the universe is ultimately doomed.
The problem is that gamers were promised "decisions that matter" and in the end everybody got the same pre-canned three endings. Stepping up 6 meta levels and asking "Yeah, but what's life all really about anyhow?" is, you know, fine and all, but also in way, profoundly missing the point. We want to talk about Mass Effect right now, not fundamental existential ennui.
Personally... I don't care in the slightest, because as someone who's been playing these games for a while and also as a programmer, I observe that what these companies are promising is basically impossible, so I never bought into this promise to be offended it didn't play out. It is not possible to have a true branching story where you can make dozens and dozens of meaningful choices and the story has fully rendered cutscenes with voice acting and unique locations and etc. etc. But it might be nice if the companies would stop promising this. Every year, as the standards go up, it is actually getting harder and harder to bring this level of responsiveness to a game.
(Sometimes I think that as impressive as the holodeck may have been on ST:TNG in terms of providing an immersive sensory experience, it's even more impressive in that it seems to have enough intelligence built in that it can actually do true reactive storylines and such. We'll probably have reasonable approximations of the former a long time before the latter!)
It is definitely possible to make choices matter. Imagine that in the final mission, if you helped the Krogans before at one point during an alien attack there will be a Krogan Squad crushing through a wall and helping you out (that's taken from an interview with a Bioware employee who described what went wrong and how the ending should've worked). Stuff like that is possible, even if branching storylines are hard to do (cut content for many players and all that, I'm aware). The Fallout games - with the exceptions of Fallout 3 - show how it is indeed possible to at the end of the game show the impact of your actions on characters and civilization. BG2 did as well! And ME1 and ME2 during and at the end and even ME3 somewhat during the game showed consequences of prior actions - just at the end of a trilogy focusing on choice not at all.
Stupid mistake and definitely avoidable. I'm convinced Bioware sealed their fate with that (making mediocre games after did not help them though).
The problem is, no matter how you disguise it, gamers sense in the first case that they're on some sort of rails. Then companies promise the second, whether they mean to or not. But they can't deliver across one game. I remember thinking when I heard the announcement about the Mass Effect trilogy laughing at the idea that they could do it across three. You just can't let the combinatorics explode that much and make decisions that change the story. (Note things like "whether or not a Krogan squad is there" are consequences, yes, but you'll note they do not impact the future flow of the story, so they don't contribute to a combinatorial explosion. It's a linear contribution; you have to model the event and voice act it, etc., but it's still just two story nodes not significantly interacting with anything else.)
Fallout 1 & Fallout 2 were structured as several parallel instances of the first example, which is probably the best compromise you can make to prevent exponential explosion while still giving some heft to the choices.
But ultimately, if you want your "choices to matter", you'd better stick to "classic" roleplaying. A Dungeon Master can react to your unexpected choices and set you in a standard "fetch the thing" quest but then react to you actually accidentally literally destroying the world when that wasn't written into their "plan", and then carry on with the resulting consequences if you like. Computer games are still a long, long ways away from anything like this: https://www.youtube.com/watch?v=1H-1WJtuzPg
I agree with your basic premise, but I think the complaint about ME3 was that it didn't reflect any previous decisions you made in the ending, not that the story failed to fork exponentially at every decision point. Indeed, IMO, ME3 did a really good job of making it feel like your decisions matter...until the very end.
Star Ocean: The Second Story is yet another example of an RPG that does this; it has 70-80 endings, many forking paths including mutually exclusive party member choices, relationships between characters, etc. The trick for making it feasible seems to be to make the ending modular so that it includes/removes different cutscenes based on what happened before. As I understand it there are quite a few old-school RPGs that do this.
Of course it is harder if you are talking about voice acting and super-detailed 3D assets. But assets aren't that hard considering you already made most of them for the rest of the game, and realistic voice synthesis is perhaps not that far away. So I don't think most people were expecting model (B), but even that could be viable if you controlled the number of forks carefully.
> The problem is that gamers were promised "decisions that matter" and in the end everybody got the same pre-canned three endings. The problem is that gamers were promised "decisions that matter" and in the end everybody got the same pre-canned three endings.
I don't think it is. The decisions made did matter, as much as anything matters, they just didn't effect the ending cut-scene you got. Is there an intentional philosophical message here? Maybe, maybe not. But there is opportunity to reflect on our relationship with decisions and outcomes. Is our time spent with the game less meaningful because of how it ended? Given a time machine, would we have chosen not to play if we knew the ending? If the ending is the thing that is important, why play the game at all instead of just watching cutscenes on youtube?
"The decisions made did matter, as much as anything matters, they just didn't effect the ending cut-scene you got."
Affecting the ending is part of the promise given.
We want to talk about Mass Effect and whether promises were kept, not existential ennui. Whether or not we can truly know ourselves or whether or not the good life is led by trying to live as an ubermensch is not particularly relevant to whether promises about the game were broken.
It's sometimes helpful to step up a meta level, but you're trying to derail the conversation by stepping up something like five or six. It's not helpful.
Now I suspect you are not commenting in good faith, but trolling. Existential ennui is great to grapple with, but out of scope for a public forum discussion about whether Mass Effect delivered on the promises it made or not.
"But none of our choices matter, ipso facto the premise of this discussion is pointless" is something you could say in any discussion. If people enjoy the shift and engage, you talk about it. If they don't, you don't dig in your heels and accuse them of not understanding your viewpoint.
EDIT: You can enjoy the journey of a game and still be upset that you were sold a lie. You can enjoy the journey in part because you believe there will be a story-based payout. If you do this, the natural arc of the views enjoyment of the story is to be angry that the payout never came. Maybe you enjoy that feeling of being betrayed by a publisher, it's is a story experience. It's not the kind of twist that satisfies most audiences.
> Existential ennui is great to grapple with, but out of scope for a public forum discussion about whether Mass Effect delivered on the promises it made or not.
Honestly I couldn't care less about ME3, I didn't play it for completely different reasons. I just found the philosophical concept of "decisions matter" much more interesting to talk about. I misjudged how much Hacker News would rather have a bitch-fest.
> "But none of our choices matter, ipso facto the premise of this discussion is pointless" is something you could say in any discussion.
First of all, my point is that choice do matter even when they do not affect the ultimate outcome. And I'm not saying it to disregard how the commenter feels about the game, but to highlight that their description of why they feel that way is a bad rationalization, and that they could learn something about how they approach the world from examining it.
If I appear upset, it's because Hacker News, ever willing to consider themselves above redditors without justification, seem to have decided that I was defending ME3 during their collective bitching sessions and expressed their displeasure with downvotes.
Nope, it's not that kind of ending. Kill all the robots or not, that's more or less it. Not that philosophical kind of game.
And if you want to see it like that anyway: Yes, we die, but our actions influence the people around us. People we help might help us later etc. None of that was properly covered.
I was about to write an impolite response. Let's just keep it at: Your thought is not that deep that it is original or something other people do not understand. You miss the point where your Existenzangst is applicable and where it is not. Jumping on meta-levels and missing that we talk about the time period where consequences matter - like for how much fun there is when you play the game during the period our universe actually exists - is not profound. Looks like a bored effort to derail the discussion to me.
Simple question: how much did the ending actually matter to your enjoyment of the game? Would you have ignored the game and simply watched your preferred ending on youtube? Would you have told your past self not to play the game at all given how the ending actually is?
These philosophical questions are not as separate from the day to day reality as you want to believe.
It compeletely destroyed every bit of enjoyment I took from playing and finishing the game. Yes, I would have not played the game and actually do that regularly when it comes up, to tell people to not play (and to not buy!) ME3. It was that bad.
> It compeletely destroyed every bit of enjoyment I took from playing and finishing the game.
So... you didn't enjoy anything at all up to that point? Really? Then why bother playing it at all? Why not let someone else play it and just watch your preferred ending cutscene on youtube? Is it really all just about the destination and the journey means nothing?
You know that this is not proven right? Or rather, not even a "theory" in the physics sense, there are multiple competing hipothesis, none of them have enough evidence for us to be certain the universe will end... In some ideas the university does end, but in big crunch, or big rip, in others it just starts anew, in others humans spawn new universes, and so on...
There are a lot of competing theories of how the universe will end, but pretty much all mainstream cosmology admits that it will end. Very few cosmological models say otherwise.
However, even then I contend that eternity is just another kind of ending. Given an infinite time, everything that can happen will, and therefore it will also repeat. That means that every decision we make will be made again infinitely, every path followed an infinite number of times, always eventually returning back around again, and ultimately the outcome will still be the same.
So you see, under cosmologies of eternity, we just exist in a stasis of infinity and our decisions will not change that. Under cosmologies where the universe ends our decisions will not change that either. Does this mean that nothing we do matters?
"but pretty much all mainstream cosmology admits that it will end."
By using "admits" you mean they accept your point of view, even if they do not want to? Thats a bit telling about your worldview.
And have you ever considered, that cosmology can only make relative observations and speculations?
In other words, if our whole universe is just a small part of a bigger universe, how could we tell for sure, if all we can observe is within those borders?
Also, how likely is it, that out of eternal nothing someday came a whole universe which wents back to nothing forever?
The outcome is not identical because our choices throughout our lives impact other people and, potentially, get passed through the generations. This is why humanity is so advanced despite not having physically evolved much. The outcome is only “identical” if you only care about yourself.
Humanity will one day no longer exist and it will be as if we never existed at all. The universe is that massive and indifferent to our existence, and it too will come to an end.
Game publishers are drug dealers. Complain all you want on the internet; you're still addicted to escaping this reality daily/weekly. Kids also don't know or care about this stuff, and they just bug their parents to buy the latest.
Gaming does have the particularity of being as expensive as you want it to be, or as cheap as you want it to be. I don't think there are many other markets where you can get hundreds of hours of genuine heartfelt value for $15.
k, cept I did care when I was a child and I notice kids care today. There is a reason why minecraft and fortnite are more popular than a huge ip or something that would be expected to sell extremely high.
I'm still salty about EA getting the exclusive NFL license. NFL 2K5 was a better game than Madden and they made the ballsy move of selling it for $20 brand new the day it came out. EA got so spooked they spent who knows how much locking down an exclusive license.
Though in hindsight who knows what a 2K NFL game would look like today. With the way they butchered their NBA games I don't have much faith that an NFL game would be any better.
NBA 2k today suffers from the same problem Madden suffers from today: no studios competing with them to push them into making a better product.
Who knows if the latest PES (which looks absurdly good) will push EA to make fifa worth a damn again, but at least we’ve got choices in futbol games now.
FIFA, Madden, NBA2k, NHL, MLB, they have all the sports licenses.
I haven't figured out how they are still releasing a new game every year - they really should be releasing every season, or really, every team, as a DLC. The content is where it is at, not the engine.
>I haven't figured out how they are still releasing a new game every year - they really should be releasing every season, or really, every team, as a DLC. The content is where it is at, not the engine.
This is easy. People pay full price every year for the new version which contains a bare minimum of changes compared to the previous (not including the roster changes). Why would they cut that gravy train off? Presumably a yearly DLC would not be priced as a full game.
>I haven't figured out how they are still releasing a new game every year.
I would expect there are number of factors that militate against the approach you propose:
* Fundamentally, what EA owns is the engine. They pay license fees to the. If someone else knows how to build a clearly better <insert sport> game. If the right to use NFL branding and likenesses is worth $10/yr to the user, and EA only ever sells roster updates, you should expect the cost to EA for an exclusive license to creep closer and closer to $10/yr/user.
* The full game release justifies a $60 price point (vs the ~$10 you might hope to get for a roster update.
* My understanding is that much of the DLC/add-on revenue now comes in the form of "power-ups" for the create-a-player modes. Progress on that track can be reset in a new edition to force pay-to-upgrade users to pay again. With a perpetual edition, this dynamic either creates power creep (which alienates the bulk of your users who refuse to pay extra and can no longer compete) or forces you to dilute the power-ups to the point they're no longer desirable to high-value customers.
* From a staffing perspective, light, continuous updates are easier to staff for than spinning up a full scale rebuild once every platform generation, with less risk of blowing the new edition and losing your incumbency. If Madden 2020 releases a wildly unpopular change, you can fix it in Madden 2021.
"They will remember everything you ever say for eternity.'
Maybe. But outrage that results in meaningful action lasts a few months at most. The first person shooter industry is fascinating to me. Somehow Call of Duty is still extremely popular and commercially successful series year after year despite continued management by the exact same studios that repeat the exact same algorithm:
1. Release re-skinned game each year with huge promises of new and original content to come
2. Underdeliver on said promises and lock remaining ones behind micro-transactions, loot boxes, and DLC
3. Abandon game 6-12 months in to begin preparation for the next iteration of the game after raking in millions on suckers
4. Repeat
You'll see thousands of people swearing they'll never pay another $60 + add-ons for a CoD game ever again and yet somehow the cycle continues year after year.
There are numerous other video game series (not just in the FPS genre) that follow the same pattern. Some eventually get run into the ground, but only after a very slow and profitable death. People have short attention spans. You might remember touching a flame, but the feeling of pain quickly fades.
I don't think that's the point of the quote. If you said 5 years ago that the game was going to include a talking cube, the internet is going to remember that and when you publish is going to expect there to be a talking cube (or accuse you of breaking your promise). The outrage only lasts for a short period of time, but it can be based on promises said ages ago that without the internet would have been long forgotten.
Often that’s not what happens. Devs say “Oh, hey, talking cubes are cool, maybe we will integrate one” people will accuse them of breaking their promise (which is nonsensical) if they don’t include a talking cube.
But this doesn’t happen in isolation. It’s likely that people who dig up that “promise” had an axe to grind anyway, so it’s about scoring points based on spin and lies.
Can you name any large controversies around something as small as that? As I understood it the large ones like No Man’s Sky or Anthem even doctored gameplay footage during their presentations in addition to promising the moon and more. However when it was just "We might do this", then almost nobody blames them, a small minority might be angry or upset, but most don't care and many steps in to defend the devs in those cases.
I think No Man's Sky went the wrong way when they started to promise whatever fans were begging them to promise...
If they had stuck with their seemly original plan, that was make a modern copy of the game Noctis, with no base building and a stronger focus on exploration of precedural worlds... things would have went better.
As soon they tried to mix it with minecraft they introduced incompatible elements on the game and ruined it, they made stuff they already had not work, and promised stuff that in the new game they were making made no sense.
I didn't played the current heavily patched version, but read about it, seemly the game shifted genre completely and is nothing like the original intention.
Meanwhile I wait for someone to make a cool noctis clone... or maybe have the original author finish Noctis 5.
> As I understood it the large ones like No Man’s Sky or Anthem even doctored gameplay footage during their presentations
You're talking about vertical slices created years in advance of any realistic release date, for the sake of demonstrating to players a game that largely doesn't exist yet. Of course they're doctored. If they could show off natural gameplay, they'd have a near-releasable game and there'd be no need for the vertical slice.
> However when it was just "We might do this", then almost nobody blames them, a small minority might be angry or upset, but most don't care and many steps in to defend the devs in those cases.
When you start trying to make assertions about how "most players" responded, it becomes difficult to sort fact from fiction. "Most players" don't follow the games enthusiast press cycle -- whatever the perceived reaction to the press cycle for a game is, it's largely determined by the people who are the most engaged and the most vocal about the game in question, and it may be gauged differently depending on what slice of the reaction you were exposed to. That makes it pretty difficult to reach agreement as to what the temperament of the overall reaction was.
I also take a bit of issue with the expectation that designers should couch unconfirmed ideas with words like "might". It seems like an attempt to determine the validity of a wave of hysteria via a process of nitty-gritty word-lawyering that I don't think is very practical. Designers aren't always the best or most cautious of public speakers, so an idea that's in the "maybe" or "I hope" pile might not receive proper couching in suitable disclaimers when the designer speaks about it publicly. Further, players often don't remember the full context of words that are spoken about a game, and even are likely to want to read the most enticing thing possible out of the words written or spoken about a game they're intentionally following. Insisting that words that are publically spoken about a game in production are lawyerable like this just encourages studios to clamp down and stop speaking to the public about their craft, and leaves us with sterile messaging that's been carefully crafted by PR experts.
With that said, I'd counter that there's a pretty lengthy history of players misinterpreting the exploratory words of designers about possibilities for their in-production games as promises about what the final product would contain, and then later raking them over the coals for breaking those promises. On the spot I think that's going to be a tricky view to corroborate without going through a messy and time-consuming process of picking apart a bunch of case studies, though.
Call of Duty lives in an odd space, I am guilty of buying it much more often than I should.
The problem is that the people I would want to play with stop playing the older games, so if you want to play with your friends and get a half-decent matchmaking experience then you have to get the new ones.
There's also a minor problem with the older games, people get so good that the enjoyment is sucked right out, it's nice when everyone is unfamiliar with the balancing/movement/etc in the first months, it evens the playing field a lot.
(also, older and thus cheaper/free games lead to a lot of cheaters who don't care if they get perma-banned).
Call of Duty is like Coca Cola or McDonald’s. When you buy it you know exactly what you’re gonna get. It’s the shooter version of a FIFA game. EA isn’t promising a drastically new game every time.
> EA isn’t promising a drastically new game every time.
Every couple of years they promise "completely new engine from the ground up" and deliver an incremental improvement on the previous years. I don't necessarily consider the result a bad thing, but the marketing is always BS.
I think call of duty in many ways makes sense when viewed as a singleplayer game (which in recent years it hasn't been). When I buy it, I am not looking for "innovative gameplay, etc". I am instead looking for the gaming equivalent of a Movie. I typically like the story, and with around 6 to 8 hours of gameplay this works out to around 4 movies which is coincentally about the same price when viewed in terms of tickets.
generation gaps are shorter for the younger populace, think high schooler, there's not much interaction between seniors and juniors. for ea is like that the markets renews every couple years.
I wonder what proportion of the cod sold are first or second purchases into the franchise before the infatuation tapers off, my hunch is that it's quite a majority
I think the quote obstensibly attributed to Lincoln is apt:
‘You can fool some of the people all of the time, and all of the people some of the time, but you cannot fool all of the people all of the time.’
The public in aggregate have a perspective that insiders in a company don’t or cannot because of culture or they’re too close to the problem or they’re just being disingenuous. And the public always finds out, sooner or later.
While it is true that certain circles of people on the internet are very smart in aggregate, I don't believe you can extend that to internet as general; Empirically, you can just have a look at the "general" trends on Facebook and Reddit, even twitter.
I think its true in the sense that if enough people are paying attention, all dishonestly and lies will be unraveled. I remember back when simcity 2013 came out EA said that always online was required because the game does a lot of processing on the servers that would be too much for your computer but shortly after someone patched out the code that checks for internet access and discovered that everything in the game except multiplayer still works.
The individuals do not have to be smart here, there just has to be one person who will bother.
> The individuals do not have to be smart here, there just has to be one person who will bother.
I don't think it works like that, most social media are a game of numbers, if one smart person was enough, you wouldn't have nonsenses like flat-earth and similar becoming "popular topics".
The statement is that the internet will find the truth. Not that they will all agree on the truth or not also find untruth. For almost every single issue the truth exists on the internet.
Because in the context of consumers, that is what internet is, social media and forums (which are social media in my book). Bloggers and other professional reports are a different story.
> The smart cow problem is thought to be derived from the expression: "It only takes one smart cow to open the latch of the gate, and then all the other cows follow.”
I believe in this context smart doesn't mean the average intellectual capability of a netizen. It refers to the internet as a whole and its power to dissect information simply due to the sheer number of users dedicated to a certain topic.
I understand, but as you say, it is only true for certain topics, however, sadly, internet as a general is full of misinformation, disinformation, toxicity, and plain outright nonsense.
I am not sure it is worth repeating, but as I said, the notion that "internet is scary smart" is only true for certain groups and topics, but not internet as general. That does not mean there is no "smart" forums or smart people on the internet, nor I made any remarks about YC News.
The internet probably isn't much smarter than any other community, they just have the highly efficient communication on their side which allows any nobody to easily call out a company and immediately reach a sufficiently large audience.
Of course this also allows misinformation to spread more easily.
I think it would be wrong to believe that there isn’t spin happening on the other side, too, and I don’t think that truth wins out in the end. This isn’t enlightened consumers vs evil, lying developers.
Sure, the internet will remember everything you ever said, but the question then isn’t whether what you said measures up to the truth, but whether what you said can be spun in a negative way for you.
Also, none of this is inevitable or even very predictable.
The most toxic, extreme and disgusting hate always seems to target developers. I don’t know if that’s a majority, but I do think that mostly milquetoast boring criticism of large publishers (yeah they suck, sure, what else is new) actually has the largest actual impact here, even if more people share that criticism.
Here [1] are the sales figures for various Call of Duty titles overtime. Sales have been plummeting. In my opinion this is not because of the promise:reality dissonance but it's clear evidence that people do respond to changing conditions, even in mass market entertainment. The reason this is obfuscated is because let's say sales drop from 30 million to 20 million. That's a critical decline, yet from a consumer perspective it's still 20 million people buying a video game. That's a really big number, and so from our perspective (lots of people buying a product, lots of advertising for it, so forth and so on) things stay mostly the same.
As an aside the exact same is true in the movie industry. Ticket sales have been on a sharp decline since about 2002. [2] This is masked by similar effects as above, but there's also another one in the movie industry. They focus on "gross receipts" instead of ticket sales. Gross receipts = tickets_sold x price_per_ticket. Far fewer tickets are being sold, but their costs have gone up. So the whole dissonance about Hollywood releasing increasingly rubbish films because this is what people want is clearly untrue. To be clear this was also not just a clever rebalancing of price:demand. Their inflation adjusted revenues also peaked in 2002.
Hollywood's in decline, overhyped games are in decline. It's only our perception that's lagging behind.
That picture, while humorous, does cherry pick about 20 users out of an 8000+ group that literally anyone can join without invitation or qualification, some time after the pre-order period has ended.
I'm addition there's curiosity, the overwhelming desire to play with friends, and some people stream/review for a living. It's not like this was a coordinated effort with organized dogma.
Funny as hell for just an out of context picture though.
Can you expand on your Google comment re. RCS? All I’ve seen recently is that they’re taking the reins from the carriers (at least outside of the US) to speed-up rollout.
I even work(ed) on a game that was actually embroiled in a scandal regarding E3.
I can probably shed a little insight as to why:
1) Once you show gameplay footage, you’ve promised. This isn’t necessarily the worst thing in the world but a lot of the time the “promise” is a mega polished version of what you want to show, it’s on rails, it’s going to show all the best features of the engine, you control the pacing etc. Even if those things make it into the game (as, my game actually was in 100% parity graphically, but we ran out of budget for animations) then people /perceive/ it worse, regardless.
2) what is shown isn’t actually a game- even if it looks like one. It’s a prototype. And there’s many reasons that the final product has to be trimmed down. Either it’s required because the budget goes towards something else (such as smarter NPCs, or better animations) or it wasn’t ever possible to produce what was shown due to things that were not known at the time of creating the demo or footage. The fact is simply that you’re showing something that may not be able to exist. So that is in direct violation of my first point.
3) we got burned the last console generation, we thought the machines were going to be at least twice as powerful as they ended up being. (You May notice that my publishers “downgrade” scandal happened post-E3:2013). What we had prepared was geared towards significantly more powerful hardware. (So, number of props on screen can’t be as high, number of light sources can’t be as high, issues with streaming “pop-in”) My publisher won’t make this mistake again and with a new console generation around the corner it doesn’t makes sense to underpromise (and use the current gen, which our competitors may not have done this E3) or overpromise, and guess what the next console gen will have.
As an aside: I know it’s a trope that “console gaming holds gaming back”, and there’s a grain of truth there, but I would caution allowing developers to use as much hardware as they can, our work machines have obscene hardware (latest high end Xeon with 64/128g Ram and a gtx2070 is common), if we were not optimising for a limited platform then pc would suffer too, I just don’t know how much it is currently limited.
I forgot to add to point 2, and my edit is locked out:
One of the main reasons that things get bleached out of a game (menu's, HUD, rich graphics) is accessibility or localisation. I think a lot of people only look at games in an able-bodied western way, but if you want to sell your game in arabic then your menu's have to support going right-to-left, if you want your games to be sold to people with near-sightedness then you can't have dense darkness or heavy fog for most of the game unless it's removable (thus' more testing scenarios).
For demos/proof-of-concept pieces: artists have free reign, until it comes time to actually address accessibility, discoverability or localisation concerns.
I think the points you listed are very understandable from an engineering point of view. On the other hand, there were also games that just looked like downright lying in E3. Not sure if thats acceptable specially when we are talking about experts in the industry are involved.
I wholeheartedly agree, there are examples of egregious behaviour.
But I'm talking specifically about things like Watch_Dogs and The Division, where it was never the intent to mislead by anyone involved, even the marketing department (which can be overzealous) thought they were playing it relatively safe, only for the situation to get dicey quickly.
Watching the Watchdogs "scandal" unfold was enough to turn me off Ubisoft for years. It's not that it was unique to Ubisoft, it was just the most grotesque case of under delivery I'd ever sunk money into. Especially when I had just come off GTA V, which was 90% of the same gameplay, but the engine and art were significantly better in every measurable way I could think of.
For a small time game dev like Hello Games failing to meet expectations with No Man's Sky, I was generally willing to accept their failure and apologies, and continued to go back and try games.
For a major publisher like Ubi, I ended up boycotting them for years.
Even when what you show really is what you have and what you will deliver you won't get away from being called a liar.
One AAA game I worked on released a trailer for E3 which we were really proud of. Our game looked amazing and that's what we showed.
The trailer was recorded in game. The only "trickery" was custom placement of some cameras and rendering to a 4K backbuffer and downsampling (which the final game does support on PC, anyway).
Despite the trailer being rendered in real time on a relatively powerful gaming rig using the actual game, the YouTube comments were flooded by people being angry at how this was obviously fake because it cannot look like that in the final game. There were even people explaining why and how this was obviously fake.
I really enjoy the infinite entertainment and presentation format that consoles provide, but I really question their utility next generation
and lol no PC gamers aren't a part of that equation whatsoever
I feel like the potential console gaming crowd is so heavily diluted towards mobile and casual formats. any of the good VR titles are also running in wireless mobile formats with good enough graphics
I hope there is a differentiating quality to make the 'set top box' format a staple in the 2020s instead of "just" better graphics. I feel like it is necessary
Nintendo has differentiated itself since the Wii and exited the graphics rat race. It really does limit the cross platform titles available, but thankfully Nintendo has their own great 1st, 2nd, and 3rd party offerings.
I think Microsoft is also deviating now. Microsoft tried to differentiate the Xbox One with Kinekt and an online-only console, but reverted due to backlash. Now they're trying again with Xbox game pass and much more cross platform/cross play availability. There're almost positioning themselves to go up against Google Stadia. But they might revert again.
So far I only see Sony sticking with the traditional model. Their marketing for PS5 and PS1 are essentially the same. "We have more games that look better."
Sony has changed their console strategy; for PS1-PS3 they made everything weird and complicated to show off. This gave them lock-in and made exclusive games prettier, once people had figured out how to program them, but the consoles were super expensive and cross-platform developers weren't interested.
PS4 is just a PC with some useful extras; PS5 probably will be too.
You're right I have noticed that in all the developer interviews. I was speaking more to the marketing to consumers. It still has the same value proposition and brand as it has all this time.
It might be what the market really wants, or they might lag behind.
> There're almost positioning themselves to go up against Google Stadia.
Microsoft has been talking about their Project xCloud and has had Xbox Game Pass around for a lot longer than Google Stadia has publicly existed, so it's maybe more that Google Stadia is positioning itself to go up against Xbox.
I used to be a PC gamer, but switched to consoles in recent years because I can spend a few hundred $ and have a machine that's guaranteed to run a large lineup of games at acceptable performance (sure, it won't be as good as a high-end PC, but the console also cost a fraction of what a high-end PC cost), and, has excellent exclusives (Bloodborne alone made buying a PS4 worth it for me). I will probably buy a PS5 for the same reason: laziness/convenience, exclusives, still costs less than a gaming PC (people often say its not true, but every time I price a passable gaming PC, it works out higher than the few hundred that a console costs, especially if you buy near console release and not when the hardware is a few years older)
With PC the hardware is more expensive but your investment can last longer and adapt with newer GPUs.
Also you are not locked into generations and such. I own a PS4 and I can't play my PS3 games for example unless I buy the "remastered" versions like Wipeout Omega Collection. Don't know if it's the same in the Xbox world. I know Sony has claimed that the PS5 will be backwards compatible with PS4 games at least. With a PC this is a non issue. I can still play Half Life 2 on any PC and it is a 15 year old game from the PS2 era.
The other issue is game prices. With PC you get huge discounts. It's common these days to find 50% discounts on a 3 year old AAA game during Steam sales, sometimes even 75%. The Sony Store never gets great discounts, except on very old or not so great games.
IMO the only clear advantage of console gaming is that it's "plug and play".
Xbox has been trying its best to move to the "PC model" where they don't break past games if they can avoid it in a generation change. The Xbox One at this point has a giant library of Backwards Compatible downloads for Xbox (OG) and Xbox 360 games, and the promise for the next generation (Project Scarlett) is that 100% of the games that run on the Xbox One will run on it.
Admittedly, the Xbox One still has a number of games that instead of showing up in the Backwards Compatibility library, encouraged repurchases for "Remastered" versions. Microsoft left that as a per-game decision to support BC or not, and whether or not to release a "Remastered" version. In some cases, Microsoft worked with the Publishers to help upgrade would-be BC owners to "Remastered" versions for cheap or for free (though often only through time limited deals and sales, so obviously not all owners benefitted in every case). Arguably one of the benefits in the case of some of the "Remastered" versions has been that publishers were given the chance to "upgrade" from custom ARM builds for the 360 to nearly the exact same x64 builds they run on Windows, further moving more of the library directly (and literally) to the "PC model".
Microsoft has a lot of good reasons to want the Xbox and PC to be friends and to increasingly apply the PC model to the Xbox. (Xbox Play Anywhere where you can buy a game once and play it both on PC and Xbox, often with cross-play between the platforms, is magical.)
The Xbox Store / Microsoft Store usually has good deals on par with Steam rotating every week, with the usual big sales around the holidays. They also generally have bonus discounts for gamers with Xbox Live Gold and/or Xbox Game Pass subscriptions.
I'd say, use a site/service to check prices and get notifications when they change. I use psdeals.net (no affiliation or such). There I have a list of games I'm interested in and get an email whenever the price goes down. That is for playstation, but I'm sure there are options for xbox too. Nintendo seems not as eager to do deep sales, disappointingly enough. Deep sales are imho great for those games you hesitate on.
On that site, I can also see price history to see what the game normally sells for, or what the common sale price for it is. Meaning, I'm less trigger happy just because I see a "sale!"-sticker.
Console generations typically last 5+ years too, though. If I buy a console, I know that I'll be able to play all new games for the console. PC will last longer, for sure, especially if doing some upgrades along the way (I had my gaming PC for 6 years, with a GPU upgrade along the way), but the cost of consoles is still much lower (without inflation adjusting, my PC cost 4x what my PS4 cost).
> The Sony Store never gets great discounts, except on very old or not so great games.
I disagree. I rarely buy full-priced games, so the bulk of my library was bought discounted, including a bunch of not-that-old AAA titles. Sure, if I want to buy most games within a few months of release, I'm probably out of luck, but its never been "very old", a year or so at worst. But even if not, given that my (at time of purchase, mid-range) gaming PC cost 4x what my console cost, that's a lot of money to spend on more expensive games ;)
Laziness is my reason for being a console gamer as well. Over the last 15 years I've dabbled in and out of PC gaming, usually spending the money every time I buy a computer to make sure it has a fairly high-end graphics card. The end result is usually things will go great for a while then I'll try and play some game that will either not run properly or have some weird issue with my setup and I'll get frustrated and head back to console. My day job is figuring out why computers aren't doing what they are supposed, at night I just want to be able to stick a disk in something and have it work correctly.
The Last Of Us series with the up coming 2nd installment, Unchartered among others are just amazing. There are amazing exclusives for XBox too. Way to go brother.
A few more PS4 exclusives worth mentioning: God of War (2018), Horizon: Zero Dawn, Detroit: Become Human, Spider Man, Shadow of the Colossus, The Last Guardian and of course I've already mentioned Bloodborne.
I've been a bit disconnected from the Xbox exclusives this console generation, so I likely missed some fantastic games there.
It seems that epic have begun offering a lot of money for studios to port former exclusives to PC to sell on the Epic store, resent examples have been Jouney which was an exclusive for the PS3 and Detroit: Become Human is listed as coming soon.
We have since learned[1], after the disastrous launch of this game, that this video was showing even the dev team what the game was supposed to be about. This game did not even exist yet as an official entity until after E3 2017 (heck, it was going to be called "Beyond" until the week-ish before E3), and then the team had just over a year to build and launch it, with predictable results.
Is this the same Anthem that’s plastered in ads all over Origin? I thought it was just meant to be EA’s knock-off version of Fortnite/PUBG? It actually looks like it would be kind of cool as a single player game, I wonder why they went through the effort to make it multi if they only had a year to build?
You may be confusing Apex Legends (published by EA, made by the same people who did Titanfall) which is a rival to Fortnite & PUBG, and Anthem (published by EA, made by Bioware) which is a rival to Destiny.
I thought Destiny 2 was already dead? It seems it still has it's following, although I always thought Warframe was the leader of shooter looters, even though these aren't my type of games.
It had a fairly major resurgence after the "forsaken" expansion was released. Judging by the size and activity level of the two separate subreddits dedicated to it totaled together, quite a lot of people.
I'm yet to see a game that's as smooth and polished as Warframe. I've tried like 10 and I'm amazed how such a small studio like DE basically raised the bar for everybody else.
That is sadly true. My wife and I spent 3 months in a haze last year while starting from scratch and it was fun but it eventually gets to you.
Truthfully though, I'm not bothered. I play very scarcely in the last 9-10 months (not only Warframe bit in general) and only when I'm in the mood. If I miss out on progress then oh well, much worse things can happen. :)
That sounds like marketers deciding the direction of the product.
I guess you could relate that to the Steve Jobs quote about how when product focused companies get their decisions made by the sales/marketing team, they begin failing.
If developers made the call they probably wouldn't do any marketing before the release of the game - "Just wait until it's finished, otherwise the marketing material might be inaccurate!"
Their game would fail, but they wouldn't know why.
So the gaming community reiterated loudly and clearly that it doesn't like being lied to with "gameplay" demos crafted specifically for E3 that share no code with the actual game being developed (looking at you, Halo 2), and the dev's takeaway was "Don't show them any gameplay."
I would hope we could collectively rally against this sort of behavior by refusing to buy such games that refuse to preview any actual gameplay. But I doubt that will work, given that Dead Island sold five million copies[0], based off what I can only assume was due to enamor with the trailer.
The issue is that it is presented as if it is actual, current gameplay.
I don't think anyone would have an issue if there was a giant watermark that said "CONCEPT" across ones that are no where near being represented in the final product yet, but it obvious that these companies are being intentionally deceitful.
I don't think anyone would have an issue if there was a giant watermark that said "CONCEPT" across ones that are no where near being represented in the final product...
I'm struggling to think of a diplomatic way to express how naive I feel this statement is, so I'll just leave it at that.
The online community doesn't even need real reasons to be angry, they'll make them up at they have to. It's somewhat ridiculous to suggest they aren't capable of ignoring a "CONCEPT" watermark to get riled up about "broken promises".
> The online community doesn't even need real reasons to be angry, they'll make them up at they have to.
The online community consist of millions of people, of course some of them will always be angry. However when you have millions of people hating you and very few defending you then there are usually a good reason to it, otherwise they don't unite like that.
Example: Nintendo announcing that the first game to use their paid subscription pokemon storage system wont actually be able to use all pokemon, and that no game in the future will either, meaning that some pokemon people paid to store and reuse later will never be able to be used. This lead to a lot of hate on the game, including some over the top criticism of the graphics which otherwise wouldn't be an issue but now most of the energy goes towards hating the game so nobody bothers to defend it.
However when you have millions of people hating you and very few defending you then there are usually a good reason to it, otherwise they don't unite like that.
There is usually a "reason" people get riled up online, whether that reason is "good" is usually pretty debatable to most rational people. And claiming that any "good" reason is able to be stretched into all sorts of other criticisms as you state is an even shakier stance to take.
Personally, I'm not convinced that the online gaming community is worth caring about in the aggregate. The community is so dwarfed by people not interesting in participating in arguments about their hobby online that the separation between people who have legit criticisms and people just looking to be angry about something isn't meaningful.
Say you have a million players for a game you've created, and 30,000 of those players participate regularly in online discussions, with 25,000 of them being reasonable and 5,000 being nonsensical (and the 5,000 is largely going to be at least the same if not more vocal than the 25,000). How much time and effort do you really want to spend trying to separate those 25,000 from the 5,000 given your playerbase as a whole?
The tree (and background) is from the new Pokemon game, yet it looks perfectly at home in a game released nearly 21 years ago. Game Freak has been pumping out cut rate Pokemon games for years and they deserve every last bit of the hate.
We might get a case study in this when Cyberpunk 2077 launches (possibly early next year!). CDPR, initially hesitant to reveal to the public the gameplay slice they produced for the media at last year's E3, eventually caved and released it with a prominent disclaimer watermark across the top of the screen:
"WORK IN PROGRESS - DOES NOT REPRESENT THE FINAL LOOK OF THE GAME"
As well as having similar messages reiterated by the narrator throughout.
Previously CD Projekt Red was roasted for perceived "downgrades" in The Witcher 3, notably in lighting which CDPR had to change to accommodate the day-night cycle in the game.
That said, hype for Cyberpunk 2077 is incredibly strong right now. We'll get to see whether players really do take disclaimer messaging into account when the final product launches, no doubt with "downgrades" evident in many facets of the game.
> You seem to think it's a very sinister practice but imagine making a video of what the product you're working on now will look like in two years.
The solution to this is showcasing your product when it's much closer to release. I'd welcome the idea of revealing games months before release and some developers already practice this approach, like Borderlands 3.
Hell, I'd even go as far as suggesting a true announce when it's done approach: Take something like Cyberpunk 2077 - just imagine the idea of Keanu Reeves mentioning that it's available...today.
I think this is fair for games that are a long way away but I often see games for sale now that all the videos on the steam store page are fake renders.
We as software engineers aren't great at predicting what we'll have in two weeks.
I think that's because our development is supposedly agile and we iterate a lot because we don't know what we're actually building very well (that's not a bad thing). Iteration in game development is much slower and lends itself to a waterfall approach because the developer is the client most of the time. In theory the game should look like a very close approximation of the final version years before its launch.
> Iteration in game development is much slower and lends itself to a waterfall approach
Given that much of game development is an iterative creative process, you would hope that iteration is much faster than in other development work. Maybe not for the programmers, though.
I don't see how its any more suited to waterfall than any other industry. Sounds like something game devs tell themselves so they feel better about their bad habits, like how crunch is necessary or how automated testing is too hard.
Game devs do not tell themselves games are suited to waterfall. In fact, its the opposite. Its the opposite of the testing is hard argument, as you mentioned, which is much more common.
Do you mean its way more waterfall in that it happens in practice, or do you mean that its much more suited to that style of development process?
If the former, then I would say a lot of large companies end up that way (they tell themselves they’re “doing agile” but in reality are really just doing some form of modified waterfall). If the latter, then I will ask: why is that?
Waterfall might make sense if you know up front exactly what you will build, which as an engine developer may well be the case, but for anybody else, I don’t see how a more agile approach short iterations and tight feedback loops wouldn’t produce better results.
In software development, 1.0 is the version number you assign to what's ready at the end of the last day you are permitted to continue working on your release.
Before online gaming, games were required to deliver a 1.0 and had no further chance to patch or improve it. They would do so in silence — like Nintendo does — and essentially work for years on things that no one even knew existed until they released their 1.0 gold master and began producing marketing material from it.
Nintendo continues to follow this policy today — we get demos when they've finalized the game, and not before — and Nintendo refuses to market games until they've finished them, so we get years of "will there ever be another Super Smash Bros" and "OMG TELL US WHAT'S COMING" which they silently ignore.
Now you declare openly that game developers who tell us what's coming while it's not yet finished will be punished if:
1) What's not yet finished doesn't match what's finished on 1.0 day in any way that is personally meaningful to you and/or anyone;
and/or,
2) What's announced is announced only with a complete finished 1.0 product from which marketing materials are exclusively developed and provided immediately upon announcement.
So, you're championing the Nintendo model, which is fine — we will hear nothing about any upcoming games until they're ready to be released, and if that means years of silence from the owners of properties such as Halo, GTA, Pokemon, and so on — then that's completely acceptable and we should expect to hear no complaint from the gaming community as a result.
I am not led to believe by your response that "we will hear no complaint" is the outcome that will occur if and when the entire gaming industry complies with your demands by stopping all pre-release announcements altogether, which leaves me wondering: How, precisely, do you expect a valid outcome to occur from your requirements that both satisfies your demands without triggering complaint over the lack of updates over time from studios who are busy complying?
EDIT: To pre-address the most obvious reply — "they should only demo what's done" — I guarantee that nothing is ever done until the final day of development, as critical features can be cut literally hours before gold master if there's an undiscovered flaw that QA discovers at the last minute — so since "nothing" is ever done until the final 1.0 release, "they should only demo nothing until the final 1.0 release is done", and we're back to the above question.
"Lie to us" and "don't mention the game until it's done" are not the only two options. There's plenty of middle ground, including "show us a cutscene video that is obviously a cutscene video" like Cyberpunk 2077 did. Don't say "this is actual gameplay" if it's not actual gameplay. Don't show videos simulating actual gameplay if it's not actual gameplay. Don't have people on stage mimicking actual gameplay while a video plays that is not actual gameplay.
"Don't lie" is a very low bar, and there is a lot of room to exist above it.
Changing gameplay between the E3 demo and the final release isn’t the problem, it’s when the difference between the preview gameplay is vastly better quality than the final. In the 12-24 intervening months one would expect a project to improve in quality and focus, not to devolve.
It’s the curated experience intended to oversell the state of the the current offering (i.e. pre-release) that raises gamers’ ire.
Have you worked on a product that has had to show a vertical slice 2 years before the product is released? I've worked on a few e3/e3-like demos for games and there's a whole bunch of issues. It's not always quality - things that were considered "easy" or "we can solve that later" can come back and bite you. Things that are fun in a vertical slice demo might not actually be engaging for more than 15 minutes or so(or an hour, or three). It can be _very_ difficult to extract a 10 minute piece of gameplay that you have some context for to show people even if you have the entire game made, which means you might have to out in some temporary "features" (add a power/remove a pathway) to make the experience fit. All or these changes can have a butterfly effect. Adding X to simplify a demo can make Y seem more important than you want it to which means Z is now redundant, but you've already spoken about Z.
There had to be done realisation that things change in games over a 2 year period, (heck, things change massively in games over a 2 week period in some games) and some acceptance of that fact.
If you want a good example of what people mean, here's a comparison of the trailer of atlas, the new game from the Ark developers, and actual launch gameplay.
Atlas was so bad that they actually found menus from ark in it, i.e. it was a skin. They actually left promotional models in the production game, so at one point cheaters were attacking other pirates with ww2 fighter planes left in from a demo.
Don't use a video that promises things your players won't get.
Years (decades?) ago, it was standard practice to include a watermark text that said something like "Alpha version footage, not representative of final release" in trailers, and everybody was ok with it. It was honest and straightforward.
But AAA game industry doesn't do honest anymore, they do marketing instead.
Halo 2's E3 presentation could not possibly be turned into a real game and was built from the ground up to deceive with the intention to throw it away after
How is that good intentions? Halo 2 is not an outlier
Okay, but that means that the era of vertical gameplay slices is effectively over, more-or-less exactly as described in the article. It further means that if a lack of gameplay demonstrations at E3 has left you feeling a bit unfulfilled (and maybe it doesn't, but it's a common takeaway among players following the event), then you only really have yourself to blame.
Even the ones that do have gameplay lie or exaggerate the limits of their game (looking at you, Todd Howard, Peter Molyneux, and Sean Murray).
It's simply marketing spin, gameplay or no gameplay, it's why you can never really trust such hyped up demonstrations and they can only tell you about their product, not show it. That also applies in further interviews on various news outlets.
I don't know though; I think it has to be done with a LOT of trust by the creator and the public. CD Projekt currently has that trust with the fan base; however Elder Scrolls 6 [1] last year was not a welcomed with a cinematic trailer based on how Bethesda has behaved in the past.
I think another view is how the FF7 remake has been treated for years since its initial announcement. There was hype, skepticism, and finally support.
This is particularly relevant today where a whole group of players have decided that Pokemon Sword And Shield will have terrible graphics, based on the texture on a single E3 pre-release clip of a tree.
Random weird opinions will always exist. Remember water puddles issue in Spiderman? Most likely no, unless you actually read the reports about it. And in the end it was entirely irrelevant.
Random weird opinions, amplified using crowd-coordinated social pressure techniques, are what’s driving them to stop doing gameplay demos in the first place. Attempting to minimize the minority’s opinion when that’s literally what led to this is a bit odd.
Also copy-pasted 3d models from previous games, flying iron snakes and floating seagulls. And they didn't even bother copy-pasting all of the models from the previous games so people are really in the mood for nitpicking.
> Before online gaming, games were required to deliver a 1.0 and had no further chance to patch or improve it. They would do so in silence — like Nintendo does — and essentially work for years on things that no one even knew existed until they released their 1.0 gold master and began producing marketing material from it.
There was some pre-release publicity for many games in development before online gaming. Peter Molyneux was working the press with his famous enthusiasm before 1990: https://www.youtube.com/watch?v=DAuvaAGnqQw . Maybe Rise of the Robots was the first real debacle of pre-release hype, but it was probably as bad as anything since: https://www.youtube.com/watch?v=E4udiQq5gpY .
You don't remember Nintendo showing everyone on E3 a few years back the new Zelda game and then when it was released as Breath of the Wild it looked far inferior to that? https://pbs.twimg.com/media/DmLfhhgXsAMMwbp.jpg
I would certainly champion that Nintendo model. I feel like people are even willing to pay a premium for it. Nintendo is to video-games what Apple was to computers through about 2014. I seldom play video-games, but my outsider's perspective is that Nintendo is the only major company which puts serious effort into making its products appear consistently polished and professional.
Also like Apple, Nintendo has this reality distortion field that causes people to excuse all their shortcomings and elevate mediocre products to exalted status. It's probably their single greatest advantage over their competitors.
CD Projekt did an amazing job with Witcher 3, but the fact is that Witcher 2's E3 video is one of the more infamous examples of graphics being severely downgraded once the final product ships. Witcher 3's E3 video contains very little gameplay but the cut schenes included have noticably worse graphics in the same cut scenes in the actual game. Gwent's E3 video barely contained any gameplay at all, and it wasn't representative of the released game at all.
CD Projekt retained trust because Witcher 3 is an amazing game - not because they delivered games with graphics matching their E3 videos.
> the fact is that Witcher 2's E3 video is one of the more infamous examples of graphics being severely downgraded once the final product ships
I just watched a W2 side-by-side comparison and I really don't see it. Yes, some textures look less sharp in the release version, but only some. The rest seems to just be a colour grading thing. In any case, the two look so similar to me that I don't see what the controversy was about. Also, the enhanced edition, which everyone who bought the release version got for free some months later, included a ton of graphics improvements which make W2 IMHO surpass the E3 demo's visual quality easily.
Of course -- In most cases it's about texture quality and quality of the lightning effects, and I honestly believe many E3 shots are better because they're taken at a point where the game is done enough to show it off, but before the Q/A team started reporting stuff like bad performance on 95% of consumer hardware (or in Witcher 2's case, the Xbox 360) or that the game filled to much to fit on X discs.
I was just pointing out that CDPR retains their reputation based on the quality of their games, and not because they deliver a final product that matches gameplay videos presented at E3.
The Xbox version came out about 9 months after the PC version came out, so while they designed it with consoles in mind, I doubt too much of the "downgrade" on PC were because of the console version.
> I was just pointing out that CDPR retains their reputation based on the quality of their games
Yes, absolutely. CDPR aren't immune to criticism and get plenty of it, they just deal with it well and don't suffer from it because they've built a reputation of amazing games and of being consumer friendly. Most of us can overlook the criticism since its quite minor in the grand scheme of things.
In all honesty I don't know how anyone can make 60$ purchases on games without seeing what it looks like. I can't comprehend it at all. Preordering was never even a question for me even before all of the recent issues with it. I just cant understand the mindset for anything other than cult-like franchises.
I can only imagine, with contempt, the cube jockey who figured out that distributing game demos was bad for their bottom line.
I can relate to this, and feel the same way about hardware, like an iPad or PS4. That's why I almost exclusively buy from vendors that take returns regardless of the reason; like Amazon, Steam and Apple. Now the full product is the demo and I can't remember the last time I spent money on a product I didn't want to spend money on.
Pre-ordering was a way before digital distribution to ensure that you got the game on day 1.
I grew up in a tiny little town in Texas where the nearest, non-Walmart video game store was 45 minutes away - this is back in the tail end of the NES, the middle of the SNES, and the dawn of the PlayStation era.
The store I went to would often only get between five and ten copies (beyond what was pre-ordered) of a video game on launch day, with a typical waiting period of two to three weeks to get more copies.
I vividly remembering pre-ordering video games that sounded interesting, some for as few as $5 that could be transferred to another video game pre-order if I changed my mind. Back then (and still today for physical games in some stores), you could pay the balance when you pick up the game.
Today, aside from bullshit pre-order bonuses that the video games companies make exclusive to each retailer, there is very little reason to pre-order - Steam/Origin/GOG/Epic Game Store will not run out of copies - they can copy the bits an infinite number of times. In fact, in the case of some flops (looking at you, Fallout 76, with your nearly 50% price cut within a month of release), it makes no sense to pre-order.
The real mind-melter for me is Kickstarting games. Crowd funding is nice and helps projects that would otherwise not be funded, but why would you want to pay for a game that isn't even completely conceptualized yet?
> In all honesty I don't know how anyone can make 60$ purchases on games without seeing what it looks like.
It depends on the game, I'd say. While I'd like often playable demos, I try to document myself on what's available (videos - not trailers - and previews) and if it is interesting to me (but again, I don't really get most of the AAA games, although some of the ones I own may be classified as such).
Also, at least releases in Japan and in Asia have actual perks with preordering (and by actual perks I don't mean useless digital items), although that comes as an additional cost.
Some studios also are able to continue to make great (if not greatest) titles. I however think AAA games are generally too broad in scope and have a much greater probability to fail to keep their reputation and/or to live up to the expectation.
The problem there lies that there's almost zero chance that the game you see @ E3 will be the exact same game that gets released 6 months down the line. Sure, you could just stop development, but then you're just wasting 6 months you could be spending making the game better. That could mean adding new mechanics or story, but sometimes you decide that a really cool thing just doesn't fit in to the story you're trying to tell, so you cut it and it's a better game for it. I feel like it's a have your cake and eat it too.
That not to mention the industry feedback you get that you could be incorporating.
The hardcore gamers in E3s audience can and will smell BS. The problem is generally all developers have to go off of is the vertical slice, generally created right during E3 and before a lot of the games systems are in place. If you go and tell the whole truth about a game in the development stage, it’s like showing a trailer for a movie without a lot of the VFX actually completed, or even worse. The business wants to know how much hype they can build around a game, and how much money to spend on marketing. And the big titles tend to treat their audience as idiots, because the marketing department tends to treat them as such. There’s a better way, but it goes against how so many companies especially big publishers do things.
I thought Dead Island was fine from a gameplay perspective which is what I bought it on because I learned early on in life that you shouldn't buy games before reviews drop.
What? I don't feel owed gameplay demos at a trade show months or years before release. I'm fine with one a month out or not at all, they're selling me a game not nine months of anticipation.
Just wait until it is out, read reviews, and watch some YouTube videos, then make your purchasing decision. What do you need a pre-release gameplay video for?
Pre-release gameplay is primarily there to fuel the demand for pre-orders. As I mentioned in a previous comment on this article, this is a vestige of the days where limited supply of video games was more common (due to physical media shortages, unknown demand, and the desire to have a bit of word of mouth and exclusivity drive some of the marketing) - with digital distribution for practically all games, unless you want a super special limited edition, there is no reason to pre-order - wait for the game to release and watch some YouTube/Twitch of the game and make up your mind, as you have said.
> the dev's takeaway was "Don't show them any gameplay."
The average developer has almost no input into what is and isn't shown at E3. They can present some source material but E3 decisioss like these are made by a marketing company or the publisher.
I would trace it to a few factors; most important is the rise of the influencer advertising strategy.
It is vastly cheaper to simply pay streamers to hype up your game on Twitch or youtube than it is to give an honest, sober demo live on an E3 stage. The audience at these press events has shifted from enthusiast reporters/critics to hyperventilating influencer personalities. It was so bad this year that they were obnoxiously screaming and interrupting the presenters every few seconds to the point where the people on stage were losing their train of thought.
The other aspect is that putting together a demo takes time out of development. Almost every game shown this year had a release date of either Fall 2019 or Spring 2020. That means they're either on crunch or ramping up for it and can't afford to set aside a few months to make a vertical slice for E3. This is partly due to the end of this console generation with new hardware coming next holiday season.
The longer term trend is that marketing has caught on to the irrelevance of E3. They can run their own Nintendo Direct style live stream whenever they want to speak to their audience and the press will disseminate that info to the wider community.
You're even seeing companies like EA experiment with dropping a new title with zero advance notice as they did with Apex Legends (their take on battle royale, from the Titanfall developers). They just had an influencer preview event the week before and dumped it to the public with pretty wild success.
The average build for a AAA game coming out in 2020 is barely in playable shape and certainly not ready to be shown at E3. Making one polished enough to show off is both time consuming and not representative of the final product. I really wish people were more open to seeing blocked out levels and unpolished animations to have an idea of where these projects are going but unfortunately even the most pristine presentations get picked apart online so I don’t think that’s happening anytime soon.
That's more or less what Steam Greenlight was and one reason why I love it even if it goes by a different name now. I can pick up something interesting, play around with it, and then catch up on it's progress over years.
I've done this with Subnautica, Starbound, Rimworld, and The Long Dark. I don't regret going on a journey with the developers at all.
Yes, Steam Greenlight (whatever it's called these days) is a cute idea, but for every gem like the games you mention, there are hundreds if not thousands of crap, minimal effort "games", many of them in a barely working state, which are released there just to make a quick buck or two, either via direct sales, or via the Steam card sales.
You're talking about early access. Greenlight was just a community based moderation process to allow anyone to submit their game without depending on a well known publisher. A greenlit game doesn't have to be in early access. Nothing prevents someone from submitting a completed game.
One of the things that I wish was available to customers on a regular basis is the playable demo that was pitched to a company for a game that gets picked up. An example of this would be id software's demo for Super Mario Bros 3 on the PC[0] (even though that didn't get picked up).
> The average build for a AAA game coming out in 2020 is barely in playable shape and certainly not ready to be shown at E3.
Yet Nintendo let people play Breath of the Wild at E3 2016 and it came out in 2017. Everything from that trailer are things from the actual released game as well: https://www.youtube.com/watch?v=6LIq8ryhG9c
So some companies clearly manages to do this. Another example is Blizzard, they typically have wysiwyg games as well.
Blizzard and Nintendo are also the companies that have scrapped the most games that I know of.
Even if you dislike their games, you have to admit they 1) have the best polish, 2) always communicate clearly what's in the game. If I don't like a Nintendo or Blizzard game I would know much before I buy it.
Overwatch was a buggy mess in the first year and disposing a weird handling of inputs to the max. SC2:WoL and previous season in LotV had the most unbalanced multiplayer towards single race in a whole 20 years history of the franchise.
> always communicate clearly what's in the game
Like getting hype in Diablo community towards Blizzcon 2018 to ask "You don't have phones?"
Other companies give you a 3 minute video to get you hyped about a game.
Nintendo gives us 3 minute videos apologizing for a previous video announcing that a game was in development and that they're restarting development because they don't like where it was going.
One of these companies is worth your money, the others aren't.
>where at times developers appeared afraid of their audience
Good. The amount of abuse the gaming industry has heaped upon consumers the past ten years is almost beyond belief. Microtransactions. Day-1 DLC. Season Passes that don't include all the DLC. So many versions of games you need a literal spreadsheet to keep track of how to get what feature. Broken piles of shit that rely on a "90-day roadmap" to tell us when they'll be playable. Using gambling mechanics to make children rob their parents to pay for loot boxes. The list is endless. Every second game dev could have night terrors for the next decade and it still wouldn't make up for how they have mutilated their own art form in the pursuit of unlimited profit. Almost every major games publisher, from EA to Activision-Blizzard to Ubisoft to Warner Brothers has behaved absolutely reprehensibly and in a just world would be burned to the ground by an angry mob.
> Good. The amount of abuse the gaming industry has heaped upon consumers the past ten years is almost beyond belief. Microtransactions.
What's funny about this is that part of the press actually encouraged microtransactions and what not when they were still not widespread (I remember a piece on FFXI/XIV arguing to drop subscriptions and go microtransactions).
Cue a few years later, everyone in the press is screaming about how microtransactions and pay to win ruin everything. Quite ironic.
I believe this cannot honestly be blamed on artists and developers. These are just people to be exploited by corporate practices. The only real boss of a publicly traded company is greed. And even a CEO isn't really free anymore in todays world.
It's a mystery to me how the gaming industry exists in its current form. I subscribe to Humble Monthly and occasionally buy a Humble Bundle. This all costs me $20-$30/mo and I get my hands on enough good games to last me a lifetime. I very occasionally will buy a cool looking new indie game, usually for $30 or less, when they come out. In general if a game looks like it has some interesting/novel gameplay I'll give it a shot. Most of them come from small indie developers, aren't super well known, and will run just fine on a 7 year old gaming laptop.
Meanwhile there is this enormous multi-billion dollar world of dozens of AAA publisher FPS titles that all look basically the same. People apparently line up in droves to pay $60 for these titles and the publishers are now so confident about their revenue stream that they don't want to release gameplay footage before they sell you the game? The #1 thing I do before buying a game or installing one I already own (since I own hundreds, thanks to Humble) is look at gameplay videos on Youtube. No gameplay video = not touching it.
What insane alien planet is this? It's certainly not the Earth in the dimension I live on, and it sounds much lamer.
Setting aside the Skinner Box aspect of the gaming industry for a moment, I think you're missing the social dimension of gaming.
A lot (if not most) of these new AAA games that are coming out have a strong social aspect. The game is a way to hang out with friends, the same way going to see a movie in the theaters might be, or checking out a new restaurant, or trying a new board game, etc. The novelty of the new game helps to decorate the shared experience.
Partially that, but as an example of the social aspect of video games: my college friends and I have scattered across the country for work and relationships. The cheapest way for us to stay in touch is a weekly video game night. We play old stuff, emulator stuff, new stuff we all like, all on PC connected over Discord for voice and occasionally video chat. It is our primary way of socializing with our friend group, often digressing from the video game we're playing into what is going on in our lives.
It's much cheaper than plane tickets, we can all drink the booze we like and have whatever dinner we want, all for the low price of one $60 game a quarter, at most.
While I do mostly the same (except I'm cheaper), what you said implies you live in a bubble. Same as TV presenters belittling MMOs or Twitch for being fringe, despite both of those being a bigger market than their own shows (ironic).
I'm personally very glad the industry exists in its current form, even though I don't participate. This mean in 5 to 10 years, I'll have great beautiful games I'll be able to play for cheap (lately, I enjoyed Vanquish for example, which used to be a AAA game). They are participating to the growth of the market, meaning there are more indies whose games I enjoy trying their luck. They participate to the industry improvement.
>Meanwhile there is this enormous multi-billion dollar world of dozens of AAA publisher FPS titles that all look basically the same.
I'm also not someone who buys many new titles unless they are offering something beyond the norm, but what fps titles do you mean?
I guess I'm a bit disconnected from the hype cycle, but I'm struggling to think of dozens of AAA FPS titles released in the past year. It's mostly Battle royale games that are FOTM, and there are only 3 AAA titles that do this that I can think of. (And the answer to those ones are: kids.)
That said, I agree with you that there is a lot of rubbish being released prematurely without gameplay vids and embargoed reviews that rely on preorders and the uninformed to purchase before people catch on.
Many games are sequels, if you’ve played the previous iteration you know already know what the gameplay is going to be like. Also all the review sites that get early copies will have their own gameplay videos out by release day so you can easily look those up before purchase.
The problem is publishers are not managing expectations. If you're showing gameplay footage of a product that is still a year out or more in development, you had better follow up with new trailer releases as major elements of the game change. Look to [Hideo] Kojima Productions' Death Stranding for an example of hype tempered with gameplay footage that is released in drips and drabs as development of the game progresses. You can get a realistic sense of how the game will play.
The corresponding drawback of trying to reap the business benefits preorder and hype-wise, of an overly ambitious or misleading (intentionally or otherwise) vertical slide demo or trailer is the backlash and loss of consumer trust when you under-deliver or cut back.
It can often be really hard to make a demo or trailer that isn't a mess without polishing it as much as possible. Development is messy. But sometimes you really can say "that's a lie" or "no, it [game feature] won't do that" and those ones beleaguer the benefit of the doubt.
Worst kind of lie is a lie, mixed with true facts: even if we believe Ubisoft that they didn't downgrade quality, this article should not put The Witcher 3 as an example - there's even official patch to restore downgraded quality, and official explanation "we just tried to make game looking the same on consoles and PC".
Gamers community trained to don't trust publishers - this article trying to say "because gamers are too stupid and suspicious", but in reality, as usually, because publishers lie too often.
Ubisoft didn't say that they didn't downgrade quality, they said “the notion we would actively downgrade quality is contrary to everything we’ve set out to achieve”
Sometimes you set out to achieve something and you fail, ending up with something that's contrary to what you set out to achieve.
Avoiding the topic at hand, which it sounds like gamers brought on themselves, I've got to say that that Doom screenshot brought back some memories.
I remember way back whenever, being excited to pick up a copy of Doom 3 for my new top-end system to show off what it could do. And being amazed that Id had spent so much time and effort to build this incredible engine to show effects that you couldn't ever actually see.
Because they'd chose a color palette consisting of only Black, Dark Grey and Dark Brown, and set the game in the dark.
It's nice to see they haven't backed down from their stance that "Nobody should be able to see anything in our games". In the linked screenshot, the entire world is literally on fire and the things standing in front of you are still too dimly lit to see.
Doom 2016 was only dark enough to be atmospheric. Important things like incoming fire are bright and loud.
The muted backgrounds and bright targets are needed though. Unlike Doom 3, Doom 4 and now 5 are so fast paced you'd have a bad time if the whole game looked like a pride parade.
> In truth, gameplay demos at events like E3 are subject to change. They’re often built as vertical slices – self-contained examples of what a sequence will look like once every feature and asset in the game has reached its highest level of polish. They can help studios accurately schedule the rest of development.
Yeah, but they never deliver. They show a carefully remastered video and then they release a game with a worse quality.
It's not about what could an E3 demo be used for. It's not a corporate event. Show us gameplay. It's a consumer show.
It's kind of a running joke. Any "in-game" footage shown at a conference like E3 is almost guaranteed to be running on a high end modern PC even if the game itself will be released on a five year old console.
It's no coincidence that the OGs who focus on gameplay -- Nintendo, iD Software, Rockstar -- show the least BS.
They've all had bad experiences in the past when they demo'd graphics and learned from it. Nintendo's 2000 Zelda video sent expectations the entirely wrong way for Wind Waker. iDs Doom 3 mishap (no one wants a sneaky, shadowy Doom Guy).
CGI trailers and other filler content don't build hype to nearly the same extent as gameplay does. Don't forget that the lack of gameplay shown was one of the reasons this years E3 was considered a big disappointment by most.
The reason Sony's 2016 E3 press conference is widely regarded as the best press conference ever, in addition to the live orchestra, was that they blasted the audience with long gameplay demo after gameplay demo. It had very little filler speeches by executives and few empty CGI trailers, and it made a big difference.
It seems to me that a brave decision would be to let gamers into the dev process openly - a regular vlog like approach by an optimistic old hand showing skin texture experiments or changes to explosion algorithms and so on - not lying, not cut scenes ... just openness
But then again, a couple of cut scenes every six months and secrecy otherwise is probably safer
Please. John Carmack did that and then he spent ages trying to teach people about the process of development. Everyone assumed that he was just a greedy bastard who didn't want to hire people so he could keep the money for himself and if he wanted to give them the game faster he'd just hire more developers.
I'm pretty sure I read a plan file from the '90s where he wrote a massive screed against this Mythical Man Month shit. And this was John Carmack, actual mythic hero. No name game developer will get super shit on.
I've followed a couple games like that, and the end result is disappointment. Every feature that didn't work out feels like something I really wanted. So, really, it's a lot of effort on the developer's part to more thoroughly disappoint the potential buyers.
In my outsider's opinion, spending that time focused on building something fun without the outside world intruding would be better. When it gets to the point of only needing polish, then start showing it to the world.
Problem with most popular games is that they are not made by hardcore gamers but by cashgrabbing businessmen and huge companies (you can find out more here https://bit.ly/2ILPdHw). Most indie games, if not 99 percent of them die being unseen. For example fortnite, althrough popular and addictive, it can only addict people new to such mechanisms (to games similar to fortnite) or 8 year old kids who never played. But does it make money? Oh boy.
E3 is a marketing event for big publishers, if cinematic trailers create more consistent hype than gameplay trailers then that's what they will do. Critics very frequently point out the formulaic, rehashed gameplay of big budget titles and they probably want to limit that.
Every E3 presentation should be able to use as much on stage acting and audio as needed but limited to a single still frame of 80s style NES box art of the game.
The main issue is that sometimes the entire team would stop to produce a demo for E3 or other shows which in retrospect would delay the game from actually coming out. Some would spend so much time making sure their demo looked good, it became an engine show-off and they knew the game wouldn't ship complete like that.
Games are so high quality these days, but gamers are insufferable. I can't read any content about games any more because it's hysterical children screaming at the top of their voice about some inconsequential difference from some concept video.
There was an interesting quote about how demos are expensive to produce - I think this is something a lot of people dismiss. People seem to think that a demo is easy, just show what you have right now! But a lot of the time is is a huge distraction from what you otherwise would be working on.
The amount of turd polishing that goes on for an E3 demo is massive. I had a friend that worked in the AAA games space, and he said that between half and three quarters of the team would halt work on moving the game forward and focus for three to four months on polishing the demo shown at E3.
I don't get it. How do players feel that something was stolen from them? Do they make their purchase decision based off a trailer aired 2 years ago?
There are tons of journals and streamers out there that review finished games. At that point it is pretty obvious how the game actually looks and plays. If the preview would be anything like the real game they would be shipping it.
Those covers were obvious cover art. That's completely different from showing things that you could reasonably expect, as if they were actually included.
Exactly. I just looked at the tape cassette of Defender for the Sinclair ZX81 (circa 1982). A nice color illustration on the cover and a black and white 64x44 pixel game inside.
However I didn't see a higher quality gameplay B/W movie of that game before buying it. They eventually tricked the ZX81 to get 256x192 pixels [1] [2] and if I saw that kind of commercial, buy the game and get the standard 64x44 I'd be pissed off too.
You could make the argument that this behavior precipitated the '84 game crash, of which a major cause was consumers reducing their purchases and expectations after experiencing low quality products.
In any case it took a big down turn so the industry agrees/agreed with you contrary to the downvotes. Big publishers went off and did their own events at the same time at a different location.
I think E3 is still important, though, because it pulls the industry together. Now whether or not something like GDC could replace it is another thing but I don’t think E3 is any more rote than CES.
You have to stop thinking that you're in charge and start thinking that you're having a dance. We used to think we're smart [...] but nobody is smarter than the internet. [...] One of the things we learned pretty early on is 'Don't ever, ever try to lie to the internet - because they will catch you. They will de-construct your spin. They will remember everything you ever say for eternity.'
You can see really old school companies really struggle with that. They think they can still be in control of the message. [...] So yeah, the internet (in aggregate) is scary smart. The sooner people accept that and start to trust that that's the case, the better they're gonna be in interacting with them.