Sunday, May 31, 2015

The tech arms race in AAA - and why I'm abandoning it

When I was 13 years old, my dad finally fulfilled my dream of owning a PC. He brought me a woefully underpowered 10MHz MHz 8088 box, but with an upgraded VGA card and a color monitor. I was too excited to care about the comically mismatched package: I could finally play games in my own house, just like many of my friends and cousins were doing.


Unfortunately for me, I couldn't find that many good games to be played on that machine. This was already well into the 90's, and all the games that were coming out then needed a faster processor to run. After quickly getting tired of Tetris, Prince of Persia, the very early Sierra adventures, and unable to afford a better PC, I found myself playing less games and instead browsing magazines or visiting friends with better PCs, to get a glimpse of the latest games I couldn't run. Those games were impressive at first: I'll never forget seeing Doom run for the first time.
Those magazines I was reading came with a huge amount of full page ads, showcasing the latest processors, video cards, hard drives. Their message, combined with the overall tone of the reviews and other articles I was reading at the time, was clear: You absolutely need the latest hardware or you are missing out on games. How could that be false? I was seeing it first hand: I was stuck with a weak machine, and as a result I couldn't play any of the great looking games my friends were enjoying.

The possibility space enabled by my original PC box was pretty limited. By possibility space, I'm referring to all the potential games that could run on the hardware, whether those games were actually made or not. On that computer all games were very simple 2D games, with very few sprites on the screen, usually played on one static screen at a time. While there is still a huge amount of games that could be made within these constraints, the overall space is small compared to what today's hardware has enabled.
Funnily enough, some 3D games did run on that ancient PC. They ran terribly. I vividly remember Test Drive 3:
The first 3D game I played. I didn’t care whether it was any fun.


This kind of primitive 3D graphics ran on that computer at around 5-10 seconds per frame. Yet young me was still having actual, non-ironic fun. Not with the game (it was unplayable), but with the glimpse of what "actual" gamers get to enjoy. My enjoyment of it came because I was cheating the possibility space enabled by my PC. I got to "enjoy" something *better*. I was right at the edge, and even a little bit beyond, of the possibility space my PC offered, and it felt great.
All these experiences led me to conclude from very early on that good games live right at the edge of the possibility space. "If a game does not take advantage of all the processing power, it must not be good, or at least not as good as it could have been", I thought. It was a fundamentally wrong conclusion that would take decades to be challenged.
What I was blindly ignoring back in my teen years was games like Elite, Ultima IV, Zork, MUD - games that fit just fine on the tiny possibility space my original PC enabled. I didn't know about them, and my overall environment didn't encourage me at all to seek them out. I was instead encouraged to seek the latest tech fix, and in my mind, that would automatically give me the optimal gaming experience. As a result, I missed out on one-of-a-kind, ground-breaking gaming experiences that could have helped me better understand what actually makes a good game early on.
Fortunately, later on, with an upgraded computer, I was exposed to games that I kept playing for a long time, not because of their tech specs, but because they were fun. Games like Civilization, Warcraft, and Master of Orion. But even as I was enjoying those games for their gameplay, my tech bias still creeped in. While playing Warcraft, I couldn't help myself thinking "How much better would this game be if they had thousands of units on the screen?” In my mind, the lack of more units on screen was always caused by a tech limitation. The only valid reason why they don't have more units is because they can't technically achieve it - if they could, the game would automatically be better off for it. The perfect balance of number of units with the amount of things a player's human brain could possibly track at any given moment was completely lost on me. So even though I enjoyed those games for their gameplay, I was still not truly understanding what made them so good.
My early obsession with technology largely shaped how I got into the gaming industry. For years I would teach myself programming, constantly working on some demo or prototype. But instead of looking to find any sort of fun in those prototypes, I was focusing on technical details like how many sprites I could simulate without the framerate dropping, or yet another infinite terrain demo. For some reason, the question of how to fill that infinite terrain with interesting things to do seemed less important than generating the peaks and troughs. I'm not quite sure why I had this ridiculous attitude. Obviously, I wanted to make games, and in a game, it's more important to do interesting things than stare at an infinite terrain. So why did I focus on the wrong things? It's an interesting question that I can't answer fully, but I suspect the answer is closely related to my tech obsession. Filling the terrain with interesting things seemed like the "easy" problem that could probably be done in a day. No, the "interesting" problem was how to use the processor efficiently to create a giant world, and then everything else would somehow fall together magically.
During my first years of working at AAA, this skewed notion of what’s important in good games was not significantly challenged. Both my studio and other studios I interacted with, at least their technical departments, had very similar ideas about the need to push hardware to its limits and drive technical innovation. The isolation between departments that tends to happen as team sizes get larger also contributed to this: I was mainly exposed to similarly minded colleagues, who viewed their job as making sure there is further technical innovation on each game. Making the game fun is some other department's job. As long as everyone does their job, the thinking goes, things will work out ok and the game will be better off on all fronts.


The first doubts

My "better tech makes better games, unconditionally" theory came under the microscope much later, from observations I made outside the AAA industry. If I had to point to a single moment where the theory started falling apart, it would be the Sims Social moment.
I always liked The Sims series, and played all the PC iterations up to the 3rd. In 2011, The Sims Social came on Facebook. I was never a big fan of the "spam your friends to progress" social "innovation", and the content seemed light compared to the PC versions, but nevertheless tried it and quickly saw a golden opportunity to introduce my wife to the world of the Sims. Since she likes decorating houses, it should be the perfect game for her, my theory went. It worked like a charm! Soon enough, she was running a full scale operation with her main Sim being helped by 3 other secondary accounts made for each of our pets. (An annoying hack to get around the atrocious, anti-social "help me out" mechanics of Zynga-style games).


The Sims Social


Very pleased with myself, I proceeded to Step 2 of my master plan: Introducing my wife to an honest to god REAL video game! The hard part was already done, there's no way I could fail now! Quickly, I brought out the big arguments.
"The Sims 3 is kind of like what you're playing on Facebook, but so much better in every way!"
"There are so many more options in house types, furniture, sims personalities, neighbours!"
"You can build multiple houses in many lots and walk around an entire town you helped create!"
"The graphics are so much more realistic!"
"It's 3D!"
There was no doubt in my brain: No rational person would ever prefer any aspect of The Sims Social over the Sims 3. So after the initial shock, I used my wife's immediate and unconditional rejection of The Sims 3 as an opportunity to understand a different way of thinking about games. Let's look at her arguments in detail.


"The 3D graphics are UGLY - compared to nice hand drawn 2D the Sims Social has".

I got that one immediately. Apparently, growing up with an obsession for 3D graphics had made me oblivious to all kinds of artifacts like polygonal edges, aliasing and texture stretching. And despite all the progress we continue to make, the free-form rotate/zoom camera ensures there will be angles from which even pretty games can look ugly. In contrast, a simple 2D engine and a good art director can create something really consistent and beautiful throughout, by using well established offline tools. And if the art style calls for it, the 2D engine can even use pre-baked techniques like soft shadows and detailed lighting that are prohibitive in real time for certain hardware. Simple technology, beautiful results. And what does a game like the Sims lose, gameplay-wise, by abandoning 3D altogether? Does the house really look that much better if you can rotate and look at it from all possible viewpoints? On the contrary - rotating and zooming the camera adds complexity and creates more potential views from which the art looks ugly.
The Sims 3


I really don't like that in most AAA studios, the decision to go 3D is pre-determined, instead of letting the team decide what makes sense for a given title. And this is just a simple example where pre-determined, "must-have" tech components take flexibility away from the team and add complexity.
This culture of "3D makes things better" is so ingrained in some developers, that they’d never challenge it even when they have complete freedom to do so. From an interview with Faceroll games:   
"Clash of Clans is a beautiful game. I understand that using 2D sprites of 3D models allows the game to run smoothly on older devices."
Like young me, these developers would never even consider the possibility that the 2D choice was intentionally made to make the game better looking, easier to play, and even more fun. They think it must have been a decision the team grumbled through, because of technical limitations.
From the same interview:
"Also, the 3D approach allows us to do things that are not possible in 2D games. One feature that really impresses a lot of people is the ability to be able to rotate the camera around the base and the action. Aside from the strategic gameplay benefits, this feature gives the game an extremely immersive experience, almost like you are looking into a three-dimensional snow globe on your mobile device."
This is basically saying "The 3D approach makes gameplay better/deeper. I won't explain how. But it certainly is impressive. There are also strategic gameplay benefits, which I won't get into. Again, it looks really cool." This flow is very representative of other conversations I've had with technical people, over how specifically some advanced features will help gameplay. Very often, they won't.


"The animations are not as cute"

How can a few frames of cartoony animation possibly look better than 30 FPS, 4-bone skinned meshes with a skeleton that has at least as many bones as the human body? Easily. The abstract cartoon animation has a completely different frame of reference, and unlike the hyper-realistic version, is never compared to real life animations. The question of how to make an appealing/cute/memorable animation has everything to do with art direction and almost nothing to do with technical details. Creativity matters, knowing the audience matters, consistency matters, realism does not matter.


"It's too complicated. There are too many buttons. And why do I have to go out in a city? I just want to make my own house pretty."

It's funny how much you take for granted in games until you watch a non-gamer try to rotate a 3D camera. And it's also funny how The Sims 3 trying to do everything (no doubt many man years of work), has turned away my wife as a potential player (and possibly many more). Sure, the extra features might be a response to the established audience's needs, and I have no doubt many people would similarly stop playing if the game became too simplified for them. But why can't AAA studios experiment more with simplified versions of their franchises, trying to appeal to gamers of different tastes? (No, the Sims Social doesn't count as experimentation. The decision to simplify was an attempt to take a quick profit by cloning Zynga, and was abandoned quickly at the first sign of trouble. Interestingly enough, the good looking 2D art style of The Sims, and even more of Simcity Social, were abandoned on mobile because there was no reference of a successful game on those platforms yet. So these two franchises naturally fell back to the AAA way of 3D graphics – which on mobile platforms are extremely ugly compared to their 2D social versions).


Strategy games and the “less is more” approach

Soon after, I started playing League of Legends. It was fun. It was also another clear indication of how wrong I had been when I was playing Warcraft so many years ago. While I was obsessing with numbers of units in RTS games, and thought the obvious way the genre can become better is to have more units on the screen, smart teams at DoTA and League of Legends distilled the experience to its basics and created a true evolution of the genre that has appeal not because it is using the hardware better, but because it removed all the complexity and doubled down on the essentials: simple, fun strategic gameplay that is made orders of magnitude more effective through their multiplayer and social nature.
League of Legends


This again led me to rethink my previous assumptions. Why are more units in an RTS game automatically a good thing? Sure, it might look spectacular, and it might create a sense of awe the first time someone sees a very large scale battle. But does it actually add something significant to gameplay, or something that will keep players playing for longer?  Isn’t it harder for the player to keep track of so many units, adding complexity and frustration?
I haven’t been able to find a good argument to support my old “more units is better” thesis. Very often, the assumption of tech-focused teams is that more units makes the game fun automatically and without further explanation. Here’s Natural Motion:
"We challenged the team with the following: Are you able to create battle scenes with thousands of characters on-screen at the same time, in real-time, with you having full real-time touch control over all of these troops, and make all of these battles resolve within thirty to sixty seconds?" Reil recalled. "And they said no, that's not possible." He laughed as he launched into the game. "Eventually we managed to get it to work."
The suggestion is clear: They just knew that having battle scenes with thousands of characters on screen at the same time would be fun. They knew this even before they implemented it. The only thing that could possibly ruin this were technical limitations. The word fun itself is never mentioned in the interview, though it’s heavily implied that the technical achievements are a big part of what makes the game fun.


Rethinking what was taken for granted

I remember the Sims Social incident quite well, because at work, at that same time, I was going through a long, brutal, foolishly self-inflicted crunch period. I worked very hard in order to perfect a new rendering feature. A contribution that largely (and rightfully) went unnoticed by the vast majority of players, for whom incremental visual improvements just weren't that noticeable or important to their experience. Having spent so much effort on this feature, on a game that could have used a lot of attention in other areas that did affect players experience significantly, was another sign that I needed to rethink my priorities and the model of what makes a good game and a good experience for players. This was the best example of the "work smart, not hard" rule having been violated. I was let down by my obsession on the tech side.
Shockingly, I was alone in this conclusion. The feedback from my contribution to the game was positive and encouraging at all stages. The other areas that I could have helped with instead were brushed off as something outside my area of expertise - I was a graphics programmer, and I had done the best I could from my position to make the game as good as it could be. As comforting as the thought was, I gradually rejected that view over the next few years. No excuse can change the fact that my effort was wasted, and it could have gone towards something useful. Obviously, I needed to do some rethinking. What do you do when you suddenly realize the specialized skills you have been developing for years are largely irrelevant for making good games? Part of this rethinking involved re-parsing everything I was taking for granted all those years. At the top of the list, one of AAA's biggest arguments for pushing technology: the "Believable world", plus variations (Immersive world, relatable characters, etc.)
The self-stated goal of many games is to create a "believable" world. For years the AAA argument for pushing the tech envelope has been that the bigger, better looking, better sounding, more detailed our worlds are, the more believable they are. But is this true? Good books have created believable, immersive worlds and relatable characters for centuries, without a hint of audio/visual stimuli. Even in games, is the world of Grand Theft Auto more believable than the world of Bastion? To me, they are both similarly believable. Even though I find both equally believable, Bastion’s world is much more interesting to me compared to a boring contemporary setting thanks to its backstory, unique visual appearance, and the narrator.
For believability, consistency is far more important than literally matching what our senses would experience had we been in that world. And for making a world interesting, realism is similarly of secondary importance.
Even though good AAA games do use good art direction to make their world interesting, most of them put more effort in high quality visuals and audio than consistency. This obsession with the senses does not make sense to me, if we’re talking about games. What happens when extremely high quality visuals and audio are not fully consistent? As the tech bar goes higher, it becomes easier for inconsistencies that kill believability to creep in. There's nothing worse for believability than playing a great looking game, then at the corner of level 8 see something like this:


Bugs like these are increasingly common, and are largely a direct consequence of the increased complexity we're subjecting our development to as a result of the tech arms race. But on the other extreme, if you look at good MUDs, they too can have believable worlds, without this kind of inconsistencies. There must be a better way than the brute-force appeal to the senses.
(An aside to this focus on senses: if VR goes down the realism path, then at some point I'd expect AAA VR games to keep pushing for other senses, too -- smell, touch, taste -- when hardware allows. All of this, of course, assuming the race to cross the uncanny valley doesn't bankrupt them in the first place.)


The tech arms race in AAA

If someone looks at the rendering improvements we got with each new console generation, a clear pattern emerges. The improvements are becoming marginal. In certain cases for the last console transition, you need still side-by-side screenshots and a magnifying glass to see the difference. However, the cost to achieve those improvements is constantly going up, at an accelerating pace.


Some AAA studios seem happy to ignore these increasing costs with decreasing payoffs in terms of visual and audio fidelity. They will try to outspend their competition and call any exclusive marginal improvement a “competitive advantage.” Like a literal arms race, the tech arms race is very expensive as other companies’ progress force everyone else to increase their spending. It is also likely to cause a bloodbath, when companies who are forced to spend too much end up releasing a game that doesn’t become a hit.
Even though people understand the escalating costs of chasing marginal tech improvements, I don't think there's any chance the tech arms race will stop any time soon. There are many reasons why many AAA studios will continue participating:

Because of the politics of numbers

Many large organizations, despite their best intentions, fall prey to bad politics (which I define as certain individuals pursuing their personal interests, or their department's interests, at the expense of the project's best interests). In politics-heavy environments, there is nothing more effective in pursuing a personal agenda than using "facts" and numbers - even when those facts and numbers are hand-picked to support a certain story.
For tech focused projects, it's very easy to find impressive-sounding numbers. "We've improved the number of polygons on the screen by 20%." "We have a competitive advantage against this other company because, unlike them, our engine is using all SPU power." The underlying suggestion is that the game itself has become better because of these improvements.
Tech-focused people will often use such numbers to either justify the large cost that went into achieving them, or to defend themselves, depending on sales of the final game. If sales are good, they were good because of the improved technology. If they were not good, it must be someone else's fault -- the technology, after all, was demonstrably better than before or better than that competing game that did better.
The simplicity of numbers gives great peace of mind to technically minded people. They don't have to bother themselves with messy, subjective, uncomfortable questions of what actually makes a game better, what provides actual value to the players. They can focus on their little corner of making tech slightly better, at a huge expense, and point to small but tangible, measurable improvements when asked about their contribution.

Because of the perverse incentive structures

AAA games are a risky business – the budget is high, and so the cost of failure is also high. The incentive for everyone on the team should be to reduce cost and therefore risk to the project. However that does not always happen in practice. Often there is a competing personal/department incentive to increase spending within the department, to shield it in the case of failure. Nicholas Lovell describes this well as operational vs financial risk.

Because it's always been done that way.

During the 90s and early 2000s, the publisher gatekeepers had all the power in choosing what gets released in the market. With heavy marketing to promote ever-improving hardware, they convinced many players and themselves that only games that push the tech envelope can ever be good. In a self-fulfilling prophecy, they only published such games. Of course, games like Minecraft could never have been a success back then, because nobody would ever pick them to be published. So if they've always believed that games need to use ever-improving tech, why stop now? Radical change is usually very hard to push on large organizations and any attempt is likely to fail at the first sign of trouble.

Because they have too specialized people in the area

The skillset of existing employees, and often their inability or unwillingness to specialize elsewhere, limits the kinds of games AAA studios can potentially make and forces them to deploy those people in tech improvements regardless of their priority.

Because of an existing audience that demands ever-increasing visual fidelity

Many tech-hungry PC and console players sometimes seem to care more about using magnifying glasses to prove the game looks better on their platform, than actually enjoying the game. They will definitely buy the next Call of Duty and the next Battlefield just because they offer incremental rendering improvements. They will even complain loudly about the mere existence of low fidelity games on their platform.
But outside of certain established franchises, it makes little sense to go after expensive incremental tech improvements just to satisfy a vocal minority that plays games just to get a short term tech fix.

Because of the ongoing pursuit of Hollywood

Some AAA studios subscribe to the idea that games can deliver the maximum emotional impact in a similar way Hollywood does: By using actors in heavily scripted sequences to tell the story of someone else that the player/viewer relates to. Instead of playing to their medium's strengths, these studios go through great pains to emulate what Hollywood gets naturally: emotive characters, good looking lighting, spectacular locations. It's a very literal attempt to imitate another established, successful medium, and because it gets some results, it's popular, despite the fact it's very expensive and brushes aside many of the benefits that games get naturally.

Problems that can occur from tech focus

Like my personal failing with the rendering feature, the tech arms race can cause issues that affect the project at large. The escalating, often out of control budgets is an obvious one. But there are others. Very often the team is subjected to unstable technology bases and engines for far too long, severely limiting their ability to iterate and experiment when it matters the most: early on in the project’s lifetime. In such an environment, innovation can be mistaken for incremental technical improvements, leading to the same games over and over with slightly better graphics. Because of the increasing pressure to create more content at a higher quality bar, team sizes get larger, and there is often pressure for the growth to happen very fast. When there is pressure to grow fast, any good team suffers: shortcuts will be taken, non-optimal people hired, and process will creep in in hopes of keeping things under control. The end result is a much more complex environment, both technology- and process-wise, which is less efficient to get things done in, and also much less fun to work in.
But if I had to pick a single problem that worries me the most about tech-obsessed teams, lack of focus would easily be at the top. During the PS3/360->PS4/Xbox One transition, I watched teams forcibly lose focus and work on features like motion control and companion apps, not because they made any sense for the game, or because the developers wanted to try them, but because tech executives were nervous of what might happen if those features ended up catching traction and they were left behind in the race. (Interestingly, when the execs are happy they are safe from such a threat, they are happy to announce that the team itself wants to focus away from such needless features). And this is nothing compared to internal team arguments about whether the game really needs that shiny new feature that will make it look better and have no impact on gameplay. The number of meetings and stakeholders one has to go through is frustrating, even when it seems there’s a clear cost/benefit argument.


There are smarter ways

While there's no doubt many AAA companies will continue participating in the tech arms race for the above reasons, smaller teams would be foolish to follow suit. Whether it's the pursuit of realism, or any other manifestation of the "more is more" attitude (more sound channels! More sprites! More explosions!), the tech obsession is destructive. Not only because it is expensive, but also because it hides smarter ways to achieve similarly powerful, or even more powerful game experiences. Examples are everywhere.
Minecraft has obtained fanatic players that keep playing for years, even though it didn’t create a high fidelity audiovisual environment.
Transistor offers an excellent, memorable audio experience, even though it didn’t use more sound channels than the average AAA game.
Papers, Please creates an extremely close personal connection between the player and their in-game family, without even showing the names or pictures of the family members.
Neko Atsume evokes extreme emotions in cat lovers, even though it doesn’t have high fidelity models or animations of cats.


Neko Atsume


MUD games in 2015 still have a dedicated audience that has played them for decades, even though they are all text.
One final realization about games that only have good technology to offer, is that they are inherently short-term experiences. Are many people still playing Ryse: Son of Rome? Do they fire it up every single day just to look at its pretty environments and hyper-realistic character models? No - they've gotten their tech fix from the title and moved on to the pursuit of the next fix. I know, because I was exactly like that 10 years ago, moving from FPS to FPS, playing the single player campaign on easy mode, mostly just staring at the pretty scenes. What are the games that people play for years that only have pretty environments or another form of impressive tech to offer? I can't think of any. Reversing the question - are there games that people play for years despite outdated technology? Absolutely, with far too many examples of both mainstream and niche games.
We are already at a place where we have the technology to display gorgeous images with more colors and at a higher resolution than the eye can distinguish, even on devices people carry around in their pockets. Instead of blindly maintaining the arms race momentum, maybe it makes sense to think about what we can accomplish with mature tech instead, and free ourselves from the tyranny of “We have to use 100% of the hardware we’re running on”.
For me personally, the problems in game development that I want to focus on going forward are completely unrelated to technology. All the games I want to make fit perfectly fine within the opportunity space of current hardware. I am more interested in approaching problems like:
- What are the qualities of games that will keep both players and developers interested in the game in the long term (not weeks or months, but years to come)?
- How do we create interesting, consistent worlds and characters that give the players just enough motivation to spark their imagination and fill in the blanks left behind by the lack of super-detailed models and the inability to look at every corner of that world via a 3D camera?
- What is the one single thing that our game will be remembered for, and how do we focus the entire team around that concept?
With this post, I am now leaving behind my frustrating obsession with tech - an obsession that cost me a better gaming experience during my youth, and the opportunity to make better AAA games. And after admitting all this, I am finally free to continue learning and experimenting with what makes truly great, long term game experiences. The next time I hear someone telling me, "Games will be SO AWESOME when cloud computing is realized and allows millions of explosions and objects flying around you!" I’ll make sure to ask, "Why?"

1 comment:

  1. This comment has been removed by a blog administrator.

    ReplyDelete