If you’re a part of video game fan communities on the internet, you’ve probably heard the word “crunch” tossed around. As large game studios’ practices and proceedings have come into light in recent years, crunch has become a much more talked about part of making games. As fun and exciting as playing and watching games can be, developing them is not always as fun a process. The practice of crunch at game studios is a difficult, unethical, and unnecessary part of development that could be avoided altogether. It can be difficult to see bad company practices and systemic issues like crunch behind the shiny veneer and hype of big, glossy AAA titles, but it’s important that we do so in order to better understand what developers are going through and learn how to support them.
What is crunch?
“Crunch” is a practice in which game developers work large amounts of overtime for weeks, months, or even years on end in order to meet a particular deadline, usually a game’s ship date. The normal full-time work week is 40 hours, but some developers have worked 60-, 80-, or even 100-hour work weeks in order to finish a game. Other developers work late nights or weekends to get more time in. It’s mostly a practice within the world of AAA games, a term used to describe large developers and publishers that make some of each year’s biggest games. (While crunch can and does happen within the indie world, it’s much more common in the AAA sphere.) There have been a lot of stories coming out of major studios recently that describe the personal sacrifices that went into making some of the most popular recent releases. Bioware’s struggling game Anthem, which the company has announced it’s giving up on, was made in part thanks to hours and hours of crunch from its developers. CD Projekt Red, the developer of such large franchises as The Witcher, promised that finishing its huge game Cyberpunk 2077 wouldn’t require crunch on the part of the team – and then it did. Crunch is responsible for a significant amount of burnout in game developers. Some even leave the industry altogether; this is particularly common among new and junior developers. These stories paint a bleak picture for those who want to get into making games, something that many consider a dream job.
What causes crunch?
If crunch is so terrible, then why does it happen so frequently? Games like Anthem and Cyberpunk 2077 are indicative of the direction that the AAA game industry has been heading in for a long time: toward big, bombastic, open-world experiences. Flagship games are bigger, prettier, and more complex than ever before; that means that they’re also more expensive and take more time to develop than ever. As detailed in The Chronicle, development studios frequently work with large publishers, like Sony and 2K, to help them release their games. These publishers invest large sums of money to give development studios the resources they need to create a game. In return, these studios expect a return on their investment as quickly as possible. To facilitate this, they set deadlines for studios to meet while developing the game. These deadlines are often stringent and inflexible – the bigger the game, the more intense the deadline.
Another cause of crunch is bad management and unforgiving development timelines. In the race to get a game out by an announced ship date or keep a promise to investors, management can force developers to crunch for weeks or months in order to save their own skins. In trying to appease investors, some of whom may know little to nothing about how games are actually made, developers have to work significant amounts of overtime that they didn’t plan for. The same thing can happen if management refuses to adjust their development timeline to account for external issues. The rise of the COVID-19 pandemic this year threw many development studios into disarray, causing them to delay and even cancel games. If management doesn’t account for this or refuses to adjust a project’s timeline to address it, the onus of crunch ultimately falls on the shoulders of developers. Remember Mario creator Shigeru Miyamoto’s famous quote: “A late game is late once. A bad game is bad forever.”
In a similar vein, social pressure from management can cause or exacerbate crunch. Some managers may encourage developers to crunch “for the company” or “for your fellow employees”, shifting the blame from their shoulders to that of other developers. This creates a disastrous work environment in which developers feel as though they must keep up with the output of the person next to them or risk looking like they don’t care about the project or the company. Even if crunch is optional – which some studios claim it is in for them – some developers may even feel as though they may lose their job if they don’t crunch, which is a dangerous motivator and a fast track to burnout. It’s ultimately management’s responsibility to ensure that a healthy work environment is created – one that isn’t build on fear, competition, or a false sense of company loyalty.
What can be done about it?
As terrible as these conditions sound, there are ways that crunch can be reduced in game development – and it’s happening already. Many indie and mid-level studios are declaring themselves “crunch-free”, resolving to adjust their projects’ timelines or launch dates in order to foster a healthy work environment. Some developers are calling for unionization, which allows employees to collectively bargain for better hours, higher pay, and the elimination of crunch at their studios. A group of game writers working with the company Voltage Entertainment went on strike for 21 days, calling for better pay and more transparency in their workplace. They won the strike, with Voltage agreeing to their requests. While Voltage is a comparatively smaller company than many of the AAA giants discussed here, this successful strike sets a precedent that will hopefully give other developers the confidence to follow in these writers’ footsteps.
Developers have also begun calling out studios on their own. Many have taken to Twitter to discuss long hours, terrible management, and the other horrors that come along with crunch. Bringing public attention to common development conditions forces large studios to reckon with their own practices. It also brings a new awareness of crunch to game journalists, streamers, and players, potentially changing the way that they buy and consume games. Would you be less likely to purchase a glossy, hyped-up game if you knew its developers had to work 100-hour weeks in order to ship it by a certain date? Many are hoping that these calls for industry examination and reflections on poor work environments will help improve developers’ work-life balance, which in turn will lead to happier employees and better games.
Underneath the shiny, exciting surface of AAA games, the crunch monster is waiting to strike. It’s a problem that no developer should have to face, but many will face it in their careers anyway. We as fans and gamers can fight it by standing up for developers when they speak out about bad work conditions, treating game delays with empathy and understanding, and learning more about crunch conditions and what can be done about them. You’ve already taken the first step by reading this article! Spend some time in the game development community on Twitter to learn about what really happens during development, particularly if you’re interested in working in games. If you want to go even deeper, the books Blood, Sweat, and Pixels by Jason Schreier and Significant Zero by Walt Williams both tackle the topic of crunch and give a more nuanced portrait of game development as a whole. By learning about and fighting crunch, we can make the game industry a better place for everyone, developers and fans alike.