The Bane of Perfection


Providing a challenge is important to many gamers, but can developers push this concept too far?

Video game difficulty has always been a tricky subject, and a tough balancing act for developers. Like many, I enjoy a challenge and increased difficulty when I feel it is warranted. I may argue in favor of normal difficulty as the default experience, but that doesn’t mean I’m against stacking the odds against my favor. If a game features a good story, relatable characters, a rich world and solid gameplay, I will come back for more, often seeing how the game changes as enemies become harder to defeat and resources become even more scarce.

Sometimes that increased challenge turns the game into a whole new experience – both BioShock Infinite and Fallout 4 are solid games on normal difficulty, but on their hardest settings, the combat truly shines, and in the case of BioShock Infinite, it improves so dramatically that it really should be the default, normal mode.

That said, a higher difficulty works at its best when it still allows the player to fail and recover from a mistake. When a game begins to enter the territory of demanding perfection is when it loses me. This was one of my biggest gripes with the critically acclaimed title Bloodborne. The combat was fluid, responsive, even fun; but if the player made one mistake, it could instantly spell the end. In part two of a series titled “The Faults of Bloodborne,” I wrote:

“Bloodborne features a mechanic where, if the player gets hit, they have a short window of time to inflict damage back onto the enemy and regain some of their lost health. Again, in theory it sounds like a great idea.

However, I often found that enemies could kill me in a matter of seconds, and if I recovered my health, another enemy would run up and steal it right back. This is because Bloodborne takes an outdated approach to difficulty – the game is designed in such a way that players must be perfect in every encounter. There is no room for error – one mistake can be costly, so much so that the player can lose a significant amount of progress simply because an enemy was able to land one hit on the player. I lost count of the number of times I would come across a group of enemies, kill all of them except the last one, and do so without taking a hit. Yet on the final enemy, right before I would land my final strike, the enemy would sneak one shot in and reduce my health meter to a sliver. I would then land that final blow, and recover just a fraction of my health, all because I didn’t perfectly execute against a series of enemies.”

The emphasis on executing every enemy encounter perfectly eventually wore me out, and I never finished the game, despite sinking countless hours into similar titles such as Dark Souls and Dark Souls II: Scholar of the First Sin. If a title is going to demand I play nearly every moment of it perfectly, then it’s a title I will pass up. Yet Bloodborne is far from the worst offender, and over the last five years there has been an increase in video games that offer the “ultimate” challenge – games that push the player to beat the game on the hardest setting, without dying at all. Recent examples, such as DOOM and Deus Ex: Mankind Divided, feature this mode, and games such as Dead Space 2 and Ori and the Blind Forest offer twists on this formula, but are still just as demanding. This “one life difficulty” is a growing trend, and not only does it play into the worst aspects of higher difficulty, but it perpetuates a mindset among gamers that their ultimate goal, the main takeaway from experiencing a game, should be perfection above all else.

The Inability To Learn


Games such as The Binding of Isaac manage to balance increased difficulty with the ability to let players fail and learn from their mistakes.

Increased difficulty has long been associated with quality in the world of video games, and this was born out of necessity. In our article “Defining Normal: Difficulty in Video Games,” we discuss how this came to be:

“For most of the 1980s through the mid-1990s, developers lacked the necessary resources, knowledge and technology to craft a game that would provide the player with an epic storyline best consumed over the course of days, if not weeks. Games were meant to be mastered in one sitting, again due to a lack of resources – codes in Metroid were workarounds for a lack of storage to save games on. In order to prevent gamers from beating a game within hours, the default experience was one of intense difficulty, requiring skill to master. Acquiring such skills took time and patience, which meant that each purchased video game essentially paid for itself over time. Therefore, in the early days of the industry, difficulty was closely linked to value.”

As the saying goes, old habits die hard, and today difficulty is still associated with quality, despite the fact that the medium has grown in leaps and bounds in other areas. There are signs that developers are focusing more on the experience instead of making their games difficult just to appeal to an old-school mentality – games like The Stanley Parable, Spec Ops: The Line, Gone Home, Papo & Yo, and Journey all put a premium on story and character over gameplay. And it’s not as if skill-based gaming is disappearing overnight – games like The Binding of Isaac, Race the Sun, Spelunky, and even to an extent games like Super Mario Maker and Metroid Prime force players to hone their skills in order to take on the highest challenges. And the aforementioned BioShock Infinite and Fallout 4 offer the best of both worlds – a more laid back, story-focused experience or a series of ever mounting challenges.

Regardless of whether or not you agree with the belief that difficulty ties into a game’s value, one thing we can agree on is that, when executed properly, higher difficulties provide players the opportunity to learn new strategies and adapt to new tactics. When ammo is plentiful and enemies weak, BioShock Infinite feels like a shooting gallery with pretty sights, but play the game on the aptly named 1999 Mode and tactics that may have been helpful occasionally on lower difficulties are necessary to survive even the smallest enemy encounter. Where these games succeed is that even on these higher difficulties, the player can make a mistake, learn from it and improve. Naturally, this also means that player’s can still exert some control over how the game plays, since it is their choice of skills and tactics they are improving. Jumping over to Fallout 4, it’s entirely possible that a player has spent their time on normal difficulty learning the ins and outs of a melee-focused character. When they decide to take that character into Survival Mode, the title doesn’t force players to adapt to an entirely new playstyle just to accommodate the harder difficulty – they can still employ the same general tactics, but have to spend more time strategizing and managing resources, and even take advantage of mechanics they might have ignored on a lower difficulty.

The ability for higher difficulties to teach players means that they are engaged and invested in the challenge ahead of them, because the player feels a sense of accomplishment from choosing a playstyle, learning the mechanics of that style and mastering it. It’s the same premise found within the RPG loop that is the foundation for games like Borderlands – players start out lacking knowledge (akin to being a low level starting character), learn the game’s mechanics and improve their skills (seen as leveling up in RPGs), and overcome obstacles previously thought impossible (defeating a boss or clearing an end-game raid).

The issue is that all of these benefits are lost when a developer increases difficulty by demanding perfection. When DOOM and Deus Ex: Mankind Divided ask players to beat the entire game on the hardest setting with only one life, it forces the player to abandon their unique playstyle and skills they’ve learned and play the game a very specific way, one that best increases their odds of completing this challenge. This has long been a problem with games like World of Warcraft and Borderlands 2 – eventually, the community discovers the best possible combination of weapons and gear, and instead of being able to take advantage of the roleplaying aspect of RPGs, everyone eventually creates the same character with the same specifications and items. But unlike World of Warcraft and Borderlands 2, one life difficulty is a product of the developer, not the community. It’s one thing for fans to find the best possible combination of items to be the most efficient – players are still exerting some form of control and creativity over the title. It may reduce games to being calculated on spreadsheets instead of the much more enjoyable methods of experimentation and trial and error, but at least it comes from within the community. When developers implement these difficulty settings, they are endorsing a mentality that should have been abandoned after the fifth generation, stripping the challenge of all of the engagement, creativity and progression that typically comes with the experience, and forcing gamers to play the game in a narrow, specific way.

Lost In Translation


The one life difficulty in Deus Ex: Mankind Divided creates a disconnect that can hinder the impact of the game’s story, characters and gameplay.

This is where discussing this issue gets a bit tricky – in the past we have called on developers to both take ownership of their games, and to let gamers break them. Couldn’t these one life challenges be a form of compromise, a way for developers to say “here are two ways to play our game – your way and our way.”

They could be, but it comes at a cost. First, by eliminating player creativity and forcing specific developer-approved tactics, one life challenges end up feeling like interactive films stuck on a loop. Players may only get part of the way through a game, fail at a specific point, and must start the game over completely in the hopes that this time they don’t make that one wrong move at that boss fight five hours into the game (or at some point along the way there, despite the fact that they’ve already bested earlier sections). This means that players either be perfect on their very first attempt, or they adopt a mentality that their attempts are like waves crashing against a cliff – they may just bounce off, but eventually they’ll erode enough rock away and send the whole thing crashing.

But in no way should this be an acceptable tactic for increasing difficulty. Imagine watching Citizen Kane, and every time the viewer was puzzled by a piece of the central mystery or a vague line of dialogue, the film restarted until the view picked up on every single clue and could piece together the ending themselves. After having to watch a dying Kane whisper Rosebud for the tenth time, you’d be forgiven if you shut the film off, accepting that knowing who or what Rosebud is isn’t worth the hassle (which, incidentally, is one of the lessons of the film, but it doesn’t need the viewer to piece that together themselves – the film does a wonderful job of that on its own).

The second issue this raises is that it creates an internal disconnect, splitting a title up into two drastically different games. This may sound like a benefit at first, but in all actuality it’s an amped up version of a problem that has hounded online FPSs with single player campaigns. I sunk countless hours into Call of Duty: Modern Warfare 2, both single and multiplayer, and the one element that always bothered me about the title was that my experiences in single player never translated into multiplayer. I could take on the craziest challenges in the campaign, perfecting certain tactics that ensured survival, but they would be useless in multiplayer. This created a disconnect that made the game frustrating to play at times – in one mode, the game was rewarding me for playing a specific way, but that same strategy would get me wrecked in another mode. I had to put in double the effort, double the time, just to master one game. The saving grace for games like Modern Warfare 2 is that, despite it feeling like two disconnected games packaged into one, each version of the game at least encouraged experimentation of numerous playstyles and allowed me as a player to make mistakes – but the lessons I learned in single player didn’t translate to multiplayer.

With online FPSs, this disconnect can be attributed to the fact that other human players are unpredictable, to say the least. In a single player campaign I can’t expect an enemy to take themselves out by throwing a grenade at their own feet, and I also don’t have to worry that my allies will betray me (as is often the case with any online multiplayer game). It also helps the case of online FPSs that this disconnect occurs over two separate modes, meaning that players shouldn’t be too shocked by the differences in playstyles. But in single player games like DOOM and Mankind Divided, this disconnect is not only present, but a direct result of the developer’s actions that negatively impacts the exact same mode (the single player campaign). And when a player has to put in double the time, double the effort, to play one game a very specific way just for the sake of completing a challenge, it begins to feel an awful lot like old-school design in the worst way. It’s puzzling in games like DOOM, which proved that developer id Software understands what made old-school design work, and meshed it with contemporary design to near perfection. To see them be so aware of this in one area, and so blind to it in other areas only strengthens that disconnect.

To Min-Max, or Not Min-Max


Games like Fallout: New Vegas can tap into a gamer’s obsessive tendencies, leading to countless hours min-maxing in the pursuit of perfection.

Playing my own devil’s advocate, this issue of one life challenges stripping away all meaningful experiences could be seen as making a mountain out of a molehill. After all, these modes are optional, and it’s easy to ignore them entirely. If a player is intrigued by the challenge, at least it’s there for them to undertake, right?

But this mentality has an impact in how we all think about video games. We’re stilling holding onto this outdated notion that a video game is meant to be mastered and perfected, and not a piece of art which can shed some light into what it means to be human, to feel love and loss, to be consumed with rage, to sacrifice greatly for the benefit of others. When gamers chase perfection, they miss out on the conveyance of ideas and beliefs, they miss out on a discussion that has the opportunity to impact them long after they’ve stopped playing. We don’t learn when we chase perfection – we care only for checking items off a to-do list, until that arbitrary list says we have completed 100% of things to do.

Obtaining a 100% completion in a game can be a fun challenge, but many games take this to an extreme that reduces even the best games to a grind, and taps into an obsessive compulsion some gamers feel when presented with a list of a million and one things to do. Grand Theft Auto V is one of the biggest recent offenders – every time the player pauses the game and goes into one of the many menus, a stat reminds them of how much of the game they have completed. Even after players finish the main story, that meter is well short of 100%, leaving players to pursue such “fun” activities as… competing in triathlons and flying planes under bridges. Just to achieve what developer Rockstar Games thinks as fully completing their game. At best, this is a case of padding out the length of a game artificially, at worst it perpetuates this notion that the only way gamers can enjoy a game and take anything away from it is to perfect it. Again, obtaining 100% completion is not mandatory, and evidence suggests that many players don’t even finish games, but when developers and publishers continue to focus on this in their games, it furthers this outdated mentality.

Even games that don’t track a player’s completion progress can bring out an obsessive, compulsive nature in gamers. For many gamers, myself included, a part of any game that features roleplaying or skill customization elements is to engage in min-maxing. For those unfamiliar with the term, Giant Bomb covered this topic, defining the concept as:

“The concept of minimizing undesirable qualities of a character so as to maximize desirable qualities in order to achieve the most powerful character possible in an RPG.”

Citing both the Giant Bomb article and an article titled “On the Vocabulary of Role-Playing” by Phil Masters, Wikipedia offers an even broader definition:

“The practice of playing a role-playing game, wargame or video game with the intent of creating the “best” character by means of minimizing undesired or unimportant traits and maximizing desired ones. This is usually accomplished by improving one specific trait or ability by sacrificing ability in all other fields. This is easier to accomplish in games where attributes are generated from a certain number of points rather than in ones where they are randomly generated.”

To give an example of what this looks like, I recall spending an entire evening reading up on all of the perks and skills in Fallout: New Vegas. This was well after the game and its DLCs had been released, and well after I had beaten the game a couple times. I decided I wanted to min-max a sniper build, and laid out a roadmap of what skills to increase at each level, and which perk to take for all 50 levels. During that playthrough, I kept that sheet at my side at all times, referring to it every time I leveled up, going against my instincts to take a perk that might result in more interesting situations in favor of one that would ensure that, statistically, I was playing the very best possible version of that character.

If this sounds like a nightmare that sucks the joy out of games, then you’re correct – min-maxing reduces gameplay to a spreadsheet, which ignores the amazing ability for video games to surprise us and allow us to travel down paths we didn’t anticipate. Again, no developer is forcing myself or others to min-max, or obtain 100% completion status, but at the same time few developers are condemning these practices. By including these features and encouraging them, the gaming industry is inadvertently (or maybe even purposefully) telling gamers that perfection is tantamount to quality. This is just as outdated of a concept as it is to link difficulty to quality – demanding perfection from players and encouraging it in no way adds value to a game. If anything, as demonstrated, it detracts from it.

It also has the potential to go beyond being bad game design and turning into something that can actually be damaging – it doesn’t take long to find self-reflective articles on how people with obsessive tendencies can lose years to perfecting video games. This concept even has its own listing on TV Tropes, which is critical of the practice, accurately recognizing it as a trope. To be clear, it is the responsibility of each adult to recognize when any activity, from drinking to eating to playing video games, has moved from pastime and hobby to a serious problem that needs moderation. It’s not a responsibility of the developer, publisher or even the Gamestop clerk who sells games – it’s up to the player to recognize when gaming has taken up too much of their life. But when it’s so easy to find essays online from former gamers, who discuss their motivation to leave no stone unturned, to see everything in every game they play, to chase perfection at the cost of not just their enjoyment but their personal health and relationships, it should hopefully be enough to make us all collectively stop and ask how this is happening. When testimonies from former gamers read nearly identical to those of former drug addicts, it should be a cause for concern. If demanding perfection already removes many of the positive elements of video games, and limits the medium’s capacity to grow and evolve, and it also might reinforce negative behaviors, then we need to ask why we’re still dragging this mentality with us?

Do Well, but Not Perfect

We recently discussed some concerning issues facing the video game industry, and one of the solutions to these oncoming problems was to decrease the barrier to entry for new gamers. But it won’t matter if consoles move away from platform exclusives and demand potential gamers throw down thousands of dollars just to play the games they want to play if, when they finally get there, they’re greeted with a community that absolutely demands perfection and asks they invest hundreds of hours into obtaining that level of perfection. For many, video games are seen as a chance to unwind after a long day, or to get together with friends and play a few rounds of multiplayer. To some, it will be seen as a way to communicate ideas and emotions, to transport players to new worlds and take them on incredible journeys with unforgettable characters. And yet others will see the chance to hone a skill, much in the same way someone practices a martial art, table tennis or card games.

All of those pursuits are perfectly fine, and they can live in harmony with one another. What doesn’t fit anywhere into that equation is the perfection demanded of gamers when developers rely on one life difficulty to provide an increased challenge. Demanding such a high level of perfection that one slip up means hours of progress lost is not an enticing prospect to potential new gamers, and even though these difficulty settings are option, their inclusion feeds into a mentality that is unwelcoming at best, destructive at worst. It’s an outdated concept that taps into the obsessive nature of humanity, and detracts from the experience by reducing works of art to numbers on a spreadsheet, stripping all meaning and chance to learn. It’s time that this mentality use up its one life, and never come back.