Defining Normal: Difficulty in Video Games
by Josh Snyder
I have a confession to make – despite the numerous awards, leading it to be the consensus 2013 game of the year, I thought The Last of Us was just a good game. A game definitely worth playing, a game anyone who views this medium in a serious light should own. But just a good game.
After finishing the story, I discussed the title at length with fellow gamers, who universally loved it, who didn’t hesitate to shower it in praise. “What’s wrong with you?” they asked, when I confided that I thought developer Naughty Dog’s latest was a respectable game, but not on par with, in my opinion, the highly underrated Metro: Last Light. As the discussions intensified, I found myself circling around to the same argument – the characters were great, the story was fantastic, but the gameplay, and by extension the entire game itself, didn’t carry the same weight. In short, the world of The Last of Us never truly felt like a post-apocalyptic world, which in my experience had a negative effect on the story and characters. I never had to carefully count my bullets, and resources were aplenty.
Without fail, the following question was asked: What difficulty did you play it on?
Unbeknownst to me, it seems that The Last of Us should always be experienced, even on the first play-through, on the hard difficulty setting. Anecdotal evidence from message boards, even articles from reputable gaming sites suggest this. Yet here I was, beating the game my first time through on the normal difficulty setting, something I do for every game, and hearing that I did it wrong. I was told to play it again, this time on the hard difficulty.
Was this really an issue with me as a gamer? Or is this, as I contend, an issue with the game? In my view, the biggest failing of The Last of Us is that its default experience, the normal difficulty setting, offered an experience that that the developers perhaps did not intend, one where resources were plentiful and enemies were mostly harmless. As games grow to include more styles of gameplay, genres and cinematic storytelling elements, a definition of what normal difficulty means becomes more important. Much like the hidden language already found in video games, the industry needs to once and for all come to a consensus, and define normal as the default setting for all video games.
A Difficult History
To debate difficulty in video games, it helps to understand why the concept is so closely linked to the medium. With the advent of arcade games, punishing difficulty ensured that gamers could not easily complete a game, and would most likely spend more of their quarters in a vain attempt to do so. In the first home console generation, a high level of difficulty was the norm for a simple, practical reason – to artificially increase the length of a game.
For most of the 1980s through the mid-1990s, developers lacked the necessary resources, knowledge and technology to craft a game that would provide the player with an epic storyline best consumed over the course of days, if not weeks. Games were meant to be mastered in one sitting, again due to a lack of resources – codes in Metroid were workarounds for a lack of storage to save games on. In order to prevent gamers from beating a game within hours, the default experience was one of intense difficulty, requiring skill to master. Acquiring such skills took time and patience, which meant that each purchased video game essentially paid for itself over time. Therefore, in the early days of the industry, difficulty was closely linked to value.
Perhaps the most classic example of a game using punishing difficulty to artificially lengthen the experience is Battletoads, the side-scrolling game that many gamers tried to beat, but could never master. The game is so infamous that it has spawned it’s own sub-genre of YouTube video, with tutorials on the dreaded Level 3: Turbo Tunnel being the most popular. To this day, Battletoads remains one of developer Rare’s most beloved titles, simply for its brutal difficulty.
Influence on the Medium
Because a high level of difficulty was such a strong part of the medium in the early days, the practice of beating games at their highest levels of difficulty has become ingrained in the medium. It is an aspect not unlike the visual elements of film – the typical modern-day Hollywood films features a cast of attractive heroes fighting unsightly villains, and this is no mere trend – it is an aspect of what defines film. So, too, is difficulty in video games – if a video game provides no challenge, then what is the point of experiencing it?
A typical argument is that, without high levels of difficulty, the medium turns into a landscape of casual games meant more for short distractions, rather than respectful consideration on their artistic merits. The classic video games were not meant to be played in short doses and tossed aside – Battletoads was designed to encourage the player to master its systems and mechanics.
Of course, that argument holds little weight today, as games with lower difficulty levels (such as Telltale Game’s The Walking Dead) are often regarded as highly as games with traditionally high learning curves (such as From Software’s Dark Souls). But the practice of implementing punishing gameplay still stands. Games like the 2004 version of Ninja Gaiden harken back to the “good old days,” pitting the player against wave after wave of relentless enemies, and forcing them to fight their way out of impossible situations. Perhaps showing how closely linked the idea of difficulty still is to modern video game design, a re-release of the game, titled Ninja Gaiden: Black, offers two more difficulty settings: Ninja Master, which is the hardest, and Ninja Dog, the easiest difficulty. The naming of the easy setting was no coincidence – series creator Tomonobu Itagaki said of the lowered difficulty in an interview:
“In other words, there are some people who want to beat the game, even if it means being reduced to the level of a dog; people who are not afraid to shame themselves to accomplish their goals.”
What’s so surprising is that, with modern game design, video games do not need to be difficult to have value. Story, characters, solid gameplay mechanics and other production values are often the criteria used to judge a game’s worth. That’s not to say that they still can’t be difficult – literature and film offer products with varying degrees of difficulty. Watching the film 2012 does not task the brain the same way as watching There Will Be Blood does – one film is methodical in its approach and demands the viewer pay close attention to characters and how their relationships with others and the world around them crumble, and the other is an action film about somehow outrunning the end of the world. For video games, Dark Souls fills that role – a game that demands the player pay close attention to subtle context clues in order to master difficult mechanics. The difference is that literature and film offer a variety of difficulty options, allowing consumers to choose which type of book or movie best suits their needs in that moment. By doing this, not only did these mediums become more inclusive, but they separated difficulty from worth.
Unfortunately, the video game industry, as highlighted by Itagaki’s comment above, has yet to fully divorce itself from the notion that challenge and skill equal value. However, some steps have been taken, such as the difficulty modifier was introduced. A standard, contemporary video game comes with three basic difficulty modifiers – easy, normal and hard. Most games include a very hard mode for when hard mode is completed. These difficulty settings alter the game world and mechanics in a way that best suits the player’s needs. In nearly every instance, the difficulty setting normal is the default setting – the player will have to actively go into an options menu and change the setting to easy or hard. And while these terms may seem self-explanatory, playing through games like The Last of Us illustrates that clearer definitions are needed.
A common approach to modifying difficulty is to think of player action and enemy action as a mathematical formula. At the normal difficulty, the player and enemies are balanced – for every point of damage/offense/score the player creates, the enemy does the same. The ratio is 1:1. On easy, that formula would change – it may look something like “player action = 2X, enemy action = 0.5X.” On hard, the formula is reversed – “player action = 0.5X, enemy action = 2X.”
The problem with this approach is that it only takes into account basic gameplay mechanics. If the player has 100 points of health and a gun that does 10 points of damage, then this formula works, since the parameters of offense and defense are clearly defined. In survival horror games, resource management can be handled the same way – the amount that resources are distributed can be altered depending on difficulty. And in role playing games, this simple formula can factor in the skills of party members, or the availability of gear that can alter the base stats.
However, what happens when a game chooses to focus heavily on story or character development? How can this mathematical formula impact the game? Games like Beyond: Two Souls present fantastic stories, but as I noted in my review, that particular game was almost painfully easy. This standard formula can still be applied to story-heavy games, but it misses an opportunity to weave both the story and challenge of the game together, instead choosing to separate them entirely. With The Last of Us, the story doesn’t alter or change in any meaningful way when the difficulty is increased. It may feel more like a post-apocalyptic world, but difficulty and story are still separated from each other. Or, as is the case with Beyond: Two Souls, difficulty is just tossed aside.
Because of the variety of game types on the market, and the variety of reasons people play these games, modifiers are necessary, but they also need to have clear definitions, as to not alienate players or mislead gamers.
Instead of the mathematical approach outlined above, a new standard should incorporate story, acquiring and managing of resources, character development and gameplay mechanics. And most importantly, the normal difficulty should be seen as the standard experience. If a developer wants their game to take place in a post-apocalyptic world with scarce resources, then normal difficulty must reflect this.
So what does normal look like? Normal is the perfect, balanced blend of story and challenge, with a game world reflective of what the developers envisioned. If the developers envision a harsh, brutal world, normal difficulty should reflect this. If the developers want the player to feel powerful, then normal should reflect that. Ultimately, normal should be what the developers consider the definitive experience for their game. Easy difficulty is for those who want to be able to experience the story and focus on it with little distractions – the game world is a bit friendlier to the player, and resources are plentiful. Hard is for those who may have already experienced the story, or for those who seek out a challenge as a top priority – the game world is unnaturally cruel, and resources are scarce. In any case, normal should be the only setting that most accurately reflects the artistic goals of the developers.
This does not mean that playing a game on easy or hard is wrong or incorrect – each difficulty setting that alters the game should do so in a way that allows the player to continue to experience the game in a new light. And these difficulty settings can still allow the developers to tell their story, but in a way that is inherently altered from their core vision. These altered versions can still be artistic expressions on the part of developers, but should not be considered the definitive experience for the game. By making normal the default setting, video games are officially divorcing themselves from the past, shaking off an unnecessary hold-over from a generation in which princesses being kidnapped by giant lizards was the equivalent of Shakespearean story-telling.
One Step Forward, One Step Back
These definitions of difficulty, and the emphasis on normal as the default, are not new concepts to the industry. There are games that already utilize this system brilliantly, and stand as examples for other developers to emulate. Yet there are still plenty of games, even high-profile, beloved games, that fail to adhere to these standards.
Surprisingly, the franchise that best displays the notion that normal is the default is the God of War series. This hack-and-slash franchise, with a two-dimensional story about revenge (and little else) somehow found a way to be the industry leader in proper implementation of difficulty. At its default setting, the player controls Kratos who, depending on the entry, is either a powerful Spartan warrior of the literal God of War. As such, the games start players off with basic skills, which are more than enough to handle low-level enemies. But as the games progress, new enemies are introduced, which require more powerful weapons and strategies to beat. At normal difficulty, the progression of the player’s skills and resources raises at a near 1:1 level with the difficulty of the enemies, meaning that the player can recognize an increase in challenge, but is always equipped to deal with it. This, as one can imagine, is exactly how someone who is fighting Zeus and his armies should feel – always fighting an uphill battle, but never knocked out.
The difficulty modifiers alter the game world and mechanics in exactly the ways one should expect. A page on God of War II’s Titan Mode, the franchise’s equivalent to very hard mode, details the changes:
“In this Mode, lower class enemies provide tougher challenges, and bosses become stronger and even harder to beat, while Kratos himself becomes much more prone to injury. Some attacks or traps become instant kills, like Theseus’ ice spikes, and some of the strongest attacks of the Colossus.
“One notable difference is the value of orbs. Yellow orbs, that fill the Rage of the Titan meter, will likely max out the meter, while opening Green and Blue Chests only restore as much as half the meter. Some Red Orbs, on the other hand, will raise the orb counter very slightly while some won’t. For example, some Red Chests won’t give Kratos any orb, as well as breakable pots and objects.”
To summarize – the harder the difficulty, the more difficult the enemies, and the more scarce the resources. The easier the difficulty, the less damage enemies do, along with an increase in resources. At it’s hardest, God of War presents a version of the game that is not reflective of what the developers intended – Kratos is never powerful, a trait not typically reserved for the toughest of Spartans. At it’s easiest, the Gods of Olympus never really feel like Gods.
A game that could have learned this valuable lesson from God of War is, surprisingly, my pick for the 2013 game of the year – BioShock: Infinite. Although The Last of Us served as a catalyst for this essay, and even though I gushed over this game in my review, BioShock: Infinite is the biggest offender when it comes to skewing normal difficulty. As I typically do on my first playthrough of a game, I started my venture into Columbia on normal difficulty. And half-way through a game that features a civil war and a mechanical enemy bird that is unstoppable, I felt invincible. I had all the ammo I needed, and the Vigors, BioShock: Infinite’s equivalent of spells, ensured that no enemy had any chance of seriously harming me.
After its release, BioShock: Infinite was widely criticized for its simplistic shooting mechanics. I always argued that the shooting and gunplay felt great, so long as you played the game on the hard difficulty. And on my second playthrough, setting the game to its ultra-difficult 1999 Mode (a reference in itself to the difficulty of old-school games), I saw just how great the gunplay was, and how wonderfully crafted the story and struggle of protagonist Booker DeWitt played out when this challenge was present.
At its hardest difficulty, BioShock: Infinite is the perfect balance of story and challenge, and it truly comes across as the vision developer Irrational Games intended. Unfortunately, that leaves three difficulty settings in place of what should be one – easy, normal and hard all became, essentially, easy. By lowering the difficulty on the default setting, Irrational Games devalued the story and characters, and ultimately their product. The struggle to save Columbia wasn’t so much a struggle, but a formality that stood in the way of a great twist ending.
In the End, the Journey Matters Most
As the medium of video games becomes recognized more and more as an art form, rules and guidelines need to be clearly established. Video game semiotics have already been established, proving that, yes, video games are art and demand to be treated as such. But not all of the pieces have fallen into place, and defining difficulty in games, especially defining the default experience, is one developers still struggle with.
But it is a worthwhile struggle – after all, developers want their games to be played and enjoyed by gamers around the world. Setting these standards ensures that developers can spend less time worrying about the difficulty of their games, and more time creating detailed worlds and memorable characters. At the beginning of the eighth generation, we shouldn’t be so focused on reinventing the wheel, but instead refining it.