The Video Game Industry: At A Crossroads

ps4_pro_001

With the announcement of the Playstation 4 Pro, Sony raised more questions than answers about the future of the video game industry.

On June 10, prior to E3 2016, we published an essay titled “At A Crossroads: The Home Console.” The essay was based off of rumors and conjecture, anticipating what Sony and Microsoft would announce at E3 2016, and how it could fundamentally change the way we purchase and play video games. Three months later both Sony and Microsoft have played most of their cards, and we now have a clearer picture of where the industry is heading.

Or do we? When Microsoft discussed Project Scorpio at E3, they were light on details, only to assure us that it would be the most powerful gaming console ever, a strange goal to aspire to since the leading console in terms of sales over the seventh, sixth, fifth and fourth generations were considered the technologically inferior consoles of the time. Even more unclear was how long current Xbox One owners could wait before being forced to upgrade to the Scorpio. With Sony choosing to focus on their virtual reality (VR) headset at E3, and not on their then rumored equivalent of the Scorpio, gamers everywhere were forced to wait just a bit longer, in the hopes that at least Sony would provide some clarification.

Of course, they almost made things worse. The Playstation 4 Pro is just as confusing as the Scorpio, and it’s most impressive features, 4k resolution and HDR images, are impossible to show off unless you are either there to see it in person, or happen to have a 4k monitor and an insanely fast internet connection. That Sony and Microsoft’s new hardware strategies led to more questions than answers isn’t entirely surprising, although it is disheartening, and it shines a light on a unique problem the video game industry faces – despite its monetary success, the cost of gaming is rocketing skyward, threatening to alienate early adopters or people on a budget, and the benefits of this progress remain unclear at best, pointless at worst.

Show Me The Benefits

In an effort to promote their own VR headset, Samsung released the following commercial to show off their fancy new piece of tech, and how people reacted to it:

Running just over a minute, the video shows people putting the Samsung Galaxy Gear VR headset on, sitting in a chair and reacting wildly to what they are experiencing. Titled “All the Feels,” the commercial attempts to demonstrate just how powerful of an experience VR is.

But allow me to be cynical for a moment – when I watch that commercial, all I see are strangers in a chair freaking out. I don’t know what they are seeing or hearing – all I know is that they’re screaming, laughing, crying. That gives me nothing to relate to, no motivation to experience whatever it is they are experiencing, and that’s because I can’t see what they’re seeing. And that is a major problem with the direction the industry is heading in.

Right now, there are three buzzwords being tossed around by companies like Samsung, Sony, Microsoft, HTC, Oculus – basically everyone who makes the hardware that enables us to play games. There’s VR, 4k resolution and high-dynamic range (HDR), all of which will allegedly change the way we play games for the better.

A brief primer on what these terms mean: VR is the easiest to explain, and has been around for the longest. It’s a headset the user wears that allows them to look around in a virtual world, and in some setups it features hardware that tracks the user’s body and movements, enabling them to “walk” around and engage in other activities. Next is 4k resolution, sometimes referred to as “Ultra High Definition,” and simply refers to the number of pixels that can be shown over a number of lines on a display (the higher the number, the sharper the picture). With 4k resolution, those numbers are 3840 pixels by 2160 lines, a significant step up from most common HD displays, which are capped at 1920 pixels by 1080 lines. Lastly is HDR, which displays images with a color contrast similar to what the human eye processes, leading to much more life-like images. Paired with 4k resolution, images take on a whole new life, with no visible image degradation and smoother movement.

samsung-gearvr-2015-002

Nothing about this images entices me to spend hundreds of dollars on VR, because I have no frame of reference for what the user is experiencing.

In theory, these technical advancements will lead to better gaming, whether it’s through better visuals or new, immersive experiences. But the problem, as highlighted in the Samsung commercial, is that these advancements are difficult to demonstrate, unless you are physically present and have the tech in your hands. That might work for journalists, but for the vast majority of gamers, the only time those conditions will be met is when they pony up and buy this new tech for themselves (or have a friend who doesn’t mind gambling hundreds of dollars on new tech that may or may not be worth it).

So why is this a problem? Can’t we just take the industry and journalists at their word? Unfortunately, we cannot, or we should not. It’s not that journalists can’t be trusted – many are great at cutting through the corporate speak and getting to the heart of an issue, but readers are still getting information on a personal experience second hand, one they are unable to see for themselves. A journalist might love VR, or claim that HDR is the future, but a reader might not be able to overcome motion sickness when playing a VR title, or they might not see much of a difference themselves between HDR and HD images. Or they can play VR and see a difference with 4k resolution, but might not care. When technology is so closely tied to personal preference, it’s impossible to take someone at their word – it has to be shown off in a way that helps gamers to understand what the technology is capable of and why they need it. If it can’t be shown off, then it’s almost destined for failure.

For further proof that consumers follow the “seeing is believing” mentality, one needs to look no further than the success of the Wii and the Wii U. When Nintendo unveiled their new, odd-looking controller for the Wii, they were met with puzzled looks and skepticism. That is, until Nintendo showed off Wii Sports. Want to bowl? Swing the controller like you would a bowling ball. Playing golf? Swing the controller like a golf club. Tennis more your speed? Just swing the controller like a racket. The concept clicked instantly, because it was easy to demonstrate to people who weren’t physically there to play it for themselves.

Fast forward to the Wii U, which featured asymmetrical gameplay in large part to the new controller, which was a giant tablet with a touchscreen in the middle. When Nintendo first unveiled their new hardware, there was instant confusion – some members of the press (and plenty of consumers) first thought it wasn’t a new console, just a new peripheral for the Wii. Once that was cleared up, Nintendo tried to show off how the controller worked, but it was difficult to display action taking place across two screens. This led to a collective shrug of indifference from the gaming community, and Nintendo took a financial hit, seeing losses to their bottom line they hadn’t seen in the years prior. If Nintendo’s inability to effectively demonstrate the Wii U isn’t enough proof that people need to understand or experience a product in order to buy it, then look no further than the now abandoned use of 3D on the handheld 3DS console, or the entire existence of the Virtual Boy.

This new direction the industry is heading is one that’s difficult to sell, which in itself is a problem, but it also highlights a concerning trend the industry doesn’t seem to care much to address – the rising costs of gaming.

More Money, More Problems

mass-effect-andromeda-4k-wallpaper-003

As a fan of Mass Effect, I’m looking forward to the next title, but I’m unsure why I need to play it on a Playstation 4 Pro.

Video games are expensive, and there’s no signs of that trend reversing itself anytime soon. If I was a novice gamer looking to make the leap into the realm of dedicated gaming, and was looking to upgrade my setup so I could take advantage of VR, 4k resolution, HDR and play the majority of titles available to me, it would set me back $3,300, broken down as follows:

  • 4k TV – $800
  • Xbox One S – $300
  • Playstation 4 Pro – $400
  • High-end gaming PC – $1,000
  • VR headset – $800

Keep in mind, this is assuming that I have all of the necessary cables to get the benefits of 4k and HDR, that I already have the hardware to mount all of these electronics so they receive plenty of airflow as to not overheat, and to make sure they are connected to the internet at the best possible speed (a decent gaming-focused router would set me back approximately $200), and this also doesn’t include taxes. On top of that, I’m out well over $3,000, without a single game to show for it (and the point of this setup is to be able to play any game I want to). Going one step further, this is also assuming that I don’t care to play the latest Mario or Zelda titles – which happen to be some of my favorite across all platforms (but with the rumored NX console being such an unknown, it’s hard to account for it in this cost analysis). In short, this is a conservative estimate, and yet the price is still way too high.

This trend isn’t only impacting hardware, but software, too. Games are already expensive at $60 a pop, but now it’s more common for a game to come with a season pass than it is for a game to release without one. In reality, I paid $85 for The Witcher 3: Wild Hunt, and $90 for Fallout 4 (in the case of Fallout 4, if I were to buy the season pass today, the game would actually cost $110). If I wanted to buy just one game for each platform listed above, it could in theory set me back $300. And if you have a friend or partner who wants to play with you? That’ll be $60 per extra controller, thank you very much.

This presents a huge barrier to entry into gaming, one that publishers and console manufacturers don’t seem in any rush to solve. Everyone thinks they have the best platform, so they push theirs as if it’s the only one you’ll need. But what if I really like Halo, want to try out mods for Fallout 4 without having to do a ton of research, play Mario Kart 8 with my friends and check out this indie title No Man’s Sky that I keep hearing about? I could cut out the Playstation 4 in that (very real) scenario, but then I’d be left without Uncharted 4: A Thief’s End, or the upcoming The Last Guardian. If I’m already investing thousands of dollars into this, why should I be prevented from playing every title?

This high cost barrier is the reason why mobile gaming is so popular – everyone already has a smartphone, so why not also use it to play games? Getting into mobile gaming is far cheaper and easier, and the reality is that video games could follow suit. Most people have a TV, and its intended use is to watch movies, sports or TV shows, so why not play games with it, too? Except, unlike smartphones, potential gamers have to jump through increasingly more expensive hoops, even if they want to limit their purchase to one console, a few games and an extra controller.

When costs are this high, most gamers simply accept that they can’t play every title, which eventually leads to the useless, pointless fan-boy wars seen nearly everywhere on the internet.  Consumers want to feel good about the expensive purchase they made, and when they have a choice between competing products, it’s easier to think they made the right choice by putting down the competition. This isn’t exclusive to video games (Ford vs. Chevy is a long-running example of this pointless bickering), but that doesn’t mean we should accept it blindly. Because it leads directly into the most damaging aspect of the modern video game industry, the one that’s been a thorn in the side of progress for decades – fragmentation.

Everyone Can Have A Little Bit Of Fun

destiny_rise_of_iron_004

The latest expansion for Destiny abandons seventh generation consoles, in an effort to reduce the negative impact of fragmentation.

Let’s pretend for a moment that the current video game landscape looked the same, but somehow through the process of magic, prices plummeted and everyone could afford to buy every console and piece of hardware. Wouldn’t that solve all of the problems? Not quite, because we would still need to contend with market fragmentation.

This year marks the third year of Destiny, the sort-of maybe-it-is maybe-it-isn’t MMO by Bungie, Inc. To mark its third year, they’re releasing a new expansion, Rise of Iron, and for the first time the content will only be available on Playstation 4 and Xbox One. That means owners of Destiny for a seventh generation console will miss out on all the fun. Want to guess the reason why? Bungie got tired of developing for so many markets, especially when some of those markets prevented the from creating the game they wanted to create. In an article published by VG 24/7, Bungie community manager Eric Osborne notes:

“In order to add new content at this time, and especially on the scale and scope of Rise of Iron, we would have to take away from those older consoles, you would have to lose something.”

In other words, they were holding the experience back, and Bungie finally said enough is enough, and went ahead with a less-fragmented approach.

The effect this has on developers and publishers is very real, and when a developer has to account for numerous consoles, it can negatively impact the final product. This was explored in a piece we published earlier this year on Assassin’s Creed. For a title of that scope to release on a near-yearly basis is in itself a monumental task, but to also release it on as many platforms as Ubisoft did one time is asking far too much. At its height, with Assassin’s Creed IV: Black Flag, the title released on Microsoft Windows, PlayStation 3, PlayStation 4, Wii U, Xbox 360 and Xbox One. If we’re being generous, we could lump the Microsoft platforms into one, since Microsoft has done an admirable job of keeping the architecture similar across all three platforms (although the reality is they have their differences, but for the sake of argument let’s assume Ubisoft could consolidate development of those three titles into one). The same cannot be said for the Playstation 3 and Playstation 4, which feature such vastly different architectures that Sony themselves hasn’t been able to figure out how to get backwards compatibility working on the Playstation 4, instead relying on a streaming service that price gouges players (the only upside for developers is that the Playstation 4 isn’t radically different than the Xbox One). Add the Wii U’s touchscreen into the equation, and at best Ubisoft had to make four different versions of the same game. This eventually caught up to Ubisoft, who is now focusing more on a film adaptation than on another yearly installment.

All of this doesn’t even consider what initially prompted this essay – that Microsoft and Sony are gearing up to release considerably more powerful versions of their existing consoles, without an upgrade plan in place. Both companies claim that every game will be playable on their original consoles as well as the new ones, but if that were the case, why upgrade? The answer can be found in the smartphone market – eventually, consumers will need to purchase the upgraded hardware in order to use new software. But as of today, Sony and Microsoft are just shrugging this concern off, hoping gamers will be wowed by the possibilities of 4k and HDR, even though gamers can’t see those benefits for themselves.

fallout_4_mods_005

Thanks to a fragmented market, a large portion of Fallout 4 players will never get to run across the Commonwealth wielding a light saber. For shame.

This issue of fragmentation only gets worse from here – in the span of a week, Sony announced the Playstation 4 Pro, touting 4k and HDR gaming, and then dropped a bombshell on the industry – the Pro cannot play 4k native Blu Rays, something the already on the market Xbox One S can do. To further demonstrate just how absurdly fragmented the industry is, developer Bethesda Softworks finally announced that, thanks to some outdated internal policies at Sony, mod support would not be coming to the Playstation 4, for both Fallout 4 and the upcoming remaster of The Elder Scrolls V: Skyrim. So if you bought into the 4k hype and happened to buy into Sony’s ecosystem, and don’t want to spend $1,000 building your own PC (or spend even more buying a premade one), and tinkering with it when things inevitably go haywire (which they will, regardless of whether or not you or a company built it), then congratulations – you’re either a new proud owner of an Xbox One S, or your fancy 4k TV will only take advantage of half of what it’s capable of.

The kicker in all of this? Save for a few discrepancies here and there, the architectures of the Xbox One and Playstation 4 are not that vastly different. Gamers aren’t even getting unique experiences with each – if Sony and Microsoft were willing to meet halfway and make one console that was a mix of their latest offerings, it wouldn’t have to change that much. As it stands, the only reason there is any difference between the two is because… one was made by Microsoft? It’s just as unclear as the future of the industry is, seemingly different just for the sake of it, with no real benefit to the gamer.

A Ticking Clock

It’s unclear how much longer the gaming industry can sustain itself with this business model. I would consider myself a very dedicated gamer, and even I don’t have a PC capable of running any VR headset, nor do I have a TV that can do 4k and HDR. But even I had to spend an arm and a leg to get the setup I have, and I had to make sacrifices in many other areas of my life just to afford it. I’m perfectly willing to part with a large chunk of change just to play video games, so it should raise alarms inside the halls of Sony and Microsoft when gamers like me are wondering aloud if it’s worth this much money.

Gamers need to be able to see the benefits of changes in order to buy into them, and the industry needs to find a way to drastically lower costs. Eventually we’ll reach a point where video games are a luxury item that few can afford, and considering the potential the medium has to engage players and tell dramatic, impactful stories, that would certainly be a loss. Art is at its most powerful when everyone can experience it, and video games shouldn’t be an exception to that rule.