It’s October, which means the biggest video games of the season are upon us. Call of Duty WWII, Star Wars Battlefront II, Middle-earth: Shadow of War, Super Mario Odyssey, and Assassin’s Creed Origins are just a few of the many heavy hitters vying for the attention of consumers this holiday season.
The biggest games of the year have the potential to sell tens of millions of units, raking in literally billions of dollars in revenue. At the same time, development teams have grown larger, game creation itself takes longer, and budgets have become greater than ever. After all, it takes a lot of effort to make today’s games look and feel so amazing. Through it all, however, the price of a retail game has not changed.
Back in the PS2 era, new games for home consoles would cost consumers $50. With the advent of the Xbox 360 and PS3 in 2005 and 2006, respectively, that price rose to $60. Despite the rise of a new generation of consoles and over a decade of inflation, the average PS4 or Xbox One game still costs only $60. In fact, some titles, like Knack 2 and Everybody’s Golf only cost $40, an aggressive price point for fully-featured retail games.
Naturally, there’s a catch. Many games (most, really) feature downloadable content: additional items, levels, and features which can only be accessed by paying more money on top of the initial cost of buying a game. The latest wrinkle to the DLC formula is the inclusion of “loot boxes,” a form of microtransaction designed to keep players paying extra money for months and even years after they first buy a game. In 2017, is this trend a problem, or are lootboxes and microtransactions a necessary evil?
What Are Microtransactions?
Typical DLC (downloadable content) comes in the form of levels or expansions. Back in the old days, PC games would receive “Expansion Packs,” such as StarCraft: Brood War or Diablo: Hellfire; they’re not sequels, but they, as their classification suggests, expand upon the content of the original game, while working within the same framework and engine as the original.
For the most part, barring atypical examples like Sonic 3 and Knuckles, expansions were exclusively the domain of the PC gaming space. That all changed with the launch of the Xbox 360 and PS3, with their easily-accessed internet capabilities. Almost immediately, nearly all games had some sort of DLC avenue, from extra multiplayer maps in titles like Call of Duty and Battlefield, to new areas to explore in narrative-driven games like Fallout 3 and Borderlands.
On the opposite side of these hearty and expansive offerings, we have microtransactions. Basically, a microtransaction is when the player pays a small amount of money for something extra in a game, such as an XP Boost, a piece of equipment, or even a digital pile of in-game currency paying real money for more fake money. The latter microtransaction method often tends to be the main source of income for Massively Multiplayer Online RPGs which are Free to Play. Games like Star Trek Online and The Lord of the Rings Online rely on their thriving in-game economy to make money, rather than monthly subscription fees and charging for expansions.
What is a Loot box?
One form of microtransaction is called a loot box. This isn’t the same as finding a chest in The Legend of Zelda or Final Fantasy; while those are literally boxes filled with loot, a loot box in this sense is an item, purchased with real money, though they can sometimes be earned without paying real dollars. Sometimes they’re called Crates, War Chests, Lockboxes, or something else, but the principle is always the same. This loot box is then opened, adorning the player with semi-random items. It’s akin to buying a pack for collectable card games like Pokèmon, Yu-Gi-Oh, or Magic: The Gathering. The consumer isn’t buying an item, but the possibility of an item they want, while incurring the risk that they’ll be stuck with something worthless to them.
Games like Overwatch use loot boxes to distribute new emotes, profile customization options, or costumes for characters to wear. These items are purely cosmetic and have no effect on gameplay, and loot boxes can be earned just by playing the game, or in exchange for real money. Reception has been positive because of the way the game doles out a reasonable drip of loot boxes just for playing the game, fueling an addictive cycle of “play the game, get loot, show off loot by playing, repeat.” There’s always the temptation to spend real money; the way the trickle of free boxes slows down the more you play is intentional, insidious, and incentivizes paying for more boxes. Regardless, many fans consider it an integral part of the gameplay loop, for better or worse.
Loot Boxes In 2017
Middle-earth: Shadow of War, published by Warner Brothers Interactive Entertainment, is out now on PS4, Xbox One, and PC, and this predominantly single player game features loot boxes. Throughout the game, players recruit an army of Orcs, enough to see them through to the end of the main story. Orcs can also be purchased with real money via loot boxes, but they’re not immediately necessary. After the main quest is completed, the player is presented with an endgame story arc which blocks progression until the player can amass an insanely powerful army of high-level Orcs. The player is essentially given a choice on how they will reach Shadow of War’s true ending: pay up for premium followers, or face the grueling grind to conquer Middle-earth, slowly leveling up Orcs in a seemingly endless loop.
Despite the critical acclaim Shadow of War has been receiving from major industry outlets, the entire story has been overshadowed by the accusations of Shadow of War being “Pay to Win.” To many, the consequence of not buying loot boxes is a slog which drains all fun from the experience.
This year’s other huge game which blocks off player progression behind loot boxes is EA’s Star Wars Battlefront II. Many multiplayer shooters feature loot boxes, such as the previously-mentioned Overwatch, as well as Call of Duty and EA’s own Battlefield. However, while the contents of those games’ loot boxes only ever contain cosmetic items, the recent beta for Star Wars suggests that Star Cards, the crux of the game’s entire progression system, are tied to opening loot boxes. Of course, they can be purchased with real money. When loot boxes are purely cosmetic, the argument can be made that it doesn’t matter how expensive they are, since they cannot effect gameplay balance. But someone who puts down a ton of money on Battlefront II right away will have a distinct advantage against someone who doesn’t.
Following the negative reception to what fans have rightly decried as shamelessly Pay to Win in a way which ruins gameplay balance, EA promised that things would be different when the game finally releases on November 17. They’ve assured players that the system will be more balanced than it was in the Beta, but time will tell if that ultimately pans out in favor of the players.
The New Normal?
It’s been said that DLC and microtransactions are a necessary evil. The price of a video game in 2017 is the same as it was in 2005, and the secondhand market continues to eat into profits. DLC and loot boxes allow some money to ultimately find its way into the hands of publishers and developers, even if that player purchased their copy of Shadow of War from GameStop or a local used game store.
Originally, microtransactions were only seen in MMOs, which are usually Free to Play. Even cheaper games, like LawBreakers and Drawn to Death, have an easy time getting away with their overt reliance on microtransactions. The question then becomes, do microtransactions have any place in a $60 game? After all, the consumer just paid $60 dollars!
The most expensive Season Pass DLCs cost $50. At launch, with the Pass, the first Star Wars Battlefront cost $110. Batman: Arkham Knight cost $100. A perceived benefit of loot boxes is that the premise of microtransactions take the place of a traditional DLC system. Some games which bring in money from loot boxes, actually eschew paid map packs; Titanfall 2, Killzone: Shadowfall, Uncharted 4, Halo 5, and even Star Wars Battlefront II have released (or will release, in the case of Star Wars) all additional maps and modes without charging players. This allows all players to play together no matter how much money they put in after their initial purchase of the game. This keeps the community alive, which means there’s a greater chance that more players will spend money on microtransactions. If more players are playing, then more players are paying.
It has to be noted that loot boxes (and the rewards they grant) are significantly cheaper to create than map packs and more traditional forms of add-on content, which leads to greater profit per sale for the developers and publishers. Are loot boxes the price the industry has to pay in order to keep the cost of games from rising above $60? Or is it just a ruse to keep on sucking money from gamers’ bank accounts?
Even if microtransactions are incentivized by publishers and thrust onto consumers, a vast majority of gamers can ignore them and still enjoy the games they love, often with the added bonus of not having to buy maps for titles like Star Wars Battlefront II. However, to players with tendencies towards addiction, the temptation can be too great, and much of their disposable income will go towards paying for games they already own. A player who bought the first Battlefront on launch day with the Season Pass spent $110. Someone who buys Battlefront II and invests heavily in loot crates could potentially spend many hundreds of dollars or more before they move on to another game. Is that an unfortunate side effect of enabling reckless spending, or is it the goal of having loot crates in the first place? The dubious ethics of loot crates have led to them be regulated in some countries, though they remain completely legal, and technically not a form of gambling, in the United States.
Loot Crates: Good or bad?
Loot crates are everywhere. Overwatch uses them in a way which keeps players coming back for more, month after month to enjoy themselves and try to win exclusive goodies, but Shadow of War pokes and taunts the player with a dirge of endless boredom until they either persevere or pony up to see the game’s true ending. Last year, Gears of War 4 found itself in hot water with the fans after its own overpriced loot box system led to the Horde Mode being effectively Pay to Win. Meanwhile, Forza Motorsport 7 is in trouble for changing the system from Forza 6 to lean harder than ever on its Prize Crates and Mod Cards. Even Destiny 2 has received flak for changing its shaders from unlimited-use items to consumables which can be purchased in crates.
Publishers are learning what they can and cannot get away with in terms of nickel-and-diming consumers with microtransactions, much like they had to learn about what they could get away with in terms of larger DLC content. Capcom’s fighting games became infamous for including DLC characters on the game disc, but employing the shady practice of keeping them locked behind a pay wall. These days, consumers are more savvy to such practices, and the practice of slicing off part of a game and selling it later is far less prevalent than it once was (although pre-order bonus missions remain the bane of many a gamer’s existence). In a way, loot boxes are a way around that, as the content granted by the crates can usually be earned within the game itself. It’s locked away from most players, but in a different way than Jill Valentine in Marvel vs Capcom 3.
Loot boxes are here to stay, and if publishers include them in ways which the gaming community can feel comfortable purchasing in large enough numbers, while ironing out the issues inherent in titles like Shadow of War, Forza 7, and Gears of War 4, then maybe gamers won’t feel cheated as much as they currently do by the less savory titles making use of practice in today’s market.