Where Do Microtransactions Actually Belong?
Microtransactions. . . Say it out loud. Say it slow. It feels almost disgusting, right? The ultimate dirty word in the gaming industry at the moment. Especially now, in the wake of the whole Battlefront 2 loot box controversy.
Ever since their introduction, microtransactions have been a topic of growing contention among developers and gamers. Some think they’re just a natural step in the process of developers making money, others view it as an encroachment on the industry. Some developers vehemently defend the practice. Others, like good ol’ CD Projekt Red have slammed it as unnecessary greedy.
I think it’s important to figure out the place of microtransactions because they’re such a common practice now. And if we don’t define their appropriate role in the industry then they’re likely to spread where they don’t belong. So where do MTs actually belong in the gaming industry?
The question’s pertinence seems to have reached a boiling point now that even some U.S. government officials are speaking out against lootboxes, saying that they’re “predatory practices” in video games. And this invites a whole Pandora’s box of issues regarding government interference that I won’t get into.
Do MTs really belong in the trash? Well, maybe they do. But I think the fatalistic reality is that we’re stuck with them. It’s not a practice that will ever completely go away. Not so long as people continue to spend money on games that have them and money on the transactions themselves. And are you really willing to boycott every single game that contains some semblance of paid content inside it? My guess would be no. And if you are, then more power to you.
Some people make the argument that developers need the MTs as a way to actually make a profit from the game. The problem is that there’s not enough evidence that this is the case. There are a lot of games getting by perfectly fine without them. Naturally The Witcher 3 comes to mind, along with plenty of other smaller titles like inFamous, Cuphead, Horizon Zero Dawn, etc.
So if we’re stuck with MTs but they’re not absolutely necessary to the industry, where do we draw the lines? What is and isn’t acceptable?
My opinion is this: Games should not be modified to encourage microtransaction purchases.
“Pay to win” is a garbage concept that should be done away with. Loot boxes are also something that should, arguably, be cast aside in favor of good old in-game grinding. Games are about gaming and the sense of fun and fulfillment that come with progressing. If you can just pay to move forward, a lot of what’s at the heart of playing through the challenges is cast aside.
Then what kind of MT is acceptable? I don’t think the average gamer minds a couple DLC weapons that are cool and don’t tip the balance of the game. Aesthetic things like masks, patches, costumes, are also perfectly fine. It’s always nice to get some little artistic upgrades to your multiplayer character with that last dollar on your store credit.
And I don’t believe it’s morally reprehensible in and of itself to attempt to further profit off your work. Car dealers charge you extra for some custom features as well and we consider that perfectly acceptable. But if the car doesn’t work the way it should “out of the box,” so to speak, without extra payments then we have a problem with that. And in the same way, if the game is tilted to such a point that spending money on MTs is necessary to make the game feel complete we should take issue with it.
Where do microtransactions belong? Ultimately their place is to the side as an optional add-on. With the key word being “optional.” I don’t believe they’re the moral blight on the industry that some people might make them out to be, but nor are they a practice that should be encouraged.