Gaming is a multi-billion dollar industry, and more people play games on a daily basis now than ever. Some AAA titles cost as much as Hollywood blockbusters to produce and often generate even more revenue than movies.
And yet recently, the notion that gaming is dying has become quite popular among gamers, as well as popular streamers and YouTubers. Many complain about the lack of creativity in new games, their state at release, and the nauseating amount of microtransactions.
On the one hand, the game industry is doing better than ever. On the other, more and more players grow dissatisfied with once-beloved studios letting them down time and time again.
So, is the game industry really going to plummet soon? Have things really gotten that bad, or is it just an overblown sentiment popularized by nostalgic gamers?
The big question: why is gaming dying?

Gaming may not be on its way out just yet, but those who are concerned about the state of the industry do have some compelling arguments.
Microtransactions and battle pass systems in full-priced games, formulaic open-world titles, and the general hesitancy to innovate among AAA companies are all very real issues. They are part of the reason why so many gamers don’t trust mainstream publishers and proclaim the impending death of gaming.
Microtransactions
In-game purchases and microtransactions are not a new concept. They’ve been around since the early 2000s. Although the “Horse Armor Pack” in The Elder Scrolls IV: Oblivion is considered to be the first mainstream implementation of microtransactions, players have been paying for in-game stuff long before that.
Expansion packs, for example, are one type of paid additional content that’s been around for decades. No one has a problem with these for one simple reason: they’re fair. Paying $10 or $20 for more hours of gameplay is an arrangement that benefits both the player and the developers.
YouTuber and streamer Josh Strife Hayes distinguishes between fair and “anti-player” microtransactions quite well in “What Went Wrong With Gaming?” In this video, he points towards the popularization of online games as the turning point in how companies charge for extra content.
In MMORPGs and other always-online games, other players can see what items their peers have equipped, leading to a competition for having the best, flashiest gear. Developers and publishers quickly realized they can exploit this rivalry by charging small amounts for exclusive gear.
The financial success of games like MapleStory, World of Warcraft, and other online titles with small, in-game purchases led to microtransactions appearing in single-player titles, like the infamous Oblivion horse armor.
As opposed to actual expansion packs, they’re very cheap to create and generate sales at a much higher volume.
With comparable revenue and much lower development costs, it shouldn’t be surprising that microtransactions became the go-to option for monetizing games after release.
Microtransactions today
Lootboxes came shortly after cosmetic items and improved gear. They took things a step further. Instead of paying a few dollars for an actual piece of equipment, players were made to pay for a chance at getting a rare item. The best examples of this are FIFA Ultimate Team packs and the loot crates in Star Wars: Battlefront II.
With the release of Fortnite Battle Royale in 2017, the concept of the battle pass became popularized even further. It granted buyers access to a stream of attractive rewards in exchange for the small sum of $9.50 every month.
Fortnite became so lucrative that now, you can find battle passes in nearly every online shooter nowadays, free-to-play or otherwise.
Are microtransactions always bad?

Today, microtransactions are ubiquitous in gaming. Full-price, single-player games like Assassin’s Creed: Valhalla sell XP boosts and cosmetic items, whereas gacha titles like Genshin Impact are notorious for being neverending grind fests for F2P players.
The list goes on and on, and the subject of predatory microtransactions deserves its own article. But are all microtransactions wrong?
I don’t believe so. Studios that create free-to-play games need a source of revenue to keep going. The only other viable alternative is ad revenue, and I’d much rather shell out a few bucks on a F2P title from time to time than be pestered by intrusive ads.
With that said, microtransactions really have no reason to exist in $70 games. Regardless of whether they make progress easier, give you an advantage over other players, or are purely cosmetic, paying full price for a title should have all the in-game items and levels unlockable by playing the game.
AAA publishers often quote rising development costs to defend their choice to include microtransactions in new releases. It’s definitely true that games cost more than ever to make nowadays, especially in the case of massive AAA blockbusters.
In that case, why not raise the prices of games even higher? If adding another $10-15 dollars to the price tag would erase the need to implement microtransaction systems, it would be a fair trade-off.
Of course, that would be quite a bold move. Microtransactions are a low-risk monetization strategy. Raising prices, on the other hand, would force studios to take more time to polish their games to perfection instead of relying on hype and marketing efforts to drive sales.
Lack of innovation
One of the main arguments in the “gaming is dying” debate is the risk aversion of AAA studios in recent years. Companies like Ubisoft, EA Games, or Activision, once known for creating some of the most immersive, innovative titles around, are now synonymous with rehashed ideas and annual releases of similar franchise entries.
Let’s take Ubisoft as an example. When Far Cry 3 came out, critics and players alike praised it for its open-world design and variety of things to do. Over ten years later, the French giant’s open-world games still stick to the same exact formula.

Head to a high spot on the map to reveal activities. Clear out enemy camps, find collectibles, and gather materials to upgrade your gear. Repeat. We’re all too familiar with the so-called “Ubisoft formula.”
They may have perfected it, but Ubisoft is far from the only studio that resorts to this type of open-world design in their games. Titles like Metal Gear Solid V, Middle Earth: Shadow of Mordor, or the more recent Hogwarts Legacy have all taken tricks out of Ubisoft’s playbook in their open-world design.
This is not to say that the games I mentioned above are bad or have nothing to offer besides their open worlds.
MGSV excelled at giving players the freedom to approach missions exactly how they wanted. Middle Earth’s Nemesis system was so good that even Ubisoft adopted it in some of its open-world games, and Hogwarts Legacy’s unique combat and spellcasting let players get truly creative with their fights.
If it ain’t broke, don’t fix it
Admittedly, I don’t mind the Ubisoft formula. In fact, I enjoy it quite a bit. However, the design blueprint first introduced in Far Cry 3, implemented time and time again in so many open-world games, is bound to tire many gamers out.
Open-world fatigue is something that has been talked about plenty and is part of the reason why lots of hardcore gamers turn their backs on AAA publishers. But the lack of innovation isn’t unique to the open-world genre.
First-person shooters, linear RPGs, and MMOs are all guilty of this as well. The Call of Duty franchise is a great example. After the release of Modern Warfare in 2007, all subsequent releases of CoD followed its multiplayer formula with little to no deviations. The series became a ghost of its former self.
In 2019, Warzone revitalized CoD, and players returned to the franchise in droves after it came out. But even that came alongside the release of Modern Warfare’s remake and was a variation of battle royale, a mode popularized by PUBG and Fortnite.

AAA studios consistently put out games that many consider to be minor updates of previous franchise entries or of popular gaming trends. After all, change is risky and puts millions of dollars on the line.
With the stagnant state of the AAA game industry, it’s no wonder that long-time gamers begin to think that gaming is dying. To quote American theologian Leonard Sweet: “Stagnation is death.”
Unfinished releases
The omnipresence of the Internet added another problem to the long list of issues plaguing the video game industry: games that are broken at release.
We’ve all been there. You preorder a game, and instead of playing it right away, you’re met with a screen saying that you have to update it first. Worse yet, the patch hasn’t been released yet, and you need to resort to playing the buggy release version.
Back in the day, the idea that you have to update a game before playing it on release day would be outrageous. But in 2023, Day 1 patches, roadmaps with upcoming fixes, and important game mode delays are the new normal.

It’s easy to mock Cyberpunk 2077’s disastrous launch, but it’s far from the only major title that’s guilty of releasing in an unplayable state. Nearly every modern Bethesda game comes out as a buggy mess, and even after dozens of patches, they’re still full of glitches.
Overwatch 2 was declared a “glorified patch” by gamers at release, and it hinged on the upcoming PvE mode to make good on its promise of becoming a fully-fledged sequel. Earlier this year, Blizzard officially stated that they’re scrapping it.
You wouldn’t patch a car
I could go on and on. Studios are often praised for delivering on their promises after months or years, like No Man’s Sky or Fallout 76. Should they really be applauded for simply doing their jobs long after the release date?
We tolerate Day 1 fixes and buggy launches, but I don’t think we should. When you buy a car, you expect it to work right away. If the manufacturer came out with a roadmap detailing how they’ll get your vehicle to run in six months’ time, you’d laugh in their face and demand your money back.
It should be the same with games. Studios carelessly release unfinished titles and charge us $70. Sometimes they take the time to fix them, and sometimes they don’t. They get paid either way. So are gamers really in the wrong for asking: is gaming dying?
Consumers share the blame

As much as I would love for that to be the case, the current state of the AAA game industry isn’t 100% the fault of studios and publishers. The consumers, myself included, also contribute to creating an environment where these companies can freely disappoint us time and time again.
Despite all the previous launch failures, predatory monetization techniques, and uninspired game design, pre-orders for upcoming AAA releases continue to sell like hotcakes.
The likes of Genshin Impact, Diablo: Immortal, or FIFA keep making billions. Each new Assassin’s Creed or Call of Duty game sells millions of copies even as they reuse the same assets and mechanics as their predecessors from a decade ago.
By continuing to preorder AAA releases, buying power-ups in mobile games, and rewarding studios for fixing their broken games, we’re sending the industry a signal.
A signal that tells them that no matter how predatory their monetization systems are, a sizeable portion of the market is going to buy into them anyway.
No matter how boring and repetitive their open worlds are, we’ll still play them because we love the franchises they’re a part of.
No matter how many broken games they put out, they still have our trust.
“Vote with your wallet” is an overused cliche that’s not always true, but in the case of gaming, it’s the only way to make a change. Unless enough gamers refuse to put up with these practices, AAA studios will continue to perpetuate them. And make a boatload of money doing it.
You don’t even have to sell your gaming system and take up another hobby instead. There are still plenty of titles worthy of your attention and money, and that includes some big-budget releases.
Gaming is alive and well
Gaming isn’t dead, and it’s not headed for an early grave anytime soon. The quality of most triple-A releases has plummeted, and that’s a fact. Companies like Ubisoft or Blizzard no longer define genres like they used to.
Independent developers took on that responsibility instead. They don’t have the budget to make massive titles with photorealistic graphics, and that’s actually one of their greatest advantages.
Without millions of dollars on the line and shareholders to answer to, indie developers are free to take all the risks they want. In fact, they have to innovate. If they fail to grab gamers’ attention by going outside the box, they won’t have a chance to compete in today’s oversaturated market.
If you look back at the last decade in gaming, small studios and solo devs are often the ones who introduce solutions that later go mainstream.

Minecraft is arguably one of the most influential games of the century so far. PUBG was initially made by one guy, and it’s responsible for revolutionizing the multiplayer shooter scene.
BattleBit Remastered could run on a toaster, and it’s currently filling the Battlefield-shaped hole in the market that even Battlefield itself couldn’t fill for years.
Big studios also make good games (sometimes)
It’s not just indie games that keep gaming alive. A handful of top studios keep delivering quality titles, going against the trend of mediocrity and sometimes even setting the standard for upcoming releases.
FromSoftware’s Elden Ring proved that the open-world genre still has a lot of potential for innovation, whereas PlayStation Studios delivers big-budget, cinematic productions of unmatched quality like Marvel’s Spider-Man or God of War.
And then there’s Nintendo. For decades, they’ve been introducing new ideas, mechanics, and concepts, and the industry followed. The Japanese juggernaut owns some of the most recognizable IPs and never hesitates to shake things up with every generation of consoles.

The Legend of Zelda franchise by itself is responsible for more game-changing ideas than any other series. In 1998, Ocarina of Time introduced lock-on targeting and the in-game camera, both of which we take for granted nowadays.
More recently, Breath of the Wild’s open world provided an alternative to the Ubisoft formula that inspired countless copycats. I have no doubt that in the coming years, we’ll get plenty of games trying to recreate Tears of the Kingdom’s building and traversal mechanics.
Is gaming dying? Not just yet!

Gaming is alive and kicking. In fact, 2023 may just be one of the best years for video games in years. Titles like The Legend of Zelda: Tears of the Kingdom, Hogwarts Legacy, or Star Wars Jedi: Survivor offer a glimmer of hope for the AAA industry, and there are plenty of big-budget titles that are yet to come out.
The indies are also going strong. Dave the Diver is one of the best games I’ve played this year, and judging by the demo, Lies of P is shaping up to be a fantastic Soulslike experience.
The AAA industry has become stagnant and too risk-averse, but gaming, in general, is doing just fine. There’s plenty to look forward to, and with the increasing popularity of the medium, it’s surely going to stay relevant long into the future.