The cost of making triple A games is beyond ludicrous. In the last generation of console games, the PlayStation 2 era, the average cost of developing a blockbuster game was between one and four million US dollars. By 2010, the average cost of development had risen to above 20 million dollars – discounting games with ridiculous budgets, like Grand Theft Auto 4 ($100M), Gran Turismo 5 ($80M), L.A noire ($50M) and APB($50M). That’s without factoring in marketing budgets.
That is a staggering amount of money – and explains why some games, even though they sell a million plus copies, fail to make any money. Asset creation, voice recording, engine development and human capital and graphics rendering all cost a fortune – and their costs are only going to increase exponentially with the next generation of hardware. We could see games costing in excess of $100 million – and the only way for publishers and developers to make money is to sell multiple millions of copies of their games – and then pad them out with DLC, or go the free-to-play route, nickle-and-diming consumers for everything they’re worth.
Taking risks and launching new IP is already the sort of thing that can leave a developer in tatters if it underperforms. Based on current game prices, the average $20 million dollar game – not taking marketing in to account – would have to sell 750 000 copies just to break even. A $100M next-gen game would have to sell just under 4 million copies to even start making any money – and those sort of sales tend to be reserved for sequels to established blockbusters, like Call of Duty. It’s incredibly likely that we’ll see nothing but the sort of stuff that publishers think will sell – First person shooters, shoe-horned multiplayer and poor Skyrim clones – next gen.
The question is – will next-gen development costs make videogames unsustainable? are we set for another giant crash, like we saw in 1983?
Last Updated: June 29, 2012