
Of all the bargains to be had in the Harrods New Year sale, none shines quite so ostentatiously as the store’s 24-carat gold-plated Xbox One, sat in gaudy resplendence under thick, presumably ram-raid-proof Perspex. While the console (purportedly the only one of its kind) had endured an ego-shanking £3,500 discount to its original £5,999 price tag, it remains one of the most expensive pieces of video game hardware in the world. This will be of small comfort to the sulking internet commentators who, in the past few weeks, have bemoaned the launch price of the Oculus Rift, Facebook’s forthcoming virtual reality headset. The technology, which will lead the VR charge in March, will cost £499 at launch (or around £1,000 for a package that includes a capable PC), much more than was previously expected.
The device’s inventor, 23-year-old Palmer Luckey, who had previously stated that if the Oculus cost $600 “nobody would buy it”, felt compelled to visit Reddit, that online pool of tranquil discourse, to apologise for “misleading” consumers on the issue of cost. “The unfortunate reality… is that making a good enough VR product… was not really feasible at lower prices,” he wrote. In part, consumers’ consternation derives from the gap between their excitement for the tech, their expectations of its cost and the dampening reality of the final price tag. The rage also springs from the fact that, in the video game industry perhaps more than any other, our sense of value is primarily set by market norms, rather than more meaningful metrics: material costs (Facebook will initially lose money on every Oculus Rift sold), labour, or the sunk investment of research and development. We expect video games, and the machines on which they run, to fall within a relatively narrow band of pricing, regardless of what went into their creation.
Similarly, the price of games themselves has remained fairly constant for the past 20-odd years: the blockbusters cost around £40-£50, budget and independent titles around £10, and phone and tablet games go for a couple of quid or nothing at all. The pricing is reflective of nothing much beyond consumer expectations. It often doesn’t account for the number of hours that went into a game’s production or the value and quality of the game itself.
The Witness, the major game of the moment, is an independent title that’s bucked the trend. When its creator, Jonathan Blow, announced that the 3D puzzle game would cost £29.99, there was outcry on social media and forums. Blow argued that while The Witness is “independent in the artistic sense”, it was an expensive game to make, built by a small team over a seven-year period (Blow spent close to $6m on its development). “I picked a price point that I felt was fairly reflective of what the game is,” he told me last week.
Blow says that once people had played the game, there were few complaints about its cost, but the pre-lash nevertheless demonstrates how the games industry struggles to adapt pricing to reflect the vibrancy and variety that’s now on offer in the medium, everything from short-form biographical titles through to lingering epics.
But by what measure to set these prices? Unlike films, where the narrative progresses at a rate set by the director, video games have no fixed running time; what takes one player an hour to complete may take another five – especially in a game such as The Witness where one’s rate of progress is dictated by aptitude or attention. Less scrupulous developers will go with whatever they can get away with. The conscientious fix a price that reflects the complicated cat’s cradle of costs that go into making a modern video game. Regardless, as any struggling artist will tell you, price and value don’t always correlate. Hopefully, however, as pricing becomes ever more flexible, the gap between the two will close.
