We're nearing the end of the longest console cycle since consoles had distinct cycles. But to hear some tell it, as hardware and packaged software sales have stagnated, the ability and drive to innovate have similarly hit a silicon wall. Alarms have been raised about when the new consoles need to come and what sort of magic they need to be able to work.
Would Borderlands have been better if it was photorealistic?
2K Games executive Christoph Hartmann's claimed modern games are stuck in a rut of mindless action because they cannot sufficiently portray emotion and human drama without photorealistic graphics. DICE general manager Karl Magnus Troedsson said gamers are tiring of first-person shooters because most do not make enough strides in new ways to render the action. Ubisoft CEO Yves Guillemot said developers are being penalized by the long console life span, and "it's important for the entire industry to have new consoles because it helps creativity."
These statements from AAA publishers and developers cast away all blame for the middling state of AAA gaming from themselves. They encapsulate the tunnel vision so woefully common among the established ranks of the industry. New technology is not the only or best way to drive creativity, and taking that approach is dismissive to the advancements games have made that weren't driven by technology.
The association between new tech--most often in the form of consoles--and fresh thinking in games is not invalid. The numbers show that companies tend to launch many more new franchises at the beginning of hardware cycles than in the middle or end. But this is only for business reasons: new systems create a market full of consumers eager for any title to justify their expensive purchases. This means publishers can afford to let developers throw their new ideas against the wall and see which ones stick. Those which work out well can be turned into lucrative franchises, and those which don't probably still moved decent numbers thanks to a smaller field of games to compete against.
Eventually the polygons get so small they can no longer be told apart. Maybe we haven't bought too many new ideas from the neighborhood GameStop and fed them into our consoles recently, but that doesn't mean there are none. Original ideas are springing up everywhere games can be made and (relatively) easily distributed--observe Minecraft and its successful Xbox Live Indie Games clones! Unabashed intellectual theft is the sincerest form of flattery, after all, and a great metric for how successful any given game is at carving out a new market.
But trying new things is a dangerous business, and for every Minecraft there are dozens of innovative efforts languishing in digital indices. So risk is a game for the young and the small. That's why much of the new vocabulary of games is coming from independent developers. It's not only the indies inhabiting Xbox Live or Steam, either. Mobile phone and Facebook titles exploded because their creators gave different kinds of games to different kinds of people in different places than before, without losing much sleep on competing as graphical powerhouses. Current technology seems to be able to handle most of their good ideas just fine.
It wasn't always this way. In previous generations, new hardware did fundamentally change our games. It expanded our perspectives from two dimensions to three. Then it let us play and compete with other people half a world away. The gaming industry grew accustomed to those leaps, and to the spike in sales and interest which came with them. But, in a beautifully succinct demonstration of the law of diminishing returns, eventually the polygons get so small they can no longer be told apart. And yet some people, chiefly those deepest entrenched within the AAA game industry, insist they cannot succeed without more. From all this a likely conclusion emerges.
The canvas is ready. Perhaps we just need more inspired artists.