The three console makers’ big E3 press conferences brought to the fore a tension that has been lurking behind a lot of the commentary on the recent console and peripheral releases. All three touted their major technological gimmickry continually throughout their media briefings (Microsoft had voice-recognition, Sony 3d, Nintendo whatever the hell they are up to,* and of course everyone had motion control).
Most observers have, implicitly, a preferred trajectory in mind for the video game industry. Some people expect that the industry will progress similar to, say, the film or television industries, where developers will naturally become more adept with the language of games and allow the medium to expand into richer, denser, and more satisfying narratives. Another common notion is that continually technological upgrades will improve the graphical fidelity, AI quality, and interactive immersiveness to regularly approach something resembling the Holodeck. What goes too frequently unsaid is that these two aims, and many other plausible endgames, are irreparably at odds.
At its outset, film seemed to offer incredible potential for technical experimentation and growth, not unlike gaming. However, the development of the movie industry as we know it was predicated on the rapid standardization of most of these elements . Edison himself set the film gauge, American and Soviet propagandists popularized the pre-eminence of editing and quick cuts, and the industry itself established a regular frame rate and duration. Notwithstanding advancements like sound, color, and now digital film, these have remained largely unchanged to this date. Likewise, television, though originally conceived of as this all-purpose content delivery device, quickly adopted the prevailing model set by radio commercial breaks and stuck with it until the ubiquity of cable access exploded viewer options (and even now, most networks follow the tropes established by wireless). In both cases, this consistency provided a stable environment in which directors and writers could fiddle solely with the content, which in turn allowed the mediums to expand artistically.
Likewise, experimentation with the narrative aspects of games requires at minimum technological standardized for a lengthy period of time. As it did for other mediums, this process allows development teams to get their bearings and successfully build off each other’s and their own efforts within a common framework. Additionally, unobtrusive controls (which keep the player focused on content rather than on their inputs) and sufficiently low development costs (which lowers the barrier to entry for creative would-be designers) also seem quite conducive toward allowing gaming to explore its narrative potential. Basically, the ideal would be for the PS2 (or Gamecube or X360 or whathaveyou) to become an unchanging standard for the next fifty years. This technological stagnation, though, is obviously anathema to anyone who prefers to imagine gaming as evolving towards simulcrum rather than art. Regular console hardware upgrades and increasingly complex input schemata are transparently necessary for this process, and the regular software resets it entails might even be a feature not a bug in this mindset.
Though my preferred alternative can probably be inferred from my other posts on gaming (if not this one itself), I am not all that interested in advocating for one option or another. Instead, I simply wish gaming press and critics recognized and voiced these conflicts more openly and more candidly. Those outlined above are hardly exhaustive either. It does not account, for instance, for those who would prefer to see gaming go the route of professional sports (as with Starcraft), companies who herald the “social media” free-to-play model as the inevitable future, or the likes of Jane McGonigal and her proposed gamification of everyday life (in McGonigal’s case, for the betterment of human nature). All of these are, to varying degrees, mutually exclusive end results. Gaming cannot be about both competitive death matches and a harmonious cooperative. But, too often, the relevant media celebrate all of these competing visions, often simultaneously or in quick succession, out of some sort of misplaced enthusiast pride for all developments in the media. Other times, you encounter people with different visions talking past each other about the desirability of certain developments without any realization of why they differ beyond the fact that they do. The future of gaming would be far better if different actors and commentators were more upfront with their ideal “end game” and, more importantly, how others’ visions interfere with its realization. Otherwise, we will end up muddling through with everyone pursuing their own ends (and lauding them the whole way along) but ending up in a position where no vision can even be remotely attain.
*It seems that Nintendo, at least since the DS (or maybe the purple lunchbox) designs its hardware solely based on what will produce the most confused gasps when revealed at E3 and names them based solely on pained groans.