Menu Close

Everyone’s invited … so why aren’t more of us gaming?

Do confusing videogame controllers prevent you from gaming? They shouldn’t. Jeff the Trojan

To the uninitiated, videogames have a bit of a reputation for being a difficult media form. A control pad, for such people, can be a cold and grey device covered with alien buttons, intimidating and unwelcoming to unfamiliar hands.

Otherwise-familiar television screens dissolve into torrid swathes of colour and movement, too rapid and overwhelming to be inclusive. Games for computers, meanwhile, can require difficult installation and configuration processes, downloading drivers and tools, requiring always-on internet connections, among other impediments.

Yet at some point in the last decade, things began to shift.

The Japanese veteran of the game industry, Nintendo, led an assault on the typical assumptions of how videogames might interact with their players and vice versa. First came the portable Nintendo DS in 2004, with its familiar stylus interface. Playing was frequently a simple matter of pointing and touching, of using a pen-like tool in order to co-operate with the DS software.

Motion-based controllers, such as the Wii Mote, have made gaming more inclusive. Ludovic Maillard

Then came the Nintendo Wii in 2006, which quickly found its way into news stories that remarked, with increasing incredulity, about the seemingly new demographics it had been discovered by.

It was, by all accounts, a true family videogames console, embraced by mothers, fathers and children alike. It was being used in retirement homes to keep the elderly active without having to venture outside. It was being used by Masterchef judges to wile away the hours between shoots. Even the Queen now played videogames (or at least, she did once for the media).

The popularity of the Wii’s body-as-controller “haptics” (or tactile feedback technology) has had lasting influence on the games industry, as can be seen in Microsoft’s Kinect and the PlayStation Move.

But the biggest change came with Apple’s iPhone. The “natural” interface of the touch screen and the meager price of apps quickly enabled videogames to approach ubiquity. Commuters, tradespeople, uni students and officeworkers might have come for the iPhone, iPad or Android’s cool or ease-of-use, but they stayed for Angry Birds (see video below) and Fruit Ninja.

That videogames are approaching ubiquity, then, is no coincidence. Despite some prior technological impediments (such as the blocky imagery and low processing power of early videogames machines), particular strands of videogame design have always reached for pervasiveness.

From the holodeck-like urge for photorealistic graphics to the videogame’s appropriation of the language of cinema, visual literacy has historically been the clearest manifestation of the desire for universal appeal.

For a brief period in the 1990s, film itself was even incorporated into the videogame, with small-scale backlots being set up at game design studios across the United States.

The recent movement towards reduced barriers of entry for interface, control, and price represents a final step to an open technological foundation for videogames.

But it’s difficult to imagine what further steps could be taken on this path. A one-button interface on an iPhone (as in games such as Canabalt or Jetpack Joyride) is as simple as videogames can possibly get.

This translates elsewhere.

Big budget console games such as L.A. Noire and Mass Effect 3 have options for toning down reflex-based segments to a negligible (or even skippable) level, allowing players with the predisposition to simply engage with the game on narrative terms instead.

To a large extent, the videogame is now as welcoming as it can possibly be. A videogame will always ask action of its players. It will always require a certain amount of assertiveness on behalf of its users. It is unimaginable for things to get easier from where they currently are: it is now up to those reticent would-be players to grasp the medium.

As Dr. Jeffery Brand of Bond University argued at the Freeplay Independent Games Festival 2011, there can be a crippling fear of failure for non-gamers. Yet that must be tempered with the fact videogames, as a creative form, have come far enough for that idea to be revealed as baseless. The excuse of barriers to entry now rings hollow.

This is not to say videogames are not exclusionary. Indeed, in the respects of writing and representation, they can be highly exclusionary.

Jenn and Tony Bot

Videogames can frequently represent a limited outlook for non-white, non-male, non-straight, and disabled characters and players, though there are rising pockets of dissent.

Just as problematic is the tendency for videogames to get caught up in established rhythms and genres. The domains of the military (think Call of Duty), of science fiction (think Star Wars games), and of fantasy (think Skyrim) are resoundingly over-catered for by videogames. Alternate experiences are possible and can be highly rewarding, but are sometimes muffled by the stentorian blare of military first-person shooters.

If videogames are to become truly ubiquitous, thematic barriers of entry must be disassembled along with technological ones.

Though it might seem counter-intuitive, given the popularity and iconography of such videogames, going beyond what are currently the dominant genres is the obvious next step in ensuring the videogame’s place as a central creative form of the 21st century.

Want to write?

Write an article and join a growing community of more than 181,000 academics and researchers from 4,921 institutions.

Register now