June 28, 2007 8:01 AM | Leigh Alexander
[The Aberrant Gamer is a weekly column by Leigh Alexander, dedicated to the kinks and quirks we gamers tend to keep under our hats. However, this special column deals, head on, with the subject of violence in games.]
The year is 1993, and a few kids are at the arcade, playing two-player Mortal Kombat. Sub-Zero versus Sonya, and the ninja’s winning. One of the kids is so small she has to hop up and down in the air to watch the fight, and she often does.
Sub-Zero’s being played by the oldest of them, and the kid’s practiced at this. The onlookers know exactly what’s coming. “Finish her!” They cheer in unison to Sonya’s dizzy swaying. The oldest kid bites his lip, steeling himself on the controls. Everyone’s watching, and he’s gotta do it right.
As Sub-Zero wrenches Sonya’s head from her body with the spine still attached, the oldest emits a guttural cry of triumph. The littlest child hops extra high to view the blood-drenched words spattered across the screen. Fatality.
All the kids squeal with the delight of the kill. My cousins and I.
Gamers had their psychological stability challenged several times in succession in the previous week. First, the American Medical Association decided it would review the classification of “videogame addiction” as a psychiatric disorder (it ultimately refused to classify it as such). Second, the ubiquitous Manhunt 2 debacle—a game so violent that it was essentially deemed by evaluatory boards in multiple countries to be unfit for play by anyone, and the intended consoles themselves refused to carry it. Last, the ESRB effectively yanked montage trailers of the upcoming third-person shooter Dark Sector from various websites, including Filefront.com, due to content they deemed “excessive” and “offensive”—despite the fact that that particular video’s been available for public view since the end of 2006.
Such moves may have wider-reaching implications beyond the immediate—after all, human history has repeatedly demonstrated the peril of the oft-cited slippery slope. The dual implication of the ruling—first, that future games will be policed with close scrutiny to be sure they’re family-friendly, and second, that a ratings board is judging propriety for an entire variegated demographic en masse—raised the hackles of gamers worldwide (as well, perhaps, it should have). Vitriolic message board rants, online petitions and incensed editorials were the norm, as the Manhunt 2 ban brought the concept of de facto censorship right to our door.
But before we make family a dirty word, before we organize ourselves on the defensive, if we are to mount a successful counter-attack, we must come to the discussion table with our minds completely in order.
Back to the nineties—1997, to be exact. I’m fifteen years old and Final Fantasy VII is, for the moment, the living end. My little sister, quite savvy for nine years old, is watching me tap my way through some battles against enemy SOLDIERs, critiquing all the while.
“How come he shot you five times, but Tifa can kill him right away by punching?” I don’t understand that either, but since it works out in my favor, I don’t mind.
“How come when you kill them they just turn red and disappear?” She asks.
“How come there’s no blood?”
Momentarily confounded for an explanation, the best I can do is, “I guess they can’t show that, or something.”
She ponders this, and agrees. “It looks stupid, though,” she says.
It did look stupid. We were no longer in the days when a hop on top of a monster’s head would do the trick. Characters had facial expressions now. We had brimstone and ash; we had dirt and torn clothes. The absence of wounds was simply out-of-place. When we asked for more blood, it was only to such incongruities we were referring. It wasn’t so much we wanted viscera; we merely wanted realism. The confetti of red pixels that erupted from a Resident Evil dog bite, once considered gory, eventually became to our perception artificial enough to interrupt our experience.
Now, we go for the throat, for the head. We’re eviscerating and vivisecting our enemies in increasingly elaborate ways. We’ve got an ice pick. A flamethrower. A chain saw. And melee games evaluate our brutality as we play. Vicious! Insane! Awesome.
The argument in favor of Manhunt 2—and many other more graphic games—is that we’re free to play them without the extra execution. The New York Times, when demoing the game, pointed out that the gratuitous kills are optional.
But then, what would be the point?
The Times also revealed that, as opposed to the basic bat-whacking and glass-stabbing in Manhunt 2 (this is what passes for generic these days), “you can stab him, wrap a cord around his neck, stuff his head in a toilet and smash him on the back of the head.” As the reviewer, Seth Schiesel, pointed out—using the Wii, this means approximating mutilation gestures with your hand at the same time.
Nowadays, game makers would never think of putting something like the bloodless death of Aeris in front of audiences. Many games in that decade provided an alternate play mode for those with a sensitive stomach; control over the level of gore. But those choices are fewer and farther between these days. Could you imagine the commercial success of God of War without the “brutal kill”? True, you don’t need to do them. And aside from the occasional blood-washed excesses, it’s a well-scaled, enjoyably rendered and lovingly designed game. But haven’t you ever deliberately executed the most gratuitous combo to finish an enemy? Because you were frustrated, maybe? Furious?
Or because dismemberment, skull crushing and mutilation killings are just fun?
Games are not reality. What we do in fantasy doesn’t necessarily come to bear on life. But to observe the trend in our medium, our fantasies are getting darker. We say that games don’t increase our aggression, our violent tendencies, and maybe they don’t; maybe they never will. There are plenty of hugely successful games—perhaps even the majority—that have no inclusion of violent conflict at all. But when you’re throwing a man into a wall, twisting a neck, scything someone in the face—and your heart rate is up, and your eyes are wide, and you’re utterly gratified by the pop-crunch-splat—we’ve all been there—it calls for an honest evaluation as to whether we are being affected, and how—if for no other reason than the fact that justifying ourselves blindly does us no credit.
We don’t want abstracted violence, do we. We want our headshots messy and our weapons realistic.
Are we crazy?
Are you sure?
The predominating issue here, of course, is that our fundamental right to choose what we consume should not be infringed. Those uncomfortable with violence in games are free to opt not to play them. But lest we forget, even in the First Amendment, there are limitations—think about, for example, what types of pornography are permitted and which are not, and the reasons behind those restrictions. There is far more permitted in cinema these days than there is in games—as Schiesel points out, the worst scenes in Manhunt 2 were not as “bad” as the Saw series of films. But then, movies don’t require participation, personification.
The idea of anyone else—“families,” ratings boards, et cetera—setting limits for us without our input is patently provocative. Much of it simply comes down to the fact that a stance against our industry is a hot trend in an unstable political climate. But before we defend ourselves with righteousness against the idea that there might be a line somewhere that we’re fast approaching—if the defense we mount is to be at all effective—it’d behoove us to take honest stock of the blood on our hands.
Categories: Column: The Aberrant Gamer