A few years ago, I found myself on a blind date with an English professor. At some point after the second drink, one of us mentioned a feature in the Times that day about a recent slew of steamy, pulpy young adult novels whose sudden popularity had incurred the wrath of both protective mothers and knuckle-rapping critics.
“But at least the kids are reading,” said my date, raising her glass. “That’s got to count for something.”
Does it?
The gut says yes, the brain isn’t so sure. Both make a compelling case. On the one hand, there’s the cumulative experience of reading, a lifetime of plunging into novels, skimming newspaper articles, browsing wall captions, identifying road signs, and the rest of it; on the other, there’s the temptation to brush all that aside and feign a sort of semiotic relativism: “who are we to say whether certain systems of codification are intrinsically ‘good?'” It’s hard to know which to trust.
I should mention that no one was arguing on behalf of the Y/A books’ literary merit–and let’s assume for the sake of argument they were as crude and morally sketchy as the parents attested–but there was indeed a rallying-cry among the literati for reading as an end in-itself, as though the activity alone possessed some indefinable, self-justifying virtue.
It’s not an absurd claim. Reading does engage and exercise lexical circuits in the brain, and probably improves certain cognitive faculties, however superfluous the content. (When kids stop reading, their IQ scores drop.) Of course, the same argument has been made for video games, that they strengthen hand-to-eye coordination and strategic reasoning, and therefore carry just as much “inherent” value as reading.
The crux of the issue has to do with as “passive” versus “active” stimulation, or at least what is perceived as such.
Many have pointed out, however, that in actuality, there is no such thing as “passive” engagement of any kind. Even perception itself, the neurologists tell us–the apparently simple act of looking or listening–involves an unceasing, multivalent interplay between the sensory information we receive and the cortical nodes that unpack it. On the most basic level, to perceive is to decrypt, and all media–text, images, numbers, sounds–deliver information in coded packages. The question, then, is why value one type of package over another?
Probably the best way out of this numbing debate is to stop justifying our beloved activities and pastimes on the basis of what they supposedly “do for us” and start recognizing why we find them valuable in the first place.
I once attended a lecture given by Don Campbell, author of The Mozart Effect, a popular book which sought to advance the fashionable theory that listening to classical music at a young age improves test scores later in life. I didn’t have much stake in the hypothesis one way or another. (For the record, I’m inclined to think that there probably is a correlation–though not a causal one; a household that happens to value sophisticated music is more likely to provide a cognitively enriched environment for a child to grow up in in the first place.) But what irked me about Campbell’s approach was the idea of prescribing music as a tool of social advancement, as if its sole value lay in how effectively it influenced other, more measurable factors like short-term recall and conceptual organization.
It’s a bit like encouraging your child to play a sport because it ups his chances of getting a scholarship ten years down the line. It certainly could, but far more valuable is the joy the child will have simply playing the sport today.
Yes, music may nudge our “pattern recognition” capacity up by some nominal degree, just as books might enhance our verbal aptitude and sports, our spatial orientation skills, but these are not and should not be the chief reasons why we engage with them. Like so many things that resist quantification–friendship, compassion, beauty–to validate them on the basis of how they affect future performance is to miss the point.
Indeed, each delivery system presents its own suite of rewards, and each are limited by what they alone can offer. Robinson Crusoe cannot compete with “Guitar Hero” on its ground, nor should we expect it to. (Writes Annie Dillard: “The people who read are the people that like literature… I cannot imagine a sorrier pursuit than struggling for years to write a book that attempts to appeal to people who do not read in the first place.”)
The only conceivable value of trashy books is the dubious but not unthinkable possibility that they might go some of the way towards engendering in young people a love of reading as an end in-itself, which in turn might whet the appetite for better books. For many, that’s the only way in. They’ll read Sweet Valley High or Twilight at thirteen, lose their taste for it by fourteen and demand something richer and more challenging at sixteen. Or so the thinking goes.
If the argument applies to one form of entertainment, though, it should apply to all. Why is it that when kids become enraptured by some idiotic program, no one says, “well, at least they’re watching TV?”
The answer is obvious: we don’t expect much from television. Call to mind the act of channel-surfing across a virtual sea of mediocrity–the officious network anchors, the blaring car commercials, the interminable daytime talk shows. It’s no wonder HBO established its high-brow reputation by defining itself in opposition to its own medium.
But is the literary marketplace really all that different? Step into a Barnes & Noble, with its endless shelves of celebrity hagiographies, its window full of diet books by suspiciously photogenic doctors, its rack of movie novelizations, and ask yourself if publishing is a classy industry.
It may be that the reason we’re so quick to defend the Written Word, to pedestalize its power and grandiosity to the detriment of all other media, is that it’s been here the longest. We can chart its evolution from primitive iconography, to ideograms and glyphs, to alphabets and punctuation, up through epic poetry and drama and novels. It’s earned its place as civilization’s posterboy. Where were the Sopranos when Homer, Cicero and Shakespeare were shaping the Western Canon?
This is a prejudice, though. It’s the default position of literary folk to stand by their heritage, even if The Da Vinci Code is its progeny. Like true believers, we’ll come up with ingenious justifications for the innate merit of typographic symbology before accepting that text is just one more delivery system. Which, at the end of the day, it is.
A word about bias. I was brought up to believe there was something wholesome and virtuous about looking things up. Usually when I asked my father, a writer, to define a word I didn’t understand, he’d nod to the American Heritage Dictionary with a slightly punishing look, as if I’d committed a minor sin by not consulting the printed object first. Perhaps he was trying to instill in me a respect for the written language, or maybe he just didn’t want me pestering him, but as a result I’ve carried this heavy-handed association with dictionaries all these years. To this day, I feel just a little bit guilty every time I dial up a word on my desktop widget instead of getting out of my chair and flipping through the hefty tome on the other side of the room. Is there really some secret value to be found in manually turning pages and scanning for particular words? Other than sharpening your ability to recall the letters of the alphabet, there isn’t much to be said for it as an activity. (Though the expert dictionary reader, Ammon Shea, might disagree.) And yet two decades later I can still sense the weight of tradition in the memory of my dad’s disapproving glance.
This is what I mean by “prejudice.” For better or worse, we all have our hard-wired associations–some of us capitulate to them and others rebel against them–but there they are. For a lot of people, the appearance of black & white film alone might signify sophistication. Something about the scratchy, silvery tint, its time-capsuled resistance to contemporary fashions, prompts an automatic sense of reverence, regardless of how many cinematic duds the studios churned out before Technicolor.
Do we do the same for the Written Word? Do we grant it Goldmember status out of respect for its breadth and longevity?
The truth is that, while all delivery systems have particular histories and particular limitations, they are equally capable of delivering meaningful content, just as all cuisine has its delicacies and its slop, its caviar and its gruel, each bound to their own range of flavors and textures. Snobbery, after all, is not measured by a “well-cultivated” palette or a table-pounding demand for “quality,” but by a deliberate unwillingness to consider that quality takes many forms and often abides unfamiliar standards.
What kids actually need, what we all need, are higher standards across the board. Not more books but better books; not fewer movies or comics or pop songs, but fewer bad ones. This worthier goal won’t be achieved by blandly extolling the virtues of one medium or lambasting another, but by developing a stronger, richer, more vibrant culture all around.
That I’ll drink to.
—
Alex Rose is a co-founding editor of Hotel St. George Press and the author of The Musical Illusionist and Other Tales. His work has appeared, most recently, in The New York Times, Ploughshares and Fantasy Magazine. His story, “Ostracon,” will be included in the 2009 edition of Best American Short Stories.
Author Archives: alex rose
Gamers Anonymous
I am not a gamer.
I do not consider myself a gaming enthusiast, I do not belong to any kind of “gaming community” and I have not kept my finger on the proverbial pulse of interactive entertainment since my monthly NES newsletter subscription ran out circa 1988.
Save a few momentary aberrations–a brief fling with “Doom” (’93), a torrid encounter with “Half-Life” (’98), a secret tryst with “Grand Theft Auto III” (’01)–I’ve worked to keep my relationship to that world at arm’s length.
Video games, I’d come to believe, had not significantly improved in twenty years. As kids, we’d expected them to evolve with us, to grow and adapt to culture, to become complex and sophisticated like the fine arts; rather, they seemed to remain in a perpetual state of adolescence, merely buffing-out and strutting their ever-flashier chops instead of taking on new challenges and exploring untapped possibilities. Maps grew larger, graphics sharpened to near-photorealistic quality, player options expanded, levels enumerated, and yet the pastime as a whole never advanced beyond a mere guilty pleasure.
Every time a friend would tug my sleeve and giddily drag me to view the latest system, the latest hyped-up game, I’d find myself consistently underwhelmed. Once the narcotic spell of a new virtual landscape wore off, all that was left was the same ossified product game producers had been peddling since 1986. Characters in battle-themed games still followed the tired James Cameron paradigm–tough guy, funny guy, butch girl, robot; stories in “sandbox” games were as aimless and hopelessly convoluted as ever.
This is to say nothing of the interminable interludes that kept appearing between levels, clearly designed by wannabe action movie directors. Fully scripted scenes populated by broad stereotypes would go on for five or even ten minutes at a time, with the “camera” incessantly roving about, punching in, racking focus, jump-cutting., as though an executive had instructed his team to “make it edgier, snappier, more Casino.”
Where was the modern equivalent to the Infocom games, those richly imagined text-based worlds that put to shame any dime-a-dozen title from the Choose Your Own Adventure series? This isn’t nostalgia talking. Infocom, like its predecessors in BASIC, put out games written by actual authors; not only did they know how to construct engaging stories and fleshed-out characters, they foresaw the opportunities presented by non-linear narratives and capitalized on their interactive potential.
Was it me, or had “refinement” in the subsequent years become a dwindling pipe-dream, like accountability in broadcast journalism?
Recently, however, I had a change of heart. On a trip upstate to visit a friend, I was somewhat reluctantly introduced to the latest installment of the “Fallout” series, third in the sprawling, post-apocalyptic trilogy, only to emerge three days later, transfigured.
Here’s the gist: your character has been born into an alternate reality, one in which nuclear war has ravaged the planet at some point immediately following World War II. Subsequent generations have grown up inside elaborate subterranean fallout shelters where culture, if not technology, has remained frozen in the 50s–faded pastel colors and lollipop iconography share space with rusting robots and exotic weaponry, almost as a form of collective denial. Those that have ventured out into the radioactive wasteland have cobbled together ersatz settlements from the ruins, a la Mad Max, in which they are able to form intimate, scavenger communities subsisting on scraps. You enter the game as an infant, grow up in an underground vault, and eventually embark on a journey that takes you deep into the perilous outdoors.
So far, a familiar, setup. But a few things set the game apart from the standard fare. For one thing, the relationship between the player and the character is mediated by something called the “pip-boy”–a digital interface strapped to the character’s arm which holds all the information relevant to your status: health points, radiation levels, weapons & ammo, etc., plus a working map of places you’ve explored and the details of your current quest.
As far as I know, this is the first time a game has come up with anything like this. The pip-boy acts as a bridge between the ‘diegetic’ and ‘non-diegetic’ worlds, a thing rooted in and motivated by the artificial construct of the game, yet positioned w/r/t the player such that he has a lifeline to the virtual realm at all times. This simple step–providing an internally-justified means of communication between player and character–makes a crucial psychological difference. It’s a bit like having a “Dungeon Master” along with you, only this time it’s not an extremely annoying child.
The overall effect on one’s consciousness is unnerving. That strange, not-quite-real sense of space that follows a day spent in a museum, or even an amusement park, permeates the outside world for long stretches of time.
Broadly speaking, this is something we ask of all art: to tweak and enrich our subjective experience of reality. (Good stuff does this for a day; great stuff does it for a lifetime.) But we also ask it introduce us to concepts, to construct microcosms that allow ideas to take shape and find a sort of aesthetic cohesion–and this is where video games, indeed all games, have historically fallen short.
“Fallout 3” is a totally different animal. It’s a game, yes, but only insofar as it adheres to a set of specific gameplay rules; beyond that, it’s a nest of integrated narratives more in keeping with Julio Cortazar ‘s novel, Hopscotch, than, say, a game of hopscotch.
Indeed, the playing of the game is merely the entry point, a framing device that allows you access to a furiously detailed world. Is this in-itself new? To some degree, the same could be said about last year’s “Grand Theft Auto”–the player enters the alternate New York as Nico Belic, a Slavic thug just in from Eastern Europe, and the story unfolds more or less according to the manner of one’s choosing. Missions are accepted or denied, bad guys are mowed-down or spared, items are acquired or neglected.
The difference is that the game “doesn’t care.” Like “The Sims,” “Grand Theft Auto” does not offer meaningful consequences to irrevocable actions. Getting a prospective girlfriend to invite you upstairs after a date simply results in an opportunity for another date; outrunning a cop means only that you will no longer be chased by him.
Conversely, a particular course of action in “Fallout 3” actually affects the way in which the story is told. Defusing a bomb in the center of town doesn’t just award you with karma points, it opens doors in the story while closing others. Enslaving a citizen doesn’t just turn a once-friendly community against you, it puts you in good standing with the slavers you encounter later on, which in turn enables a set of otherwise unavailable choices. The game “cares” what you do, though it does not “judge” you–again, like a Dungeon Master.
In Aristotelian terms, the dramatic action ultimately takes precedent over the “obstacles.” No matter which choices you make, or in what order you make them, the game is predicated on an ingeniously organized narrative architecture that presents a nested series of dramatic events and corresponding consequences, the constellation of which determines the “plot points” of your particular quest. Like life, what you do is who you are.
Which is not to suggest that we begin judging games by the standards of drama proper. Equating the two raises the same red flags we find ourselves facing when we start calling jazz “America’s classical music” and comic books “graphic literature.” Neither idiom seems to benefit from the association. On the contrary, it suggests that we continue evaluating them on their own terms, for what they can accomplish given their own advantages and constraints–only with the bar set much, much higher.
It also means that those of us too snooty to accept certain terms for ourselves might have to buck up and swallow our pride.
Hell, I’m a gamer.
—
Alex Rose is a co-founding editor of Hotel St. George Press and the author of The Musical Illusionist and Other Tales. His work has appeared, most recently, in The New York Times, Ploughshares and Fantasy Magazine. His story, “Ostracon,” will be included in the 2009 edition of Best American Short Stories.