by Jon Katz
23 September 1997
"So the whole question comes down to this: Can the human mind master what the human mind has made?" - Paul Valery
Ever since I wrote about the great Beavis & Butt-head controversies for Rolling Stone in the early '90s, a time when animation's most repulsive geeks were being blamed for unleashing a wave of violence on the vulnerable, screen-addicted young, one of my bizarre hobbies has been trying to figure out why technology is so unnerving for so many people.
For most of my adult life, from the advent of rock and roll to the rise of Nintendo and MTV and the spread of cable, the belief that technology is undermining our values and corroding our civilization has taken on the proportions of a religion.
These previous anxieties, it turns out, were all horseplay, minor warm-ups for the great trauma sparked by the advent of the digital age. The rise of the Net and the Web is causing a collective nervous breakdown among our academic, political, media, and moral guardians. I'd like to know why, but I'm coming to believe that I never will.
Mostly, my research over these past four or five years has taught me how woefully ignorant and poorly educated I am about history and technology. Reading techno-scholars like Marshall McLuhan and Albert Teich has helped, but it's been a tussle.
The problem, I've found, is that people who write well often don't know much history, and people who know history often don't write well. And people who know technology often don't write well or care much about history.
Of the scholars I've stumbled across, one of my favorites is political scientist Langdon Winner, author of Autonomous Technology, subtitled Technics-out-of-Control as a Theme in Political Thought.
Winner has taught me that the great unease over technology isn't a new thing at all.
"Technology," he wrote, "is a word whose time has come. Its rise as a conscious problem in a wide variety of social and political theories requires some explanation. We are now faced with an odd situation in which one observer after another 'discovers' technology and announces it to the world as something new. The fact is, of course, that there is nothing novel about technics, technological change, or advanced technological societies."
This seems clearly true. Medieval Europe, the Enlightenment, the Industrial Revolution, and the digital age all triggered highly sophisticated political and cultural change of different sorts, and sparked widespread unease as well.
In fact, says Winner, freakouts over the advent of technologically advanced societies are so pronounced as to constitute religious upheavals in themselves, perhaps an explanation for William Bennett, Joseph Lieberman, Tipper and Al Gore, Newt Gingrich, and the Religious Right - all of whom have landed like a dread plague smack in the middle of American culture and politics with their ratings, moralistic books, V- and E-chips, and blocking software.
Analogies to religious crises help to show the outrage present in contemporary criticism of technology, wrote Winner, but "they fail to capture an important characteristic of the discussion - its pervasive sense of puzzlement and disorientation."
Much of this stress, suggests Winner, comes from the idea of what he calls "autonomous technology" - the belief that somehow technology has gotten out of control and follows its own course, independent of human direction.
That this notion is patently bizarre, he wrote, hasn't kept it from becoming a central obsession in 19th- and 20th-century literature and political thought.
"For some time now," he wrote, "the writings of many of our most notable poets, novelists, scientists, and philosophers have been haunted by the fear that technology has 'run amok,' is 'no longer guided by human purposes,' is 'self-directing,' or has 'escaped all reasonable limits.'"
I would add to his list of the angry and alarmed: boomer parents; many intellectuals, especially of the middle-aged male variety; liberals; and moralists. And I'd suggest that technology is not only out of our control, but is corroding our value system and that of our children.
This is what Mary Shelley was getting at, of course, in Frankenstein, not a horror story but a highly relevant tale about technology and morality. Shelley makes the point that we are unthinking about the technology we acquire and deploy.
In her book, the only entity that seemed to have given a thought to the technological issues at stake in creating life was the monster, who was bewildered that somebody would go to the trouble of creating him without thinking about what kind of impact he might have. Confronted with the monster's demands that he take responsibility for what he'd done, Victor could only sputter that his bastard offspring was a fiend, and then wish him dead.
If you substitute the computer for the monster, you have a very timely tale.
Victor himself seems flabbergasted at the very idea that his creation might pose some complicated problems or responsibilities.
This is a wonderful metaphor for our modern relationship with technology such as computers and cable systems and VCRs. We rush out to buy these things, stick them in our homes and give them to our kids, but like Victor, we seem to have given absolutely no thought to what might happen next. We seem flabbergasted that when Johnny has access to much of the information in the world, some of it might be sexual or unsavory.
When we get in trouble, as thoughtless people invariably do, we rush out to buy blocking software and V-chips, and we install ratings systems in desperation. Since these can't help us any more than such shallow remedies would have helped Victor Frankenstein, technology seems like an unmasterable and menacing trauma, a monster.
Is technology autonomous? Does it have a mind and life of its own?
To me, technology is both amoral and neutral, no better or worse than the people who create and deploy it, reflecting their values rather than making any of its own. Technology is a tool, like a wrench, but one that needs to be understood, considered, and monitored. People who won't think about technology, sadly, will get what they deserve, just as poor Victor did. People who do think about and consider it can reap great rewards.
"That which men have made," says Winner, "they can also control."
Winner poses three questions he says are central to understanding the autonomy, thus the morality, of technology. They should be posted in libraries, living rooms, and schools, not to mention the halls of the US Congress. Only when they are asked, discussed, and considered will people feel that technology is something they can master for their own good ends and purposes, instead of a raging beast menacing them and tearing their homes apart.
Such questions get almost no consideration in media or politics, where the depth of discussion about technology seems to go no further than dirty pictures on the Playboy Web site. They are good questions, and I pass them along here, hopeful that they'll be of use and perhaps get the discussion they deserve.
1. How thoroughly do people know their own technology? How much does an individual understand about the range of technologies that affect his or her life?
2. To what extent do men and women control technology? Once created, does technology take on effects and consequences of its own?
3. Is technology a neutral tool for human ends? Can our sometimes mammoth technical systems accomplish the lofty goals set out for them?
Having the envious advantage of hindsight, maybe we can avoid Victor Frankenstein's sorry fate. Maybe we can recognize that it is technology unconsidered that is an inherently monstrous thing.
. . . .
Talk about technology, fear, history, and good books, in Threads.
Related columns:
Where Frankenstein, Batman, and The X-Files meet
Technotragedies show technology's limits and dangers