College Quarterly
Fall 2004 - Volume 7 Number 4

Cybertimes: A Response to Elizabeth Charters

by Michael Whealen

I have a shameful confession to make. Even though I have taught in Canadian colleges and universities for close to twenty years now, I only learned to use a computer some six years ago. (Undoubtedly, I was my own worst enemy, as one of the consequences of this sin of omission was that I had to grind out three theses and endless term papers on an electric typewriter: Remember those?) Now, admittedly, in his critique of information technology, "Silicon Snake Oil: Second Thoughts on the Information Highway", Clifford Stoll has carefully pointed out that the frequently-maddening ancillary operations (system setups, repeated calls to technical support, manuals the size of the OED) and those counterintuitive prompts involved in negotiating hardware and software while browsing and composing, often make PC word processing ultimately as time-consuming as using an old-fashioned typewriter. And in defense of "primitive" ways, some of the most astonishingly prolific writers I have known—Joyce Carol Oates comes to mind—still write all their text out in longhand. Yet, these caveats considered, I still think I could have saved myself a tremendous amount of time and frustration by learning to use a Mac or a PC back in the 1980s.

But there may also be something of a more troubling nature going on here; something that has little to do with prosaic concerns and debates over productivity and efficiency. I was reminded of it when I came across Elizabeth Charters' intriguing piece in the winter, 2004 edition of CQ, titled "New Perspectives on Popular Culture, Science and Technology: Web Browsers and the New Illiteracy." Charters has a consuming interest in cognitive processes. More specifically, and following in the steps of Marshall McLuhan et al., she is interested in how the culturally and historically dominant technologies by means of which we acquire our information shape the qualitative and quantitative nature of the messages we receive. In accordance with the subtitle of her speculative article, and in agreement with Neil Postman ("Amusing Ourselves to Death: Public Discourse in the Age of Show Business") and other critics of CRT/VDT-delivered information systems, Charters doesn't like what she sees as a major outcome of these technologies: a "dumbing" down, or a new illiteracy among her college students.

In this opinion piece, I want to suggest that there may be good reasons to believe that the impacts of these new technologies on cognition, culture and societies are simultaneously both better and worse, and more complex than the author imagines. In other words, I will try to use Charters' piece as a point of departure for questioning technology, as Martin Heidegger did in his essay "The Question Concerning Technology."

As a writing instructor in a large Canadian university since the 1980s, I empathize with Charters to a degree. While I am certainly not an authority in cognitive science, I would surely agree that my students in general seem to be seeking out and processing information in cognitively different ways today than they tended to a decade or two ago. Let me hasten to add that my insights in this matter are highly impressionistic (although I do base them on working with thousands of students on their assignments over the years in question).

If I do get the gist of her argument, it is analogically that readers are becoming browsers. Personally, I find this a brilliant and succinct way of describing how the technology may be rewiring our "wetware." Traditional readers (those accustomed to "the culture of the book") paraphrase. They tend to read slowly, to reflect on context, and to confront texts recursively. Readers who adopt the cognitive style of web browsers (whether they are browsing or reading black-letter text online) skim or scan, looking for data, much as one does when browsing through a book in Chapter's or Indigo before deciding to buy it. In the process, context, depth and subtlety are lost, or displaced. She presents an interesting generic model from information processing theory of how it is thought that our brains process information: Sensory input goes into working, i.e., short-term, memory. From there, it is encoded and decoded into long-term memory in processes known as cognitive transformations, or thinking. Browsers—whether mechanical or human—remain "stuck" in the realm of short-term memory (although it must be admitted that browser software is becoming more "intuitively" reflective and user-interface responsive all the time: See Turkle, Ullman, below*). In other words, browsers tend to privilege quantity over quality, and (somewhat) sacrifice depth and reflection. (As I write this, I note that Google boasts it searches sites representing more than eight billion web pages. How many of our college and university students know which sites are reliable? How many know the differences between URLs that end in .com, .org or .ca? Judiciously used, it's a wonderful research tool; in many other ways, it's decidedly not.) From the perspective of how it may be subtly transforming thinking, reading, and writing practices, it may well represent a sort of intellectual amnesia, or a gradual dumbing down of the sort first described by Rudolf Flesch in his seminal book "Why Johnny Can't Read". Not coincidentally, perhaps, this is a study that was published at the dawn of the TV era (1955). As Charters also perceptively suggests, browsers and the Internet may do a lot to facilitate plagiarism, as students, overwhelmed by a sea of information written by author(itie)s who may—or may not be, reliable—panic, and cut and paste large chunks of online texts produced by the students' full-text relevance searches into their essays. Is she—am I—suggesting that these new technologies are causing plagiarism? Certainly not. But I think that both Charters and I would find common ground in the proposition that they encourage the shift to a cognitive style of processing and reproducing information that is more congenial to plagiarism.

Another reservation that Charters doesn't mention directly: Increasingly, many of my students identify themselves as dyslexic. One frequently-encountered potential indicator of this tendency is to repeatedly transpose sets of letters in patterned ways in written English (the vowels "a" and "e" are common victims). Although I pick up on these idiosyncrasies quite quickly when confronted with unvetted samples of their writing, I usually don't say anything about it. It usually takes these students several meetings with me before they "confess"—invariably in camera—that they have a "learning disability." I don't like this kind of stigmatization, and I try to go out of my way to say that this is probably little more than a minor glitch in their wetware as it adjusts to new ways of processing information. Another reason I say this is that I have noticed that since I've picked up my keyboarding speed and generally increased my level of comfort and duration of contact with these new technologies, I've started to commit these cognitive glitches myself. And I virtually never did so before. Granted, this may be no more than a harbinger of premature onset dementia, at my age.

Am I saying that browsing for information is responsible for what by most accounts is a cognitive glitch that became much more prevalent in the western industrialized nations after television and other forms of electronic knowledge acquisition made their appearance after World War II? Again, I would suggest that the answer is probably no. Cognitive science—which today usually involves the process of mapping how we process sensations with the assistance of MRIs that trace the path of radioactive isotopes through the brain after a discrete stimulus has been "received"—is a discipline that is still relatively new. In most cases, it's still difficult to so map the "conventional" paths by which the simplest of stimuli route through mammalian neurons and synapses. More complex ("atypical") routings, such as one might reasonably expect to encounter in the process of mapping occasional dyslexia would seem to present challenges of an almost unthinkable magnitude, given what we do know about the exquisite complexity and flexibility of how brains process information. Are dyslexic episodes caused by random misfirings and or reroutings of synaptic impulses? Are they caused by too much exposure to TV? By living near high-voltage hydro corridors? By an indolent life of privilege and affluence? All of these? Some of them? We don't know, and probably won't for a long time.

Which brings me to a more general historical and cultural concern I have with what Charters has to say, to the effect that IT may be deleteriously spawning a new class of illiterates. In a time of increasingly fierce global competition, we unarguably need citizens who are able to find and process information quickly, critically and reflectively, even if these imperatives are sometimes contradictory. But there may be also be tradeoffs here, where the benefits may outweigh the shortfalls. At one point, Charters writes "computers are insidiously eroding a generation's opportunities to develop higher level thinking. . . ." Frankly, when I see a pathological trope like "insidiously," I get nervous. This summer, I reread "The Republic" where, some 2,500 years ago, the elitist Plato had Socrates moaning on and on about the intellectually and morally bankrupt state of Athenian youth. And I thought of some of our more recent conservative culture critics like E. D. Hirsch and Harold Bloom who, despite the inevitable demographics of contemporary cultural diversity, nostalgically long to restore the vanishing "cultural literacy" of the (white male) canon.

I also recall reading a very interesting recent piece on globalization by "New Yorker" columnist Katharine Boo. She was interviewing one of those growing numbers of well-educated white-collar knowledge workers in Karela, India who edit e-mailed copies of texts for organizations in North America. When she pointed out to him that he was doing ten times the work of one of his North American counterparts for a fraction of the salary, he remarked that he was earning ten times what he would normally earn in a comparable job working for an Indian firm. Wasn't this what the gradual equalization of wealth across national boundaries promised by globalization was all about, he wanted to know? He then went on, gesturing to his cell phone, and asked Boo to imagine a world where time zones—and, who knows, perhaps even national boundaries—had disappeared due to "fast," simultaneous digital forms of communication. A smaller, more united, more egalitarian world, perhaps? Or a neoliberal world of a few privileged "haves" and many suffering, exploited "have-nots" living in perpetual immiseration? I don't know.

Related, and looking at those clever little icons on my desktop (which are comprehensible irrespective of the user's first language), I am reminded of something that the German philologist Alexander von Humboldt wrote some 100 years ago: "And every language builds a circle around those who use it, a circle that is only transcended when they step outside of it." Allow me to unpack this. Humboldt is of course suggesting that, ironically, languages have what cultural anthropologists call strong "in-group" and "out-group" effects—and that, by extension, we can only begin to truly empathize with out-group members (and overcome our own parochialisms and misunderstandings) when we learn to "speak their languages." So, imagine a global village where the peoples of the world communicated with one another much as you "communicate" with the icons on your computer's desktop. Would there be fewer misunderstandings? Perhaps. The result could, conceivably, be a more globally communitarian humanity. But it also possible that it might be like "communicating" with a plasma screen—eerily flat, depthless, affectless, and decontextualized. I don't know. As I suggested earlier, I have only questions.

Perhaps we need to think of some of the potential utopian advantages of these new technologies for global solidarity, while still remaining sensitive to some of the possible dangers their potentials may pose. As the poet Holderin wrote, "But where danger is grows/The saving power also." I can live with the ambiguity.

*I've found that books and articles that explore the emancipatory potentialities of IT intelligently (while still remaining sensitive to their dangers—cognitive dumbing down, privacy issues, knowledge in the service of disparate power relationships, environmental problems, among others) are rather hard to come by. Two that I've found helpful are Ellen Ullman's "Close to the Machine: Technophilia and Its Discontents", and Sherry Turkle's "Life on the Screen: Identity in the Age of the Internet". If any of my readers know of any others, please let me know.

Michael Whealen teaches in the Centre for Academic Writing at York University in Toronto, Canada. He can be reached by e-mail at


• The views expressed by the authors are those of the authors and do not necessarily reflect those of The College Quarterly or of Seneca College.
Copyright ©
2004 - The College Quarterly, Seneca College of Applied Arts and Technology