I am not sure what scares us more about scientists: the fact that they might be wrong, or the fact that they might be right.
(Post)modern society is literally built on the theoretical discoveries of pure science and the application of the resulting knowledge to what we deem useful inventions. Life without electricity, polymers, pharmaceuticals, internal combustion engines and fast food is imaginable, but only barely. Socially and spiritually, we are largely made in the image of the technology we have created. It would therefore be a huge blow to our self-image and esteem to learn that something has been very wrong with the way in which contemporary science conceives the world, goes about trying to explore it, and ultimately to bend it to our purposes.
By being wrong, I don’t mean that scientists may be mistaken about sub-atomic particles, or the origin of life. Who knows? Maybe primitive organisms actually came from hitchhikers on an asteroid rather than arose in a primordial chemical soup. And, maybe “string theory” is just so silly that, as Wolfgang Pauli famously remarked: “Not only is it not right, it’s not even wrong!” I suppose these eventualities are possible; but, if so, they amount merely to some details in the application and not to the foundations of the scientific enterprise.
More important would be to discover that the philosophy and practice of science, as rooted in the promise of progress from the European Enlightenment onward, was irredeemably pathological, and that our species might perish from the abundance of toxic materials that our big brains have allowed us to construct or acquire. For scientists to have waited until now to mention that the entire industrial project, begun in Britain a couple of hundred years ago and now in full spate in Brazil, Russia, India and China, is threatening a global ecological holocaust seems a tad negligent on their part. For them only recently to have learned that all those medicines that were intended to ensure increased longevity and good health have resulted in uncontrollable new strains of viruses, bacteria and assorted “bugs” that threaten unprecedented global pandemics seems to indicate a failure to exercise due diligence. What, I am asking, if science has been fundamentally been wrong in encouraging our narcissism and the vanities of progress? What if we are not as smart as we think we are? Or, worse, what if we are too smart for our own (or anyone else’s) good? What if science was morally wrong from the outset, and the price to be paid for our super-promethean ambition is our extinction?
In the alternative, by being right, I mean that the scientific frame of mind and the wondrous knowledge it has generated may provide increasingly accurate accounts of the nature of the universe and all that exists within it—the stars and planets and plants and animals (including those that “creepeth upon the earth — Genesis 1:25) and, of course, us. This verifiable (and falsifiable) scientific understanding has already been validated to such a high degree of probability that it would take willful ignorance or perversity to deny it. Combined with the technological mastery of human and non-human nature, it has transformed us and our environment. It has, however, come at the cost of what some once charmingly called our “innocence” and others labeled the childhood of our species.
Science has already managed to dislodge many of our earlier ideas about who, what and why we are. Prehistory, antiquity and contemporary American presidential hopefuls are replete with vivid stories and legends about human origins. Despite their demonstrable scientific absurdity, these myths have been retained for their inherent entertainment value, and for their practical value in encouraging us to think about our mortality and morality. Thus, although they have been discarded by people who prefer fact over fantasy, they may not be completely useless. Nonetheless, one pertinent consequence of science has been the enormous blow it dealt to the literal content of these mythologies.
We can, for example, no longer imagine ourselves to be at the centre of the universe, if indeed it makes any sense to speak of the universe even having a centre. Astronomers and cosmologists have located our galaxy, our most proximate star, our particular planet, and therefore our species in a remote and otherwise insignificant corner of the known universe. Our naïve notions of gods, our singularity as a species and our mighty powers of action and reason have been seriously compromised where not utterly abandoned. Galileo, Darwin, Friedrich Nietzsche (for some) and Sigmund Freud (for others) have combined to deflate our overblown pride. Whatever we may think about our personal “souls” and such, we now know that we are a marginal life-form and, thanks to contemporary evolutionists, we now know that our existence is, at best, an historical fluke, an evolutionary accident attributable, perhaps, to Pikaia glacilens. Little Pikaia is the first known chordate. It miraculously (so to speak) survived the massive Precambrian extinction, flourished in the subsequent Cambrian explosion of speciation, and is a good (in fact, the only known) candidate for the title, the “mother of all vertebrates.”1
Homo sapiens is evidently not specially designed by God in His own image (Genesis 1:27), and we do not even possess a privileged “human nature” that separates us from all other creatures and makes us uniquely capable of self-improvement, never mind achieving something akin to perfection. The lessons of science have been hard on our species-pride. What if the scientists are right, both physically and metaphysically? What if all the energy we have put into theology turns out to be an exercise in self-delusion? What if there is no external criterion for justifying our very existence? How can we deal with the notion that we are, at most, a contingent and wholly unnecessary collection of organisms at the periphery of existence and that, as Jean-Paul Sartre so depressingly put it, humanity “is a useless passion,” or a sort of noble cosmic “absurdity” according to the still-spirited Albert Camus, but in either case (as Edgar Allan Poe suggested) “only this, and nothing more.” We may, in short, know more, but be less confident than ever before. What if science is ontologically and empirically right, and we must therefore pay the existential price for obeying the ancient admonition often falsely attributed to Socrates: “Know thyself”?
Our dilemma is nicely captured in the case of one of the most charming scientists working today, and a man is also a heck of a story-teller. His name is Oliver Sacks. He is a man of medicine, and also a man of music. A neurologist by trade, he has spent much of his professional life in the company of people we once called “lunatics.” Sometimes, living and working among patients with diagnosed mental defects, disorders and diseases, he began to question the sanity of the world. To reassure himself that there was some ultimate order in the universe, he carried around a small object about the size of a credit card. On it was printed the Periodic Table of Elements. So, no matter what he observed in terms of bizarre human behaviour, he could at least remind himself that at some level, it all made sense.
That’s part of what scientists do. They reveal to us what most of them strenuously believe to be the inherent order of the universe expressed in the “laws of nature.” Scientists have displayed the merits of their ideas not only by building, demonstrating and explaining complicated theories in physics, chemistry, astronomy and the like, but also by showing us how to use this knowledge to build refrigerators and television sets, windmills and hydro-electric dams, submarines and airplanes. They have also showed us how to make atomic bombs and plastic water bottles. Sometimes when, for example, they gave us thalidomide to treat nausea in pregnant women, their knowledge seems to be something we could have done nicely without; nonetheless, they keep on making new discoveries, and we respond by refusing to keep our hands off them. We turn their research into computers and fibres (for communications and clothing), and into skyscrapers, genetically modified organisms and Velcro, Viagra and Valium. In short, they provide us with the intellectual means to master the universe, or at least parts of our part of it, but they do not guarantee that we will do so with wisdom. They are helping us to fulfill the Biblical injunction to “subdue” nature and exercise “dominion over every living thing that moveth upon the earth,” (Genesis 1:28), but they cannot help us much as we strive to exercise this power with prudence or consistent good taste.
Some religious literalists demur. They reject such obvious scientific facts as biological evolution. They imagine that the world was made in six days about 6000 years ago by an anthropomorphized deity with a beard fashioned by William Blake, a distinct preference for people of the Jewish faith, and, in the words of J. B. S. Haldane, “an inordinate fondness for stars and beetles.” Most of these people nonetheless rely on scientific knowledge to move them from place to place in automobiles, keep their houses well lighted at night without excessive candle wax, and spread the gospels using satellite dishes and the Internet. Science is the basis of our technological achievements. It helps define our social arrangements and our cultural understanding of ourselves and others. The dominance of the scientific model is virtually complete.
And yet, maybe it isn’t all that simple. Sometimes science changes abruptly and profoundly. For example, scientists who used to advocate the mechanistic model of matter in motion have come to see the limits of their ideas. The results are a little scary. It was shown, for example, that even Sir Isaac Newton (never mind Ptolemy and Aristotle) missed some things, and got other things wrong. Newton’s limits were revealed by early twentieth-century minds like those of Einstein and Planck (who, by the way, didn’t even agree with each other). Now, while most of us have not yet really understood much about relativity and quantum mechanics, our scientific leaders are fast pursuing radically different notions about what constitutes space, time and energy than either Newton or the twentieth century geniuses understood. Today, thinkers such as the ever-popular Stephen Hawking and the various physicists in pursuit of the “theory of everything” are alerting us to the possibility of a multi-dimensional universe and to the notion that the essential beliefs we clutch so tightly in order to make common sense of our existence are, at the very best, expedient illusions—fanciful fairy takes for the very afraid.
What’s more, while standing in awe or fear of knowledge far greater than our own, we try to defend ourselves by distancing ourselves from the brightest, if not always the best, among us. We exhibit bemused detachment or deploy unconvincing disdain regarding those who can “wrap their minds around” complex technical matters and nonetheless manage the existential terrors they seem able to unleash. Mostly, we try to avoid thinking about such things at all and, when we do, we try very hard to resist thinking about them deeply.
No wonder we invent diverse popular images of scientists. There are, of course, heroic figures—astronauts, médecins sans frontières and, but we more often invent fictional characters such as the “mad scientist,” or the “absent-minded professor.” These imagined personalities are, by degrees, inspiring, terrifying or benignly reassuring. The fear that they engender is seldom about who they are, but what they are telling us; even Dr. Frankenstein had no obviously evil intent. As Mary Shelley told us: he was a modern Prometheus eager to give a great gift to humanity, and fated to suffer the consequences.
Scientists, it is plain, need to be tamed. Although the recent flurry of television programs featuring forensic scientists replacing tough private detectives and both good and bad cops have helped to bridge the gap, scientists still portrayed eccentric side-kicks for men of action (think Dr. Temperance Brennan on the popular television program “Bones” beside the less brilliant but far more comprehensible and seriously square-jawed Agent Booth, Dr. Spencer Reid in “Criminal Minds” in the wake of the even more square jawed Agents Hotchner and Morgan, or Dr. “Ducky” Mallard on the even more popular TV show “NCIS” doing the lab work for Agent Gibbs). No matter how clever they are, their arcane concerns are lightened and leavened by generally gentle mockery.
Part of the process of taming scientists consists of creating caricatures of scientists at their best, their worst and their silliest. Part of the establishment of the ideal image of a scientist includes frequent displays of piety. So, we regularly invoke the antique adage about seeing far by “standing on the shoulders of giants,” a saying attributed to Newton in the seventeenth century and to Bernard of Chartres in the twelfth century, as well as being traceable back to Greek mythology. Whatever we claim as its origins, the meaning is clear. The scientific process and method consists of the slow accumulation of information that some towering figure turns into a major advancement in understanding. However slowly or quickly science advances (and it always advances), its development is considered to be inherently orderly, progressive and (at least in retrospect) predictable.
If, therefore, new scientific knowledge is sometimes unsettling, we can at least find order in its evolution from myth and magic to solid utilitarian knowledge. In fact, more than anything else, the modern political economy has tamed science most effectively by bonding it to technology and, therefore, to the production of salable goods.
One consequence of the increasing commodification of scientific inquiry, the definition of science as intellectual investment in the production of new and improved articles for sale, is that the idea of the lone scientist working diligently, passionately and perhaps obsessively in an ill-lit and poorly equipped laboratory has pretty much vanished. No one produces a work of genius ex nihilo, and no one does so all alone.
Few scientists can come up with an absolutely new idea out of their imagination, and fewer could prove it if they did. Scientific inquiry requires vast investment in personnel and equipment. Teams of collaborators and support staff are required. Supercollider superconductors aren’t cheap; neither is testing new pharmaceuticals or new materials with which to make construct bridges and stealth bombers. Not for nothing are even the shortest published research articles laden with several and sometimes many co-authors—frequently led by someone who played no part in the actual work, but who was successful in winning the research grant. So, we find that most scientists are put to work on practical projects intended to yield a profit rather than “pure research.” Innovations that are not linked to a demonstrable “pay-off” seldom get started and even more rarely come to a good end.
Anyone with a jot of scientific insight, however, already understands that the overall account of scientific progress and practice as the stable accumulation of information and the methodical refinement of theory is pretty much hogwash. This was amply demonstrated in 1962 with the publication of Thomas S. Kuhn iconic volume, The Structure of Scientific Revolutions. It is true, of course, that a great deal of scientific research follows what we can call the standard model of what Kuhn himself described as “normal science”; but, the paradigm-changing discoveries or, more often, re-interpretations of existing data are something else.
To help non-scientists come to grips with the realities of scientific discoveries, Michael Brooks, a British science writer has produced a very readable and often insightful book that seeks to alter the popular understanding of science and scientists. He has constructed a narrative that emphasizes the anarchy the actual scientific enterprise. Much of the groundwork, of course, had already been developed on the periphery of scientific endeavour. Contemporary observers of science will see the influence of such critical philosophers of science as Paul Feyerabend, whose immensely controversial book, Against Method: Outline of an Anarchistic Theory of Knowledge, originally published in 1975, set the tone for much of the “postmodern” approach to the deconstruction of scientific discourse.2 Of course, if the grammar of the patient accumulation of facts, occasionally punctuated by the insight of a singular genius fails to describe scientific work adequately, the incessantly relativistic impulse, energized by a pervasive irrationalism doesn’t improve matters. What is needed is an accessible introduction of the history of science that falls into neither trap.
Free Radicals is a useful stimulant. Brooks makes a compelling argument for breaking from the naïve objectivistic and positivistic history of science. He calls for courage of the imagination and a sort of “anything goes” approach to procedure. He encourages daring experiments and wild hypotheses of the sort that, Brooks intimates, lie behind the sort of genius that unlocked new doors and opened windows, letting the sunshine in and encouraging us all to stride confidently out of the darkness and claustrophobia of the dark laboratories and into an expansive world, in Carl Sagan’s words, “free of demons and full of light.”
Brooks also indulges in gossip. Now, gossip is not always mean-spirited and it certainly is not always false. Indeed, sometimes rumours and whispers don’t hint at half of what’s been going on. The point here is that it is too easy to confuse peccadilloes and personal eccentricities with the kind of open imagination and anarchic spirit that is sometimes associated with creative genius. So, Brooks brings forward the curious case of Kary Mullis, who insisted that his use of LSD contributed to his work on polymerase chain reactions which led to the ability to mass produce DNA. That work won him the Nobel Prize for Chemistry in 1993. The same is said of Francis Crick, who seems to have been high when he and James Watson untangled the double helix of DNA forty years earlier.
And, of course, there is the similar instance of Ralph Abraham who may have used “acid” to help him along with his development of chaos theory and fractal geometry, but who probably gave more insight into the “scientific method” when he described the relations between physicists and mathematicians in about 1960: “The romance between mathematicians and physicists ended in divorce in the 1930s. These people were no longer speaking. They simply despised each other. … By 1968 this had completely turned around.”3 This may also count as gossip, but it at least rises to the level of interdisciplinary jealousies and not personal idiosyncrasies.
As to whether credence should be given to the creative potential of chemically induced altered states of consciousness, Timothy Leary might have agreed and so, we are now learning in ever greater detail, might Sigmund Freud. For Brooks, however, such behaviour marks not only the truly independent mind, but also the kind of anti-authoritarian personality that is needed to stand up to convention—whether in physics or biology, politics or religion and, indeed, ethics and morality. Alas, the definition of anarchy, admittedly elastic, stretches to the point of breakage when he lumps together under that label revolutionary microbiologist Louis Pasteur, revolutionary anti-colonialist Frantz Fanon, revolutionary cell biologist-cum-Gaia enthusiast Lynn Margulis and revolutionary revolutionist Karl Marx. Marx, whose active political life consisted largely of a running battle with real anarchists, would not likely be amused; his anarchist opposite, Mikhail Bakunin, would be apoplectic.
Peter Forbes, writing in The Independent, calls Brooks “the canniest science currently plying his trade.” He is highly regarded as an entertaining writer. He is “breezy and fun.” He is “childlike” in his enthusiasm. There is, however, a decent limit, even to anarchism. Brooks crosses it. Writing in the American alternative Internet news source Truthout, Jeff Sethness makes the case forcefully that Brooks goes a quark too far when he links “anarchism” from everything from fraudulent manipulation of data which inconveniently disproved an hypothesis to sexual infidelities involving at least one scientific mastermind.
According to Sethness, Brooks’ worst blunder “is his treatment of Werner Forssmann, a German medical researcher who came to invent the cardiac catheterization procedure by means of rather unconventionally attempting it on himself: Brooks claims him to have ‘promulgat[ed]’ anarchy of a ‘darker hue’ in his subsequent role as surgeon general of the Nazi regime, a position he infamously used to perform horrific medical experimentation on prisoners. There can,” he continues, “be nothing remotely anarchic in such acts; anarchy is not simply scandal or the rejection of established limits, against Brooks’ implications. Not all expressions of the repressed should be considered rational or humane—that is to say, anarchic.”
It is well and good to promote the notion that science isn’t all of a piece. Certainly there are a number of eccentric (not to say mad), preoccupied (not to say absent-minded), and intellectually rebellious (not to say anarchic) characters in the scientific community. There may also be a few monomaniacal, dishonest and treacherous careerists who would happily steal colleagues’ ideas and try to ruin their reputations in the bargain. To confuse irreverence, methodological unorthodoxy and theoretical radicalism with gross violations both scientific and humanistic ethics is to sink the boat he hopes to float.
There is also a political purpose that needs a small comment. Anarchism is, above all, a political doctrine—perhaps the most generous and optimistic among the range of philosophies-cum-ideologies available today. Sometimes coyly, more often brazenly, Brooks ties political dissidence to scientific brilliance. Sometimes it works and sometimes it doesn’t. Environmental issues have almost forced biologists, climatologists and others to become political, usually in opposition to politicians from Stephen Harper to David Cameron to Rick Santorum. They are required to enter the public area to defend their knowledge of global warming. Likewise, religious commitments to “intelligent design” (aka “creationism”) compel engagement by evolutionary biologists. And, of course, those even slightly embarrassed by the application of physics to the making of nuclear weapons (including some who worked on the Manhattan Project) have become forceful peace advocates (as J. Robert Oppenheimer famously cried: “Now I am become Death, destroyer of worlds,”) and inspirational models for the publishers of The Bulletin of Atomic Scientists.
Still, there is no necessary relationship between scientific innovation and political radicalism. So, while it is true that superior scientific minds are apt to tilt to the left (as are intellectuals of all sorts), there are enough on the side of authoritarianism to make a sizeable case that scientific and political acumen are not cut whole from the same cloth. Though many would question his brilliance, if not his success, the mere name of Trofim Lysenko or, better, Edward Teller at least hint that there are both more and less to anarchistic scientists than can be found in Free Radicals.
1. Stephen Jay Gould, Wonderful Life: The Burgess Shale and the Nature of History (New York: W. W. Norton, 1989), pp. 321-323.
2. Paul Feyerabend, Against Method: Outline of an Anarchistic Theory of Knowledge, 4th ed. (New York: Verso, 2010).
3. Quoted in James Gleick, Chaos: Making a New Science (New York: Penguin, 1988), p. 52.
Howard A. Doughty, teaches political economy at Seneca College. In a previous life, he worked for Ontario Hydro, Atomic Energy of Canada Ltd., and the Canadian Nuclear Association. He can be reached at firstname.lastname@example.org