As a boy growing up in rural Ontario, Canada in the wake of World War II, I was open to all the popular mythologies about what some people now call the “last good war” (Weber, 2008). From John Wayne’s film “The Sands of Iwo Jima” to a cursory reading of the initial six-volume edition of Winston Churchill’s Nobel Prize-winning history, both the moral rationale for the conflict and the outline of its main events, causes and consequences seemed perfectly clear. Only later did I allow myself doubts, and even now I am vaguely discomfited by those who declare that it was an immeasurable human tragedy and one that might have left the world no worse if it had not been fought. A primordial reaction of limitless disgust at Nazi atrocities remains, no matter how much I learn that the Allies were far from blameless. Even many otherwise principled pacifists have difficulty denying that it was a “just war.”
Over the past seven decades, some of the factual issues about the war have come under close scrutiny and the more-or-less official versions have been found wanting. It seems, for example, we have learned that there was more to the lead-up to the Japanese attack on Pearl Harbor than was originally discussed in polite company and that Japan may have been provoked to make the first strike by an American administration which knew that no less an event would be needed to pull American citizens out of their preternatural isolationism. In the other theatre of the war, North Americans have come to recognize that the defeat of Nazi Germany was more a matter of the huge Russian sacrifice on the Eastern Front than eloquent Churchillian resolve in the Battle of Britain, General Patton’s tanks, Jimmie Stewart’s bombing missions, the Normandy invasion and American heroism in Italy and at the Battle of the Bulge. Moreover, we have grown used to discussing openly the morality of the fire-bombing of Dresden and the use of atomic weapons to bring the war in the Pacific to a dramatic halt.
Although such issues are not the focus of the books under review here, they do set the context. It may be time to re-examine the Second World War and the way it is remembered. It is certainly time to review what really happened, why it happened and, of course, how we should start to behave as a species to ensure it doesn’t happen again. The United Nations, of course, was supposed to ensure the peace; it has, however, spectacularly failed to accomplish that goal: first, by failing to deter the USA and the USSR from conducting the “Cold War” in an international system based the so-called MAD policy which can be said to have been a success only if success is defined as avoiding a global Armageddon; and second, by accepting as normal an unrelenting series of hostilities in what the late American essayist Gore Vidal (2002) called “perpetual war for perpetual peace”.
In the process of re-examination and potential revisionism, both large themes and compelling individual stories have emerged. One concerns Bletchley Park, the role of Alan Turing in the work that gave Allied commanders’ foreknowledge of German military tactics and movements, and the bizarre, discreditable fate of the man who, as much as anyone―Eisenhower or Montgomery, Roosevelt or Churchill, might take credit for the Allied victory on the Western front and in the Atlantic.
Thanks mainly to the recent film “The Imitation Game,” the socially transformative gay rights movement and the current dominance of electronic communications, big data manipulation, social media and the ballooning of constant surveillance in the fast-developing national security state, at least a small number of “Millennials” are becoming aware of the extraordinary life and the improbable death of Alan Turing (1912-1954).
I find this notable because World War II is even more distant from the lives of the young people in my classes than the Boer War was distant from me. After all, I spent some youthful time with a Boer War veteran, something few of my students have done with military survivors of either the conflict in Europe or in Asia from 1939 to 1945. As it happens, Alan Turing and my mother were born just a month apart―now more than a century ago―in what must seem like ancient history to college students today.
What’s more, because of the denigration, devaluation and deterioration of history as a subject worthy of attention in contemporary schools, colleges and universities, the mere fact that anyone can be made to be interested in events prior to last month has actually become reassuring. Who knows? Perhaps, despite the best efforts of the authorities (Giroux, 2014), a sense of coherence and chronology has not been entirely and irretrievably flushed down our collective memory hole by the corporatist educational specialists and curriculum consultants who seem to dominate college education in the early decades of the twenty-first century. Maybe it takes the example of an exceptional individual in exceptional circumstances leading to an exceptional tragedy to spark young minds―provided, of course, that the spark leads to an open light of inquiry and not just to another indulgence in celebrity culture.
The case at issue here is especially gripping. It is now better understood that Alan Turing’s rather incredible brain was largely responsible for igniting the development of the modern computer, the enduring appeal of artificial intelligence and the techniques of encryption upon which not only military strategy, but statecraft, international diplomacy, the larger part of (post)modern commerce, finance and industry, and even the crafting of personal identities and the amassing of networks of “friends” now appear to depend. Although he cannot plausibly be blamed for it, Turing also stood at the beginning of the process of total surveillance, the pervasive and invasive information environment and ubiquitous world of big data analytics in which every step (or misstep) in a person’s life is made part of a permanent digital record available for retrieval in a social environment without space or time.
This ought to be a sobering thought for educators who build electronic files on students (and themselves) by noting every absence, tracking every quiz and commenting on every “inappropriate” comment made in the course of a school day. As Edward Snowden (Pitas, 2013) so elegantly put it, “a child born today will grow up with no conception of privacy at all” (Pitas, 2013). On that score, Alan Turing may not be held accountable, but we surely will be.
In the ecology of data mining, data storage, data manipulation and global decision making by immaterial algorithms, we may soon find that our limited historical awareness will be created to conform to the wishes of opportunistic authorities eager to explain and justify their modes of governance. If so, then Alan Turing will have had a larger influence than he might have imagined and, I like to think, one of which he might not have approved. Meanwhile, the vision of recombinant data, arranged in artificial narratives is already with us. It provides the background in which Alan Turing can now be revisited and reinvented.
So, “The Imitation Game,” the successful Hollywood confection which purports to tell his story, has become the mechanism through which the long repressed memory of Alan Turing has been awakened. A long-time fan of Benedict Cumberbatch who plays Turing in the film, I was quite charmed by the performance. Though I knew it got some things wrong, distorted others and invented still more, I was provisionally willing to let certain inaccuracies slide for dramatic effect. Such examples of cinematic licence as putting Turing together with Russian spy John Cairncross in what critic Christian (2014, 2015) correctly calls “an entirely superfluous subplot” that falsely turns Turing from a hero to a plausible (temporary) traitor are normally explainable, if not entirely forgivable. Historical romances are common enough in world literature, so they should obviously be expected in modern cinematic entertainment. However flawed the script and direction, I surmised, the movie at least succeeded in making Turing a temporary household name and may have inspired viewers to look further. Yet, something more insidious takes place when distortions and inventions predominate in what is billed as a virtual (so to speak) “docudrama.”
Anyone wishing to take up the challenge of finding out what really went on at Bletchley Park and in the nine years following V-E day would profit from reading Andrew Hodges’ book, the document upon which the film was based. It rescues Turing from what Caryl describes as “a caricature of the tortured genius,” an epitome of “robotic oddness,” and a bit of a fop (whereas, in truth, Turing was a “bit of a slob with a chronic disregard for personal hygiene).” As well, Cumberbatch plays him as a man unable to understand much less to tell a joke, whereas Turing’s friends knew him to have a “sprightly sense of humour” (Caryl, 2015). These might be dismissed as trifling misrepresentations, but the reformulation of both the persona and the political history of Alan Turing shift the emphasis of the narrative too far. They dramatize the important issue of Turing’s persecution for homosexuality in a way that stresses his victimization, while missing the more important question of the oppression that led to it. They may also pass over real questions about his death.
Andrew Hodges, himself a prominent mathematician and gay rights activist, was “alarmed by the inaccuracies” and regarded the fictional Turing/Cairncross relationship as “ludicrous” (Williams, 2013); so, it is safe to say that he does a better job and treats Turing with considerably more respect. Among the parts of Turing’s life that (perhaps understandably) do not translate well into film are the tremendous scientific and mathematical skills that were crucial to his success as a cryptographer and therefore to the entire war effort. Had it not been for the ability of Turing’s team (a group that eventually grew to about 9,000 people) to break the German codes in the almost impenetrable “Enigma” machine, it seems that the invasion of France on June 6, 1944 would not have been possible or, if possible, would almost surely have failed. These elements of Turing’s life and contribution are managed in the movie in a more or less desultory manner. As in many similar cases, the story is all about personal experiences and not about the circumstances that framed them or even the way in which the personal experiences altered the circumstances in much more profound ways.
One particular point of interest, of course, is Alan Turing’s alleged suicide. The “official story” out of the British authorities is that Turing was demoralized by the harassment and discrimination he suffered after the war. Depressed and dispirited, especially by government-mandated hormonal “therapy” to “treat” his sexual orientation, he is said to have killed himself. Hodges accepts this version. R.Jack Crawford is more skeptical.
Andrew Hodges’ book is a reissue of a volume originally published in 1983, but with a commendable new forward by American cognitive scientist Douglas Hofstadter. The original edition played an important part in the gradual, painful process of rehabilitating Alan Turing’s reputation and revealing the full extent of hideous humiliations heaped upon him in a more explicitly unenlightened age. It was rightly praised for its insight and empathy, its rich store of detailed knowledge about Turing’s “leaping genius” and its “full-throated celebration of Turing’s brilliance, unselfconscious quirkiness and bravery in a hostile age” (McKay, 2012). It remains an outstanding account.
Hodges probes the personal dimensions of Turing’s actual life with admirable honesty and sensitivity. The result is not quite hagiography, but it is clearly a book with a mission. It broadens the scope of Turing’s genius beyond his exemplary wartime work to include consideration toTuring’s larger intellectual project, the scientific investigation of the mind (Turing, 1950). In so doing, it produces unrelieved melancholy at the lost prospect of having Turing’s voice heard in the larger discussion of the capabilities and limitations of thinking machines, cybernetics and the meaning of meaning (cf. Dreyfus, 1972; Bateson, 1979; Haugeland, 1985; and many others).
Crawford’s task is different. He is somewhat more interested in Alan Turing, the revolutionary designer of electronic computer, than in the drama of his personal life and his fate within a repressive culture; however, he had already published a very useful collection on Turing’s contribution to computing (Crawford, 2005) and now has moved into a fuller, more complete study.
A curious result is that Crawford’s more distanced account may actually be more insightful in some ways than those that conflate Turing’s fate with the issue of homosexuality. He disputes the common argument that Turing’s death came about at a time of extreme psychological depression, reporting that Turing’s friends insist that he had stoically endured his treatment at the hands of British criminal justice (he had pled guilty of gross indecency, was given the option of imprisonment or “chemical castration” and chose the latter) and was, at the time of his “suicide,” excited about a variety of new projects. Instead, Crawford considers two alternative explanations.
Crawford convincingly undermines the coroner’s verdict in the Turing case and offers persuasive evidence that Alan Turing, who was conducting experiments using cyanide at the time, might well have died accidentally as a result of inhaling cyanide fumes. On this point, Crawford and Hodges part company, with Hodges preferring to believe that Turing took his own life in a peculiar re-enactment of the Disney animation of the poisoning of Turing’s favourite fairy tale heroine, Snow White.
Crawford also opens the door to the possibility of murder. The British authorities had already deemed Turing a “security risk,” not for the reason the United States did when they unleashed anti-communist hysteria and went after their own wartime genius J. Robert Oppenheimer, but because of Turing’s sexual orientation. It was widely believed that homosexuality left people open to exploitation and extortion and that information vital to national security might be the price of silence on the part of ordinary criminals and international espionage agents alike. Considering the seal of secrecy that was placed on Bletchley for decades after the War and considering Turing’s familiarity with British intelligence, Crawford does not strain credulity excessively by hinting at a high-level conspiracy.
As in many murky matters involving official “intelligence,” no final verdict on the coroner’s verdict is expected. What can be found in the two books under review are fascinating perspectives on a unique individual, an historical period that is foundational for our own, and intriguing intimations of a future that has become our present and portends much for the future of our students, whether they can be enticed to know it or not.
Bateson, G. (1979). Mind and nature. New York, NY: Dutton.
Christian, C. (2014, December 19). A poor imitation of Turing. NYR Daily. Retrieved February 16, 2015 from http://www.nybooks.com/blogs/nyrblog/2014/dec/19/poor-imitation-alan-turing/
Christian, C. (2015, February 5). Saving Alan Turing from his friends. New York Review of Books 67(2): 19-21.
Crawford, B. (2005). Alan Turing’s automatic computing machine: The master codebreaker’s struggle to build the modern computer. Oxford, UK: Oxford University Press.
Dreyfus, H. (1967). Why computers must have bodies in order to be intelligent. Review of Metaphysics 21: 13-32.
Giroux, H. (2014). The violence of organized forgetting: Thinking beyond America's disimagination machine. San Francisco, CA: City Lights.
Haugeland, J. (1985). Artificial intelligence: The very idea. Cambridge, MA: MIT Press.
McKay, S. (2012, November 9). On ciphers and codebreakers during World War II and after. The Wall Street Journal. Retrieved February 16, from http://www.wsj.com/articles/SB10001424052970203707604578095243031215954
Pitas, C. (2013, December 25). Snowden warns of loss of privacy in Christmas message. Retrieved June 5, 2015 from http://www.reuters.com/article/2013/12/25/us-usa-snowden-privacy-idUSBRE9BO09020131225
Turing, A. (1950). Computing machinery and intelligence. Mind 59(236): 433-460.
Vidal, G. (2002). Perpetual war for perpetual peace: How we got to be so hated. New York, NY: Thunder’s Mouth Press.
Weber, M. (2008). The ‘good war’ myth of World War Two. Institute for Historical Review. Retrieved June 6, 2015 from http://www.ihr.org/news/weber_ww2_may08.html
Williams, A. (2013, June 3). Film about WW2 codebreaker Alan Turing is attacked by biographer for exaggerating love affair with woman because he was gay and says Keira Knightley is ‘too glamorous’. Retrieved June 11, 2015 from http://www.dailymail.co.uk/news/article-2346828/Film-WW2-codebreaker-Alan-Turing-attacked-biographer-exaggerating-love-affair-woman-gay-says-Keira-Knightley-glamorous.html
Howard A. Doughty, teaches Cultural Anthropology and Modern Political Thought at Seneca College in Toronto, Canada. He can be reached at firstname.lastname@example.org