227 35 4MB
English Pages 272 [296] Year 2020
R AT I O N A L F O G
Rational Fog Science and Technology in Modern War
M. SUSAN LINDEE
Cambridge, Mass achus etts London, England 2020
Copyright © 2020 by the President and Fellows of Harvard College All rights reserved Printed in the United States of America First printing Jacket design: Jill Breitbarth Jacket photograph: Tape: LICreate/E+/Getty Images; Area 51, mushroom cloud: spxChrome/E+/Getty Images 9780674250222 (EPUB) 9780674250239 (MOBI) 9780674250246 (PDF) The Library of Congress has cataloged the printed edition as follows: Names: Lindee, M. Susan, author. Title: Rational fog : science and technology in modern war / M. Susan Lindee. Description: Cambridge, Massachusetts : Harvard University Press, 2020. | Includes bibliographical references and index. Identifiers: LCCN 2020011117 | ISBN 9780674919181 (cloth) Subjects: LCSH: Science—Moral and ethical aspects. | Technology—Moral and ethical aspects. | Science and state. | Technology and state. | Military research— Moral and ethical aspects. | Military weapons—Moral and ethical aspects. Classification: LCC Q175.35 .L56 2020 | DDC 355/.07—dc23 LC record available at https://lccn.loc.gov/2020011117
To Dot
CONTENTS
Introduction 1 1
To Hold a Gun 21
2
The Logic of Mass Production 43
3
Trenches, Tanks, Chemicals 63
4 Mobilized 86 5
Unforgettable Fire 110
6
Battlefield of the Body 132
7
Battlefield of the Mind 159
8
Blue Marble 178
9
Hidden Curriculum 203
Conclusion: Reason, Terror, Chaos 224
Notes 239
References 253 Acknowledgments Index 277
273
R AT I O N A L F O G
Introduction
MILI TA RY M ACHINES A ND SCIENCES A RE OF T EN BE AU T IFUL , SEDUC T IVE , CRE AT IVE. SUBM ARINES, FIGH T ER
jets, missiles and even tanks can have an enchanting ferocity, a quality sometimes captured in stirring World War II images from the sky of “thousand bomber raids” or Cold War photos of astonishing nuclear detonations with palm trees waving in the foreground. Contemporary marketing images of drones and fighter jets depict gleaming metal surfaces that are almost sexual in their allure, so-called techno-porn. Many military technologies truly do look amazing, and they can draw us in with their cleverness, their lines, their remarkable capabilities. They are interesting to look at, visually arresting, sometimes fascinating. At some point my own personal fixation became tanks. I visited the Armed Forces Ordnance Museum when it used to be at Aberdeen Proving Ground in Maryland and brought my students there for visits with the unforgettable Dr. William Atwater (now retired but still an active speaker and scholar). Atwater knew a lot about weapons. He would walk us around the old tanks from all over the world—Russian, British, Japanese—telling us how they worked, why they were vulnerable, and how they changed over time. I learned that w omen drove early Russian tanks b ecause the spaces w ere too 1
2
Rational Fog
FIGURE 1. At the Aberdeen Proving Ground, Spring 2004, with a group of PhD students
enrolled in my class that semester: From left, Christopher Jones, me, and Dr. William Atwater; in rear, Eric Hintz, Perrin Selcer, Damon Yarnell, Roger Turner, and Matt Hersch; and leaning down, Daniel Fehder and Corinna Schlombs. Photo by the author.
small for men, and I wanted to learn to drive a tank. My students gave me toy tanks presumably intended for male children (certainly not for an adult female scholar with feminist and pacifist inclinations) (Figure 1). But tanks— pudgy, awkward, and not r eally all that safe—do have their seductive qualities. In a tank, it seems to me, one could move forward through life without risk. Perhaps like other military technologies, they seem to promise security, power, and safety in an unsafe world. This book is a study of the rise of technoscientific war, and beautiful technologies play a key role in the story I tell. This is b ecause they seduce us, pull us in, and often promise more than they can deliver. We have to start with seduction b ecause as a culture, the industrialized “West” has attended endlessly to the wonders and glories of military technique and technology. At times t hese machines have seemed to justify their own existence by virtue of how clever and beautiful they are—how “sweet” the technical details and
Introduction
3
the physical form are. And to be fair, for me, the cleverness is a crucial part of the historical story. Military technologies and sciences are products of intense human intelligence, often produced by the most skilled thinkers of their age, and they demonstrate remarkable talent. In this book, I ask readers both to notice the seduction—to permit it to m atter as they contemplate this story—and also to become analytically indifferent to its force and power, that is, to cease to be seduced. We might call this a kind of disenchantment project, with tanks and missiles in the m iddle. The seduction is “inside” the story I tell, something worth noticing but not accepting. I propose that part of the technophilia playing out around military technologies reflects their status as extreme products of h uman intelligence. They have often depended on the systematic pursuit of knowledge, and they reflect the potential of human ingenuity and thought. We make them up, conjure them like spells, to solve problems. At the same time, they are forms of evidence. What scientific researchers notice about the natural world can sometimes tell us what social worlds they inhabit and what problems they understand as central, and therefore what they see as marginal. What experts notice and attend to has reflected their social and historical place, their standpoint, their situation. This is particularly overt and transparent in scientific studies, for example, of socially charged topics like race, ethnicity, gender difference, criminality, and m ental illness. But social and political knowledge can also be excavated in much more abstract studies in biology, chemistry, physics, and mathematics. Historical and social circumstances and problematics shape what scientists and engineers take for granted, what they assume, what they place outside the boundaries of attention, and what they see as satisfying or credible solutions. While this doesn’t necessarily indicate that the insights produced are flawed, it does suggest that scientific ideas often mirror the context of their production. The ideas and technologies themselves therefore can serve as forms of intriguing historical evidence of the workings and structure of social and po litical systems in the past. Just as novels (fiction) or rules of etiquette (elaborate social conventions) can help us understand the social experience and value systems of earlier times, scientific ideas and technological innovations can help us understand cultures and systems of power in the past as well. In other words, this book does not take the position that context explains the content of science (the old “externalism” which was once controversial in my field of history of science). Rather, it is that the content of science and
4
Rational Fog
FIGURE 2. A photographic image produced in E. Newton Harvey’s laboratory during the Second World War. James Boyd Coates, ed., Wound Ballistics (Washington, D.C.: Office of the Surgeon General, Department of the Army, 1962), figure 69.
technology can illuminate critical aspects of culture and social order over time. This book in practice, then, explores one central question: Why do we know what we know? Why do we know how much plutonium it takes to destroy a city? Why do we know how to aim a projectile around the curvature of the earth? Why do we know the exact speed at which a bullet makes the brain of a cat disintegrate? By the way, that last question refers to a real equation. In 1942, during the Second World War, in a laboratory at Princeton University, a wound ballistics research group began shooting anesthetized cats. The cats were surrogate soldiers, their bodies stand-ins for the bodies of male soldiers, the bullets used reduced in size to mimic the proportional relation between an average male body and a standard Army bullet (see Figure 2). The goal of the research was to determine how exactly to construct and fire a bullet so that it would do the most bodily damage. That’s what wound ballistics is: the study of how to modify ballistics so that bullets do more damage. In the
Introduction
5
course of this work, the team developed a retardation equation that mathematically characterized the impact of a bullet on the tissue of a cat.1 Why do we know the exact speed at which a bullet shatters the leg of a cat? Why do we have this knowledge and not other knowledge? Scientists routinely say that there remain more questions than answers in their own areas of expertise, many proposing that we know only about 5 percent of what could possibly be known in geology, astronomy, biology, chemistry, and physics. Medical knowledge is notoriously uncertain, with many gaps in knowledge. Why do we know what we know and (therefore?) not know other things? The answer at least begins with a consideration of political structures that supported and made sense of scientific and technological knowledge: technologies m atter in the ways that they are understood to matter, made to matter. Social and political order produces, in part, the effectiveness of technical objects. And most technologies are in practice assemblages of human and nonhuman elements.2 For example, an electrical system involves wires and energy, rules and protocols, governmental organizations that build the system, workers who maintain the system, consumers who use it, and legal experts who negotiate its safety. The relationships between h uman and nonhuman actors differ in different systems, but to speak of a technological system without attending to these relationships is to miss something important. Over and over again when we look at the history of military technology, we see the ways that social belief and context mattered to the use of technologies. The matchlocks that worked fine on European battlefields were unappealing to Native Americans in New England, who preferred flintlocks. The difference reflected their ideas about how to use the technology. The chemical weapons that all participants used freely—almost carelessly—in the First World War w ere then never officially used again by the vast majority of nations, a circumstance that has many explanations, none definitive. The po litical barriers to using even tear gas—not to mention deadly gases like sarin developed after the First World War—remain high. Offenses happen, but are internationally condemned.3 And many technologies—for example, many forms of artillery—worked best when they were the center of social cohesion, of a group consensus, so that their efficacy, their utility on the battlefield, was a direct product of a social alliance. Essentially, artillery teams came to constitute what science studies scholars commonly call a sociotechnical
6
Rational Fog
system. The p eople in the sociotechnical systems I explore stand in many dif ferent places. Physicists, chemists, engineers, and other experts, for example, built nuclear weapons. Career civil servants, elected leaders, consultants, private industry insiders, and even journalists managed nuclear weapons as chess pieces in diplomatic and political power struggles. Laborers handled dangerous materials and cleaned up messes. Members of the Armed Forces guarded, maintained, and moved them. Weapons have producers, workers, many kinds of users, and consumers of several kinds, including those who experience their intended effects as battlefield technologies of war. All these people are important to our historical understandings of science and war. Among the p eople who m atter most for any historical or medical understanding of nuclear weapons are those who w ere exposed at Hiroshima and Nagasaki and also those who directly experienced the estimated 2,000 nuclear weapons tests that w ere detonated around the world—US, Soviet, British, French, Chinese—in programs of atmospheric weapons testing in the 1950s. I call these end-users exposed to radiation the ultimate consumers of nuclear weapons. I recognize that this is an unconventional use of the idea of the consumer. We think of a consumer as someone who desires what is consumed or purchased. But by situating those affected by weaponry in general (not just atomic bombs) as consumers, I mean to bring them completely in to the industrialized “supply chain,” as legible participants who must be considered whenever the technology is assessed and must be a part of any historical account of the weaponry.4 I propose that the p eople whose bodies experience weapons know most intimately what it means that such technologies have been produced, stockpiled, tested, and used. In that sense, they are (unwilling) consumers of such technologies, and their experiences are central to the reconstruction of the history of science, technology, and war. Modern science was to some extent born militarized. Experts were asked to solve practical problems of weaponry, ballistics, chemistry, mapping, and health as soon as they were officially recognized as experts. Galileo was known best in the Ottoman Empire for his essay on gunnery. Militarization was not an external force imposed on the new natural philosophy but an integral part of its rising legitimacy, authority, and relevance from the scientific revolution to the present. Not all knowledge was relevant to state power, but states sought out knowledge that was. And science and the modern state grew up together in Europe. As the great historian Paul Forman observed in 1973, the relationship had contradictory tensions: “By the middle of the 17th century
Introduction
7
we have in highly developed form that apparently contradictory union of the notion of a Republic of Science—of an activity and body of knowledge which transcends national boundaries and loyalties—with the most acute consciousness of the national origin or affiliation of individual scientists and scientific achievements.”5 Some of that acute consciousness reflected the high value to the state of technical expertise, from the earliest beginnings of science through to the present. For the past c entury, at least since European powers self-destructed in the course of the First World War, the United States has been the dominant military power in the world. It has also been a scientific and technological power house, leading the world in Nobel Prizes over the last 70 years (375 winners as of 2019 vs. 129 for the runner-up, the UK). The United States economy has supported vast industrial laboratories, federally funded centers of scientific research, leading global universities, and wealthy foundations with commitments to the production of new knowledge. That much of this knowledge has been oriented around state military priorities is perhaps insufficiently recognized. In 2018 the United States spent $649 billion on defense. This was more than the next thirteen countries combined: 36 percent of world defense spending was by the United States, with only 14 percent of world military spending by China and less than 4 percent each by Saudi Arabia, Russia, the UK, India, France, and Japan. In terms of spending as a share of GDP, the United States is also unusually high, with about 3.2 percent allocated to defense. There are a few countries that exceed this percentage. T hese include Algeria, Angola, South Sudan, Bahrain, Armenia, and Oman. But most stable, wealthy nations spend 2 percent or less of their total GDP on defense. (All these numbers come from the impressive databases compiled by the Stockholm International Peace Research Institute, which carefully tracks and updates arms markets and defense spending.) The 2018 numbers reflect longstanding historical trends: During the so- called American Century, from the Spanish-American War in 1898 to the 9/11 attacks in 2001, the United States developed and sustained global military dominance with aggressive strategies focused on new technical knowledge. It created well-funded programs in chemical weapons, biological weapons, nuclear weapons, psychological warfare, computing and information sciences, and many other areas. This confluence of both scientific and military dominance is not a coincidence. The two domains overlap. Together they have
8
Rational Fog
affected modern life both in intimate, everyday ways and in the grand worlds of geopolitics and trade. Many citizens in the United States seem to have little understanding of the daily impact of this defense funding. Let me suggest one internet exercise. E very day at 5 p.m. the US Department of Defense posts all contracts awarded that day with a value of more than $7 million: https://www.defense .gov/News/Contracts/. The 16 such contracts awarded in one day and announced at 5 p.m. the afternoon before I wrote these words (on the morning of May 25, 2018) included $969 million to a coalition of academic and health-care industry organ izations, $558 million to Lockheed Martin, $416 million to Boeing, $19 million to Motorola, and $28 million to Ocenco Inc. for further development of an emergency escape breathing device. The total awards for May 24, 2018, came to about $2.5 billion, for one day. That left out any contracts for less than $7 million. The absence of academic awards to schools of engineering, medicine, and arts and sciences in this particular daily listing is probably explained by that $7 million daily limit: many awards to university researchers would be for less than $7 million. In 2018 the Department of Defense (DoD) was the third largest sponsor of academic basic science, behind the National Institutes of Health and the National Science Foundation. For schools of engineering and for computer and information sciences, DoD is the top funder. Life sciences are heavily supported by the Defense Health program. Roughly one-fifth of DoD research funding is set aside for internal projects at defense sites and laboratories. Much of the rest goes to private industry and academe. DoD allocated $70 billion to defense research and development in 2016. Most important for academe are those projects supported through category 6.1, “far-sighted, high payoff research with no obvious application.” But as my story suggests, knowledge with no obvious application has often become relevant to defense. Prior to the Second World War, US federal defense support of scientific research at colleges and universities in the United States was minimal. Federal surveys that tracked all sources of research support for science began in the US in 1938—a sign of systematic government interest in how knowledge was funded. The range and specificity of these surveys escalated after 1940, with particular attention to industrial research that received government support. Before the mobilization of the Second World War, many scientists studied questions that were of interest to the Army or Navy and became in-
Introduction
9
volved in military programs, but defense funding was not an important source of academic support. During the war, however, campuses were rapidly transformed. Scientists of all kinds w ere recruited to help with defense projects and new institutions w ere created to support military needs. The Office of Scientific Research and Development, created on presidential orders in 1942 and run by the engineer Vannevar Bush, helped to generate useful new knowledge during the war (and has been the focus of significant scholarship).6 After the war, in 1946, the Navy created the Office of Naval Research to carry on the successful alliances between university scientists and naval leadership. In 1951, both the Army and the Air Force Department created research offices with similar goals, and by 1950, the Office of Naval Research was supporting 40 percent of all basic science research underway in the United States. Most accounts recognize the Second World War as a turning point in the military funding of science.7 As a result, a massive system of national laboratories and research centers of all kinds grew out of the war effort.8 Laboratories at Oak Ridge, Tennessee; Hanford, Washington; Los Alamos, New Mexico; and Livermore, California, became critical employers of scientists, engineers, and mathematicians of all kinds. As Michael Dennis has suggested, this transformation in science funding animated policy debates that circled deftly around core questions about the nature of science. It threatened to reduce scientists to hired technicians and to make them trained workers rather than highly trained knowledge producers. Such a status might undermine their professional claims to universality, neutrality, and autonomy.9 Defense funding came to dominate research support for basic science. While some other civilian federal agencies like the National Science Foundation (created in 1950) and the National Institutes of Health (formally created in 1930 but with a longer history in federal public health programs) began to play a larger role in supporting scientific research, defense funding dominated research support through the 1950s and 1960s. In 1958, 41 percent of all basic research undertaken at universities in the United States was funded through Pentagon agencies and programs. But in the 1960s, with the rise of campus unrest and faculty protests relating to US participation in the war in Vietnam, things changed. At many universities, secret defense research came to be seen as inconsistent with the mission of the institution. Faculty Senates voted to ban Pentagon funding, and universities dropped their institutional association with Department of Defense funded laboratories like the MIT Draper
10
Rational Fog
Laboratory and the Stanford Research Institute.10 The Department of Defense withdrew support from 16 other such federally funding research centers.11 The close relationship between defense and university scientists roared back to life after the election of Ronald Reagan to the US presidency in 1981. Faculty resistance to defense funding was mitigated by policies that limited how much classified work could be done on campus. Department of Defense funding agencies also began to more explicitly support research projects even if they had no obvious or immediate practical applications. Secrecy restrictions eased as a result of pressure from universities. Scientists were more likely to be able to present their work publicly at scientific meetings and to publish. However, a fter the US Department of State began investigating some foreign scientists on US campuses in 1980, new tensions emerged. These led to a 1984 statement by the presidents of Caltech, Stanford, and MIT that their universities would refuse to do certain kinds of research if the Pentagon restricted publication.12 Some of this response reflected long-simmering tensions around the special place of scientists and science in any political system—communist or cap italist, democratic or fascist. The liberal Western response to fascism and communism in the twentieth century included the promotion of the idea that science was antithetical to coercion and violence and that science could thrive only in a capitalist democracy. As David Hollinger eloquently shows, the sociologist of science Robert K. Merton’s wartime 1942 formulation of the “scientific ethos” reflected the notion that science and democracy were expressions of each other. Merton saw both as threatened by fascism.13 And fighting fascism would require inculcating the core values of science in every citizen. The “fellowship of science,” Merton and other social scientists suggested, could be imitated by any citizen who had commitments to honest and f ree inquiry, critical evidence-based knowledge, and antiauthoritarian values. Science exemplified a free society and was critical to sustaining democracy. As philosophers like Michael Polanyi argued, a truly free society needed science.14 James Conant, a chemist who had worked on chemical weapons in the First World War, and who became the President of Harvard, also argued that the practice of science exemplified what a free society should be: a collection of people driven by reason, persuaded by evidence, and able to act in the world from the perspective of what was really true. Conant’s “red book,” General Education in a F ree Society, identified science as the foundation of the “spiri-
Introduction
11
tual values” of democratic humanism. It was not scientists who belonged on a pedestal, he proposed, but those who acted like scientists in the absence of the social supports of the scientific community. The impartiality of science was an “unheroic routine” for t hose disciplined by peer review and critical commentary. The heroism was in t hose who could think and act like scientists in other domains.15 These observers were responding to the carnage of twentieth c entury warfare, and to an emerging recognition that rational thinking could lead to tragedy. The wonders of science, by the 1940s, included technologies of mass destruction. Producing these technologies, in turn, brought technical experts to new kinds of field sites, where damage became a guide to further damage. We refer to “collateral damage” as the unintended consequences of the chaos of war. These commonly include noncombatant deaths (children, women, the elderly) and destroyed transportation systems or urban neighborhoods that were not the intended targets. Collateral damage is used to refer to intent, to those forms of destruction that were not the purpose of the bombing raid. My concept of collateral data is to some extent the same. It is the “unintended” production of opportunities to collect and assess new knowledge, as a result of human or environmental damage produced by war. Modern battlefields at least since the 1940s have been field research sites on a g rand scale. Real-time shock research on the Italian front during the Second World War, for example, involved soldiers so badly wounded they were expected to die—so they were turned over to scientists to study.16 Hiroshima and Nagasaki became postwar field sites for understanding physics, cancer, psychological trauma, and heredity, as both the ruins and the bomb survivors w ere subjected to a wide range of long-term scientific studies.17 The Korean War, begun in 1950, started with a plan for scientific and technological field research, where clothing and shielding, evacuation protocols, and medical field practices could be tested. More than e arlier battlefields, the fields of K orea were understood as a scientific opportunity, a chance to collect data in real time on an active front. Increasingly, battlefields and ruined cities came to be interpreted as rich experimental sites, to be mined by military experts and scientific teams. Scientific research could be built into invasion plans, and knowledge could be one downstream result of violence, just as violence was the downstream outcome of knowledge production. For scientists, t hese new landscapes of knowledge production changed the experience of being a technical expert, and this book is as much about the
12
Rational Fog
impact of militarization on the scientific community as it is about the impact of science on the practice of war. I suggest that evolving relationships between socially sanctioned violence and sophisticated technical expertise are as fundamental to h uman history as is the rise of the sovereign state, the European conquest of much of the rest of the world, or the general unfolding of international disputes that usually falls u nder the rubric of military or political history. Indeed, the events, people, objects, and narratives to which I attend in this book are central in all these domains. Mainstream historical accounts too often present science and technology as deterministic and autonomous forces, as “coming on the scene” somehow (by magic?) rather than purposively constructed by sustained and complex human actions and choices. My work explores t hose human actions and choices, and their consequences. The historical story I track is one of irony, tragedy, brilliance, and creativity. The core irony involves h uman intelligence. The h uman ability to think—the mind—has long marked human exceptionalism, and differentiated “man” from “ape.” Over the last three centuries, this ability has been weaponized. At the same time, the mind has become a new kind of battlefield, a place where geopolitical forces and technical destruction intersect. Human intelligence is a resource for producing further h uman injury, as experts, supported by public funding, continue to devise more effective ways to damage bodies, minds, cities, and landscapes. And simultaneously, the mind itself is a highly vulnerable target, now more important to war in the twenty-first century in many ways than factories or military installations. Through terroristic style warfare, the h uman mind has become a weapon itself. Fear and anger, produced through propaganda, can generate social and political damage. This book may at times seem counterintuitive. It is oriented around morality but does not make moral judgements central. I view modern militarized technoscience as a moral disaster, in which the finest properties of the human mind have been leveraged to maximize human suffering. But I do not pre sent a list of cost-benefit calculations or of enlightened and unenlightened experts. This partly reflects my conviction that we are all inside the worlds that made this state of affairs possible. As the science studies scholar Donna Haraway long ago encouraged us to see, in modern technoscientific society, there is no innocent place to stand. Like other feminist scholars in the 1980s, Haraway struggled to come to terms with the epistemological power of modern science. Science promises reliable truth about the world—a valuable com-
Introduction
13
modity by any measure—but also plays a well-known role in various proj ects of oppression, for example through scientific racism and sexism. She sought to construct an account of science that reconciled “radical historical contingency for all knowledge claims” and a “no-nonsense commitment to faithful accounts of a ‘real’ world, one that can be partially shared and that is friendly to earthwide projects of finite freedom, adequate material abundance, modest meaning in suffering, and limited happiness.” Feminists did not need, she proposed, a “doctrine of objectivity that promises transcendence.” Rather, they sought ways of making knowledge that were “answerable for what we learn how to see.”18 In technical systems focused on maximizing mass injury, scientists, engineers, and physicians with an overt professional commitment to human welfare learned to see how to damage people and societies more effectively. Everything I explore here is therefore intrinsically morally charged, whether those involved choose to notice this or not. But for me, historical accounts that track technologies in terms of beneficence and destruction (the downstream civilian benefits of military technologies, for example, or the ways that war is “good for” medicine, or the moral scientists versus the immoral ones) are quietly proposing a cost-benefit calculation in which war has downstream benefits. Nuclear weapons lead to (supposedly cheap) nuclear power, for example, and the experience of trauma in military medicine leads to heroic civilian trauma care. It is true that some technical knowledge produced as a result of the bodily and material damage of war has had downstream civilian benefits. I am reluctant, however, to weigh the consequences of the militarization of knowledge on that particular scale. One other perspective is important to this analysis. It is important to begin from a position that does not divide the world analytically into e nemy and friend. So much military history inclines t oward nationalism—toward the systematic explanation of victory, or the elucidation of brilliant strategy and leadership. Such work can be informative and sometimes fascinating—I have read my share of it. But I have different goals here and so invite the reader to follow a different thread with me, one that might be legible and informative whether you have benefited from the military power of the United States or been a victim of this or any other military system. My critical line in this story is not the line between good and evil, or right and wrong, or friend and e nemy. Rather, I am tracking the line between reason and violence. That is the blurry boundary that this project circles. While reason might connote
14
Rational Fog
what is good, and violence what is evil, the practices I explore were often both at the same time. Scientific knowledge often heals and injures at the same time, and in order to see this simultaneity clearly, it will be productive to set aside questions of nationalism or even just war. T hose questions of military success and national dominance matter a great deal. But adopting a slightly different standpoint, I am proposing here, might make it possible to understand some new questions about how and why war and science look the way they do. My title Rational Fog refers to ideas about the power of scientific rationality and about the “fog of war.” Carl von Clausewitz famously said that in war “all action takes place, so to speak, in a kind of twilight, which like a fog or moonlight, often tends to make things seem grotesque and larger than they really are.” The nineteenth-century Prussian military strategist was concerned h ere not only with strategy but also with the uncertainties, rationalities, and emotions of war. He proposed that war was a “fascinating trinity” of violence, chance, and calculation. Not romantic about fighting, he tended to assess b attles in economic terms, in which even idealized notions like honor and genius were slotted into the “balance sheet of war.” Clausewitz borrowed language from commerce that made war both rational and financial—a calculation of cost-benefit in the “strategic budget,” with bloodshed serving as the occasional cash transaction in a business normally operating on credit.19 Peter Paret suggests that Clausewitz has been misinterpreted and misunderstood in large part because his ideas have been abstracted from their time and repurposed for l ater debates. This is unquestionably true: Clausewitz has been “read as though he were a late twentieth century defense analyst.”20 And yet this reading, however historically suspect, is part of what makes him relevant to me. The idea of the “fog of war” was much discussed in military circles in the United States at the height of the Cold War, and Clausewitz was perhaps less influential during his own times than he was 130 years a fter his death (in 1831, of cholera, a circumstance that forced his wife Marie to prepare the final manuscript of On War for publication in 1832).21 Herman Kahn’s chilling 1963 text On Thermonuclear War, a cool calculation of surviving a nuclear war, was titled in homage to Clausewitz.22 As the Cold War escalated in the 1950s, Clausewitz became one of the most frequently quoted military theorists of the era, his insights invoked to explain war and make sense of policy. This was partly b ecause he proposed that the destruction of any e nemy in war would logically be absolute and com-
Introduction
15
plete. It is important to point out that Clausewitz did not “advocate” such destruction. He rather pointed out that this was the logical theoretical goal of any armed force, unless and until the enemy surrendered. Scientific and technological developments of the mid-twentieth century made the idea more than theoretical and gave his words a new, disturbing resonance. The complete destruction that made his ideas so resonant in the twentieth century was produced by laboratory science and by p eople with an explicit commitment to the power of reason and rationality. Many of t hose I consider here, the scientists, engineers, physicians and other experts, believed in the power of disciplined h uman thought. Experts are generally p eople who have oriented their professional lives around the potential of rationality. But like the generals Clausewitz so shrewdly observed, they have often operated in the fog and the twilight of war. At times, their communities w ere torn apart and their professional lives shattered by the consequences. This then is the “rational fog” that I explore. I try to bring knowledge and violence together in historical terms with the same sense of force, intensity, and complexity that has characterized actual practice over the last three centuries. In the process, I map the gray zones where people who trained to pursue truth became agents of downstream violence. I have here focused primarily on events in the United States. I begin with a very rich case study of the use of the gun in Europe and elsewhere after 1500, but then turn to industrialized science and technology a fter 1800, with particular attention to science and technology in the United States in the twentieth century. The United States is the country and citizenship into which I was born and on which most of my own scholarly research has focused. Many related trends and tendencies appear in the history of science and technology in Russia / the Soviet Union, Germany, the United Kingdom, France, and other nations. I refer to and cite some of this literature, but this book is oriented around the history of militarized sciences in the United States. I explore how experts tried to negotiate their relationships with the state; how secrecy and security have s haped what it means to be a scientist or an engineer; how battlefields were reconfigured by technologies; and how masculinity and courage w ere gradually operationalized in terms of discipline and training rather than moral character. I draw on both primary and secondary sources and particularly on a rich and compelling scholarly literature in the history of science, technology, and medicine. I show that the transformations of war produced through technical expertise absorbed human genius
16
Rational Fog
and human creativity—they involved the application of the finest properties of the h uman mind to the production of h uman injury. They were not “inevitable” or “natural” but contingent and historical. And they are deeply implicated in modern history in general, in ways that have often been neglected in historical accounts. Many of the finest minds of the last few centuries have turned their talents to the efficient production of h uman injury. Objectively brilliant h uman beings—some of the most impressive thinkers ever to live—have knowingly and purposively set out to construct ever more devastating ways to destroy human bodies, minds, cities, and societies. And they have succeeded: We truly have available what Mary Kaldor once called a baroque arsenal, full of diverse ways to damage p eople, including many kinds of missiles, bombs, tanks, drones, land mines, chemical and biological weapons, submarines, psychological torture programs, propaganda, the internet, and information surveillance methods.23 Today this arsenal is distributed through both l egal and illegal markets, widely available to all buyers. It has significant consequences in virtually all global relationships.24 And it is functionally both secret and public, the focus of what Peter Galison calls an “antiepistemology,” with a “staggeringly large effort devoted to impeding the transmission of knowledge.” Epistemology asks how knowledge can be uncovered and secured, he notes, but “antiepistemology asks how knowledge can be covered and obscured. Classification, the antiepistemology par excellence, is the art of nontransmission.”25 It is not only that science and technology have decisively changed warfare. Engagement with war has also changed science. The historical trajectory that binds knowledge production to the production of violence—that threads these two enterprises together with such intensity—has also generated modern guerilla warfare, terrorism, and cyberwar. Emotions rather than factories are the critical targets in much twenty-first-century conflict, and this is one result of the incredible technological superiority conferred by science in prosperous nations. The forms of warfare called today “terrorism” are technoscientific workarounds that respond to the efficacy and superfluity of sophisticated weaponry. How did we build this assemblage of brutality and pure truth? This book is an attempt to frame that question clearly. It is not a canonical history of science, technology, and war. It is rather a reflective exploration of technical violence.26 It is informed by feminist theory, science and technology studies, and ethnographic and sociological scholarship. Many of the topics I explore have been
Introduction
17
examined in g reat detail in extraordinary, sometimes riveting, scholarly studies of particular nations, technologies, scientific disciplines, and military campaigns. I draw on this scholarly literature to reconstruct events, reflect on their connections, and provide guidance to readers who wish to learn more. I also draw on my own scholarship, which has focused on science in the United States a fter 1945, and particularly on interpretations of the atomic bomb produced by the scientific community. In the chapters that follow, I track how key technologies and sciences mattered in the history of science and war. My first case is the very rich history of the gun, a s imple technology that helps us understand the idea of the sociotechnical system and the critical importance of various kinds of “users.” Some people asked to fire a gun in battle apparently simply did not do so, and while “mock firing” was discovered only in the twentieth c entury, it could later be reconstructed as a profound historical reality. I then turn to the processes of industrialization—interchangeable parts, efficiency, rational management—and their roles in technoscientific warfare. I suggest that the logic of mass production was also the logic of total war and eventually the logic of indiscriminate urban bombing. By the 1940s, civilian workers produced the airplanes that legitimated air power strategies that bombed civilian workers’ neighborhoods. By the First World War, my third case study, the battlefront was a technoscientific achievement engaging the labor of Nobel Prize winning chemists and physicists, a site of mud and knowledge, brutality and truth. It also shattered the international scientific community, as German chemists w ere blamed for the first uses of gas and not welcome at scientific meetings for almost a decade. That war was in many ways the “training ground” for the generation of scientists, engineers, and physicians who played leadership roles after 1939 in Europe and the United States. As I show in Chapter 4, they had learned the brutal lesson that science and technology w ere crucial military resources, and they applied that lesson with skill and creativity not only in the US and the UK but also in Germany, Italy, and Japan. Both Axis and Allied nations weaponized their experts, as mass mobilization reshaped c areers and lives and scientific agendas. One of the most important mobilization projects produced the atomic bomb, and I explore how the bombs were built and how the damage they generated was used to produce new knowledge. I show in Chapter 5 that Hiroshima and Nagasaki a fter the bombings were the focus of significant scientific research by Japanese and US physicists, geneticists, psychologists, botanists, physicians, and other
18
Rational Fog
experts, as well as Strategic Bombing Surveys and a survey by the Manhattan Project scientists. But as I suggest, those studying Hiroshima and Nagasaki selected what they would notice and problematize and what they would not notice or not see. In my next chapter, I consider how militarily relevant scientific research constructed an image of the human body that was oriented around violence. In 1943, Yale physiologist John Fulton described the brain to a colleague as “a semi-fluid substance, suspended by fairly inelastic attachments in the cerebrospinal fluid in a rigid box.” Fulton selected those properties of the brain relevant to its demolition by firearms. I propose that his perspectives reflected a general emergence, after 1900, of a suite of biomedical sciences that saw the body as a target and a battlefield. At the same time, and over the roughly the same period, the human mind, too, became a special focus of studies that w ere intended to elucidate how the mind could be broken down, destroyed, or manipulated. In Chapter 7, I consider how changing minds became a key state project. Scientific and sociological research on propaganda and communications, psychological warfare, brain-washing and mind control, and obedience to authority, often with military support, helped establish ways of scientifically controlling feelings and thoughts in order to control economies and political relationships. This research made the mind a critical battlefield in technoscientific war. But there were other emerging battlefields, too, and by 1980, the entire earth had become a battlefield literally “filled” with technological achievements. In “Blue Marble,” Chapter 8, I explore how the Cold War arms race brought technologies, weaponry, people, air bases, missiles, and detonations to places once invisible, inhospitable, irrelevant, unknown; to tropical paradises, frozen landscapes, deserts, islands; and to airless, cold places in space and the upper atmosphere and u nder the sea. These w ere often places seen by military planners as empty, valueless, owned by no one, occupied by no one, remote, and expendable. And they became sites of engineering and scientific feats of astonishing scale and cost. I close with a consideration of what the militarization of knowledge meant for those who made the knowledge. In “Hidden Curriculum,” I consider how experts in fields from physics to sociology found their research calibrated to empower the state, and scientists trained to see themselves as creating knowledge as a social good found themselves engaged in something that felt very different to them. I track their discomfort and their efforts to negotiate a solution and their professional and personal decision making as they tried to navigate a system they did not control. I close with a consideration
Introduction
19
of our current vexed circumstances, as all technical knowledge has become a resource for violence. As my account suggests, everything h umans know about nature can become a resource for state power, and every form of knowledge can cut both ways. If you know how an economy works and what facilitates its growth, you also know how to bring that economy down. If you understand what the human mind needs to sustain a sense of safety and order, you also know how to destabilize that mind. If you understand the engineering of a bridge, you know how to bring it down. And if you know how to intervene to stop a pathogen, virus, or bacteria, you also know how to maximize its spread. Over the last century, scientists and engineers figured out many ways to produce human injury. It was not the most obvious use to which h uman intelligence could have been applied, but it has been a very important one. In characterizing how and why this occurred, I have invoked efficiency and reason—ideas central to the very models of rationality that I describe—to suggest that at least some of this scientific effort has involved significant waste of human ability and talent. I do not see an easy way to re orient knowledge around “the welfare of mankind,” though I think that seeing the problem clearly is a first step. I recognize and acknowledge that this is perhaps an awkward historical moment—with the rise of troubling new forms of antiscience politics—to call attention to the roles of technical knowledge in state-sponsored international violence. The legitimacy of science as a system of accurate and reliable knowledge production that can be trusted to be correct is under attack in the United States and elsewhere. Business, religious, and political leaders and the general public have challenged and even rejected scientific assessments of vaccines, evolution, climate change, and other natural phenomena. Critics invoke changes over time in dietary advice to suggest that all science is untrustworthy. Implied in this rejection is a general skepticism about the legitimacy of the scientific method and the reliability of technical knowledge. All I can say in my defense is that if there is any domain where the scientific method has proven its practical value, a billion times over, it is in the domain of war. My story demonstrates the incredible power of the systems of thinking and research that we call science. It tracks the overwhelming legitimacy and authority of technical knowledge systems that truly work. The history of science, technology, and war amply demonstrates that the scientific method can generate powerful and trustworthy insights. Anyone skeptical about the legitimacy of the scientific method might benefit from
20
Rational Fog
contemplating scientific studies of the chemistry of gunpowder, the physics of the atomic bomb, the geology of proposed nuclear storage at Yucca Flats, or the mathematics of drone trajectories and geospatial maps. These are systems that have reshaped the global order. At the very least, they suggest that the methods of science can be effectively leveraged to produce new knowledge. While military science and technology is not generally celebrated as contributing to the “welfare of mankind”—that much-repeated goal of modern science—they establish without question its practical legitimacy. In an utterly straightforward way, they prove that science can produce truth. Modern war brings together violence and knowledge, brutality and truth. It therefore links technical knowledge—science, medicine, and engineering— to the state’s war-making capacity. This well-recognized intersection is a point at which certain features of the modern social and political order come into particular focus. The goal of this book is to elucidate t hose features. I reflect on why both warfare and science look the way they do today. In their brilliant economic audit, Schwartz and his co-authors tracked down exactly how much money was spent on nuclear weapons systems.27 My work is an audit of a different kind: I explore how much skill, intelligence, and insight have been devoted to military violence. Hence, my audit begins.
1 To Hold a Gun
HIS T ORIA NS HAV E GR A N T ED NO O T HER SINGLE T ECHNOLOG Y A L ARGER ROLE T H AN T HE GUN IN SH APING
human destiny over the last five centuries. While I am skeptical about the ways that this causal framing generally works, I recognize its ubiquity. The gun has appeared as the engine of modernity and the key to the rise of “the West,” the modern state, the slave trade, imperialism, and European cultural and economic dominance. Even the printing press has not been accorded the centrality of the gun. Historians have seen gunpowder and the gun as transformative, determinative, and the causes of diverse and seemingly unrelated historical shifts of global importance. In many ways, the gun provides a powerful example of how technological determinism unfolds as a form of explanation. It is a relatively simple technology, at least at its core, involving a small contained fire, the gases from which put a projectile in motion through a metal tube. In ways that may not be obvious, it was modeled on the bow and arrow, substituting a controlled explosion for the human force of muscles to launch a projectile. But despite the core simplicity of the technology itself, using guns in a systematic way to sustain a long-distance, large-scale military campaign came 21
22
Rational Fog
to be extraordinarily complex. Any large army dependent on guns needed constant and reliable access to dry, high-quality gunpowder. It could not be mixed or manufactured in the field. There had to be supply lines, transport vehicles, and ways of keeping the gunpowder from spoiling or breaking down. Making gunpowder in turn required chemical expertise, and states needed new research institutions to support it, new sources of the critical ingredient saltpeter (an organic substance found in manure, h uman excrement, and bird droppings), and new experts (chemists). Making reliable guns themselves was also challenging. As the historian Ken Alder has demonstrated, state interest in producing more reliable guns led directly to the first uses of interchangeable parts in late eighteenth- century France. As t hings turned out, guns w ere uniquely suited to industrial mass production. They began as craft products of individual skill, each one unique. But they “invited” innovation. Standardization solved many military problems. Soldiers expected to use guns effectively in battle, furthermore, needed to be trained. The complex process of loading, reloading, and firing under difficult, life-threatening circumstances needed to be (emotionally) automatic. Systematic drills made such performances possible. Eventually it even became clear that soldiers needed training to help them overcome human resistance to aiming a gun and shooting and killing another person, an act that in practice could be extraordinarily difficult. There w ere therefore technical, material, institutional, and emotional systems built around this technology. Making it work required attention to all. Historians of science and technology call something like this a “sociotechnical system.” It’s a shorthand way of capturing the ways that technical objects—things—shape the social and political order not only by virtue of their technical properties or physical qualities as objects, but also by virtue of the systems and institutions that come into being to manage them, make them useful, and realize their potential. The only way to understand the gun is to pay attention to the complex systems that made it a functional and productive military technology. When historians talk about the gun causing the rise of the modern state, the Atlantic slave trade, or the rise of European empires, they are folding into the technology itself (“the gun”) these sociotechnical systems that performed its political and social force. It is sometimes productive to tease such elements apart. This can help us make sense of an object that by any measure has had
To Hold a Gun
23
profound, enduring consequences. Teasing apart the elements shaping the (undeniable) power of the gun is therefore one goal of this chapter. The history of the gun illuminates technologies as systems of meaning, embodiment, practice, and social order. The gun is much more than a small explosive device. It is a powerful symbolic object that performs its effects, like so many other technologies do, through social relationships. It is also a critical technology in the history of the body. With the rise of the gun, dif ferent and more damaging wounding at a distance became possible. Gunpowder came first. It began as a Chinese alchemical curiosity, possibly as early as the ninth century of the Christian era and probably part of Daoist alchemical studies of immortality. It might have been used and studied in places other than China as well. There were very early references to an explosive powder in texts from India. We know that by the eleventh century, it was regularly manufactured for use in Chinese weaponry. A fire spear, developed in China, was a tube filled with gunpowder that was intended to set fire to a target. Fire spears were heavy and awkward and could not send fire very far, but enough of them could ignite a target. At some point after around 1200, Chinese gunsmiths developed the true gun. A true gun is defined as having a barrel, high-nitrate gunpowder, and a single projectile. Around this time, guns, grenades, rockets, and other incendiary weapons became regular tools of war in China. These technologies were used extensively in siege and naval warfare by vast armies and navies in a period of chaotic conflict between the Mongols, the Song Dynasty, and the Jurchin Jin Dynasty. Chinese battlefields were filled with gunpowder technologies and Chinese military forces used novel strategies to exploit them. Essentially, then, early modern warfare was invented in China between about 1200 and 1400, as the historian Peter Lorge has conclusively shown.1 It is important to say this very plainly b ecause much of the literature produced by European historians over the last two centuries about the rise of the gun has left out or minimized the story of gunpowder warfare in China. The gun has played both an a ctual role in European history that matters a g reat deal and a (somewhat tiresome) symbolic role as a sign of European superiority. There is a similar tendency in stories about other kinds of military technology that came later, reflecting the force of technological seduction. In my view the celebration of technological differences—particularly as markers of national or continental superiority (and here tone can be impor tant)—is the opposite of serious historical analysis. But it is common even in
24
Rational Fog
many very respectable and serious historical accounts. It plays a particularly discouraging role in much of the literature on military revolutions, a litera ture that more or less tracks European dominance of most of the world by explaining European mastery of heavy artillery, volley fire, big-gun sailing vessels, and so on.2 It is nonetheless true that European inventors, kingdoms, and nascent modern states embraced gunpowder and guns, modifying them aggressively and expanding their styles and their uses. Chinese gunpowder technologies came to look “backward” even though the gun appeared in China first and migrated on early trade routes out of China around 1320 to Europe, India, and Africa. Arriving at around the same time, along the same trade routes from China, was the plague, the Black Death. European nations lost about half their populations in a few short years, 1346–1353. It is intriguing to wonder how reactions to the gun as a potentially transformative technology might have been mediated by this profound social and biological trauma. Guns were not universally admired when they first became a presence on European battlefields. Martin Luther in the early sixteenth c entury proposed that cannon and firearms were “cruel and damnable machines” possibly produced by the devil. Shakespeare around the same time has a character in King Henry the Fourth complaining of “villainous Salt-peter” and proposing that “but for these vile Gunnes, He would himselfe haue beene a Souldier.” Guns seemed, to him and to others, to make war less glorious.3 Machiavelli’s treatment in his 1521 The Art of War (a text that has not aged as well as The Prince) was dismissive. The great Italian political observer ignored and despised gunpowder weapons partly because they w ere not found in the recorded history of the ancients. One historian has proposed that he could not accept firearms as a significant military and political innovation, even as that innovation unfolded all around him, because this would have undermined his model for military virtues.4 And guns were not seen as particularly virtuous: some commanders hated musketeers and punished captured enemy gunners by cutting off their hands. Unlike hand-to-hand combat and knightly warfare, guns caused death in ways that reflected neither character nor social or moral status.5 Any peasant could kill with a gun. By the sixteenth century gunfire began to kill more and more people, particularly as guns grew larger and armies adopted practices like enfilading, which is firing at an angle across slow-moving formations of disciplined marching troops.
To Hold a Gun
25
FIGURE 3. Saltpeter works in central Europe. Francis Malthus, Pratique de la guerre continent l’usage de l’artillerie, bombes et mortiers, feus artificiels & petards, sappes & mines, ponts & pontons, tranchées & travaux (Paris, 1681), following p. 150. The Huntington Library.
The gun is at least two technologies. It is a firing tube that eventually had a trigger, and it is gunpowder, an explosive used in the Arab world and in China centuries before guns were produced. Making the tube and the trigger was relatively simple. Making gunpowder was not. Indeed, the need for gunpowder reshaped trade and incentivized empire-building. Gunpowder is a finely calibrated mixture of charcoal, sulfur, and saltpeter (Figure 3). There were many gunpowder recipes and many claims about what made the mix perfect. Some called for the charcoal to come from burning
26
Rational Fog
particular types of wood, but generally both charcoal and sulfur w ere easily acquired. The saltpeter was the hard part.6 Saltpeter, a natural product of the decay of organic substances including manure, rotting plants, and even the bodies of animals, formed about 60 to 75 percent of black powder recipes. It had long been employed in human production systems, such as the bleaching of textiles, soap-making, metallurgy, and even as a preservative in cheese. It was also used in fireworks in various places. Presumably its combustibility had been witnessed many times. With the rise of gunpowder weapons, acquiring saltpeter became a higher priority for European powers. Saltpeter could be “farmed” using animal manure, h uman and animal urine, bat and bird guano, and other natural sources that could be induced to undergo forms of bacterial decay. And the need for saltpeter made excrement valuable in ways that provide us with insight into the shifting status of individual property rights during the rise of the modern state: states could and did boldly seize privately held animal manure, h uman latrine waste, pig dung, and pigeon droppings. Such materials were the military equivalent of contemporary oil or uranium—central to maintaining state power. David Cressy’s vivid account of the roaming saltpetermen of Elizabethan England, for example, highlights the ways that collecting dung was an exercise in royal power. The king’s subjects were required to turn over their urine and manure to the state and to permit their houses, barns, and churches to be ruthlessly excavated (sometimes with no warning) for the extraction of these precious materials. The “incivility” of the saltpetermen, who sometimes broke into h ouses to dig up rooms and tore up church pews where w omen supposedly urinated in place, enraged landowners. The practices of raiding and extracting manure from English h ouseholds, churches, and farms did not ease until new sources of saltpeter in India made them irrelevant after about 1630.7 But the record of these intrusive activities demonstrates how the need for gunpowder could justify oppressive—even shocking—disruption of local properties and rights. Between 1400 and 1900, then, European states depended on saltpeter for military security. Some states that were powerful when gunpowder weapons came to dominate European battlefields lost their offensive advantage when they could not keep up with saltpeter supplies (Sweden, for example). Other seemingly “backward” states like E ngland expanded their imperial ambitions and their technological range and strength through saltpeter.8 Most European
To Hold a Gun
27
states could not make enough gunpowder on their own. To get it, they had to make deals and alliances, buy guano from sources in the global South, and establish contracts with large saltpeter farms that could process manure and urine. And increasingly, guano was needed not only for gunpowder but also for fertilizer to increase food production as populations began to soar.9 Quality gunpowder production depended on the expertise of chemists, and most Eu ropean states began supporting chemical laboratories studying gunpowder after 1700. This was a significant form of state support for scientific research. Gunpowder analysis focused on finding the perfect recipe and assessing why different chemical combinations performed the ways they did. Chemists published scientific treatises about gunpowder, which was a substance both po litically important and chemically interesting.10 One critical innovation was called corning. Transporting gunpowder over rough roads could cause the various constituent elements to settle by weight, and thereby reduce the effectiveness of the mixture. Corning involved moistening the perfectly mixed gunpowder, permitting it to dry, and then mechanically breaking it up into kernels. These kernels, or corns, preserved the right proportions in e very sample.11 In the seventeenth century, armies in Europe grew larger. Forces had generally been relatively small, perhaps 5,000 men, but with the rise of gunpowder armies and with escalating tensions in Europe between emergent states, they grew as large as 100,000. Such massive armies required significant supplies of both gunpowder and guns. T hese technologies came u nder more and more centralized production in state-sponsored armories. State defense needs became huge and no feudal lord could sustain them. Guns therefore in theory led Europe out of feudalism. With the gun emerged a new form of modern state, with standing armies, nationalist identities that subsumed different ethnic or linguistic groups under a single civic order, systems of formal taxation, and the mass production of military technologies like guns, artillery, cannons, and gunpowder. The literary scholar Sheila Nayar has even proposed that gunpowder and guns provoked the emergence of the early modern chivalric romance, as writers tried to come to terms with threatened constructions of heroic masculinity and invoked a nostalgic past of jousting, dueling, and dressage.12 “How did one perform aristocratic manliness in an era when the cavalryman was a diminishing presence on the battlefield? Moreover, how did one assert knightly courage in the face
28
Rational Fog
FIGURE 4. Gunlock for a musket, St. Etienne, 1777. Musée de l’Armée (#16640).
of the proverbial ‘meaner’ sort with a loaded musket?” Technologies, she suggests, like morality or education can drive literary fashions.13 Around 1450, the common musket of Europe weighed between fourteen and seventeen pounds. It had a trigger and could be held and aimed (Figure 4). The modern gun was more or less complete by 1600, and a skilled musketeer could fire off one round about e very two minutes. Matchlocks and flintlocks were both in use around the same time, differing only in how they produced the energy to ignite the gunpowder. Matchlocks had a slow-burning wick that was lowered to the firing pan to ignite the powder. Flintlocks worked by the production of a spark with a flint stone.
To Hold a Gun
29
These were then the two forms of the gun that migrated out of Europe on trade routes and with colonizers. They w ere not particularly fast or accurate or easy to use, but they transformed violent conflict, first in Europe, then in North America and Africa, and eventually around the world. People interacted with the gun in different ways in different places. While it might seem as though a gun would “naturally” dictate its own use—as though a gun has an automatic, transparent, proper way of being held, aimed, fired, and understood—this is not what happened. Systematic drill was one European response to the problems posed by the new gun technology. The procedure for loading a matchlock involved something between eight and forty-two steps, depending on how you divided the motions. Having a physical plan for the motions of hand, arms, and head, consistently reinforced by drill, enhanced the likelihood that young soldiers could keep their composure u nder fire. Drill was an old idea, used in ancient Rome. It was reinvented and repurposed in the late sixteenth c entury by Maurice of Nassau, Prince of Orange. He was the captain general of Holland and Zealand between 1585 and his death in 1625, and in that role he proved to be an unusually effective battlefield commander. It was not entirely new to train recruits. Armies had always done so when recruits first arrived for service. But Maurice viewed drill as a regular and continuing duty to be engaged in at all times, keeping soldiers prepared for each step. His soldiers made their movements in unison in response to shouted commands. Constant drill and superior command-and-control under Maurice’s guidance eventually made possible what historian Geoffrey Parker has called “a production line of death.”14 The Roman past provided Maurice’s models. He read texts that had been translated into Latin in the late fifteenth century that were based on second- century Greek accounts of how Roman armies were organized and trained. Greek authors focused on the efficient handling of weapons and on the importance of well-planned, sequential word commands that could enhance discipline. Maurice adapted t hese ideas to discipline the use of guns on open European battlefields. The practices he promoted made portable firearms more efficient and reliable. Maurice was also deeply concerned about spare time, which he considered dangerous: he had protocols for keeping young men busy digging, drilling, and cleaning. Larger armies required more systematic training, and controlling so many young men required at the very least keeping them busy. Drill also facilitated
30
Rational Fog
a technique called the volley, in which a front line of troops fired in unison and then quickly fell back to let the next line, loaded and ready, fire quickly again.15 In a fully reformed Maurician force, armies were organized in small tactical formations in groups that could be easily controlled by a single voice. Pikemen, musketeers, and h orsemen were integrated on the field, with the pikemen surrounding and protecting the relatively vulnerable musketeers (who took a while to reload). Troops w ere expected to maintain their formations in active battle and to execute commands without reflection.16 It was a kind of choreography of technological efficiency, with social and bodily control understood as a regular part of military service. Drill made individual soldiers cogs in a tightly controlled machine, accepting commands with (ideally) blind obedience. The rules of drill even dictated stern facial expressions. In the course of manual drill, e very soldier was supposed to look and perhaps feel a certain way.17 Manual drill in the Maurician style became the norm in forces from Russia, Spain, Sweden, and Italy in the seventeenth and eighteenth centuries. Familiar images of European battlefields, with soldiers all in a row confronting the enemy on open ground, reflect this style of warfare. But relationships between the gun, the body, the enemy, and the field of battle w ere not fixed by the technology itself and could be understood in very different ways in different places. As Patrick Malone and David J. Silverman have shown, Native American groups in the seventeenth century interpreted the gun in ways that reflected their experiences with bow and arrow.18 While European gunners marched across open battlefields performing drill in unison and probably not aiming their weapons (there was no command to aim in Maurician drills), Native Americans hid b ehind trees and rocks, waited in ambush, and aimed carefully to hit individual p eople. The colonists were shocked and criticized their “skulking way of war” as cowardly and unmanly. L ater, the colonists themselves adopted these same strategies of forest warfare when they faced the British in the American Revolution. It is perhaps a standard observation in introductory American history textbooks to note the British complaint that the unruly Americans in the Revolution were hiding behind trees and aiming their guns at individual persons. This was not consistent with the standards of European warfare. It was seen as cowardice. But t hese were exactly the same complaints that the
To Hold a Gun
31
early colonists had lodged against the Indians. Essentially, Native American ways of war w ere incorporated effectively into the military strategies of the new nation. Malone’s intriguing study of the exchange of technologies between British colonists and Algonquian residents explores how Indians adopted European technologies and the British in turn a dopted Native American ones. Snowshoes and nocake (a form of ground corn useful in the field) had practical advantages for colonists in the New England environment. For their part, Native Americans took a keen interest in guns and proved to be far more proficient in their use than most of the colonists w ere. The Indians were skilled at warfare with bow and arrow and their marksmanship translated quickly to the musket. Conversely, for reasons of class, most of the colonists coming from Britain would never have owned a gun. Selling guns to the Native Americans was of course against the law in French, Dutch, and English territories in New E ngland, but traders sold them to Indians anyway. And as Malone shows, Indians quickly demonstrated their preference for flintlocks over matchlocks. They would pay more in trade for flintlocks, and choose not to buy matchlocks. Matchlocks produced a burning smell that could lead to the detection of a person hiding behind a tree or crouching behind a rock. Such a smell mattered not at all on an open European battlefield in disciplined (and highly vis ible) formation. But in forest warfare, if one wished to be concealed, it was a disadvantage. This was a form of technological choice that those selling guns to the Native Americans noticed and commented on. Some colonists recognized Native American hunting skills and came to rely on Indians for game and furs. But the fur trade and the European flintlocks that fueled it had devastating effects on indigenous groups in North Amer ica. As Silverman shows, they led to increased violence between native groups and to a new slave trade in which captured Native American enemies w ere sold by other Native American groups to colonists.19 Many smaller and less well-armed Native American groups w ere culturally erased by other Native American groups, using some of the same methods common in European conquests. These included forced migration, loss of community, loss of language, and social reconditioning. Heavily armed Iroquois enemies, for example, forced the Susquehannocks in Maryland and Pennsylvania out of their territory along the beautiful river now named for them, the Susquehanna. As their numbers collapsed in the wake of smallpox epidemics
32
Rational Fog
and threats from surrounding Indians, survivors were moved to Iroquois communities in what is now New York State, to be given new identities as Iroquois. Essentially, Silverman shows, they ceased to exist as a cultural group. Firearms violently transformed internal Native American warfare and facilitated such erasure.20 But even well-armed Indian groups found that their skills and marksmanship w ere not enough to save them. Estimating population numbers for pre- Columbian North America is notoriously difficult, but t here is general agreement that at least 90 percent of Native Americans in North America had been wiped out by disease (mostly smallpox) and violence by about 1700. Europeans brought guns, germs, and steel, as Jared Diamond proposed.21 They also brought a commitment to control the wealth and resources of the new lands and to remove from those lands anyone who already lived there and threatened that control. The gun had a somewhat different, though equally interesting, fate in Japan. Firearms w ere almost certainly known in Japan almost as early as they were known in China, but they w ere not widely a dopted in Japanese warfare u ntil the sixteenth century. The European-style arquebus or flintlock came to Japan on a Chinese trading vessel in 1543. Three Portuguese adventurers, picked up by the Chinese at the Portuguese colony established in India in 1510, brought two muskets and ammunition to Japan, and they soon shot and killed a duck. This impressed the local chief who asked for shooting lessons and bought the guns. He ordered his chief sword maker to start making guns, and within a de cade, guns w ere being made all over Japan. A Japanese general in full armor died of a gunshot wound in 1560, only seventeen years a fter the European gun arrived in Japan.22 By 1550, training and drill with the new weapons w ere being practiced by Japan’s peasant soldiers. Presumably, shaping Japanese responses to the gun were the internal dynamics of a prolonged war, with many feudal lords vying for control of the country and with the shogun and the Emperor struggling to stabilize power. It’s important to recognize that Japan was a major weapons producer. It sold its remarkable swords to China and other countries throughout the sixteenth century. But the European-style gun solved an internal problem at a critical moment in Japanese history. At the famous B attle of Nagashino in 1575, Lord Oda’s 38,000 men included 10,000 with matchlocks. Lord Oda himself fought with spear, as befitted his high status. Only peasants fought with guns. Less than a decade l ater, by 1584,
To Hold a Gun
33
several battles in Japan had even taken on a World War I–type stalemate. No lord wanted to send his troops into massed volley fire, so both sides dug in and stayed put.23 But guns relatively quickly fell out of favor. Japanese gunmakers had adapted the European technology and mastered its complexities. And in a revealing twist, controlling the gunmakers became a way to control the technology. In 1586, the regent of Japan, Lord Hideyoshi, undertook a form of covert, socially mediated arms control. He announced that he was g oing to build a statue of Buddha that would be enormous and that he would need iron from all the weapons in Japan. He asked all citizens to donate both their swords and their guns to this spiritual project. The Buddha, to be built in Kyoto, provided a reason to collect the weapons.24 As Japan abandoned its imperial ambitions in Korea, the Tokugawa shogunate began to stabilize relations between the feudal lords and to pacify Japan. Controlling guns became a part of that stability. Guns became a state monopoly, and when the state reduced the number of guns that it ordered, it compensated the gunsmiths by having them make swords instead. Modest orders for guns each year kept a few gunsmiths busy, but guns became almost entirely ceremonial in Japan, used in processions. The gun did not exactly disappear, but its status and meaning changed. Like the foreigners who were simultaneously expelled from Japan, a foreign technology was more or less expelled as well. Kleinschmidt attributes this change to the high “social cost” of the gun, including the demands of drill and supply, and points out other situations where the gun was relegated to a lower status, reduced in numbers, or removed from battlefields. “The social cost of the deployment of portable firearms seems to have been considered too high in Japan, for, a fter extensive use of portable firearms for about two generations in the sixteenth c entury, they w ere banned from the arsenals,” and “similar evidence has been recorded from late sixteenth- century China. Again, portable firearms were used together with manual drill, the rules for which were laid down in drill manuals. But portable firearms failed to achieve tactical significance in warfare [and] other weapons existed in East Asia which offered tactical and strategic alternatives.”25 The elements of the Japanese case that bear emphasizing are as follows. First, Japan was perfectly capable of producing a large number of high- quality and effective guns. Second, the elimination of the gun was part of a broader effort to remove outside, foreign technologies and ideas (including, for example, Christianity) from Japan. And finally, the process of
34
Rational Fog
removal was handled by the control of the individual experts who knew how to make guns. In the short term, paying inflated prices to support the gunmaking families as they made swords might have been costly. But in the long term it helped eliminate a technology that for various reasons was no longer congruent with state goals and social order in Japan. In the nineteenth century a newly “opened” Japan (in July 1853 by US Navy Commodore Matthew Perry who was seeking to establish regular trade relations there and whose guns played a role in Japanese willingness to talk) rapidly took up industrialized military technology with a vengeance. Japan became a world naval power in a very short time. But for almost three centuries, a time of long peace, guns were virtually unknown in Japan. The story of the rejection of the gun, as told by Noel Perrin in his engaging 1979 book, Giving Up the Gun, has been reinterpreted by later historians.26 The lessons Perrin draws out—that the Japanese rejection of the gun means that it can perhaps be rejected elsewhere, for example, in the contemporary industrialized world—may be too simple. But the details of his story hold up. Japanese armies used European-style guns extensively, and then stopped doing so when state policies changed. As Brenda Buchanan has noted, the idea of the “gunpowder empire” was introduced nearly forty years ago by Chicago historians Marshall G. S. Hodgson and William H. McNeill. It involved relatively little explicit attention to gunpowder itself. “It has become a useful shibboleth,” Buchanan proposes, “a password requiring little or no explanation.”27 I admit to having wondered myself why the British Empire was not a gunpowder empire. But the term is not applied to European empires that were forged with guns. It refers only to Islamic military patronage states—the Ottoman Empire in Turkey, the Savafid in Persia (Iran), and the Mughal in India—that drew on gunpowder technologies to control large, diverse tracts of land and populations.28 All w ere empires that brought together many languages, religions, and ethnic groups, and they enforced strict hierarchies in their military services. Keeping these populations u nder control was critical to imperial order. And keeping the large multiethnic armies they depended on supplied required money, administrative support, and consistent training. All emerged between 1400 and 1600 as the European gun came to be embraced around the world. The Turkish Ottoman Empire was one of the earliest and longest lasting. Like the Japanese, the Ottoman army rapidly adopted and mastered Euro
To Hold a Gun
35
pean military technology. Handguns, field guns, and siege guns were all in use by Turks from about 1520. The Ottoman Empire was built around its army and engaged in rapid expansion. It was at its height in the sixteenth and seventeenth centuries when it controlled Egypt, North Africa, the Balkans, and Eastern Europe. It was a stunningly successful empire that endured for 400 years, during which the Ottoman army was the most dangerous military that European powers faced.29 By one theory, it was the cost of trying to run the empire, dispersed and polyglot, like a modern state that brought it down. The difficulties of providing education, establishing mass production, and meeting the needs of citizens perhaps weakened the Empire, and by the late nineteenth c entury the Ottomans w ere borrowing money from the British to sustain their hold on vast territories that were restless and ready to be independent. After the 1914–1918 war, World War I, the empire fell apart. The Safavid Empire in Persia also relied on the use of gunpowder and quickly adapted European technologies to meet its own priorities. During the first half of the sixteenth century, Shah Ismail I led his Safavid warriors to found a new Persian empire in Iran. The Safavid Empire had characteristics identified as classic in a gunpowder empire: a highly centralized state, fully industrialized production of arms, and highly trained and disciplined armies. It fell apart after about a century of expansion and conquest. Similarly, after 1483, the Mughals in India a dopted muskets, heavy artillery, and cannon technology, as well as styles of drill common in European armies. Eventually the Empire included all of northern India and part of central India. It was a conquest based on military discipline: competing powers in India were not yet organized or wealthy enough to control and supply a large army or infantry, and like the British, Mughals seized land at gunpoint. Russia followed a related pattern of expansion in the same period, from 1440 u ntil about 1750. This pattern included the rise of a highly centralized state, heavy dependence on artillery, the control of massive territory, and significant economic power. The Russian army became a permanent professional force, trained in the use of artillery and muskets, which led to victories against Sweden and Poland under Peter the G reat from 1682 to 1725. Perhaps the gun promoted European social and political order no matter who was holding it. Systematic large-scale, defensive use of this weapon required reliable consistent gunpowder production, storage, and transportation. It also required standing armies, large numbers of men who could be
36
Rational Fog
consistently called on for the virtually permanent pursuit of war. These men had to be cared for, clothed in appropriate uniforms and boots, and trained in drill, discipline, and especially nationalism.30 Gross has suggested that clothing can be understood as a military technology that reflected the authority of scientific expertise.31 Certainly the new clothing protocols of modern standing armies reflected industrial technological capabilities—the weaving and milling practices of the industrial revolution. The experience of serving in the Armed Forces even came to be seen as part of a kind of civilizing mission: young men from diverse regions could be brought together and taught a similar vocabulary of meaning, similar values, discipline, and a faith in the modern state. They learned how to walk, dress, speak, and believe. Even into the twenty-first century, military ser vice has been understood to produce “modernity” in places around the world, one soldier at a time. While these gunpowder empires differed in many ways, they all reflected the impact of the new technology. Some administrative practices w ere more conducive to the effective use of the gun, and t hese practices in turn s haped social and political order. The British gun industry also fueled a different kind of empire: the slave trade. A fter 1690, traders in Africa w ere often paid in “slave trade guns” made in Britain—in Birmingham, Bristol, and Liverpool. Like other so-called gunpowder empires, the slave exporting states on the Gold Coast and the Slave Coast became highly militarized, with relatively large armies able to seize their enemies in terrifying night raids and sell them into slavery. By 1700 Euro pean demand for slaves was skyrocketing, and the Akwamu, Denkyira and Dahomey, and Asante groups took advantage of this demand to acquire guns. African traders demanded guns, powder, and ammunition in exchange for slaves and favored firearms over almost all other trade goods. The gun increased the efficiency of gathering enemies, who could be seized with flintlock muskets loaded with shot instead of a lethal ball. The goal was surprise, disruption, and taking victims alive and unhurt. The Dutch West India Com pany by 1700 could not meet the demands of traders along the Gold Coast who wanted guns. By one estimate, British guns amounted to about one half of the total firearms imported into West Africa.32 This flood of guns into the Slave and Gold Coast after 1650 brought about dramatic changes in interstate warfare within Africa. It shaped the political reorganization of African states, such that historians can reconstruct a high
To Hold a Gun
37
correlation between the increase in gun imports and the increase in the slave trade. Some African states began to develop relatively large armies—perhaps 12,000 or even 20,000 musketeers—and states with more firepower could build on this advantage to dominate slave exporting and profit from eliminating their enemies. What I want to suggest is that the European-style gun was adopted around the world roughly from 1450 to 1700, and each group that came in contact with this style of weaponry took it up in ways that reflected their culture, their priorities, and their unique geographical and military challenges. Some groups were reluctant but felt compelled to acquire European guns for their own defense, and in the process of attempting to repel the European, they became more European. Other groups quickly recognized the value of the technology but used it differently b ecause of their own traditions. The musket of the sixteenth century was a mobile piece of culture, a manifestation of European technological skill, and a critical element of a system that had many consequences that were probably not obvious to those living through this arc of tremendous change. It’s not that the consequences w ere the same everywhere, but the machine had intrinsic power to damage flesh and kill game—it was a biological technology. It produced a bodily experience of wounding that performed political power. Rejected or embraced, desired or feared, the gun acted in the world. Given this power, it is interesting to contemplate what holding a gun meant to t hose who w ere asked to use it to kill. It is now clear that many of t hose holding guns, for centuries, probably fired their guns above the heads of the enemy. Mock firing was not discovered or named until the mid-twentieth century. But a fter its existence was understood and recognized as a possibility, evidence of its e arlier presence could be reconstructed in the complaints of generals, the rules for training soldiers, correspondence records, and the physical evidence drawn from guns dropped on active battlefields. While the quantitative data that originally animated an institutional response in the 1950s was probably not accurate, the phenomenon was real enough to provoke massive changes in training, and therefore changes in firing rates in military forces and security forces around the world. The discovery of mock firing was a surprise. It was the result of a survey during World War II of US soldiers who had just experienced fighting on an active front. Army historian General S. L. A. Marshall was attempting to
38
Rational Fog
document the experience of war in real time. He had a series of questions for soldiers who had seen active b attle, and his interviews w ere done shortly a fter the battlefront experience, sometimes only two or three days later. One of his standard questions was: When did you begin to fire your gun? Marshall had been asked to conduct these interviews as part of the (astonishing) record-keeping apparatus of the US Army. He was interested in the experiences at the frontline of those who had been in contact with the enemy. In modern warfare the teeth-to-tail ratio is generally imbalanced. About two- thirds of those who served in World War II did not actually engage in active combat. The experiences of the relatively few who did w ere of obvious and compelling historical and strategic interest. Marshall and his team conducted interviews with thousands of soldiers in more than 400 infantry companies in Europe and the Pacific, all a fter they had been in close combat with German or Japanese troops. His later book about these interviews proposed that only 15 to 20 percent of these soldiers who actually saw combat fired their guns at the enemy.33 These non-firing soldiers, he said, did not run or hide. In many cases, they took significant personal risks to rescue fellow soldiers, bring forward ammunition, or deliver messages to the front. But they did not fire their weapons at the enemy even when faced with repeated charges.34
ater analysts and critics w L ere skeptical about some of Marshall’s quantitative claims and concluded that his record-keeping practices were problematic.35 His research and findings have therefore been questioned, and he may not have had the data to back up his statistics. Yet his findings had significant, practical results. Leadership in armed forces around the world began studying what Marshall called “ratio of fire”—meaning the percentage of combatants in any active military engagement who actually fired their guns, whatever type of weapon they w ere holding. His observations provoked questions about troop training that led to policy changes in world militaries and even police forces. Questions of troop training had of course been relevant for centuries. In the nineteenth c entury, the switch to breech loading firearms led to more dispersed lines and an exponential increase in firepower. Military leaders and theorists worried then that soldiers who w ere not shoulder to shoulder—who
To Hold a Gun
39
ere spread out rather than packed together—might not charge forward in w a disciplined way. They promoted the idea that soldiers had a particular, powerful “elan” or “esprit de corps” that kept them under control even on a chaotic battlefield. Yet even their concerns suggest that the possibility of quiet, covert resistance was recognized. While it seemed as though the necessity of killing would be obvious to a soldier who was attempting to defend his own life and to protect his comrades, some soldiers resisted. The usual two options—fight or run—were joined by a third. This was a socially invisible form of technological choice: a soldier could seem to fire the gun without d oing so, or perhaps actually fire the gun while aiming above the heads of enemy soldiers. It was an astonishing and unexpected result. Why had they not fired? Marshall’s findings w ere taken very much to heart by the US Army, and a number of training measures were instituted as a result of his suggestions. According to l ater studies, t hese changes resulted in a firing rate of 55 percent of soldiers in Korea, and 90 to 95 percent in Vietnam. The training methods that increased the firing rate from an estimated 15 percent in World War II to 90 percent in Vietnam (and 100 percent today) have been referred to as programming or conditioning. The FBI became interested as well and began to study non-firing rates among law enforcement officers in the 1950s and 1960s— and found similar results.36 And as a result of Marshall’s work, historians and Army researchers (in the US and elsewhere) began to try to understand just how long this kind of invisible resistance, for whatever reason, might have been a problem. Commanders as early as the seventeenth c entury complained about soldiers who didn’t fire at the enemy. Similar suggestions came up in a survey of French officers in the 1860s, who said some soldiers did not fire their guns. Indeed, military historians had reported ineffectual firing throughout history, and in the late nineteenth c entury a careful reconstruction of surprisingly low killing rates in the Napoleonic wars suggested at least the possibility that many soldiers had not been firing their guns, through many major battles. A 1986 study by the British Defense operational analysis establishment examined the killing effectiveness of military units in more than a hundred nineteenth and twentieth century battles. They compared real historical mortality data with hit rates from simulations of the same b attles using pulsed laser weapons. Modern test subjects w ere using simulated weapons and could neither inflict nor receive a ctual harm from the e nemy. Concerns about their
40
Rational Fog
own vulnerability, or about hurting another person, w ere presumably absent in such a trial. The test subjects killed many more enemy soldiers than had actually been killed historically. The research group in 1986 concluded that battlefield fear alone could not explain such consistent discrepancies across time, place, and circumstance. They proposed a widespread unwillingness among soldiers to fire guns at other soldiers. This unwillingness, this quiet and false performance of b attle, kept the actual historical killing rates significantly below potential levels.37 Perhaps the most remarkable data on firing rates came from Gettysburg. After the Battle of Gettysburg, 27,574 muskets were recovered from the battlefield. We know this b ecause of the record-keeping propensities of the Army. Of t hese nearly 90 percent, 24,000 muskets, w ere loaded with shot when they were recovered. Of these loaded muskets, about half were found to have been loaded more than once. Of these, 6,000 had between three and ten rounds stuffed into the barrel. One weapon had been loaded and not fired twenty- three times.38 The Civil War weapon was usually a muzzle-loading, black powder, rifled musket. To fire that weapon, a soldier would take a paper-wrapped cartridge consisting of a bullet and gunpowder packaged together. He would tear the cartridge open with his teeth and pour the powder down into the barrel. He would then set the bullet into the barrel and push it all down with the ramrod in order to prime the weapon. This was a lot of work. So why w ere t here so many loaded weapons left on the battlefield? Why did at least 12,000 soldiers misload their weapons at least a second time during combat? Some soldiers would of course have been shot while charging the e nemy, and they might have been running with a loaded gun. But loading a gun required about 95 percent of the time involved in managing a gun, and firing it required about 5 percent of the time. Most guns left on the battlefield should have been empty. Marshall proposed that soldiers in battle became conscientious objectors, passively and quietly refusing to kill or try to kill e nemy soldiers. I suspect that the reasons for individual choices might have varied quite a bit, and rather than resulting from some fundamental and intrinsic resistance to killing, could have been the result of many different calculations, including but not limited to a reluctance to fire a weapon at a person much like oneself—young, male, possibly forced into service. But in any case the fact that so many soldiers did choose not to fire, over centuries, provides some insight into the
To Hold a Gun
41
differential impact of technologies and the crucial subtlety of technological choice. Individual users of any technology make choices about how the technology will be used by them. And sometimes invisible choices are best of all because they conceal from those in power what is r eally happening. If we think about the story of the gun and its historical impact—its roles in the creation of empires, the slave trade, and military chaos in Africa and North America, the Industrial Revolution, and even in the moral negotiation of the individual confronted with a battlefield choice to fire or not to fire— we gain a perspective on our own complicated uses of the technologies we take for granted, ignore, are captured by, or leverage individually in unexpected ways—perhaps to mislead those around us. Did gunpowder and the gun lead to the rise of the modern state? Did they provoke colonial empires? Did the gun produce the slave trade? Jared Diamond in his popular study proposed that “guns, germs, and steel” shaped political relationships around the world.39 Diamond’s intent was to undermine the idea that European conquest can be taken as a transparent sign of European biological and cultural superiority. He attributed European successes in taking land and resources from others partly to differences in the geographical resources available in the region, particularly the kinds of animals that could be domesticated and the options for crops and food. He also highlighted the impact of infectious disease and the biological power of the gun. While Diamond’s account is persuasive, it does not generally consider the many different ways that the gun acted in the world, and he therefore elides questions of technological choice. As I have noted h ere, guns could be used by p eople who were hiding behind trees or standing together in an open field. They could be fired in a volley or in single shots, aimed at individual persons or fired generally in the direction of massed enemy troops. They could even be fired so that the bullet was more likely to pass above such enemy troops, apparently a relatively common choice made by individual soldiers. Using a gun could involve rules about posture or about class. Some gun manuals in Japan in the fifteenth c entury suggested that those holding a gun should stand elegantly, elbows pulled in to avoid appearing coarse or common. In this prescription it is clear that a person holding a gun was expected to be seen by someone. European drill manuals placed men holding guns shoulder to shoulder with other men, all of them carrying out stylized motions of
42
Rational Fog
loading and firing. Drill established a relationship between the gun and the body, and all fired at once in a volley and then dropped back. Native Americans, as they gained access to European guns through trade, had rather dif ferent ideas. They crouched and hid when they were holding a gun. Accustomed to forest warfare with bows and arrows, they embedded the new technology in their own systems of practice. The proper way to hold a gun, for the Indians, was to hold it so that it could not be seen. There w ere also different rules about who should hold the gun at all. Those who sold or traded guns to Native Americans in the early colonies were breaking the law. In Japan, guns w ere allowed for peasant fighters who did not have the skills for the real bravery involved in b attle with swords. In Britain, the gun was only for wealthy landowners. Most of the British colonists who came to North America had never handled or owned a gun. Even poachers in Britain didn’t use guns because they were noisy and called attention to the activity of poaching. At the same time, elites in Britain initially considered the use of the gun in hunting to be shameful. All these shaded nuances of responses to the gun are technological choices. They involve decisions about what aspects of the gun are important, valuable, or problematic. T here’s a solid, real thing in the m iddle of t hese choices, a heavy, difficult-to-load, inaccurate, and dangerous firing machine. But this thing, despite its solidity and heaviness, does not wholly dictate how it will be seen and used. Variations in contemporary gun laws and practices in dif ferent nations today reinforce the point. How the object w ill act in the world and what its consequences will be depend on how it is managed in any culture. Some states carefully control access, some do not. The story of gun technology suggests that even a relatively s imple military technology is socially complex, negotiated in ways that shape its consequences. If the gun caused the rise of the modern state, it did so as a result of its interaction with established, pre-existing fields of culture and power. It was a technological intervention through which European powers reorganized themselves, and eventually, the rest of the world.
2 The Logic of Mass Production
T HE PROC ESSES CEN T R A L T O INDUS T RIA LIZ AT ION—IN T ERCH A NGE ABLE PAR T S, EFFICIENC Y, AND R AT IONAL
management—were also central to the rise of technoscientific warfare. The logic of mass production was also the logic of total war and eventually, the logic of indiscriminate urban bombing—the logic of thousand- bomber raids and city firestorms. By the 1940s, civilian workers produced the airplanes that legitimated air power strategies that bombed civilian workers’ neighborhoods. Industrialization is a broad and vague term, first used around 1906, and roughly describing the shift after 1780 from an agricultural to a manufacturing economy, and from hand-made craft production in the home to machine production in the factory. Most historians locate its origins in Britain’s mills or in armories for gun production. Henry Ford’s early twentieth-century assembly line is often the narrative end-point: Ford’s car production practices and strategies fully realized the potential benefits of industrialized production. Generally seen as a process that brings people and societies into modernity and into global networks of capital and manufactured goods, industrialization produces economic winners and losers. Its deeper costs are commonly
43
44
Rational Fog
calculated as psychological: anomie, alienation, hopelessness in the face of competition. Its benefits are material prosperity (for some). Historians have been attentive to industrialization for good reasons. It was not an innocent project of h uman advancement. Rather, I would characterize it as a process of entanglement. It played a role in moving every individual further and further into complicated networks of rules, machines, l abor management, social discipline, emotional investment, and bodily discipline. Along the way it effected massive changes in human experience, birth to death, hearth to factory, statehouse to battlefield. While it was roundly criticized by some (notably Karl Marx in his writings after 1867), it also seemed to animate a babble about productivity, progress, and improvement—a constant commentary by elites who appreciated its various wonders. This babble is best understood not as a metric of real benefits but as one of the ways that it generated political power. Industrialization bound individuals closer to the state and to imagined communities that extracted certain emotional investments (consumer desire, loyalty, nationalism, and some forms of courage). It also produced total war and the concomitant end of what became a legally negotiated social role, “the civilian.” In this chapter, I explore these somewhat messy threads. My account here is not definitive. It is rather an effort to ask some different questions about a very large-scale process that has had profound consequences for modern life. Most accounts of industrialization focus on a few well-recognized technologies including steam power, railroads, the telegraph, the rifled musket, and the Minié ball, and on practices of production such as interchangeable parts, the assembly line, and mass production. The nineteenth c entury was the period of the most rapid change and the most rapid development of new systems and technologies. For the armed forces, t hese changes reshaped administration and supply. European armies grew still larger; promotions in military forces became more egalitarian and meritocratic; and the geographical range of armies expanded with new railroad systems that made moving large numbers of people and supplies easier. The shift from hand-made, artisanal gun production to mass production made it easier to rearm large armed forces in shorter time periods. Bousquet, in his account of the rise of scientific warfare, emphasizes the rise of a mechanistic world view a fter 1700 (the idea of a clockwork universe) and the emerging idea that war was “the principal vector for the realization of historical destiny.”1 He emphasizes the critical roles of concentrated energy,
The Logic of Mass Production
45
what he calls “thermodynamic warfare,”2 and the profound problems of logistics that new ways of industrialized warfare produced after 1800. For much of the history of armed conflict, an army in motion would have had limited contact and communication with any kind of political center. The center might establish goals and develop plans. But commanders in action, confronted with critical decisions in real time, were often on their own. Supplying these traveling armies was similarly vexed. Gunpowder sometimes went bad, or ran out. Food for both men and horses had to be acquired locally. Pillaging was part of the supply plan and was expected (and feared) by the locals. By 1850, most European states had regular taxes and established systems for paying troops. Rules emerged for promotion by merit rather than birth. Most armies in Europe began to wear uniforms of one kind or another, commonly made with mass-produced textiles, in systematic, reliable, meaningful colors and fabrics and designs. Weapons also became more or less standardized. Relatively large numbers of weapons could be produced in shorter times. Train supply lines and telegraph communications linked the moving army to the political center. Professional arms dealers met the needs of growing armies. Eventually t hese new systems of warfare, with their complexity and their scale and their high communication levels, came to be seen by European powers as posing new problems for the rules of war. In a series of meetings and negotiations after 1874, European nations began to openly discuss limitations on technology; new protocols; and rules for prisoners, civilians, spies, prisoner exchanges, and arms races in particular. T hese discussions reflected the eighteenth-century movement to humanize war through the application of reason.3 At the Brussels declaration of 1874 and the Hague declarations of 1899, leading European powers struggled to collectively manage the highly destructive powers of modern technology. A crucial part of these discussions focused on the nature of civilians. The idea of the negotiated, formal, legal civilian—the person who possessed some inherent consensus right not be a target in war—emerged with the same modern technology that facilitated the civilian’s greater vulnerability. It is important to remember that in many military systems, over the span of human history, noncombatants were routinely targeted. In early Roman warfare, noncombatants within the Roman world were commonly spared, but noncombatants elsewhere were not. T hese distinctions were preserved roughly in some forms of European warfare (where civilians in neighboring European countries might be spared) but almost never in colonial wars where European
46
Rational Fog
forces often sought to exterminate people whose land or resources they wanted to control. And even within Europe, in early European warfare, villages w ere burned and castles crowded with women and children destroyed. In places all over the world women and c hildren were routinely killed or taken as slaves or captives in the course of violent conflict. It is not possible to argue that human warfare, however defined, has had some overarching or consistent tendency to exclude certain kinds of p eople from becoming the focus of violence.4 This makes the nineteenth-century debate more interesting, for it occurred at the very moment when killing factory workers (who made the technologies that permitted an e nemy to continue to wage war) was beginning to make strategic sense. In fully industrialized, “total” war, the factory worker, the farmer, and the cook are all working every day for the state’s military goals. All citizens provide the materials needed to continue the war, from bread to fabric. Therefore all are effectively mobilized and perhaps for this reason, legitimate military targets. The civilian as a protected legal entity emerged at the very moment when targeting civilians was a rational strategy. Just as the meanings of civilians were shifting, so too were the meanings of soldiers. Coincident with the rise of industrialized war, the meanings of the soldier began an important shift, from expendable, cheap, and unimportant to recyclable and valuable. The formal structures of military medicine were rudimentary until the nineteenth c entury and were not really fully mobilized and institutionally sophisticated u ntil the First World War. A wounded soldier in earlier times might be helped by his comrades, but his care was not necessarily the business of the army itself. Death of soldiers on the field was commonly viewed as the natural cost of war. But rational war was expected to be cost-efficient, and proponents of military medicine—most decisively the British nursing pioneer Florence Nightingale but also many others—argued that it was not only more humane but also more cost-effective to treat and heal an injured soldier than it was to waste the resource or to retrain a new one. The humanitarian benefits of battlefield care w ere intertwined with rational resource management. The body of the soldier had to be recycled efficiently just as materials w ere recycled and other resources w ere effectively and rationally exploited. The rise of professional military medicine produced the “two masters” problem: any physician on a front line could be expected to experience at least occasional tension between medical obligations to a particular patient and institutional obligations to the armed forces to quickly send soldiers back into battle.
The Logic of Mass Production
47
It is also not a coincidence that the Unknown Soldier was recognized first in the twentieth c entury, in the wake of the devastating carnage of the First World War, the first war in which many combatants wore stamped metal identification tags marked with their name, identification number, and even religion. Many soldiers in earlier wars in Europe had been relatively unknown, in the sense that they were less likely to have a formal state identity, a centralized, standardized textual record of existence, and a paper trail attached to their own embodiment. The Spanish empire tracked individuals more closely than some other states. But in many places in Europe and around the world, births of individuals w ere recorded only in clerical sources, parish records, family bibles, or tax records prepared by landowners. Most populations w ere not the focus of systematic state record keeping prior to about 1750, and such practices escalated in Europe only a fter about 1830. Drew Faust has noted that the naming and counting of military dead are of recent origin, emerging in the United States only in the wake of the US Civil War. In the 1860s, the policies of both Union and Confederate forces did not require formal notification of f amily members about wounds or death. Civil War soldiers wore no dog tags or other forms of identification; military forces kept no records of next of kin. At the end of the war, almost half of Union soldiers and more than half of Confederate dead w ere unknown, unrecorded. But in the wake of the war, the United States moved t oward a policy that led to a massive effort to identify and rebury the Union dead. “By 1870, the bodies of almost 300,000 soldiers had been located across the South and reinterred in 73 national cemeteries.”5 As the state management of citizenship shifted in the eighteenth and nineteenth centuries, the formal apparatus of stabilizing individual identity, birth, and death flourished. Someone born in 1600 might well have been institutionally invisible—unrecorded in state archives or school records, without an official number, proper name, or legal documentation. Some people could live their entire lives outside of formal record-keeping systems. But gradually a complex system of identity documentation made each person a legal entity, recorded and known in ways that e arlier populations w ere not. The monuments to unknown soldiers produced around the world a fter 1918 reflected this twentieth-century social and political reality: by 1914, virtually all soldiers participating in the war were known. They had an established, documented identity. Their disappearance in the mud and violence of the
48
Rational Fog
trenches conflicted with the prevailing expectation of stable identity and bodily consistency. Today unknown soldiers are again, in new ways, rare. The collection of DNA in many armed forces means that identity persists even when very little bodily material remains. Technologies of identity have created the contemporary state of being known always and everywhere. In industrialized society, identity is industrialized as well. The process of industrialization then involved feelings, machines, and labor. The efficient management of bodies took many forms. T hese included systems of formal health care in the armed forces, of stabilized identity, and of more efficient production and destruction. The history of the rise of industrialized war can be understood as a part of the history of the body. It has involved discipline, record-keeping, and healing and destroying human bodies in planned and systematic ways. Highly educated armed forces also became the norm in the nineteenth century, as a part of this industrialization process. Military academies trained engineers and artillerists, and then the establishment of War Colleges produced trained staff officers. T hese new forms of military education paralleled the development of other forms of education such as universal and mandatory public schooling. Conscription and military service came to constitute a form of socialization and identity formation beneficial to the sovereign state. This was particularly true in national contexts characterized by dramatic geo graphical differences. Young men (men only, until very recently) participated in the armed forces as a part of a “modernizing” process, with individuals recruited from one area of the nation, sent to be trained in another, posted in a third, and maybe exercised or posted once more in a fourth before release to the reserves. In this process, regional dialects and cultures could be left behind in favor of one (dominant) national language. Through military ser vice, individuals were thus taught to see beyond their geographic origins and to become loyal to the nation as a whole, conforming in dress, attitude, and feelings such as patriotism.
ere I consider four thinkers whose lives and ideas illuminate the logic of H mass production.
The Logic of Mass Production
49
The French military engineer Jean Baptiste de Gribeauval (1715–1789) imagined gunnery as the practice of cool rationality. As Ken Alder has suggested, engineering rationality is not a set of timeless, pre-existing abstractions. Rationality in gun production emerged historically, awkwardly, in fits and starts, and partly through negotiations around both labor and expertise. The French artillerists under Gribeauval’s direction adopted strategies that increased the rate and accuracy of fire on the battlefield. But the interchangeable parts that promised to streamline or rationalize French gunnery in general posed problems of labor that in the end undermined the production program, as managers and artisans negotiated and disagreed.6 For Gribeauval, the pursuit of rational planning in war reflected the culture of the French artillery service to which he belonged. The service was oriented around emerging scientific ideas of reason. It was also the administrative center for acquiring new weapons. This gave him and his allies a special perspective from which to explore technological change. Engineers with the service were trained in the best European schools of military engineering, with skills in mathematics, technical drawing, and administration. The ser vice also managed military relationships with private French cannon foundries and musket makers, and state-owned workshops. In theory, this group should have been able to realize their vision of rational gun production, a vision that included interchangeable parts and even, to some extent, interchangeable workers. Most historians track the rise of interchangeable parts to the United States in the mid-nineteenth century, but Alder explores its earlier appearance in late eighteenth-century France and its eventual rejection in a critical French network. The case of Gribeauval and his allies, and their shifting strategies of rational violence, permits us to see how modern mass production transforms social and political relationships. Beginning in 1763 the artillery service undertook a series of radical reforms under Gribeauval’s leadership. One key reform system focused on French cannon. This reform was more than technological. It involved reconfiguring how men cared for and used cannon, what tactics they pursued, and even how they w ere organized socially as a team. They learned to be bound tightly to the machines they managed, in a fusion of organism and gun. In the pro cess Gribeauval and his team dramatically increased firing rates and kill rates for French artillery.
50
Rational Fog
In 1777, the Gribeauvalists began a program to build a better musket. They invited armorers to submit designs, and chose a design submitted by a state- supported arms production workshop that was run by Honore Blanc. Blanc claimed that his guns w ere the result of the experimental method. Every part had been assessed and modified through a process of testing and even discussion. Nothing was left to chance or to tradition. Thus Gribeauval and his group, in choosing Blanc’s workshop, demonstrated their commitments to reason, rationality, and scientific experiment. Interchangeability is usually taken to be a way of describing parts that are so precisely made that they can be assembled without a final hand-fitting. The guns produced in late eighteenth-century France in Blanc’s workshop came close. Their production was standardized through steel dies, filing jigs, milling machines, and gauges that measured proper tolerances. Each gun still required a final hand-fitting by a worker now obliged to produce pieces that matched the expected tolerance. As Alder suggests, this production process also involved standardizing the labor and the laborer. It changed the work of the artisan. Artisanal guns had been individually produced—the result of craft skills and personal strategies and decisions. The new gun production system, with its gauges and tolerances, defined conduct and workmanship. The detailed and proper “fit” of the artifact required new social relationships and new rules for the behavior of workers. This French swerve around 1777 toward interchangeable parts and the de- skilling of the labor of gunmakers was part of a larger Enlightenment project that valued technological innovation. What the Gribeauvalists accomplished was incomplete and contested, and it was driven by state bureaucrats rather than capitalist “entrepreneurs.” It does not look like modern Fordist mass production—it was not oriented around increasing profits. It was also, in the end, a failure. The gun production systems promoted by the Gribeavalists were rejected as too costly, and much of what they accomplished was historically forgotten. Interchangeable parts were “reinvented” decades later in the United States. The French engineers had developed novel techniques for drawing designs for new technology, and they had produced novel physical tools and machines to make new kinds of guns. But they ran up against social resistance on many levels, from the workers to the generals. A certain kind of technocratic ideal assumed that professional engineers would control workers
The Logic of Mass Production
51
through drawing, machines, and measurements. But in practice, workers resisted. Interchangeable parts manufacturing aroused powerful opposition from both merchants and artisans. It threatened established practice. Precision manufacturing was also a social discipline. Engineers committed to it had a practical interest in the improvement of muskets, but also aesthetic and philosophical commitments to mathematical reason and the scientific method. Blanc’s early iteration of interchangeable parts did not lead directly, in a straight line, to mass production. But Alder persuasively shows that it did reflect the intriguing connections between mass production and military culture, something many historians have noticed. Gribeauval and his allies understood the battlefield as fundamentally rational and manageable through engineering acumen. By the nineteenth century, the humanitarian benefits of battlefield health care came to be understood as part of efficient and rational resource management. British nursing pioneer Florence Nightingale (1820–1910) was a key proponent of statistical order, rationality, and quantitative analysis. She valued efficiency in all m atters and spent most of her life providing statistical proof 7 of its value. The Crimean War (1853–1856), where Nightingale did her important fieldwork, was one of the earliest industrialized wars, where railroads and the telegraph line played key roles. It was one of the first wars with real-time press coverage, and reports of tragedies on the front page had an impact on strategic and political decision-making, and on reactions and support at home. Provoked by Russian attacks on Turkish fleets in the Black Sea and by fears in Europe that the Russians w ere planning to advance and take over the region, Britain and France came to the defense of the Ottoman Empire. The result was a brutal slaughter on all sides, in battles at Gallipoli, Sebastopol, Balaclava, and other sites. In the end, the Russian Empire was defeated. One of the journalistic reports that most enraged the British public was that French soldiers were being cared for by skilled French nursing nuns at the Turkish center at Scutari in Istanbul. Injured French troops reportedly had good food, clean blankets and bandages, and round-the-clock medical care. British soldiers by comparison, this report said, w ere getting no care at all, lying on the floor and dying, with no one even tending to their wounds. This portrayal provoked the Secretary of War to order Florence Nightingale to bring a team of trained nurses to the front in Crimea.8
52
Rational Fog
Nightingale was a wealthy, elite young w oman who had a mystical vision in her twenties that said she should serve the world. Despite objections from both her family and male suitors, she pursued this public service and rejected marriage. Her campaign to professionalize nursing—a type of morally vexed bodily labor commonly done either by nuns or prostitutes—was the form that this service to the world took. When the Crimean War began in 1853, Nightingale had acquired some medical training in Germany and was superintendent of a clinic for gentlewomen in London. She was close friends with the Secretary of War Sidney Herbert and his wife, and a fter news accounts suggested the desperate situation for British troops, they asked her to take over the management of nursing in Crimea. She spent most of her time during the war at a military hospital in Istanbul. Injured soldiers arriving there faced a miserable sea crossing from the battlefield to the hospital, where Nightingale found the conditions deplorable. She exploited her wealthy networks to encourage donations of needed supplies and funds. She also demanded and enforced levels of cleanliness that apparently had a direct impact on survival. The record of correspondence and reports generated during her time there is astonishing. She wrote every night, tracking e very policy option and e very mistake or problem. Late at night she sometimes walked through the wards checking on soldiers. She became known as “the lady with the lamp.” The romantic image of her caring labor—her concern for soldiers—made her a popular heroine. She brought to all her work an intense drive, administrative skills, and the charisma that her name conferred after her work in the Crimean War, becoming a respected statistician of military medicine and gradually of public health in general. Her “rose diagrams” or polar area diagrams provided a way to see complex data from multiple sources in a single image (Figure 5). These pie charts, she proposed, w ere a way of making difficult-to-understand facts obvious for what she called “the vulgar public,” in which she included Queen Victoria. She sent one report to the Queen, commenting “she may look at it b ecause it has pictures.”9 Nightingale embraced the new tools and methods of statistics that were becoming popular. While her formal training in mathematics was apparently superficial, her f ather’s social connections in mid-nineteenth century Britain exposed her to some of the leading thinkers of her day. It was a period of vibrant scientific debate and new approaches to mathematics and data. The young Nightingale met mathematician and engineer Charles Babbage, who
FIGURE 5. British nursing pioneer Florence Nightingale developed the rose diagram as a way of picturing risks and mortality in the Crimean War. Wellcome Collection / CC BY 4.0.
54
Rational Fog
conceptualized the computer in the mid-nineteenth century and was a houseguest of his friend and patron, the mathematician Ada Countess of Lovelace, who worked with Babbage’s ideas to propose ways of programming information—hence the common idea that Lovelace was the first computer programmer. In 1847 the Nightingales attended a meeting of the British Association for the Advancement of Science in Oxford, where they heard lectures by Michael Faraday, a prominent and important English scientist who worked on electromagnetism and electrochemistry. She also became close friends with Charles Bracebridge, an early enthusiast for statistics and sanitary reform.10 While it was not a disciplined education in statistics, it seems to have prepared her to learn what she later needed to learn. After her transformative management of the hospital in Istanbul, Nightingale was publicly revered. She began trying to reform the medical practices of the British Army. Army officials w ere suspicious of her motives and her expertise, but she effectively used statistics to allay their suspicions, developing arguments about how to keep armies healthy and then expanding to explore how to keep the public in general healthy. One of her most impor tant special projects focused on sanitation in India. She completed statistical studies of soldiers’ health in India and demonstrated that bad drainage, contaminated w ater, and poor ventilation w ere responsible for high death rates in colonial outposts. Here again she applied statistics to chart a path forward. One her mottoes was drawn from Goethe: “It has been held that the world is governed by numbers: (be that as it may) this I know that numbers teach us whether the world is well or ill-governed.”11 She had no germ theory of disease. But she had evidence nonetheless that keeping patients, nursing staff, and water and facilities clean had an impact. She was a pragmatic persuader, willing to use whatever tools might be expected to work. But she was interested in the study of numbers as an intellectual discipline in itself. She also had an almost mystical belief that through statistics the laws of the world could be determined, and that these laws constituted the laws of God. This was not unlike the ideas that animated Natural Theology, the discipline oriented around deciphering the mind of God by studying nature. Mostly homebound in her later years, she continued her work on statistics and efficiency through her eighties. She was an unusual person in many ways, though in her attraction to reason and evidence, she exemplified the rising influence of scientific rationality in the efficient management of war.
The Logic of Mass Production
55
Nightingale leveraged emotional sympathies for injured soldiers to legitimate a rational plan for managing them in war. Similarly, US Naval theorist and historian Alfred Thayer Mahan (1840–1914) leveraged emotional and patriotic commitments to justify his vision of naval warfare and its importance. Mahan was considered in his own lifetime to be one of the most influential and important thinkers of the day. He is barely remembered today, yet his ideas are made manifest in the imperial practices of the twentieth-century United States. The empire that historian Daniel Immerwahr characterizes as the “greater United States”—the pointillist collections of islands and bases from the Philippines to the Marshall Islands to Puerto Rico that are not officially US soil—were in large measure Mahan’s idea.12 He conjured up this empire, however, in order to demonstrate how important navies w ere. The sea, in Mahan’s vision, was for empires. The United States needed one, and he provided the maps. Mahan has not usually been interpreted as an emotional resource for the promotion of industrialization.13 I seem to be virtually alone in seeing him this way. But I suggest here that his language, tone, vivid enthusiasms, and faith in both commerce and Navy had an inspirational, almost evangelical quality. Industrialization depends not only on transforming production pro cesses or consumption, but also on engaging political leaders and the general public in an enthusiastic vision. Mahan provided the language and the moral justification for the American C entury. He was a persuasive visionary, a believer in the importance of surplus goods and ocean commerce. He concerned himself with economic futures and market limitations, and his naval forces were intended to keep the factories g oing. It was, for many of his readers, a powerful and compelling vision.14 Mahan came from a military family—his middle name honored a founder of West Point, Sylvanus Thayer—and attended the Naval Academy at Annapolis, completing his studies in 1859. He fought for the Union in the American Civil War and a fter the war served briefly in the Pacific. In 1885 he was appointed instructor at the Naval War College in Newport, Rhode Island. He served some years as president of that college, but he also taught courses that led to his most important books. In 1890, he published The Influence of Sea Power upon History.15 This text made him famous. Five more related volumes were to follow. All emphasized how important sea power was. Most said more or less that Britain’s empire depended entirely on it.
56
Rational Fog
It might have helped his reputation in some ways that he had become close friends with Theodore Roosevelt when Roosevelt was a visiting lecturer at the Naval War College (in the end they did not always agree). But it also helped that he was an engaging, even flowery, writer, persuaded that his perspectives were crucial to the future of the United States. He used history to demonstrate the value of sea power and the role of the sea in all aspects of pro gress and success. And he produced a vision of the United States and its future.16 His work enthralled Roosevelt, Henry Cabot Lodge, John Hay, Benjamin F. Tracy (Secretary of the Navy, 1889–1893), and Hilary A. Herbert (Secretary of the Navy, 1893–1897). To him and his supporters, the need to create an American empire seemed self-evident, almost biological—a nation that was not expanding would die. His observations about timeless laws of history w ere consistent with a general style of late nineteenth c entury thinkers, reflecting a positivist mind-set and tendency to eagerly accept “general principles” that explained the world. This was the style of Karl Marx and Herbert Spencer, a style mixed with vague loyalties to “scientific” reasoning and a conviction that unchanging laws could be discerned, in Mahan’s case, by the detailed explanation of sea b attles. But his understanding of his own time was profound. He told leaders of the United States what they must have wanted to hear. He pointed out years before the Panic of 1893 that without foreign markets, the United States could face a crisis around domestic l abor and overproduction (which it promptly did). And three years before Frederick Jackson Turner famously analyzed the end of the American frontier, Mahan predicted its impact. “Whether they will or no, Americans must now begin to look outward. The growing production of the country demands it. An increasing volume of public sentiment demands it.”17 It was Mahan’s idea for the United States to build the Panama Canal. The Isthmian canal, he said, was the first necessary step to the control of the Pacific. The “three seas” of the United States made it crucial to “outlet by them and access to the regions beyond,” and the control of the Pacific was part of a “natural, necessary, irrepressible” American expansion. Mahan also said that the United States needed bases in Samoa, St. Thomas in the Virgin Islands, Puerto Rico, Hawaii, and the Philippines. In his view naval operations supported the commercial empire of the United States, and national greatness, Mahan said, was nautical power. The arguments he outlined thus provided Congress with powerful justifications for building new battleships and participating in naval arms races.
The Logic of Mass Production
57
Mahan zeroed in on places where American interests were at stake and cata logued what those interests w ere. The United States had been a third-rate power considered to have little influence in the councils of the world, but in the late nineteenth c entury various forces converged to thrust the country into the international limelight. When the McKinley administration declared war on Spain in April 1898, it was seen as a victory for American industry. The war lasted only five months. At the end, Spain handed over Guam and the Philippines in the Pacific and Puerto Rico in the Caribbean. The building of the Panama Canal began in 1903, under President Theodore Roosevelt, and was completed in 1914. The United States purchased St. Thomas and the other Virgin Islands from Denmark in 1917. Mahan’s vision—the dream of expansion and power that he built around navies—became a reality in less than thirty years. He is sometimes called the most influential thinker of the nineteenth c entury. I would suggest that his influence was not just pragmatic. It was a concoction of belief and desire that men in powerful places could invoke and use. It was emotional at its core. If Mahan’s vision of the purpose of a Navy was emotional and transcendent, so too were some naval technologies—machines that in practice performed the emotional exuberance of industrialization. As I have suggested, seduction is part of the history of military technology, and sea-going machines have had a particular allure, sometimes enough to make their actual b attle performance irrelevant to their military value. The 1862 ironclad Union Monitor functioned for less than a year before sinking in a storm at sea, but it was a powerful symbol of the machine age and, as David Mindell has demonstrated, more important for its symbolism than for its b attle readiness. In resonant ways, the 1906 HMS Dreadnought provoked a worldwide arms race and inspired a generation of massive biggun battleships, many of which w ere lost in action in both world wars. Critics blamed these losses on bad design and flawed assumptions.18 In both cases, these ships brought together efficiency and exuberance. The first fight between two ironclads—steam powered, armored, and seemingly glorious ships—took place on March 9, 1862, at Hampton Roads, Virginia. The northern Monitor was specially designed as an iron ship and filled with innovative new mechanisms for underwater warfare. The southern Virginia was a standard wooden steam frigate coated with iron by Confederate engineers. The battle was short and not particularly conclusive. Both
58
Rational Fog
sides claimed that they won. But the confrontation at Hampton roads occupied a powerful symbolic place in the war at the time and later. Celebrated as a Union victory, it provided hope at a difficult time in the war. And the Monitor itself seemed to be a sign of the potential of US industrial power. It was both amazing and frightening, a sign of dehumanization and a sign of the wonders of the future. In David Mindell’s exploration of the human experience for t hose who lived and worked inside the Monitor, he draws on the unforgettable letters of one insightful crew member to his wife. William F Keeler, the Monitor’s paymaster, shared with his wife Anna his feelings and experiences inside a machine that both protected him and threatened him. Keeler wondered about heroism and masculinity in war, when men w ere protected b ehind iron plates. Had science done away with courage? And could warfare be conducted without risk, or with asymmetrical risk, in which some were safe and o thers vulnerable?19 As Mindell makes clear, Keeler admired his ship and its mechanical won ders, but grew also weary of the cold, darkness, and noisiness. During the summer, conditions on the ship became nearly intolerable, with the heat inside reaching 130 to 150 degrees Fahrenheit. There were even mosquitoes in the ship. With others, Keeler wondered if the Monitor was a welded tomb. Was he in as much danger from the ship itself as from the enemy? Life in an enclosed space underwater was taxing. Protection could be entrapment. He and the crew knew that the new technology had flaws. On its first trip out, before the battle at Hampton Roads, the ship encountered a gale that forced w ater through its hatches and deck seals. Seawater entered the blowers, then leaked onto the leather b elts that drove the steam machinery. This caused the engine-room blowers to fail, and the engine room filled with noxious gas from the coal fires. In the open ocean, such a situation was life threatening. Engineers could not stay below to repair the ship without becoming ill, and the crew was frightened. The eventual solution was a classic sociotechnical one: all the engineers gathered on deck, and worked through a plan for repair. Then they went down one by one to do one specific task and come right back up. Working together, with social cohesion, they solved the problem of the machine. While the Monitor was being built, in the winter of 1861–1862, Union leadership heard rumors that the Confederates were modifying an old Union frigate, the Merrimac, with iron plates. The Union ship had been taken by Confederate troops at a naval yard in Norfolk, V irginia, in the spring of 1861.
The Logic of Mass Production
59
Retreating Federal troops burned and scuttled it, but Confederates raised, sal vaged, rebuilt, and rechristened it—as Virginia—using materials left b ehind at the yard. Leadership in Washington feared that this new ironclad ship could challenge the all-wood Yankee blockade fleet at Newport News. And if it succeeded, they feared, it could steam up the Potomac and attack Washington. The Virginia did arrive at Hampton Roads, near Newport News, on March 8, 1862, one day before the Monitor arrived. The union fleet was caught unprepared. The day’s battle seemed to prove that ironclads would destroy wooden navies. The Virginia rammed and sunk the Cumberland and killed 121 members of its crew. It also successfully set the Congress on fire. That ship exploded. The ironclad Virginia fought with impunity, and finally withdrew largely intact. By nightfall a third union frigate at Hampton Roads was trapped on a sandbar and vulnerable to a morning attack. Late on the same day the Monitor arrived at Hampton Roads. The area was strategically central to the war. It was also a setting suited for a public battle. The events were in full view of spectators from both sides, taking place in a natural naval amphitheater. Part of the reason that the b attle had such an impact on the American public was that it was witnessed live, and described by witnesses and participants in such vivid terms. The battle on the morning of March 9 took about four hours. The boats were very close to each other. Shot and shells bounced off the iron surfaces of both ships. The Monitor took 22 hits but suffered only minor damage. The Virginia was also hit and most of its extremities were shot away. In mid- afternoon, the two vessels disengaged. T here were no fatalities. Both sides claimed victory. The two ironclads never encountered each other again. Keeler, in letters to his wife, wondered if the heroism of the Monitor crew lay not in their performance in b attle (because they were protected), but in their willingness to live in such a strange environment. “I think we get more credit for the fight than we deserve—anyone could fight behind an impenetrable armor.” Certainly the crew and the ship became celebrated heroes. In the process, the ship became too valuable to expose to any f uture battle. Gradually the crew realized that they were being kept away from conflict, put on display to encourage the public and celebrate Union technological skill. The ship was worth more as a symbol than as a technology of war. In an observation that captured the gendered tensions about the ship and its impact on heroism or masculinity, Keeler compared the ship to fine china: “The government is getting to regard the Monitor in pretty much the same light as an
60
Rational Fog
over-careful housewife regards her ancient China set—too valuable to use, too useful to keep as a relic, yet anxious that all s hall know what she owns.” After sitting largely idle through the summer, the vessel sank off Cape Hatteras on New Year’s Eve 1862. Sixteen crew members died. The remaining forty- seven, including Keeler, w ere saved. As Mindell’s analysis suggests, the new ironclads signaled change, for the Navy and for the nation. They also exemplified the qualities of industrialization more generally. They w ere part of a rational plan for resource management, for the efficient pursuit of naval warfare, and for maintaining public enthusiasm for the war. They inspired both the powerful and the public, but for those asked to fight inside them, they delivered a mixed experience of safety and self-doubt, wonder and skepticism. The HMS Dreadnought, also a symbolic vessel, had a very different destiny. The first such all big-gun ship of its size and specifications was christened on February 2, 1906, at Portsmouth Harbor in Britain. L ater this moment came to look like a divide—from pre-to post-Dreadnought. The new turbine engines made the difference. The Dreadnought was planned as a model of efficiency and effectiveness. It had a six-m ile shooting range, multiple watertight compartments, and oil-fi red rather than coal-fi red burners. Its armor was thirteen inches thick, and it could reach a speed of twenty-one knots, faster than any other battleships of the time. It was indeed something impressive to behold—the photographs make that clear—and it provoked a panicked arms race in all major navies.20 The US Navy’s first dreadnought was the South Carolina in 1910. By 1914 the British had twenty-two dreadnoughts, and the Germans fourteen. In 1921, the United States had ten dreadnoughts and a plan to build a dreadnought for every state in the nation. The ships were symbolic of global power. They were also understood by journalists to be male. Many ships are given female names or described using female pronouns. But journalists found dreadnoughts to be masculine, “hard, tough, shooting fools” and “he-man battlewagons” with “strange masculine beauty.” The public, witnessing American dreadnoughts steaming into harbor, w ere “near tears” at the site of the ships.21 Such responses reflected emerging support for battleships as the key to naval strength and the definitive measure of naval power. Some navies, including the Unites States Navy, had supported more mixed use of smaller ships in the late nineteenth c entury. But the Dreadnought and all its progeny signaled changes in naval thinking, and made the large battleship the mea
The Logic of Mass Production
61
sure of a nation’s strength. Gradually, and definitively by World War II, the symbolic power shifted to the aircraft carrier, and then later to the nuclear submarine. During the war, the United States built about one hundred aircraft carriers and about eight battleships. Eventually the cost of the original generation of dreadnoughts began to strain the economies of Japan, France, Britain, Italy, and the United States—the countries with significant dreadnought fleets. Some doubts about the efficacy of the big ships had also begun to emerge. Significant combat losses of dreadnoughts during the First World War seemed to suggest that there was a problem with the ship itself. The 1922 Washington Naval Treaty and the 1930 London Naval Treaty resulted in an international consensus in f avor of “battleship building holidays” for most nations, though these agreements imposed few to no limitations on either submarines or aircraft carriers, the emerging naval technologies of the era. By the 1930s, compliance began to erode—partly because many nations were gearing up for another war. The modern all-big-gun battleship never lived up to the expectations that had produced it. But as the earlier battleship era came to a close, new vessels with a strong f amily resemblance to the dreadnought continued to be built. In size, if not in teeth, the modern aircraft carrier is at least a cousin of the dreadnought. I have constructed industrialization as an emotional domain—one in which even rationality and efficiency invoked feelings that legitimated military policies and practices. I suggest that e very element of modern capitalized statecraft played a role in the rise of technoscientific war. Markets and capital, mass production, interchangeable parts, systems thinking, and the ideological promotion of nationalism (“imagined communities” as Benedict Anderson called them) were all important to the changing structures of military conflict.22 The logic of mass production was also the logic of total war, and eventually the logic of indiscriminate urban bombing. The eighteenth-century French military engineer Jean Baptiste de Gribeauval, nineteenth-century British nursing pioneer Florence Nightingale, and turn-of-the-twentieth-century US naval historian Alfred Thayer Mahan each enacted emerging ways of thinking about reason and violence. Iron-clads and dreadnoughts w ere technological expressions of t hese ways of thinking. By the First World War, military conflict had become an industrial machine. Directly linked to this process of industrializing war was a related pro cess of controlling what industrialization made possible. International efforts
62
Rational Fog
to establish rules for war—to reach a consensus between “civilized” nations— began in the nineteenth century in Europe and expanded in the twentieth century to become part of an ongoing global process. The efforts reflected general concerns about changing technologies of war. The Naval Conference of 1922 reached agreements about stopping the dreadnought arms race that was costly for all involved. Only a few years l ater the Geneva Protocol for the Prohibition of the Use in War of Asphyxiating, Poisonous or Other Gases, and of Bacteriological Methods of Warfare was proposed in 1925 and approved in 1928. It was originally signed by thirty-eight nations—but not by the United States until 1974. Atmospheric weapons testing and arms limitations agreements in the 1950s and l ater were oriented around controlling production and risk. Major programs for biological weapons w ere developed in many countries beginning in the 1920s, but the Biological Weapons Convention of 1972 became the first weapons convention to ban development, production, and stockpiling. It was not only a rule about use but also about possession. Since the 1970s, there have been at least twenty-six international agreements focused on controlling military technologies. Nuclear weapons top the list, but rules about mines, space, the oceans, and arms trading have also been established and negotiated. The Chemical Weapons Convention of 1993—supported even by the chemical industry in the United States, and one of the most powerful conventions in effect t oday—requires that stockpiles be destroyed. It also requires challenging inspections and commitments by all signatories to defend any nation attacked with chemical weapons. Science and technology were deeply implicated in these international agreements, which reflected the extreme violence and risk of industrialized war. The idea of the existence of a recognizable and clear “civilian” was promoted most assiduously at the very moment that air power made the existence of such a person incoherent. If the state war machine depended on factories and factories needed workers, then destroying the neighborhoods where workers lived was a legitimate part of strategy. Industrialization, indeed, made the bombing runs as they unfolded—the firestorms of Tokyo and Dresden—logical. I do not equate logic h ere with morality. My point is only that industrialization made sense of the new uses of extreme violence. Reason, logic, efficiency, and mobilized emotion were intellectual resources for technoscientific war. They were all fully brought together in the First World War.
3 Trenches, Tanks, Chemicals
T HERE H AS BEEN NO T HING LIK E I T BEFORE OR SINCE. T H AT T ROGLODY T E WORLD OF T HE W ES T ERN FRON T WAS
technologically sophisticated and strategically disastrous. Those two facts are related. Fought primarily in a vast trench system that zigzagged across France and barely moved for four years, the war saw the first military uses of many new technologies. Chemical weapons came straight to the trenches from the massive laboratories of the German chemical industry. The military tank, modeled after tractors, was developed as a response. Fragile airplanes, less than twenty years after the Wright brothers’ first flight at Kitty Hawk, were experimental additions to the arsenal, dropping bombs but also providing reconnaissance and strategic support. Enhanced radio communications, more powerful machine guns, oceangoing submarines, and sophisticated aerial photography w ere also reshaping strategy and h uman experience. In World War I scientific experts enlisted for active duty. They d idn’t just supply expertise or ideas. They joined the Armed Forces. Some died at the front. Others served in laboratories in the production of chemical weapons or on studies of psychological warfare. World War I is commonly known as “the Chemists’ War” b ecause of the rise of chemical weapons. But it was more 63
64
Rational Fog
than that. It was the anthropologists’ war, and the psychologists’ war, and the physicists’ war, and the engineers’ war. The First World War brought modern science, engineering, and medicine into the trenches, and in the pro cess tore the scientific community’s dreams of internationalism apart. There is a literary iconography of the trench, immortalized in memoirs, poetry, fiction, and even official military reports.1 Much of it describes mud. But this mud-filled world of human suffering was also shot through with the products of scientific reason. T here was a veritable cornucopia of knowledge, medicine, and truth in those muddy trenches (Figure 6).2 The war produced mass psychological trauma and equally terrible physical trauma. For social scientists, including psychologists, it was an opportunity to prove their immediate, practical value to the state. For physicians and surgeons, it provided access to many destroyed bodies that could be studied and perhaps healed— an example of how war is “good for” medicine, and of the value of the collateral data routinely generated in military action. And for scientists in general, it became both an opportunity for new research and a crisis of conscience: the First World War led to a fracture of the international scientific community. For more than a decade after the war, German chemists including Nobelist Fritz Haber w ere shunned by their peers for their roles in the uses of poison gas and for their vocal support of the aggression of the German state. The nineteenth-century dream of internationalistic, pure science began to fall apart.3 The great historian of warfare William McNeill at one point described the chaotic beginnings of World War I and noted, “the reasons for such bizarre behavior can only be surmised . . . World War I remains unusually difficult to understand.” Many other historians have expressed similar perspectives.4 Some have suggested that the war was “natural,” a result of the biological tensions produced by modernity. Among the factors historians have invoked to explain the war are a cult of masculinity and heroism reflecting late nineteenth-century responses to economic and social disruption; internal strife and tension in many of the nations involved such that an external e nemy was unifying and useful; the costly dreadnought arms race that threatened to bankrupt several nations; and the psychological adjustments produced by the general population shift from rural to urban life in many European states. All of t hese possible explanations seem to propose that the war was closer to a seizure or a trance than to a rational political calculation.
FIGURE 6. Flanders field seen from the air, suggesting the size and complexity of the trench systems of the First World War. Royal Museum of the Armed Forces and Military History, Brussels.
66
Rational Fog
For whatever reason, many leaders involved in sustaining and prolonging the war seemed to avoid empirical cost-benefit analyses. It was one of t hose (many?) wars that was all cost, for all sides. Of the seventy million p eople who served in active duty in the war, ten million died and millions more suffered grievous wounds. In 1918, a deadly flu epidemic that began among young men in army camps killed about thirty million more, spreading from the camps to civilian populations in both urban and rural centers, all over the world. The disaster of the war and all that it produced can barely be elucidated here. Indeed, it is difficult to absorb it or to keep it all in one’s head at the same time. In this chapter I follow the technical elements of the war, considering how science, medicine, and engineering shaped strategy, and what the war meant for the experts who participated. But I cannot explain the war itself. It began in June 1914, when a group of young Serbs, trained and organized by the Serbian government, assassinated the heir to the Austro-Hungarian throne while he was visiting a newly annexed section of Bosnia. Various assassins followed him through his publicly announced route for the visit and eventually one of them succeeded in killing him. By late July the declarations of war began. Austria and Germany w ere allied against Serbia; Russia declared war on Austria; Germany declared war on Russia and invaded Belgium and France; England declared war on Germany. And so on. Within weeks, Europe was a battleground. But it wasn’t just Europe. Colonial troops were at every front, drawn from European empires, mobilized to defend lands and societies in which they had no direct stake. Men came from Australia, Africa, Asia, and Latin America to fight in the trenches. The First World War was genuinely a world war, b ecause Europe collected the world. Many of the declarations of war were delivered by telegram, so we might say that new communication technologies played a role in how quickly the violence began. Telegrams also made it possible to move troops quickly and simultaneously. They changed the pace of military action. And they played a role in diplomatic interventions, which were widely expected to end the war as quickly as they sparked it.5 Many senior military officials in Europe knew that modern weapons w ere more lethal and brutal than any weapons used in earlier wars. Some assumed that modern societies would not support or tolerate the cost of a prolonged struggle that was likely to be so traumatic. A knockout blow, many hoped, would end things quickly, with troops home
Trenches, Tanks, Chemicals
67
by Christmas. But the technologies that might have seemed likely to quickly end the war somehow did the opposite. Machine guns prolonged the war rather than ending it.
By Halloween, October 1914, the Western front had congealed. It ran 475 miles from the North Sea to the Swiss border, with 15,000 miles of trenches, rarely more than 500 yards apart. As the war dragged on, the proximities and semi-stabilities—the stalemate—had unexpected consequences. In studies of the sociology of trench warfare, it is clear that on some sections of the line, informal norms limited violence. Some sections w ere only a few yards apart. At one point Canadian troops dug in on one side of a ruined barn, and German troops dug in on the other, in easy hearing distance. While troops were encouraged to engage constantly in offensive activity, and commanders reported routinely that they did, many sources suggest otherwise. Diaries, letters, memoirs, and even reports from commanders described the practice of “Live and Let Live” on some sections of the line. This was a principle that defined relations between opposing armies and limited risk. These informal and collective agreements between frontline soldiers were nonverbal, tacit, covert. They inhibited offensive activity to a level that was mutually defined as tolerable through social practice.6 It could mean aiming weaponry to the side or down toward the ground, so that the opposing troops were not hit. Or it could mean a routinized activity—the “evening strafe”—that was expected and understood to provide formal evidence of “activity” on the line. Mealtimes, easily overheard in the e nemy trench, were by consensus respected. And opposing teams on night patrol sometimes quietly ignored each other.7 A visitor unfamiliar with the policy would see this front line as reasonably active, noisy, and compatible with official requirements. For those in the trenches, however, the occasional bombs and bullets w ere not indicators of animosity but of consensus, even collusion. Those seeking to explain t hese behaviors usually draw on theories of alienation. Soldiers expected to engage in violent labor can and do sometimes reject it, and construct alternative ways of relating to opposing soldiers. They may use technologies to achieve their own goals—to appear to be properly engaged in warfare without actually killing anyone—rather than those of
68
Rational Fog
their commanders. Like those who aimed their guns too high or low in earlier wars, they were practicing a protective form of technological choice, preserving the appearance of formal compliance without actually complying. Trench warfare then involved a technological and social system of immense complexity. It was not just a series of muddy ditches but a huge network of supply, strategy, communication, and social performance. The incredible maps of trenches and the trench system—and even the parks where sections of trench are preserved—suggest how modern the trench was. It was architectural in its complexity. Many trench systems consisted of three parallel zigzag lines. The front was called the fire trench, from which soldiers would fire on the enemy. Typically it was six or seven feet deep and about six feet across. It had a fire step, r unning laterally along the front, which could be mounted by any soldier attempting to fire or observe. Trenches had traverses, which were sharp turns at ten-yard intervals. These were intended to localize any offenses, either blast and shrapnel, or gunfire, should e nemy troops land in the trench itself. Fire trenches, as the first line, often had barbed wire piled on the enemy side. Next was the support trench for supplies and rest, and fi nally the reserve trenches. Linking them all w ere communication trenches. No man’s land was the unclaimed space between the trenches. It was unoccupied and technically “disputed” though that is probably not the right word (Figure 7). It could be as narrow as twenty yards, as wide as 1000 yards, but was usually about 200 yards—less than an average block in New York City. Despite the proximity of the enemy most of the time, both sides saw the enemy only infrequently. French and British troops w ere moved around a lot. German troops could spend two years essentially at the same small section of trench. The immobility of the trenches, as the lines stabilized and the war dragged on, provoked desperate technological solutions. One of these solutions was chemical weapons. The 1914–1918 war was the only major global conflict in which chemical weapons were routinely used by all sides. Chemists initially promised that chemical weapons would break the stalemate. But the stalemate did not break and all sides still continued to use chemical weapons in great volume and much variety. The strategic use of mustard gas, chlorine, phosgene, and other asphyxiating and poisonous chemicals in the First World War has never been repeated, for reasons that remain unclear.8 In some ways, chemical weapons were not new in 1914. Arsenical gases were used around 1500, possibly earlier. Ancient recipe books collected from
Trenches, Tanks, Chemicals
69
FIGURE 7. A vertical photograph of Thiepval village, and German front-line and support trenches, while under bombardment by British artillery. Image © Imperial War Museum.
70
Rational Fog
around the world suggest that noxious smelling gases were recognized as a possible aid in warfare, and sulfur and smoke w ere a standard part of siege warfare. But the First World War was different. It involved the use of sophisticated, laboratory-based chemical expertise to produce injury that did not bluntly open the body but that exploited reactions specific to h uman biology, with a full scientific knowledge of that biology. Guns and swords were brutal and bloody. They w ere crude. Chemical weapons w ere sophisticated, grounded in laboratory science. And they killed p eople as though they were insects.9 The prominent German chemist Fritz Haber, who later won a Nobel Prize for the synthesis of ammonia, led the German effort. Haber was head of the Kaiser Wilhelm Institute for Physical Chemistry in Berlin. He was a powerful and influential scientist who had played a critical role in the recruitment of Albert Einstein to Berlin from Prague in 1912. Utterly devoted to the promotion of German chemistry, he saw the war as an opportunity to promote the value of chemistry to the state. When World War I formally began, in July 1914, Haber was on vacation. He immediately volunteered for war duty but was rejected because of his age. Instead, he was appointed head of the Chemistry Department of the new Board of Wartime Raw Materials, a part of the Ministry of War. He had experience with investigating alternative substances for use in various industrial pro cesses, and could perhaps help deal with the shortages of raw materials in Germany for military production. He had also worked extensively on new types of explosives (one of those experiments resulted in an explosion that killed one of his colleagues). The most important question Haber tackled in the first months of the war was about nitrogen: how could enough sodium nitrate be manufactured as raw material for both explosives and nitrogen fertilizers?10 But his attention soon turned to chemical weapons. It was a choice he made himself. His research explored with increasing intensity all relevant aspects of the use of asphyxiating and poisoning gases. He brought together qualified scientists and technicians from various fields to talk through the prob lems. It was not just a matter of advanced mass production or efficacy but also of delivery. How could t hese weapons be brought to the battlefield and used in ways that did not damage one’s own troops? He needed engineers, physicians, chemists, meteorologists. He needed to predict both environmental and biological reactions.
Trenches, Tanks, Chemicals
71
Haber’s program to develop chemical weapons was quickly recognized by the German army as a high priority. Any officer Haber selected was assigned and transferred to his organization. He recruited a circle of young German scientists who included Otto Hahn, who later won a Nobel Prize l ater for his insights into nuclear fission; James Franck, a physicist who later immigrated to the United States, also won a Nobel Prize, worked on the Manhattan project, and was the key author of the Franck report in May 1945; Hans Geiger who later constructed the Geiger Mueller counter for radiation; and Gustav Herz, a physicist who also won the Nobel Prize in 1925 for his work on ionization. As this partial list suggests, it was an impressive group.11 Haber believed that chemical weapons w ere legitimate means of war and that they w ere perhaps even more humane than guns. He was not the only person to reach this conclusion, as the postwar debate about chemical weapons made clear. Even some combat veterans preferred chemical munitions to machine guns. Chemical munitions, they said, did not mutilate.12 In early 1915, Haber began to think about a possible way to use a liquefied chlorine gas for an attack. His initial idea was that cylinders filled with chlorine could be opened simultaneously along a wide stretch of the front. This liquid chlorine would immediately turn into gas upon contact with the air. It would form a yellow green or white cloud containing a relatively low level of chlorine. But since chlorine is heavier than air, the cloud would roll forward and down into the e nemy trenches and foxholes. It would force the e nemy up and out, create disarray, and make it possible to seize the trench. In this plan, German troops would be wearing gas masks. They would follow the cloud, penetrate e nemy positions, take prisoners, and break the line. Perhaps conveniently, Haber and other German leaders took the formal position that the French use of tear gas in the fall of 1914 had constituted the first use of chemical weapons. This was an important way of justifying his own research. When Otto Hahn objected that using chemical weapons would be a violation of international law (referring to late nineteenth c entury prohibitions on shells containing noxious chemicals), Haber invoked the French action and said that Germany would not be the first. Tear gas (a broad category that refers to irritating lachrymators—they produce tears) of course is one of t hose interesting chemical weapons that is still legal t oday to use domestically, against one’s own citizens in case of riots or other public unrest, but that remains illegal in war.
72
Rational Fog
On April 22, 1915, at 6 p.m. along six kilometers of the front, the Germans opened 5,730 cylinders, releasing 180,000 kilograms of chlorine in a yellowish- green cloud. This was near the Belgian city of Ypres. For a brief period a nine-kilometer break was created in the line, but Germans did not realize that the line had been broken and did not take advantage of it. At least 7,000 victims were sickened, and probably between 350 and 500 people w ere killed. The British troops along this section of the line were colonial troops with very limited experience. They w ere not prepared for a chemical weapons attack, and they ran. Germans did manage to take about 1,600 prisoners in the midst of the attack. The next day, newspapers in London announced that the Germans had used an asphyxiating gas, but they did not mention that the line had in fact been broken. The newspapers in Berlin mentioned that German troops had advanced and captured new territory, but they did not mention the use of chlorine gas. The Germans immediately used gas again, and Germany’s actions provoked all nations involved to retaliate. Thus began an arms race of new agents, new gases, and new defenses. Gas masks like the Black Veil, the Helmet, the Large Box Respirator, the XTX Respirator, and the PH Helmet evolved along with the weapons. They were stuffed with dirt, sand, cotton, and other kinds of filtering materials to keep the chemicals out. There were masks for horses and masks for dogs, both animals that played a key role in the war. Some dogs noticed the smell first and w ere kept in trenches as sentinels. Sgt. Stubby, for example, was the most decorated dog of the First World War. The Boston bull terrier began his military career as a mascot but became a combat dog after he was injured in a gas attack (Figure 8). The injury made him sensitive to gas, and he warned soldiers in the trenches when he detected it (by barking).13 Higher and higher concentrations of gas were used as the war progressed. Defense strategies and practices evolved. Eventually both sides understood that merely forcing opposing troops to put on their cumbersome and uncomfortable gas masks could be effective. Gas masks were demoralizing, tiring, and unpleasant. Fear of gas warfare was demoralizing and exhausting. Moreover, gas and traditional explosives w ere often used in conjunction with each other. Explosive shells forced soldiers down into trenches, where some gases like chlorine tended to s ettle. Different technological systems worked together to increase the misery of the war. By the end of the war 3,000 chemicals had been tested for potential battlefield use by various parties. A total of thirty-five different formulas w ere actually tested in battle, and com-
Trenches, Tanks, Chemicals
73
FIGURE 8. Sergeant Stubby, the most decorated war dog of the First World War. The Boston bull terrier began as a mascot and became a full-fledged combat dog. Wikimedia Commons.
manders concluded that twelve of these “worked.” In total, 125,000 tons of chemicals w ere deployed during the war. Chemical weapons caused about 90,000 fatalities and 1.3 million injuries. Mustard gas was the most impor tant chemical weapon. It had relatively low mortality rates, but it worked fast and was irritating. Chlorine and phosgene were also widely used. Phosgene was highly toxic but it took one or two hours to kill. Chlorine was less toxic but faster. The use of chemical weapons had already been rejected in various forms of international agreements about war adopted a fter 1874. A fter the First World War, chemical weapons w ere rejected again by international law and also by general taboo, which survived more or less despite slow signatories (the United States waited u ntil 1974) and some violations, particularly against colonies.14 Allied authorities were surprised when German troops did not use chemical weapons in the Second World War. Some governments in recent years have used some of the more horrible nerve gases against their own citizens. The taboo is incomplete, mottled. But it is real.15 In 1919 when the G reat War was over, the New York Times reported that an American chemist had developed a new chemical weapon that smelled like geranium blossoms. The New York Times called it “the climax of this country’s achievements in the lethal arts.” Lewisite had been discovered by
74
Rational Fog
accident by a priest chemist in a lab at Catholic University of America. It was developed into a weapon by Winford Lewis, just in the time for the end of the war. This end came as the first shipment was crossing the Atlantic Ocean. With the announcement of the armistice, the crew dumped 3,000 tons of lewisite into the open sea. When we think about the impact of technologies of war, it is important to also think about their long-term consequences. The lewisite dump is the first known case of the United States systematically dumping chemical weapons into the ocean, but it was not the last. Barrels of mustard gas, phosgene, and other agents were and are dangerous to store, and the United States by 1918 had a large stockpile. From the 1920s through to the 1970s—for half a century—US armed forces routinely dumped unwanted, expired, nonstrategic or unstable chemical munitions into the sea, often in US w aters not far from the North American coast. One notorious program was called CHASE, for cut holes and sink ’em. This was a quick description of the ways that Navy ships due to be scuttled w ere loaded with chemical weapons and sunk at sea. At the time, those military personal engaged in planning, approving, and carrying out this dumping must have imagined an ocean that could make anything disappear. In the late 1960s, the US Department of Defense publicly acknowledged that the US Armed Forces had records of seventy-four instances of disposal of chemical weapons in the ocean, with thirty-two of these instances off US shores and forty-two off foreign shores (Figure 9). According to the Army, the last ocean disposal was in 1970, 250 miles off the coast of Florida. In total, weapons were dumped off eleven states, with no records kept of the exact locations.16 In 1972, Congress passed the Ocean Dumping Act to prohibit the disposal of waste into ocean w aters of the United States. The bill included provisions explicitly prohibiting offshore disposal of chemical warfare agents. The development of chemical weapons had many long-term consequences. These underwater dumping grounds, somewhere in the oceans, coincidentally expanded their targets to include sea life, as dolphins appeared on Atlantic beaches in the 1980s with burns from still-active mustard gas. The case of chemical weapons exemplifies a much broader trend. Modern scientific warfare has produced wastes and toxins and radioactive materials that have contaminated, literally, the entire world. The scale of the toxicity, the scope, the range of damage, and the enduring legacies are only now beginning to be
Trenches, Tanks, Chemicals
75
FIGURE 9. Cut Holes and Sink ’Em, 1964, off the Coast of New Jersey, dumping mustard gas into the ocean as part of the Cut Holes and Sink ’Em program. US Army.
recognized and explored by historians.17 In some ways twentieth-century warfare can be seen as a history of the systematic, industrial generation of devastating environmental damage. It has been a sustained war upon the earth itself. Chemical weapons w ere a result of the stalemate of the trenches. Profound emotional trauma was another result. By late 1914, soldiers in the trenches began to experience something that was called shell shock. The name originally reflected the idea that the brain had been rattled, shaken up, by the intensity (noise, force) of modern shelling. The explosions themselves, it suggested, had caused brain damage. Later the diagnosis changed. Those affected could then be interpreted as victims or as psychologically weak, faking it, feminine or too tenderhearted, or as narcissists who w ere overly dependent on their m others, or perhaps spineless, lacking in courage. There were also arguments that only the finest soldiers experienced shell shock. To some observers, the yokels at the bottom of the class hierarchy had no fear, but overly educated young men seemed vulnerable to mental collapse in the face of war because of their intelligence.
76
Rational Fog
Of the many medical diagnoses produced by modern warfare, those associated with the emotional response to violence have most clearly reflected the gender system. This system as it is understood in feminist scholarship divides the social and biological world into two clear categories, male and female. Each of these categories is also associated with a range of other binaries that are supposedly mutually exclusive: thought to feeling, objectivity to subjectivity, logic to intuition, mind to body, culture to nature, aggression to passivity, public to private, political to personal, and so on. In each of these cases, the first property listed is associated with men and masculinity, and the second with women and femininity. These associations transcend biology. A man can be read as intuitive, emotional, passive—and therefore feminine. A woman who is logical, intellectual, and aggressive can be socially interpreted as masculine. The gender system, as it has been practiced over the last century in Europe and the United States, assigns people to strict categories that are difficult for any individual to completely avoid. In general, t hose properties assigned to men, whatever properties they are, are more highly valued.18 Shell shock, battle fatigue, war stress, and other diagnoses of their kind over the course of the c entury were often seen as forms of male weakness. The emotional responses of soldiers in war suggested that ideals of masculinity were not being met. T hese diagnoses therefore operated somewhere between medical rationality and moral order. Just as expert physicians were called to the miraculous shrine at Lourdes in France to testify to the reality or falsehood of miracles, (because only a doctor could determine if a cure was beyond the expertise of doctors), so too w ere physicians in the First World War called to draw the line between coward and victim, deserter and patient, mind and body.19 The first use of the term shell shock was in the British Medical journal The Lancet in February 1915. Psychologist Charles Myers (1873–1946), who was part of a volunteer medical unit in France, proposed that he was seeing something new. The name itself suggested that shell explosions (rather than feelings) caused the disorder. Myers did not explain exactly how this happened. He saw the symptoms as similar to a diagnosis commonly applied to women—hysteria—and said that they could be caused by physical damage to the nervous system or psychological damage as a result of the technologies of modern war. Myers might have been the first to put the term in print, but there is some historical evidence that he borrowed it from men in the trenches: those experiencing shelling may have come up with it.20 The following year,
Trenches, Tanks, Chemicals
77
British psychiatric physician F. W. Mott proposed that the condition was organic, meaning that it was explicitly the result of brain damage from exploding shells, or perhaps from carbon monoxide produced by shells and inhaled. The symptoms of shell shock were surprisingly diverse. Many kinds of people and stories could be accommodated by the category, which was open and flexible. One soldier who developed depression and a tremor after experiencing an intense four-hour bombardment could not walk and was admitted to a field hospital, where he died. Another became mute and deaf five days after a shell explosion. He was cured six months later as the result of a religious vision. Another experienced temporary leg paralysis, a symptom that has long been seen as a canonical manifestation of trauma. Eventually the diagnostic categories built around shell shock had consequences even for veterans’ benefits, as those disabled by enemy action might qualify for pensions as battle casualties, whereas those brought down by their emotional reaction to war did not. The name attached to a soldier’s suffering therefore determined where he was sent when diagnosed, what medical treatment was chosen, and w hether he qualified for a pension.21 The anthropologist and psychologist W. H. R. Rivers saw shell shock as more than psychological. It was cosmic, eschatological, and engaged with questions of judgment, heaven, hell, and the meanings of h uman existence. Meanwhile, the British Army executed soldiers for cowardice—306 total—and tried thousands more for desertion.22 In psychiatric diagnoses, and perhaps in all medical diagnoses, it is important to recognize that categories, symptoms, and meaning are not timeless or transhistorical. The symptoms of earlier forms of “soldier’s heart” (a nineteenth-century diagnosis) and other responses to war did not match t hose of shell shock, b attle fatigue, or contemporary post-traumatic stress disorder. While all t hese diagnostic categories captured meaningful experiences for those using them, they did so in different ways that reflected the specifics of time and place. The psychological pain of the First World War was genuine and powerful—the record is clear. But it had unique properties. Much later, PTSD became a category that included the traumas of everyday life, such as being a victim of crime or being injured in a traffic accident. As historians of medicine have demonstrated over and over again, categories of disease shift over time. The “real” biological experience attracts different kinds of attention, signs, explanations, and recommended protocols for
78
Rational Fog
intervention. One can suggest that war trauma is both entirely real (a real form of h uman suffering) and a historical product of social consensus and selective performance shaped by culture and belief. That the institutional recognition of such conditions has depended on reporting statistics and diagnostic standards is further suggested by their variability across different armed forces. Apparently, no soldiers in the Soviet Union ever experienced the suite of symptoms and suffering that w ere called shell shock or battle fatigue. It was not an available diagnosis. They were alcoholics or mentally ill before the war, or depressed because of their own problems. British troops in the First World War experienced up to a 40 percent rate of shell shock. In the Second World War different services reported between 25 and 30 percent stress-related incapacity. In Vietnam, there was almost no desertion, but later, after troops returned to the United States, rates of m ental distress (by this time called post-traumatic stress disorder) were as high as 31 percent. M ental distress of various kinds seems common in war, but the forms and meanings of this distress vary over time and place.23 Like so many other battlefield conditions of the twentieth c entury, shell shock was technoscientific on both sides. It resulted from new kinds of warfare, and from the work of scientists and engineers who produced chemical weapons and made new systems of artillery. And it was institutionally and even morally managed by other kinds of experts, including physicians and psychiatrists, who had the authority to decide what it should mean for any soldier. Shell shock, like so many other products of modernity, was produced by new forms of science and technology, and managed by other forms of science and technology. It was an experience, a diagnosis, a psychological state and an administrative problem, and it reflected at e very stage the centrality of technical knowledge to the First World War. This centrality had profound consequences for the scientific community. The war shattered hopes for internationalism in science, hopes that were just beginning to blossom. By the early twentieth c entury, many institutions promoted a romantic idea of science as uniquely neutral, universalistic, and benevolent. Science seemed almost spiritual to many practitioners in its high-minded promise to aid mankind. It was a classic vocation—a calling, not a profession. Those drawn to science were drawn to a meaningful human quest that transcended nationalism or individuality. They worked for “mankind.” Some of the emerging institutional arrangements of the late nineteenth c entury reflected these ideas.
Trenches, Tanks, Chemicals
79
Between 1870 and 1910, for example, experts reached consensus agreements at an international level about naming and standardization in science, medicine, and engineering. Scientists also proclaimed that they exemplified a meritocracy. In science, they proposed, class, ethnicity, and nationality did not matter. Science knew no national boundaries, and anyone with talent and insight could succeed (except women, of course, who were excluded from virtually all PhD programs at the time). The premise of internationalistic science was that experts in all nations would conceive of their problems in the same ways and pursue their investigations with the same practices, leading them to be able to reach conclusions that reflected a shared set of assumptions and values. National characteristics and cultures would not shape what was true—this was the key point. Borders were irrelevant to scientific knowledge, which could and should travel freely across international communities of expertise. This appealing story of purity and egalitarianism was animated by some forms of practical experience. Natural philosophers in Europe had long communicated across national boundaries. Increasingly a fter 1860, international congresses, societies, institutes, and standards seemed to institutionalize t hese relationships. National academies began to award medals and even honorary memberships to persons who were not citizens of their own nations. Core disciplinary interests also encouraged t hese attitudes. Cooperation was critical in fields in which information gathered around the world had to be brought to bear on pressing scientific problems. Thus were founded the Potsdam Institute of Geodesy (1875), the French Weights and Measures Office (1875), the Paris Health Office (1893), the Institute for Marine Investigation in Copenhagen (1902), the Strasburg Institute for the Study of Earthquakes (1903), and the International Institute of Agriculture in Rome (1905). For disciplinary groups there was a similar trend. International societies were founded for botany in 1864, astronomy in 1865, meteorology in 1873, and geology in 1878.24 At the same time, international agreements w ere negotiated by scientists and engineers for standardized electrical units, botanical names, disease names, statistical methods, railroads, radiation units, and chemical nomenclature. In many countries significant public support began to be available for the pursuit of natural knowledge. In the United States, new state agencies focused on geology, agriculture, anthropology, meteorology, biology, botany, physics, and astronomy. In Europe there were similar new agencies and new institutions supporting a highly professionalized cadre of scientists. And at
80
Rational Fog
the center of all this international activity was the dazzling scientific community in Germany. Germany was the center of European high culture. It was admired for its art, literature, music, science, and philosophy. Nothing compared to a good German PhD. By the 1820s any person interested in science went to Germany to study in Berlin, Munich, or Gottingen. Foreign study in Germany peaked in 1890, and German was the international language of science.25 Symbolizing this sense of a coherent international community of knowledge production was the creation of the Nobel Prize, with the first prize awarded in December 1901. Financed by the Swedish businessman Alfred Nobel, who held a patent for dynamite and made part of his fortune as an arms dealer, the prizes were intended to be conferred based only on the quality of the scientific work honored. Nobel’s 1895 w ill stipulated that no consideration should be given to nationalism in decisions about who would receive the award. The prizes were symbols of the celebration of internationalism (although nationalist interests routinely s haped nominations).26 Solidifying these international bonds was the social experience still so familiar in academe: the international society and its annual meeting where new ideas and discoveries could be shared. Between 1870 and 1900, about twenty international scientific meetings were held in Europe each year. Between 1910 and 1914, that number had risen to forty per year. But from 1914 to 1918, as the war devastated Europe, only seven international scientific meetings were held. The war disrupted scientific networks and challenged internationalist ideals. By the end of the war, many scientists saw a darker vision of the relationship between science and the state. The Manifesto of the Ninety-Three Intellectuals, signed by prominent members of the German scientific community in October 1914, shocked many scientists in other nations. It was a full defense of German actions, including the destruction by German troops of a magnificent library in Louvain, Belgium. The library at Louvain had been built in the fourteenth c entury and contained rare and priceless manuscripts. It was burned by German soldiers in August 1914. This was widely viewed as an assault on culture itself, a sign of barbarism. But if international observers w ere hoping that the German scientific community, with its commitments to knowledge and learning, would condemn the actions of t hese soldiers, they were disappointed. T hose signing the manifesto defending German aggression included luminaries of science like Max Planck, Paul Ehrlich, Wilhelm Ostwald, Wilhelm Roentgen, and
Trenches, Tanks, Chemicals
81
Walter Herrmann Nernst. All were leading German scientists considered crucial thinkers in their fields. And all had chosen to defend country rather than to defend the core values of international reason. Soon Fritz Haber’s chemical weapons program added to the international outrage. Haber was a renowned chemist respected internationally. The war made him something akin to a war criminal—though the Swedes awarded him the Nobel Prize in 1919. The German physicist Albert Einstein did not sign the Manifesto of the Ninety-Three Intellectuals. He was nonetheless viewed with suspicion by some non-German physicists. When his theory of relativity was published during the war, in 1915, it was at some risk of being disbelieved because of the nationality of its creator. The theory was a coherent and revolutionary explanation of space and time that drew on stunning mathematics. Because of the work of a British Quaker professor of astronomy at Cambridge University, Arthur S. Eddington, the physics community began plans to organize an eclipse expedition that would eventually validate Einstein’s work—during the war itself. As Matthew Stanley’s work shows, Eddington translated Einstein’s paper and persuaded o thers of its importance, in a rare example of internationalism during the war. A favorable opportunity to validate the theory would occur in 1919, when an eclipse would make it possible to detect the displacement of the stars that the theory predicted. It would be visible in only a few places in the world, and arranging the trip and equipment would be costly and time consuming. Eddington engaged support from the Royal Academy of Sciences and the Royal Society, and collected teams of observers, including himself, to make the trip in time for the May 1919 eclipse. In the midst of a brutal war with Germany, Eddington committed himself and his colleagues to a project that would prove that a German scientist with a path-breaking theory was right.27 The expedition succeeded, and when the formal results were presented at the Royal Astronomical Society later that year, it was reported as a revolution in science: the stars w ere indeed displaced. But Einstein’s brilliance did not erase skepticism about all t hings German, or about German science. After the war, to many German scientists, science was all Germany had left. The physicist Max Planck said that “if the enemy has taken from our fatherland all defense and power there is one thing which no foreign or domestic enemy has yet taken from us: that is the position which German
82
Rational Fog
science occupies in the world.”28 But German science no longer occupied that position. The International Research Council established a fter the war in 1919 had a formal policy that German scientists were not to be welcomed at international meetings—indeed even admitting scientists from neutral countries that might be sympathetic to Germany was a vexed issue for those planning the new group. As Daniel Kevles shows in his study of the creation of the new IRC, many scientists reacted emotionally to the idea of attending meetings with German scientists. Emile Picard of the University of Paris, an eminent mathematician whose son had recently been killed in action, made it clear to one US correspondent that French scientists would no longer be willing even to sit down at the same table with German colleagues. “Personal” relations of any kind, Picard said, would be “impossible” with men whose government had committed such atrocities and who themselves had “dishonored” science by exploiting it for criminal ends. The US astronomer George Ellery Hale was equally willing “to cut loose from them altogether,” and British mathematical physicist Arthur Schuster, his nephew dead at the front, said he could not conceive of attending postwar meetings with enemy scientists.29 In practice, many international meetings between 1918 and 1930 did not include German scientists. By the summer of 1920, fifteen nations w ere members of the new International Research Council which was aligned against Germany. Feelings gradually shifted, and in 1926 the rules against admitting Germany to the IRC were relaxed. But Germany refused to join either then or in 1931, when the IRC was renamed the International Council of Scientific Unions. The First World War brought international scientific activities, including the award of the Nobel prizes, to an almost complete halt. Haber proposed that “in war-time, the scholar belongs to his nation, in peacetime to mankind” and many of his peers agreed.30 Scientists in Germany supported the war and denigrated the achievements of scientists from the other side. “Such a massive intrusion of politics into the supposedly non-political realm of science naturally left scars. Even today, the boycott -the term coined for the series of measures advocated by the IRC-remains a sensitive issue for many scientists, to be evoked primarily as a warning or an object lesson to show what happens when the norms of universality and organized skepticism are set aside.”31 The prewar organization of international science was a casualty of the war, and even through the 1970s, the events of the 1920s and early 1930s
Trenches, Tanks, Chemicals
83
could still provoke angry words. Indeed, the boycott might have been implicated in some of the more romantic and exaggerated claims about neutrality and honor during the Cold War.32 As the legacies of the First World War became the subject of public debate in the 1920s—as participants, scientists, and public observers began to look back on the devastating carnage of the war—many anticipated f uture wars determined by science and technology. Will Irwin’s 1922 apocalyptic book, The Next War, imagined chemical weapons rained down upon cities, with massive bomber raids, bacteriological weapons, and general terror. He also portrayed scientists as akin to war criminals. Like others, he wondered if air power made war obsolete, and too terrible to contemplate. “Here is a projectile, the bomb-carrying aeroplane, of unprecedented size and almost unlimited range; here is a scheme of warfare which inevitably draws those who were hitherto regarded as non-combatants into the category of fair game.”33 He proposed that “we must try to repair this world machine” and said war had “died its spiritual death” through new technology.34 As Tami Davis Biddle has demonstrated, as early as 1905 British experts valued attack from the sky (at this point, bombs from hot air balloons) in terms of creating fear in the bombed people on the ground. Morale effects w ere seen as the key advantage of air power, perhaps partly because attacking the “will” of the people was a well-recognized and traditional military goal, and one that valorized a British way of understanding the British p eople. National ideals of courage, resourcefulness, tenacity, and willpower, as they would presumably be made manifest under the stresses of urban attack, reflected perfectly the values of upper middle class Victorian and Edwardian societies. And some air power theorizing made invidious comparisons between states, based on ideas about race and class. Different nations could be expected to “yield” more quickly in the wake of an air attack—and this would be a sign of the inferiority of the people in that state.35 The debate about how to use air power might be seen as a debate about definitions. What was an airplane? An eye? A delivery device? A fighting platform? A support technology for ground troop activities? During the First World War armed forces from Germany, France, Britain, and the US explored all these possibilities. It was clear almost immediately that air space was valuable space—not least because enemy resources could be seen more easily from the sky.36 During the war, tactics for the employment of aircraft were worked out just as ground combat tactics w ere worked out,
84
Rational Fog
through intensive trial and error, and by 1918 a fairly sophisticated body of doctrine existed for the battlefield uses of aircraft.37 From 1918 through the 1930s, this debate only intensified. Interwar air theorists like Billy Mitchell in the United States, Hugh Trenchard in Britain, and Guilio Douhet in Italy proposed innovations in both strategy and technology. In general, they expected air power to be decisive in the next war— if not apocalyptic (e.g., ending all wars, by virtue of the fear that air weapons would produce). Books predicting ruin as a result of air power included titles like The Poison War, The Black Death, Menace, Empty Victory, Invasion from the Air, War upon W omen, Chaos, Air Reprisal, and What Happened to the 38 Corbetts. The real-time attacks in the 1930s—at Guernica, Abyssinia, and Manchuria—suggested how terrifying air power could be. Gen. Giulio Douhet’s 1921 book, The Command of the Air, proposed that air power would lead to societal collapse. Postulating that vast destruction could be wrought by fifty squadrons of bombers, he asked his readers, “How could a country go on living and working under this constant threat, oppressed by the nightmare of imminent destruction and death?”39 In the United States, air war planners focused on the potential of accuracy. Engineer Carl Norden’s bombsight, first produced in an early form in 1924, promised precision, and therefore efficacy. Flight promised both psychic terror and rational efficiency. In 1915, a mournful Sigmund Freud in Vienna described the impact of the war. It had broken out, he said, and “robbed the world of its beauties.” It had shattered European pride in the achievements of civilization and tarnished “the lofty impartiality of our science.” The war “revealed our instincts in all their nakedness and let loose the evil spirits within us which we thought had been detained. It robbed us of that we have loved and showed us how ephemeral w ere many things we had regarded as changeless.”40 As the poet and mystic Rainer Maria Rilke described the “unnatural and terrible wall of the war,” it was a dividing line between past and present, the product of new scientific ideas and technologies.41 It was not just industrialized or technoscientific war, but a form of unnatural nature—of nature leveraged to undermine society. In 1933, as the possibility of another war began to loom over Europe, Albert Einstein and Freud published a series of exchanges between them in a small book entitled Why War? Sponsored by the League of Nations International Committee on Intellectual Cooperation, this book was proposed by the
Trenches, Tanks, Chemicals
85
pacifist Einstein as a pressing issue of the day. In his comments Einstein presented himself as a humanist, not a physicist, trying to understand the human acceptance of violence. Freud proposed that men took pleasure in war. This did not mean that eliminating war was impossible, but aggression was a natural part of the human psyche that had to be recognized. The two very famous men commented on the causes of war with both an eye to past and to future.42 Nationalist passions were growing stronger. Fascist states were solidifying their power. Science and technology made possible new forms of attack and new vulnerabilities. It was a terrifying f uture to contemplate.
4 Mobilized
AT A 19 4 6 MEE T ING OF T HE N AT ION A L RESE A RCH COUNCIL IN WASHING T ON, D.C., T HE PHIL ADELPHIA
physician Malcom Grow (who had worked in aviation medicine during the war) proposed that scientists had let the country down during the Second World War: “We have gone through this war without any real knowledge of what kills people,” he said.1 He thus suggested that scientists had a special obligation to produce knowledge of what kills people. However wrong Grow might have been about the killing powers of science and technology in 1946 (they were substantial), his complaint was an accurate reflection of a new way of understanding scientists’ obligations to the state. The mobilization of the scientific community during the Second World War was a remarkable success and a problematic transition for the scientific community. It involved an almost counterintuitive fungibility of expertise. Geneticists were deployed to plan harbor defense, mathematicians to plan propaganda, paleobotanists to help build bombs at the University of California at Berkeley. In t hese networks of emergency mobilization, being trained as a scientist conferred broad legitimacy, regardless of the specific details of one’s training. 86
Mobilized
87
Such practices celebrated scientists in precisely the ways they might have celebrated themselves, as uniquely rational thinkers with skills that could be applied anywhere, to any problem. They reflected an image of a generic all-purpose “scientist” who had special contributions to make to the war effort—and to the political and social order. At the same time, this fungibility brought experts into sometimes uncomfortable networks of military hierarchy and state-mandated research. Beginning in 1939, and escalating rapidly in the United States and in other nations engaged with the war unfolding in Europe, new and existing scientific institutions were asked to orient research agendas around military needs. This led to the mass production of penicillin, radar, the proximity fuse, DDT, the atomic bomb, computing, better boots and sleeping bags, new instrument dials, napalm, new rockets, torpedoes, chemical agents, methods of blood transfusion, antimalarial drugs, sonar, and many other discoveries both major and minor. A list of mobilization projects in the 1947 Office of Technical Ser vices report includes reports on human hearing, rangefinders, rain-repellant glass coatings, heat-resistant metals, neurotic inventory scales and IQ testing, replacements for brass, studies of German gun tubes, sunshades, flame thrower fuels, and smoke cloud testing in Brownsville, Texas.2 Military priorities included the development of improved scientific perspectives on h uman capabilities, the environmental world, materials sciences, and clothing. Research took place in private industry, academe, and military facilities. In the end, the main agency charged with organizing this research in the United States, the US Office of Scientific Research and Development (OSRD), signed 2,300 contracts with 321 universities and 142 nonprofit and academic institutions. It spent $500 million, a remarkable budget in the 1940s. This did not include the $2 billion (in 1940s dollars, today about $32 billion) spent on the Manhattan project to build the atomic bomb, which was “hidden” in the Army Corps of Engineers budget.3 While it was not a part of the OSRD agenda, the B-29 was also the focus of significant engineering research during the war, and in the end became the most expensive weapon system to be developed in World War II, costing more than the atomic bomb project.4 Every scientific agency of the United States government played some role in this mobilization, from the US Geological Survey to the US Department of Agriculture. Elite universities, especially in the Northeast and especially MIT, benefitted from an infusion of cash that transformed their campuses. Mobilization also reshaped technical and engineering industries large and small.
88
Rational Fog
It was a system shock, sudden, escalating, on-demand, and r unning in a high pitch throughout the war. It was both a success and in some ways an ironic failure.5 In some ways it accomplished the opposite of what those who had planned and enacted it had hoped: the successful mobilization of scientists and science during the war legitimated the permanent government mobilization of technical expertise for military purposes. Scientists faced a new risk of becoming, as National Academy of Sciences President Frank Jewett put it, “intellectual slaves of the state.” 6 For participants, it was an amazing opportunity and a vexing professional challenge. Many of the projects created new industries, for example, around antibiotics, computers, electronics, synthetic rubber, rocketry, and pesticides. And those involved were often calculating benefits that would be realized in a postwar world, in which their patriotic service could be leveraged to financial advantage. They assessed how much to share and how much to hide as they imagined future industrial profit. In the penicillin project, for example, pharmaceutical firms met frequently to discuss their work. They shared data and methods; however, internally, they worried about sharing too much with future competitors.7 Projects also often operated at the secret-open borderland. Scientists indoctrinated in the values of open publication and sharing of data faced a challenge to their core professional values. Secrecy could discourage productive communication across laboratories and in practice delay technical progress, but projects during the war w ere often both secret and open, public and concealed, offensive and defensive, all at the same time. It was also a world of acronyms: NDRC (National Defense Research Committee, the first group appointed to begin mobilizing scientists), OSRD (Office of Scientific Research and Development, the presidentially created oversight group that grew out NDRC), NRC (National Research Council, the working arm of the National Academy of Sciences), CMR (Committee on Medical Research, which was part of the OSRD), MED (Manhattan Engineer District, which was the formal group charged with building the atomic bomb), WPB (War Production Board, which identified production needs and worked with mobilized industries to negotiate relationships), RRC (Rubber Reserve Corporation, which was created to figure out how to make synthetic rubber when rubber supplies w ere disrupted by the war), USDA (United States Department of Agriculture, which managed the development of pesticides like DDT and oversaw the mass production of penicillin), NRRL (Northern Regional Research Laboratory in Peoria, Illinois, where submerged fermentation
Mobilized
89
methods using corn steep liquor to make penicillin were tested), BPI (Bureau of Plant Industry, which oversaw aspects of synthetic rubber production), and many more. Mobilization swerved the professional lives of those pulled into it, producing anguish, excitement, success, patriotism, and moral compromise. Scientific training in the 1920s and 1930s had not prepared this generation of experts for the tsunami of opportunity and risk. The mobilization of science in the Second World War differed from that of the First World War. In World War I, scientists often left their academic labs behind to join the war effort. They w ere not civilians but scientist-soldiers serving as reserve officers and enlisted men. So, for example, the mathematician Osvald Veblen helped to develop ballistics research as he worked as the head of the office of experimental ballistics at the new Aberdeen Proving Ground. Veblen recruited young mathematicians from academic centers around the United States and brought them to Aberdeen as enlisted men.8 Kevles, in The Physicists, details the differences between scientific mobilizations in the two wars quite clearly.9 College students and professors in the United States w ere among the strongest supporters of US involvement in the First World War. The Princeton recruiting station was among the busiest in the nation, and Princeton sent a larger fraction of its student body to train for the war than any other university. In addition, 138 Princeton faculty served in uniform during the war. Meanwhile, the National Research Council, the practical arm of the prestigious National Academy of Sciences that had been created in the nineteenth century, was placed in charge of the US Army Signal Corps, the technical arm of the Army in World War I. The Army also gave council members military commissions, including the NRC Chair, the Caltech physicist Robert Millikan. Thus in 1914–1918, the Harvard chemist James Conant (a future president of Harvard University) became a lieutenant and worked on chemical weapons at an Army laboratory located on the campus of American University. Norbert Wiener, later to become famous for his conception of cybernetics, attended military training camps in hope of commanding a battalion in France. More than 150 mathematicians served in uniform, along with many more engineers, psychologists, physicians, physicists, and chemists. But by the late 1930s this model of active military service for the intellectual elite was anathema to Vannevar Bush, then President of the Carnegie Institution, who had himself worked in uniform on antisubmarine research in
90
Rational Fog
the First World War. As t hings turned out, it was Bush who seized control of the mobilization of science twenty years l ater. Some natural and social scientists did serve in uniform in the Second World War. For example 1,700 of the 4,400 members of the American Psychological Association (APA) worked directly for the military during World War II. Thousands of other psychologists consulted for war-related government agencies.10 But many more scientists served in the civilian networks of knowledge production engineered by Vannevar Bush. While questions have legitimately been raised about the relationships between civilian and military knowledge systems—and the moral culpability of t hose who built bombs or refused to build them—when you look closely at the history of these scientific development programs, in many cases the knowledge itself had no inherent qualities as e ither military or civilian. This could be true regardless of where the funding came from, or w hether the person making the discovery believed that what they were d oing in that moment would help the war effort or any f uture war effort. A discovery or an insight could float along as a strictly civilian or “pure science” question, perhaps even for decades, and then be leveraged at a crucial moment to address critical military needs. An idea or a technology could start as the result of a military initiative and then become an important civilian technology. A project could also be supported by military funding and prove useless to the war effort. And apparently civilian (indigenous!) technologies, for example Inuit-style kayaks, could become part of military systems and of weapons delivery, as they did during stealth kayak raids in the Mediterranean in World War II. The notion that military technology is a special domain outside the normal run of expertise cannot be sustained. This is not b ecause military knowledge is in any sense innocent. Rather, it is b ecause many forms of knowledge, expertise, and technology are fungible, unstable, polyvalent. “Civilian” and “military” are interesting categories worth interrogating, particularly b ecause they matter to historical actors and play a role in justifying policies and programs. But the historian cannot take them as self-evident or transparent, or morally meaningful in some transcendent way. As I show here, many of the technologies and scientific ideas produced for the purposes of the war were by any standard humanly valuable. They focused on medical innovation that could help to save injured soldiers (and later many others), on improved food production in agriculture that later increased yields for others, or on understanding weather more accurately in real time.
Mobilized
91
In some cases, military research was even more “objective” than civilian, ecause of the practical and straightforward institutional needs that drove b it: climate change, for example, has been studied seriously by US Armed Forces since the 1950s b ecause it was recognized early as a strategic threat. Having accurate information about such a risk was “beyond” political concerns (e.g., contemporary concerns about the impact of remediation on industry and economic growth). Conversely, civilian sciences could become violently militarized, used to damage p eople and environments, regardless of the intentions of t hose who made new knowledge—which is exactly what happened with the herbicide Agent Orange. A botany graduate student at the University of Illinois in the early 1940s, Arthur Galston found a compound that could cause the weakening of cellulose at the juncture of leaf and stem, which caused plants to shed leaves. His intent was not to produce something that could be used as a defoliant in a f uture war, and the rest of his research moved in different directions. But scientists at Fort Dietrich later discovered his 1943 dissertation and built a research program around the possibility of using what he had discovered. Galston became an activist when he discovered that his own work had been used in this way.11 The mobilization of science and technology during World War II makes the ambiguity of “civilian” and “military,” when applied to science and technology, even more obvious. All kinds of knowledge and expertise w ere shifting from one category to another. Things were mobilized and in motion. And as it turned out, almost everything about nature had a potential defense (or offense) dimension. Planners at the OSRD recognized this simple fact, and exploited it. Vannevar Bush was the key architect of the OSRD. Years later, in assessing his own role, he noted that “there were those who protested that the action of setting up the National Defense Research Council was an end run, a grab by which a small company of scientists and engineers, acting outside established channels, got hold of the authority and the money for the program of developing new weapons. That, in fact, is exactly what it was.”12 This “end run” was defined fundamentally by the values of Vannevar Bush. He d idn’t trust the federal government and didn’t want science to be under government control. That is why the postwar consequences of his very successful mobilization of science in World War II are what historian Larry Owens called ironic: Bush brilliantly led science and technology somewhere he did not want it to go. He was explicitly afraid of scientists as beholden to
92
Rational Fog
the state, and his own success facilitated a transformation in government support and military investment that made scientists more, rather than less, dependent on state support and more, rather than less, embedded in military systems. Bush was an electrical engineer trained first in mathematics and held a PhD in electrical engineering from MIT. His laboratory in the 1920s and 1930s began to design and build analog computers, that is, computers that presented data using some kind of physical data system rather than digital data. His Differential Analyzer, operational in 1931, used a system of gears and cams driven by steel shafts to obtain approximate but practical solutions to prob lems that w ere seen at the time as difficult. It began to be employed to solve engineering and physics problems, and it was used in World War II to produce ballistics t ables.13 He became an engineering dean at MIT in 1932 and in that role began to be involved in national politics, serving as chair of a committee that examined the patent system for President Franklin Roosevelt. In 1939, when German troops invaded Poland, Bush proposed that Roosevelt should create a structure to help bring scientific expertise to the attention of military planners. In June of 1940, Roosevelt appointed Bush to lead a new National Defense Research Committee (NDRC) “to coordinate, supervise, and conduct scientific research on the problems underlying the development, production, and use of mechanisms and devices of warfare.” A year later, it became the somewhat grander Office of Scientific Research and Development (OSRD). The OSRD was the agency most involved in mobilizing science for war in the United States. Other government and military bureaucracies w ere also involved, and sometimes they operated independently of OSRD oversight. But no other single institutional nexus matched the scope and scale of the OSRD program.14 While the Manhattan Engineer District spent more than OSRD, it was both more focused (on building a particular kind of bomb) and more decentralized (with thirty-seven different institutions and 120,000 employees). The problems that science seemed likely to solve w ere wide ranging. The original NDRC (created in 1940 and continued in 1941 as a subcommittee of OSRD) had divisions focused on research in armor, fuel, communication, ordnance, and bombs. The OSRD, with more funding and a broader agenda, added an expansive Committee on Medical Research (CMR) and divisions on insect control, ballistics, subsurface warfare, radar, camouflage, metallurgy, and fire control, as well as panels on applied psychology, applied mathematics,
Mobilized
93
problems of the tropics, and vacuum tubes. After the war, the OSRD generated dozens of specialized reports about the research undertaken during the war, most of them book length. These texts document the remarkable technical yield of the project.15 As head of the OSRD, Bush was the dealmaker, an idiosyncratic, essentially conservative critic of the New Deal, building mobilization around a deep suspicion of the federal patronage that made mobilization possible. Opposition to his views could also produce a dislike of him personally. He was extremely self-confident, in ways that sound elitist, masculine, and privileged. Bush said that a professional is a person (he would have said “a man”) who can and should take charge of society because the professional has superior specialized knowledge. A small, intelligent minority benevolent in intention, he proposed, should rule everyone else. Elites w ere (by definition?) disinterested and authoritative and could safeguard o thers who knew less. He was convinced that service, rather than financial return, motivated the true professional.16 In his management of the OSRD he was therefore cautious and restrained. He viewed its mission in narrow terms. The social sciences, for example, were not a top priority for him b ecause he viewed them as less relevant to combat priorities. He created an Applied Psychology panel, but the leadership at the Social Sciences Research Council viewed Bush as uncommitted to supporting research in the social sciences. Even many biologists found Bush narrow in his views of what constituted militarily relevant biological research. And, predictably, given his priorities and his worldview, the OSRD favored elite institutions, with MIT benefiting the most. As Owens has demonstrated, the OSRD worked well partly because of the contract, which Bush used as a bureaucratic way of establishing a proper relationship between scientists and the state. Bush didn’t want scientists to be in active duty as enlisted men or officers. Nor did he want them to be subservient in the ways that those receiving grants or fellowships (fundamentally, “gifts”) might be seen, needing to win the approval of government officials. His solution was to use the contract as a way to equalize power. In Bush’s eyes, a contract established an equal relationship between two independent parties. It was closer to a marketplace arrangement than a sponsorship, and it protected everyone. Most particularly, it protected those involved from possible arbitrary demands from politicians, bureaucrats, or generals. All contracts were on a no-profit, no-loss basis in order to avoid the image of
94
Rational Fog
wartime profiteering. Contracts could clearly define what each of the partners in any agreement would be expected to do. The goal was to keep the arrangement honest. Contracts could be made only between independent and equal bodies or individuals. As Owens has put it: The OSRD’s ultimate output was weapons, but the index of its day to day operations was the thousands of contracts mass-produced by its administrative machinery, like products off the assembly lines of the nation’s factories. Given the unprecedented business of this crypto-federal scientific organization, the large numbers of political amateurs who worked for it, the prickly relationships that existed between agencies, and the rapid expansion of its scientific business, OSRD was a managerial accomplishment of the highest order.17
The question of what exactly was being contracted for was, however, a bit unclear. Gradually, the OSRD chose to write contracts with built-in flexibility. Contractors should agree to work hard to make a discovery, rather than to deliver a specific item by a specific time. This meant there were no deadlines. Such contracts w ere easier to extend or amend as the realities of discovery, innovation, and failure evolved. Bush’s strategy of the contract helped to fuel the engine that made the Cold War state, perhaps even the rise of the military-industrial complex that later vexed President Eisenhower. In terms of making new knowledge, and applying it rapidly in real time to the problems of a global war on two fronts, it was a stunning success. In 1943 the New York Times called the Office of Scientific Research and Development a vast test tube army dedicated to winning the war, with “more than 100,000 trained brains working as one.” T hose trained brains working as one genuinely transformed not just war, but the world. It w asn’t always a m atter of new knowledge. In many intriguing cases, things that had been discovered, described, and published e arlier w ere leveraged into practical, large-scale achievements because of the demands of war. At the very least, these cases suggest how important context is for discovery. Things can be known but not important, so much so that they are virtually ignored. Alexander Fleming, for example, won the Nobel Prize for penicillin a fter the war, but his original paper on the antibiotic properties of the mold he iden-
Mobilized
95
tified as Penicillium notatum in 1929 did not seem to herald a medical revolution to anyone, Fleming included. And the incredible pesticide DDT was synthesized and described by a PhD student in 1874 and not used to kill bugs until the late 1930s, when it became a lousicide used on war refugees to control disease. Only after 1943 did it become an environmental pesticide. Similarly, the insights into radioactivity that began with the French physicist Marie Curie in 1903 w ere not seen as relevant to national security until the 1930s. In this section, I explore how knowledge was repurposed to play a critical role in Allied pursuit of the war. All three of these initiatives—penicillin, DDT, and the bomb—were profound technical successes. All had unexpected, problematic, long-term consequences. The mass production of penicillin became one of the most important medical innovations of the twentieth c entury. It fueled the development of many other antibiotics as a systematic effort to test molds, bacteria, and fungi collected all over the world and revealed other microorganisms with antibiotic properties. The availability of penicillin transformed the treatment of many diseases. And its successful production depended on the OSRD Committee on Medical Research and its careful navigation of an international Allied scientific and industrial network. The story of the discovery and mass production of antibiotics has sometimes produced what I would call narratives of resentment. Participants and historians have reconstructed credit, carefully parsing out who did what and who should have been given more attention or credit, or, which country should be seen as primary. The core narrative of resentment focuses on the UK vs. the United States. Scientists in Britain (including Fleming) figured out that penicillin would be important to the war effort. Scientists, engineers, and industrialists in the United States turned that promise into mass production by submerged deep-tank fermentation. In the most resentful version of this narrative, the United States “stole” penicillin from Britain.18 The Scottish-born bacteriologist Alexander Fleming, working at St. Mary’s Hospital in London, discovered penicillin by accident. Having served as a physician at the front during the First World War, Fleming had long been engaged with questions of infection and sepsis. In 1923, he discovered the very mild antibacterial properties of h uman saliva and mucus. He found that h uman saliva contained an enzyme, lysozyme, which broke down bacterial cell walls. It was a form of organic defense (mouths are full of bacteria). In 1928, he was growing staphylococci colonies, as part of writing a textbook on bacterial
96
Rational Fog
infection, and a number of his plates, necessarily exposed to the air in the course of being frequently examined, had become contaminated by various micro-organisms. On one of t hese contaminated plates, the bacterial colony was becoming transparent and not growing near a large colony of the common, grey-green bread mold of the genus Penicillium. Fleming began to investigate what was happening. He grew the bread mold and collected the liquid it produced. Fleming’s “mould juice” was difficult to extract and seemed to be slow- acting. He did not test its efficacy against bacteria in living organisms, though he did inject it into a rabbit only to test for toxicity. He published a paper about his findings in 1929 in the Journal of Experimental Pathology. In that paper he named the new substance penicillin, a fter the mold genus Penicillium, from which it was derived. He seems to have seen the name as an editorial convenience, as he said in his paper. In order to avoid “the repetition of the rather cumbersome phrase ‘mold broth filtrate’ the name penicillin w ill be used. This w ill denote the filtrate of a broth culture of the particular Penicillium with which we are concerned.” Fleming closed his paper by suggesting that penicillin might be useful as an antiseptic for surface wounds.19 Many physicians and scientists then in leadership roles had seen first-hand the risks and costs of infection at the front in the First World War. Perrin Selcer’s work on wound treatment in the front lines captures the intensity and complexity of the debates during and after the war.20 In 1938, as another war loomed ominously ahead for Britain and the world, a group in Oxford, at the Sir William Dunn School of Pathology, began looking systematically for antibacterial substances. A Jewish refugee from Nazi Germany, the biochemist Ernst Chain, was newly employed in the laboratory. He had arrived in England with no resources or family, and Howard Florey, the prominent Australian-born pathologist who ran the laboratory, hired him. This was at least partly b ecause Florey was persuaded that the life sciences and the chemical sciences w ere or 21 should be in productive conversation. Chain started looking at all published work on antibacterial substances. He read Fleming’s e arlier paper about lysozyme, the substance found in tears, mucus, and saliva that had antibacterial properties. Chain was able to figure out the chemistry of how it worked and, concluding that he had solved the scientific problem t here, moved on to other antibacterial agents. He then read Fleming’s 1929 paper about penicillin and persuaded Florey (who was much more powerful and able to lead a signifi-
Mobilized
97
cant scientific investigation) that penicillin might be important. Eventually Florey agreed. By May 1940, they had collected enough active penicillin to test it on eight mice infected with strep. The account of the British biochemist Norman Heatley, one of the key team members who figured out how to extract and purify penicillin, tells the story well: “After supper with some friends, I returned to the lab and met the professor to give a final dose of penicillin to two of the mice. The ‘controls’ were looking very sick, but the two treated mice seemed very well. I stayed at the lab u ntil 3.45 a.m., by which time all four control animals w ere dead. It r eally looks as if penicillin may be of practical importance.”22 The next step was human trials, but Florey’s group did not have enough penicillin to test it on people who were actually ill. At this stage the only option for growing the mold and producing penicillin was surface fermentation, which involved growing the mold on the surface of a nutrient broth, generally with sugar of some kind, and thereby inducing the production of the yellow liquid that appeared on its surface and had antibiotic properties. The mold grew best in shallow containers, and Heatley used flasks, b ottles, trays, dishes, and even old-fashioned bedpans with a lid and spout borrowed from the Radcliffe Infirmary. He eventually designed and managed to obtain 400 rectangular ceramic vessels that w ere stackable in which the medium could be easily changed. This was the first effort at the “mass production” of penicillin.23 The group’s 1941 paper on t rials of penicillin with h uman subjects was entitled “Further observations on penicillin.” It was published in the Lancet and described results with a total of only ten patients. Two w ere suffering from serious systemic infection and died after showing modest improvement with the administration of penicillin. Eight others w ere healed. The results were promising enough to justify a major program of mass production, but efforts to interest pharmaceutical firms in E ngland were unsuccessful. There were probably many reasons for the lack of interest from leadership in the British pharmaceutical and fine chemicals industries, but certainly among them w ere the already pressing demands of the war in Europe. Florey reached out to the United States. The chairman of the OSRD’s committee on medical research, University of Pennsylvania chemist A. N. Richards, was a personal friend. He helped Florey and Heatley make the connections they needed in the United States. And h ere is where the story begins to
98
Rational Fog
get more interesting, in terms of understanding how mobilization worked in the United States. The development of the mass production of penicillin by deep tank fermentation involved three major companies (all of whom agreed in advance to make no patent claims), at least two government committees, scientific researchers in private labs, scientists at several universities, and even the passive cooperation of a fruit vendor in Peoria, Illinois, who sold a rotting cantaloupe to a USDA lab assistant. This Peoria cantaloupe, a chance recruitment to national defense, yielded the most productive strain of the mold. The key technological element was deep tank fermentation. This is the pro cess of growing some kind of a bacterial agent, fungus, or mold submerged in a nutrient liquid and constantly stirred, with controlled temperatures and different forms of sugar or salt in order to cause the living thing to produce something desirable. Deep tank fermentation was already in use by the 1920s for the production of citric acid. A fungus, aspergillus niger, could be induced to make citric acid, which is found in many plants including lemons and limes, at commercially productive levels. Pfizer, a small chemicals company in Brooklyn, had mastered this process and was by the 1930s one of the world’s major suppliers of citric acid. The USDA’s Northern Regional Research Laboratory in Peoria, Illinois, also had extensive experience with submerged fermentation. USDA scientists thought that it might be possible to grow the penicillin mold in this manner. They created a nutrient mix of sugar, milk, salts, and minerals, and adjusted the temperature and the rate of stirring to maximize the amount of the active agent that was produced. Growing penicillin by submerged fermentation eventually solved the problem of mass production. Pfizer, already successful with citric acid production, became the main producer of penicillin by deep tank fermentation during the war. Penicillin transformed Pfizer into a pharmaceutical giant. No other com pany was able to ramp up the production of penicillin as quickly.24 The com pany already had the practical, scientific, and craft knowledge needed to succeed at deep tank fermentation.25 Other forms of producing penicillin—by synthesis or by surface growth—were less effective. But Pfizer’s success depended on many other institutions and p eople. The USDA lab at Peoria worked out the first calculation of how to use corn steep liquor as a nutrient broth for penicillin. A geneticist at the Cold Spring Harbor laboratory, Milislav Demerec, irradiated the mold from that Peoria cantaloupe to extract a more
Mobilized
99
powerful, mutated strain of Penicillium. The OSRD’s committee on medical research kept the fragile alliances going, even as tensions arose between Merck, Pfizer, and Squibb; between the Committee on Medical Research and the War Production Board; and between individuals at e very level.26 The first major public trial of penicillin was after the Cocoanut Grove Fire in Boston in late November 1942. Cocoanut Grove was a basement nightclub in Boston’s Bay Village. When the fire broke out, patrons could not escape, partly because of the way that exits were constructed. The final death toll was 492, and many more w ere injured and burned. Some survivors w ere treated for infection with penicillin in an early, public use of the drug that attracted significant publicity. The dramatic success of penicillin led to appeals, sometimes heartbreaking, to President Roosevelt and particularly to First Lady Eleanor Roosevelt, asking that penicillin be made available to sick family members including infants and c hildren. But penicillin was being rationed as a military resource and was not generally available to civilians. Boston University School of Medicine physician Chester Keefer, the head of the War Production Board’s Committee on Chemotherapeutic Agents, became the “penicillin czar” who managed rationing. Keefer reviewed e very potential subject who could be given penicillin, and oversaw the medical assessment of the new drug. In that sense, he sometimes decided who would live and who would die.27 In June 1944, penicillin was available to all Allied troops for the invasion on D-Day, and less than a year later, in April 1945, it was made available in pharmacies in the United States as a civilian medical resource. The worldwide search for other antibiotics produced by molds, bacteria, and fungi led to bacitracin in 1942, streptomycin in 1944, cephalosporin in 1945, chloramphenicol in 1949, terramycin in 1950, erythromycin in 1952, vancomycin in 1956, and rifampin in 1957. It was a transformative moment in medicine. For those reading this t oday, there is a fairly good chance that you have benefited from antibiotics more than once. You might also be threatened by antibiotic resis tance, as many disease-causing germs have evolved quickly to survive exposure to antibiotics. Just as penicillin was discovered long before the war and then repurposed in the midst of mobilization, so too was dichloro-dyphenyl trichloroethane (DDT). In 1874 a PhD student studying in Germany, Othmar Zeidler, synthesized DDT. He published a report and did no further work on it. In 1939, the Swiss
100
Rational Fog
chemist Paul H. Mueller found that DDT killed insects. He filed for a basic patent in Switzerland in 1940. When the United States began sending troops to fight in the Pacific after the December 7, 1941, attack on Pearl Harbor by Japanese forces, troop exposure to malaria, typhus, and dengue fever became a critical military problem. Some divisions suffered far more from diseases carried by insects than from combat wounds. This was not a particularly new phenomenon. From at least the eighteenth century until World War I, disease often caused more troop loss than combat. For combatants, the ratio of disease deaths to wound deaths was 110:15 in the Mexican-American war, 65:33 in the American Civil War, 27:5 in the Spanish-American War, and 19:53 in WWI (excluding deaths from the 1918 influenza epidemic). This metric partly reflects the brutality of World War I weaponry. However, it also reflects improvements in medical treatment and an increasing commitment to disease prevention. By the Korean War, the official US Army count was one disease death to 126 wound deaths. One impor tant element in reducing this disease burden was the control of insects, and it was here that DDT mattered. Early in the war the OSRD asked the Bureau of Entomology and Plant Quarantine of the US Department of Agriculture to explore new ways to prevent insect-borne diseases. The Bureau devoted its Orlando, Florida, laboratory to this effort. Entomologists searched for quick rather than ideal solutions to the problem. Typhus control was the first priority for the Army. U ntil 1942, the primary control mechanism for killing lice, which carried typhus, was to steam clothing and bedding. This worked well enough in times of peace, in stable military bases, but at the front it was impractical. It also did not affect lice already attached to the human body. The Army wanted a louse- killing powder that soldiers could carry to the front, and Orlando entomologists sought something that would kill lice quickly, work at low concentrations, produce very high mortality, and endure as long as possible. The combination of rapid and complete killing, low-dose concentration, and per sistence was a difficult standard to meet. The laboratory screened 8,000 chemicals and then focused on 400 for further experimentation. Only one chemical emerged as promising for killing lice. This was pyrethrum, which came from the dried flower heads of the chrysanthemum. The bureau recommended a pyrethrum powder to the Army in 1942, and the US Armed Forces adopted it as their standard louse killer.28
Mobilized
101
Pyrethrum powder worked against typhus but not against malaria, b ecause it did not kill mosquitoes. The Orlando group began trying to find something that would kill mosquitoes that carried malaria. Usual mosquito control practices involved draining, oiling, or poisoning breeding areas, but again, these practices w ere not practical on an active military front when an island was to be invaded or in forest warfare. Meanwhile, supplies of chrysanthemums were threatened by the war itself. The United States had imported pyrethrum from Japan, Dalmatia, and K enya, but the war cut off supplies from all but Kenya. By July 1943, the situation was becoming critical. There was a defined problem. Insects that carried typhus and malaria had to be killed. An insecticide of some kind was understood to be the solution, and the very high risks of disease on rapidly moving battlefronts justified high-risk pesticides. Most of t hose engaged in studying and assessing chemical insect killers used criteria that reflected the wartime crisis. Questions of long-term effects and toxicity were more relevant in peace than in war, and if a chemical produced more toxicity or more irritation, its use might still be justified in a military emergency.29 The Swiss manufacturer Geigy sent the USDA a sample of DDT in October 1940. DDT was clearly toxic to insects; it worked at small doses and killed insects for a long time. These were the exact, desired qualities of a new anti-malarial insecticide. Another advantage of DDT was that it was possible to manufacture it in the United States because Geigy had a factory there. But it was not clear that DDT was safe. When it was given to guinea pigs and rabbits in high doses, it could cause convulsions or death. High doses, however, were not necessary. Very low doses could kill insects and larvae, and it seemed that h uman skin absorbed little or no DDT when it was applied as a dust. In May 1943 entomologists at the USDA recommended that DDT, already in use in Europe, be the Army’s official louse powder.30 As Russell notes, this decision did not mean that the Army or the USDA had decided that DDT was harmless. Rather, the h azards had been weighed against the advantages, and the ability to rapidly produce DDT, its persistence in the field, and its high mortality for insects justified its use u nder t hese critical conditions. Tests at the National Institute of Health’s Division of Industrial Hygiene suggested that that DDT was safe for humans when used as an aerosol dust or mist. During the winter of 1943 and 1944, it proved its field value. Typhus appeared in bombed-out Naples, and Allied health organizations dusted more
102
Rational Fog
than one million civilians with louse powder made with DDT. The epidemic was halted in the m iddle of winter, and DDT received much of the credit. Publicity about the new chemical promised civilian uses after the war. Reader’s Digest proposed that insects would no longer be a problem for farmers and ranked DDT as one of the greatest scientific discoveries of the war. It would wipe out disease-causing insects, of course, but also other pests like flies, cockroaches, and bedbugs.31 Journalists seem not to have noticed that evaluating a pesticide for civilian use in peace time would and should have different rules. The traits that made DDT ideal for military use during an active war—persistence and broad killing power—were the same ones that legitimately concerned scientists at the Bureau of Entomology and Plant Quarantine. Any persistent chemical, if used on food crops, could and probably would leave poisonous residues. And a pesticide that killed everything would kill both pests and useful insects that preyed on them and helped to control them. Entomologists knew during the war that DDT posed risks, to crops and to the environment. After the war, of course, DDT was widely used for crops and for insect control in cities until it was identified as having environmental consequences (the earliest scientific papers were in the late 1940s) and posing a fundamental public policy risk that could damage all forms of life (after Rachel Carson’s book Silent Spring in 1963). While the wartime production of penicillin and DDT are well known, perhaps the most famous mobilization story of the Second World War is that of the construction of the atomic bomb. This was accomplished by a group of engineers, mathematicians, and physicists at Los Alamos, New Mexico, and at other installations across the United States and in other Allied nations. Like penicillin and DDT, the bomb project involved knowledge that was repurposed for defense, in the circuitous path from x-rays and radium to nuclear weapons. In 1895 the German physicist Wilhelm Roentgen produced and published an x-ray image of his wife’s hand—the clearly visible bones and wedding ring, and all the soft tissue a blur. This astonishing image appeared in newspapers all over the world. Roentgen had been working with the energies produced by cathode-ray tubes when he found x-rays (so named because he did not know what they w ere). Their existence suggested provocative possibilities of other unknown and unseen sources of energy, and when Henri Becquerel in Paris found natural radioactivity in uranium salts, he realized that it differed
Mobilized
103
from x-rays. Following Becquerel’s insights, the Polish physicist Marie Curie and her husband Pierre, in Paris, worked to isolate two new elements, radium and polonium, from uranium ore. In 1903, Becquerel and the two Curies shared the Nobel Prize.32 Ernst Rutherford, the New Zealand–born physicist then at McGill University in Canada, thought the results suggested that the atom was not stable. Rutherford managed to produce artificially induced radioactivity, which raised more questions about the nature of the atom.33 Studies of radioactivity earlier in the century then were part of exciting new insights into the nature of matter. Mass and energy w ere clearly related, and energy was something people needed. By 1933, “atomic energy” was the subject of some public speculation, with proposals for “atomic street lamps” and even for explosives. The novelist H. G. Wells mentioned atomic energy in his 1914 novel The World Set Free. But t here was no systematic program to build a bomb, no deadline, no national directive. That happened only a fter a critical scientific paper in 1939 that demonstrated that the energy being released by atoms was comparable to the energy released in some chemical reactions used in explosives. This paper, by the Jewish Austrian-born physicist Lise Meitner and her nephew Otto Frisch, proposed that the uranium nucleus underwent fission (a word borrowed from biology)—essentially, splitting. It appeared in Nature in March 1939. As Meitner’s biographer Ruth Sime put it, “In just over a page in Nature, they pictured the fission of uranium as the division of an ‘essentially classical’ liquid drop, estimated a vanishingly small surface tension for large nuclei, calculated the energy release from uranium fission, predicted that thorium also undergoes fission and accounted in all major respects for the four years’ work that Meitner, Otto Hahn and Fritz Strassmann had done as a team and that had culminated in the finding of barium and the discovery of fission.”34 By 1939, the Jewish Meitner was a refugee from Nazi Germany, newly settled in Sweden. Like many other women scientists, both in the past and in the present, she was not generally given the credit she deserved for her path-breaking work on fission. The Nobel Prize went to Otto Hahn in 1944.35 Shaping the consequences of the publication of this 1939 paper was the recent migration of refugee scientists from Germany in 1933 and later, in the wake of new laws that forced the removal of Jewish scholars, scientists, and intellectuals from professional positions. Albert Einstein had simply decided not to return to Germany in 1933. Enrico Fermi fled Italy with his Jewish wife.
104
Rational Fog
Leo Szilard, Hungarian born, left Germany in 1935. Many other outstanding scientists lost their positions and found new institutional homes in the United Kingdom, the United States, and other countries. T hese refugees from fascism and the murderous Nazi state had a particular stake in preventing a German atomic bomb. A few months a fter the Meitner-Frisch paper in August 1939, Leo Szilard, with support from others, persuaded Einstein, then established at the Institute for Advanced Study in Princeton, that he alone could reach President Franklin Roosevelt. Einstein duly wrote to the president, who received the letter in October 1939. This letter explained how high the stakes w ere. It suggested that a bomb was possible and that the United States needed to be ahead of its enemies in the coming war. While the United States was not yet a formal participant in the war, it was gearing up. Roosevelt created a committee to explore resources for building such a bomb. By June 1940, t here was a general scientific and policy consensus that whoever possessed this hypothet ical bomb would have a tremendous advantage. The United States leadership, scientific and political, committed to build it. On the day before the Japanese attack on Pearl Harbor in December 1941, Roosevelt increased funding for the new weapon. It had been $6,000 in 1939 and was $1.2 million by the end of 1941. In June 1942 Roosevelt told his inner circle that the bomb would be built—with a rough prediction of completion in 1945. He turned it all over to Leslie Groves, the US Army Corps of Engineers General who was then overseeing construction of the Pentagon for the Army.36 The cast of characters who led and participated in the program to build the atomic bomb from 1941 to 1945 has attracted a good deal of public interest, led to several made-for-television and Hollywood movies, and inspired much scholarly and popular history, some of it good, some less so. Known for security reasons as the Manhattan Engineer District (intended to sound boring), and run by the S-1 Committee of physicists, the project was goal-directed science on a grand scale. Its most famous research site was the mesa at Los Alamos, eighty acres in New Mexico, forty miles northwest of Santa Fe, also known as Site Y, The Hill, and PO Box 1663. But t here were also laboratories run by Harold Urey at Columbia University, by Arthur Compton at the University of Chicago, and by Ernest Lawrence at the University of California. And there were critical production facilities at Oak Ridge, Tennessee, and Hanford, Washington.
Mobilized
105
General Groves chose the theoretical physicist J. Robert Oppenheimer, then at the University of California at Berkeley, to run the scientific side of the project.37 Oppenheimer recruited his colleagues to come to New Mexico, promising unlimited funding and an untrammeled work environment. His spiel to those he sought to recruit conjured a mixture of brilliant science and boyish fun. High on the mesa would be gathered the best minds in the field. Those who participated would make g reat discoveries while also living in a beautiful setting far above the world of everyday common concerns. There would be pack trips into the mountains and breakthrough discoveries. Oppenheimer was a good salesman. In April 1943, fifty scientists assembled at Los Alamos to learn what they were doing there. They had been recruited with minimal explanation of the project. Groves meanwhile was working very effectively to create the production facilities and the trained crews that would be needed to make and deliver the new weapon. He purchased land for a uranium production plant at Oak Ridge in Tennessee and signed agreements with Chrysler and Union Carbide. Under t hese agreements they would build and run a gaseous diffusion plant for the production of uranium 235. In early 1943 another facility for the production of plutonium opened in Hanford, Washington, on the Columbia River. Meanwhile the US Army Air Forces began training B-29 pilots with large dummy bombs at Wendover Field, Utah. In the end, the Manhattan Engineer District involved thirty-seven institutions. It employed 120,000 people, including twenty of the world’s top thirty-three physicists. Producing the bomb cost $2 billion (almost $32 billion in today’s dollars) and took three-and-a-half years. There were facilities in nineteen states and in Canada. Organizations and private industries involved included DuPont, which built Hanford, now one of the most polluted places on the earth, and Chrysler and Union Carbide. The project was about the size of the modern US automobile industry, and it was secret. Producing the bomb involved the labor of physicists, chemists, mathematicians, and engineers, but also elementary school teachers, bricklayers, d rivers, cleaning staff, and guards. Some of the research was conducted in elite research universities, some of it in private industry. It was a remarkable example of mobilization. By the summer of 1944, a fter D-Day, Allied victory began to seem very likely, perhaps as soon as within a year. At that point, some of t hose working on the atomic bomb began to wonder if they should stop. The British physicist
106
Rational Fog
Joseph Rotblat, later a winner of the Nobel Peace Prize, actually left.38 Like many others, he foresaw a possible arms race if the bomb were used and thought that international control and complete openness could perhaps slow that down. Einstein asked to speak with Roosevelt about this issue, and about risks of an atomic weapons race, in March 1945, but Roosevelt died on April 12 at Warm Springs, Georgia, before the meeting could be scheduled. As Truman took over as president, he learned about the bomb. In the midst of a continuing world military crisis—Berlin surrounded and about to fall (Hitler committed suicide on April 30, eighteen days after FDR died), and Japan still a threat—Truman learned that the United States was about to have at its disposal a weapon of unparalleled potential force. Vice-President Truman had known nothing about the vast program to build an atomic bomb. FDR, with whom he was not particularly close, had not consulted him or provided him with any guidance about the appropriate use of the new weapon. Truman probably never understood the complex forces unleashed at Los Alamos. Certainly he never openly acknowledged his own role in making a difficult situation worse. The atomic bomb training units of the 509th Composite Group of the Army Air Forces w ere sent to North Field, on the Island of Tinian in the Marianas. Force teams and prepared combat planes arrived with an atomic bomb. They also had photographic equipment and radiation measuring devices. They dropped bombs on Hiroshima at 8:15 a.m. on August 6, 1945, and Nagasaki at 11:02 a.m. three days later. There was no specific order from the president to drop the second bomb. It was dropped b ecause the weather cleared over Nagasaki. The great mobilization of the wartime years produced many innovative technologies. It also demonstrated the productivity of a system of federal support for science. As my focus suggests, this productivity had civilian and industrial benefits. Both DDT and penicillin animated new industries and generated industrial profits. The atomic bomb promised a future of atomic energy. But all also had unintended effects. DDT and penicillin were embraced with an enthusiasm that ultimately reduced their value. It is almost as though they were valued in the wrong ways—seen as spectacular and wonderful, but not treated as such. The atomic bomb, perhaps seen as less wonderful, ultimately contaminated the world. Both DDT and penicillin (and antibiotics in general) should have been used with extreme caution. The risks of DDT were not completely understood in
Mobilized
107
the 1940s, but scientists associated with the USDA urged caution and knew that the pesticide might have environmental effects that would, in the long run, prove costly. But industrial leaders and farmers wanted unrestricted use of the promising new chemical, and they drenched fields, cities, and beaches with it across the United States. Similarly, the astonishing discovery of penicillin was fundamentally squandered, not valued enough to be protected or used carefully. The deep tank submerged fermentation of penicillin produced vegetative debris as the active mold juice was extracted from the broth of the tank. This vegetative left- over product, the plant itself, began to be sold fairly early as possible feed for cattle. C attle are notoriously unpicky eaters, and selling c attle producers the mold debris, which they happily consumed, was a way for the penicillin producers to profit from the fermentation and for c attle breeders to acquire an inexpensive source of food. What none of those involved in this early phase perhaps understood was that the residual penicillin in the Penicillium mold waste would make c attle grow larger. Growers began to realize that c attle fed penicillin waste grew larger than other c attle. Whatever small trace amount of active penicillin was left in this vegetative m atter, it was enough to have an impact. This led to the widespread use of Penicillium vegetative waste in c attle feed and indeed to the increasing use of antibiotics in general in industrialized agriculture. Some estimates propose that most antibiotics produced today are used in agriculture—not for animal diseases but to promote growth or prophylactically to lower the risk of future disease or injury.39 Animals fed antibiotics can be raised in smaller enclosures and more stressful and intensive circumstances. A fter 2000, some bans on antibiotic use in animal production were passed in the United States and other places, but dangerous forms of antibiotic resistance had already emerged. Antibiotic resistance was documented fairly early a fter the war. Penicillin and other antibiotics placed evolutionary pressure on bacteria and shaped bacterial survival. The rapid evolution of many forms of deadly bacteria resistant to an ever-evolving arsenal of antibiotics was facilitated by the overexposure produced by h uman practices. And clinical decisions about antibiotics may have been less important than agricultural use. Massive efforts to find new antibiotics around the world, by collecting molds, bacteria, and fungi in dirt and in plant waste, succeeded as many new antibiotics w ere discovered, processed, mass-produced, and eventually
108
Rational Fog
synthesized. Yet like penicillin, many of these antibiotics have become less effective over time, and some observers now predict a post-antibiotic era, in which bacterial infection again becomes the cause, as it once was, of an estimated 30 percent of all human deaths. A similar phenomenon occurred as DDT was widely used on crops in the late 1940s. Insect resistance to DDT was already a recognized farm problem by 1952. Only t hose individual bugs most resistant to the chemical survived each spraying, and they became the dominant progenitors of the entire next generation. The few that did survive DDT, for whatever reason, suddenly had no competition. It only took a few years for resistant strains to appear. Incredibly effective in fighting malaria during the war and in controlling insects in crops immediately after, DDT was a powerful and important technical discovery. Manufacturers wanted to sell DDT, and farmers wanted to benefit from it. Rumors during the war of its effectiveness and the promise that it would increase crop yields w ere seductive and enchanting. But DDT had effects on the entire environment, and biologists began to write about its negative consequences in the 1950s. In trying to kill pests, farmers were also killing beneficial insects, the predators that ate the pests: lady beetles, birds, and snakes. Early kills of robins, presumably because they had consumed insects or worms killed by DDT, dramatized the consequences. In 1962, Rachel Carson used DDT as one example of environmental damage caused by scientific and technological progress and by the indiscriminate use of pesticides.40 Eventually, the new Environmental Protection Agency, created in 1974, placed tight controls on the use of DDT. A 1947 propaganda film still viewable on YouTube t oday shows an entomologist eating a bowl of porridge heavily sprayed with DDT (I routinely have my students watch this) in order to persuade viewers that DDT was safe. A generation taught to embrace this technology had to learn a different way to see it. Today the limited use of DDT in circumstances of high malaria risk is generally promoted and seen as acceptable, but the widespread and indiscriminate spraying of fields is no longer common practice in the United States. The construction and use of the atomic bomb, however, had the most devastating long-term impacts. The anthropologist Joseph Masco has eloquently suggested that by the mid-1950s, it was no longer a perverse exercise to imagine one’s own home and city devastated, on fire, and in ruins; it was a core act of gover-
Mobilized
109
nance, technoscientific practice, and democratic participation. Indeed, in the early Cold War United States, it became a civic obligation to collectively imagine, and at times theatrically enact through “civil defense,” the physical destruction of the nation-state.41
Masco here captures the political and cultural dimensions of weapons testing in the Cold War and joins many other scholars in tracking the long-term physical devastation that nuclear weapons production and testing generated. This “nationalization of death,” he suggests, built the nation through the “contemplation of nuclear ruins” and created a new citizen-state relationship mediated by nuclear fear.42 As these cases and many other o thers like them suggest, the mobilization of technical knowledge had mixed consequences. Mobilization involved collaborations across institutions with very different cultures and priorities. Military necessity became a reason even for competitors to collaborate. National emergency justified the mixture of private industry, national laboratories, and government bureaucrats who managed mobilization. Leadership at OSRD was sometimes able to tame the tensions in the mobilization of science between potential f uture profit and patriotism, secrecy and openness, autonomy and control, immediate interests and the future, and conservative strategies and high risk strategies. Most of the scientific research conducted under the control of the OSRD was carried out by civilian experts. It involved the systematic use of scientific personnel who were protected and given deferments. T hose engaged with OSRD projects were not asked to go to the front or die for their country but to discover for their country. Some scientists, engineers, and medical experts did work on or near the front lines in many different capacities as technical advisers, field overseers, and so on, but their contributions were understood to be technical. The skillful mobilization of science that the OSRD orchestrated suggested that it was possible to order science on demand. If you had the right p eople in the room and enough money to give them, you could get whatever you wanted. Scientists and engineers working with the OSRD produced a cornucopia of useful technologies, from radar to camouflage. They showed that experts could solve military problems. In the process, science was more thoroughly militarized, leading to problems of ethics, morality, and professionalism that came to haunt the scientific community a fter 1945.
5 Unforgettable Fire
M Y T I T LE FOR T HIS CH A P T ER , UNFORGE T TABLE FIRE, IS TA K EN FROM T HE T I T LE OF A PUBLISHED BOOK OF
drawings and paintings by atomic bomb survivors that first appeared in En glish in 1977. Survivors w ere asked to recall what they remembered of the immediate aftermath. Their striking paintings, sketches, and drawings presented the bombings through visual memories and short verbal descriptions of what they saw and felt, including anguish, despair, and hope. The result captured the terrifying landscape of bodies, fires, and people who died or survived in the wake of the attack.1 These artistic creations provide one way of understanding an experience that occupies a special place in geography, time, and history. In the technical record and also in the significant artistic and literary record, Hiroshima and Nagasaki stand for a potential h uman future. The cities are exemplary, in the sense that what happened t here might happen elsewhere. This possibility has been repeatedly rehearsed in science, science fiction, film, and civil defense planning, and in the historical, political, and humanistic writings responding to the bombings. How this future looked to different observers depended on their standpoints. The questions that mattered to different observers depended on what they expected the two cities to reveal, prove, or mean. Scientists 110
Unforgettable Fire
111
sought to extract one kind of knowledge, air power theorists another kind, and historians a different kind. When Japanese experts compiled their own accounts for public review some forty years after the bombings, they framed the damage and suffering in terms of a “profound longing for peace.”2 The events in these two cities, for this Japanese group, suggested that the Cold War arms race had to stop. It is important to recognize that while the weapons w ere unique, the procedures associated with the damages they produced were conventional. The cities were administratively managed like all other cities subjected to Allied bombing campaigns, and the bombs w ere treated exactly like other military technologies that were tested in the field in real time during the Second World War. The United States Strategic Bombing Survey (USSBS), created by the orders of US Secretary of War Henry L. Stimson in 1944, was charged with conducting a “scientific investigation of all the evidence” of the effects of bombing technologies and strategies in Europe and the Pacific. The research of the USSBS would help to evaluate “the importance and potentialities of air power as an instrument of military strategy, for planning the future development of the US Air Forces, and for determining future economic policies with respect to the national defense.”3 By June 1945 the Survey had completed its field evaluation of Allied strategic bombing against Germany and had grown to more than 500 civilian analysts and 300 military analysts. T hese experts quickly turned their attention to Japan. Some have suggested that the two cities w ere bombed as experiments, a word that would usually refer to a formal and controlled test of a hypothesis. I would suggest that if Hiroshima and Nagasaki were experiments, then so too w ere Dresden, Berlin, Hamburg, and Tokyo. I have proposed throughout this study that burned and bombed cities and destroyed h uman bodies became crucial scientific sites for the production of new knowledge—what I call collateral data—and certainly this was true for Hiroshima and Nagasaki. The two cities w ere the focus of significant scientific research by Japanese and US physicists, geneticists, psychologists, botanists, physicians, and other experts. Perhaps there are some elements in these activities that reflect the scientific method and the field experiment, but in practice, twentieth-century warfare was a grand experiment, a technically driven program of knowledge production in which damage became a resource for new insights. The interest in testing weaponry was a general property of bomb sorties, and the long lists of publications by the bombing surveys demonstrate record-keeping and
112
Rational Fog
record-collecting practices of military data retrieval that w ere not unique to these two cities. Their productivity as places saturated with knowledge reflected the more general process of knowledge production in war. Damage becomes a guide to further damage (how should cities be bombed in the future?) and to protection (how could other cities and the people in them prepare for atomic warfare?). Some have proposed that the bomb was dropped because of racialist ideas about the Japanese (and that it would never have been dropped on Germany),4 some that it was the first act of the Cold War rather than the last of the Second World War,5 some that it sped up the end of the war and saved the lives of many Allied soldiers,6 and some that it was unnecessary and uniquely cruel (as opposed to fire bombing and carpet bombing).7 The explanations fall into broad categories: orthodox (to end the war), realist (no more cruel than other bombing campaigns), revisionist (to scare the Soviet Union), fanaticist (a sign of technological fanaticism), and the modern synthetic consensus (to end the war, scare the Soviets, and justify spending $2 billion on the bombs). Secretary of War Henry Stimson articulated the most orthodox version in his 1947 essay of justification, “The Decision to Use the Atomic Bomb.” This essay, which appeared in Harper’s Magazine, was actually ghost-written by McGeorge Bundy (Bundy’s father was Stimson’s Assistant Secretary of State) and vetted by others who wanted a full-throated defense of the decision. It was a consensus-building document that made claims not supported by archival or serious historical documentation.8 Meanwhile, the physicist P. M. S. Blackett argued for the revisionist view that the bomb was not the end of World War II, but the beginning of the Cold War.9 Modifying the requirement of unconditional surrender, so that the safety of the Emperor was guaranteed, might have made a difference, but despite support from both Undersecretary of State Joseph Grew, who knew Japan well, and Secretary of War Henry Stimson, this option was not pursued. Alternative options such as a noncombat demonstration of the bomb never got much traction with Truman and his inner circle (they feared it might not work and a dud would embolden the Japanese). The Allies could also have continued to pursue various Japanese peace feelers. In July 1945, Grew publicly announced that Japan seemed to be trying to communicate a surrender plan. But Truman’s closest advisers, and especially new Secretary of State James Byrnes, thought that using the bomb would put the United States in a position of considerable power at the end of the war.10 Truman probably saw no
Unforgettable Fire
113
good reason not to use the bomb. Japan’s cities w ere already burning. By 1945 the annihilation of urban spaces was not remarkable. Controlling Japan without Soviet interference was one crucial goal (a few months of trying to manage the occupation of Germany with the Soviets made it clear just how undesirable it would be to share Japan with them). Ending the war quickly was another: Allied soldiers died every day. It is impossible to say that the post-1945 arms race could have been avoided if the weapons had never been used. But it is clear that the military use of the weapons energized Soviet determination to rapidly develop and stockpile their own, and that continuing atmospheric weapons testing exacerbated global tensions and general contamination of land, sea, and human bodies.11 Carbon-14 is a radioactive form of the element that is crucial to all life, but that is created only by nuclear weapon explosions. So much was produced by atmospheric weapons testing in the 1950s that it is still present today, in every human body and indeed in ecosystems around the world, including the most remote sections of the Amazon River.12 A child born in 2019 will have incorporated in their body carbon-14 left over from the years of open-air weapons testing. It is medically harmless, but it is a sign of the enduring levels of earthly and bodily transformations that the bombs produced. By August 1945, Tokyo had been firebombed for months with devastating human and material consequences. Japan’s industrial production was in tatters. As the Strategic Bombing Survey report stated in 1946, “most of the oil refineries were out of oil, the alumina plants out of bauxite, the steel mills lacking in ore and coke, and the munitions plants low in steel and aluminum. Japan’s economy was in large measure being destroyed twice over, once by cutting off of imports, and secondly by air attack.”13 Soviet troops w ere also on their way to join in the Pacific war, as Stalin had promised months e arlier at Yalta. Soviet engagement was expected to be a decisive factor in possible Japanese surrender, but the Soviets would not be a welcome presence in Japan. All these circumstances combined to make sense of the first uses of the bomb on August 6 and 9, 1945, though as Michael Gordin has emphasized, the first and second bombs were used in very different ways. Bombing Hiroshima was the focus of a strategic debate. Nagasaki was bombed because a second weapon was available, and because a weather forecast suggested that bad weather was coming and therefore the unit should take advantage of clear skies immediately. It was bombed without further input from Washington, and not as the result of a strategic calculation or a process of careful review
114
Rational Fog
and consideration. For some observers, Hiroshima’s bomb could be justified as part of a military and diplomatic calculation. Nagasaki’s could not.14 It would be hard to overstate the chaos in the two cities in August 1945. Survivors suffered losses of families, homes, businesses. The “consumers” of the atomic bomb—the ultimate consumers—were those killed and injured in the bombings. An estimated 48,000 corpses w ere disposed of, often burned, in Hiroshima. An additional 14,000 w ere missing. Dead within the first weeks were an additional 9,000, so that a basic minimum for those killed is more than 78,000 p eople. By 1946, this estimate grew to 151,042. T hose who survived suffered both immediate and long-term effects.15 Japan surrendered in mid-August. Most serious scholarship today suggests that Soviet entry into the war the same week that the bombs w ere dropped was the decisive factor leading to the surrender. The public idea that the bombs ended the war was created by those who w ere responsible for their use. The internal archival record in both Japan and the United States, carefully studied by Hasegawa, demonstrates that Soviet engagement was more important to the inner circle in Tokyo.16 More than seventy years later, Hiroshima and Nagasaki are still the only two cities to have been subjected to nuclear attack in wartime, and the United States is still the only nation to have used nuclear weapons in active, declared war. Had the two cities been firebombed like almost e very other city in Japan, their names would not be shorthand for anything. They would have been burned and destroyed as w ere Tokyo, Yokohama, Iwakuni, Nagoya, Kobe, Matsuyama, and Osaka. Instead, they came to symbolize almost a break in time. In this chapter I look at what these two cities meant after they had been bombed. I consider the efforts by US and Allied authorities, and later Japa nese scientists, to extract knowledge and lessons from the two destroyed cities. I consider how damage became a scientific resource with broad relevance to many kinds of risk and many purposes. T hose studying Hiroshima and Nagasaki selected what they would notice and problematize, and what they would not notice or not see. The Strategic Bombing Survey in the fall of 1945 focused on what the two cities might reveal about the future value of air power. The Manhattan Engineer District (MED) report tracked signs of the bombs’ energy and the physical damage caused by blast, radiation, and fire—and emphasized over and over again how well the physicists at MED had done their jobs (it is relentlessly self-congratulatory). Medical specialists
Unforgettable Fire
115
first with the Joint Commission and later with the Atomic Bomb Casualty Commission compiled records of who had died when and from what c auses, and began studying possible long-term genetic effects in the offspring of the survivors. Even in the twenty-first c entury, experts at the Radiation Effects Research Foundation continue to mine the city for data about radiation risk to human populations. When Japanese scientists made sense of the bombings, they saw evidence of Allied inhumanity. Rather than lessons about air power, organizing science in war, or calculating biological risk in the future, they saw a moral crisis produced by science. Experts of many kinds thus joined in forensic explorations of the damage caused by the atomic bombs. Physicists replayed the bombings in the Nevada desert in an effort to calculate radiation dosimetry. Psychologists tracked emotional responses in the survivors. Geneticists looked for biological changes in the next generation, the offspring born to survivors in the decades after the bombings. A military technology transformed t hese two urban spaces into unforgettable sites for making truths of many kinds. The two cities became places of future-oriented productivity, for prediction, risk assessment, and perhaps even forms of moral divination. What any group saw depended on its priorities. It would have been extraordinarily difficult for any study to capture all the events in those two cities, and my premise is not that anyone should have seen it all. Rather, it is that by tracking the process of selective noticing and attention, I can illuminate practices that are generally important for any understanding of technical knowledge systems. Why do we know what we know? It is fair to say that in August 1945, Hiroshima and Nagasaki became experimental cities, regardless of the precise intentions of those who dropped the bombs. They became field sites and test battlegrounds. Their destruction could be documented, studied, quantified, and extrapolated to other circumstances. The collateral data they yielded could be used to guide urban planning in the United States, to protest the arms race, or to argue for an inde pendent Air Force. And the effects on the survivors could be used to calculate the human risk of radiation, both immediate and long-term, not only in the survivors but in all the kinds of p eople who could be expected to be exposed to radiation: workers, medical professionals, patients, and victims in future accidents or nuclear attacks. The cities could even be used to model future nuclear wars, to plan attacks, and to plan defenses.
116
Rational Fog
When military experts engaging with questions about air power looked at Hiroshima and Nagasaki, they saw a debate about inter-service rivalry and technological superiority. The US Strategic Bombing Survey, created by Secretary of War Henry Stimson in 1944, was a program to document the effects of the air war then underway in Europe and the Pacific. Its findings could be used to plan bombing runs and also to assess the value and effectiveness of air power. Before it discontinued operations on October 8, 1947, the survey published 208 reports about the effects of Allied air power in Germany, and 108 reports about Japan, some as short as twelve or thirteen pages, some as long as 337 pages. The USSBS therefore published 316 reports documenting events in Europe and the Pacific. For Japan, these included reports on coal and metals, musical instrument manufacturing, Mitsubishi Heavy Industries, morale effects, electric power, standards of living, bombing results, and operations at every island involved in the war in the Pacific. The sheer scale of the work in Japan suggests that the entire country a fter the war appeared to be saturated with critical knowledge of many kinds. Appropriately enough, Franklin d’Olier, president of Prudential Insurance, ran the USSBS survey, which Peter Galison has suggested was “the greatest damage-assessment program in history.”17 Other officers in the survey included the Canadian economist John Kenneth Galbraith, businessman (and l ater defense planner) Paul H. Nitze, and research chemist Monroe Spaght, who l ater ran Shell Oil. Cumulatively, these reports that this group generated provide powerful insight into how US officials thought about Japan, how they made sense of the war itself, and how they i magined future relations. The Pacific War USSBS presented three summary reports that w ere presented to Truman. These were the thirty-two-page overall Summary Report (Pacific War), 1946; the thirty-six-page assessment of Japan’s Struggle to End the War, 1946 (about the surrender); and the forty-three-page assessment of the Effects of Atomic Bombs on Hiroshima and Nagasaki, 1946 (on which I focus).18 I also consider here one of the reports about medical effects, the eighty-six-page report on The Effects of Atomic Bombs on Health and Medical Services in Hiroshima and Nagasaki, 1947; the two specialized reports by the Physical Damage Division (both significantly longer); the three-volume, 966- page Effects of the Atomic Bomb on Hiroshima, Japan, 1947 (Vol. I: 115 pages, Vol. II: 630 pages, and Vol. III: 336 pages); and the three-volume, 765-page Effects of the Atomic Bomb on Nagasaki, Japan, 1947, (Vol. I: 417 pages, Vol.
Unforgettable Fire
117
II: 348 pages, and Vol. III: 265 pages). I mention all these details to suggest the range and depth of these reports—and their potential for historical understanding of this critical moment in world history. The survey team in Japan in the fall of 1945 consisted of 300 civilians with various skills, 350 officers, and 500 enlisted soldiers from both the Army (60 percent) and the Navy (40 percent). In September 1945, the survey, with its headquarters in Tokyo, began to reconstruct the economy, military planning, and social life of wartime Japan. One goal was to understand the negotiations leading to the acceptance of unconditional surrender. Another goal was to assess the health and morale of the civilian population under Allied control and occupation. More than 700 Japanese military, government, and industrial officials were interrogated by the Strategic Bombing Survey staff. The survey also recovered and translated many documents that were turned over for permanent archival use in the United States (this is one of the reasons that historical work on postwar Japan sometimes depends on US archives). In many reports, policies and practices in Japan were compared to those in the United States. Observers compared building standards, the density of populations in urban centers, and so on. The summary report on the bombings closed with a discussion of how the two cities in Japan could be used to predict the consequences of f uture atomic bombings of cities in the United States. Indeed, many of the facts selected in the summary report w ere framed in terms of their relevance to citizens of the United States. The cities seemed to be filled with lessons for the United States. Possible future effects on US cities that would be bombed had force[d] themselves almost inescapably on the men who examined thoughtfully the remains of Hiroshima and Nagasaki. These conclusions have a dif ferent sort of validity from the measurable and ponderable facts of preceding sections, and therefore are presented separately. They are not the least important part of this report however and they are stated with no less conviction.19
Engineers compared Japanese building codes to US ones; surveyors calculated how the same damages would look if superimposed on New York, Washington, Chicago, Detroit, and San Francisco. They generally concluded that buildings in American cities could not stand up to an atomic bomb. In 1946
118
Rational Fog
the population density in Manhattan per square mile during the day, when workers w ere in town, was 145,000 p eople. This was the largest population density in the country. It compared to 12,750 people per square mile in prewar Hiroshima and only 7,000 per square mile in Nagasaki. “The casualty rates at Hiroshima and Nagasaki applied to the inhabitants of Manhattan, Brooklyn, and the Bronx, yield a grim conclusion.”20 The solution, the survey group concluded, was to decentralize cities in the United States (this led eventually to the interstate highway system). Industrial and medical facilities should be dispersed, shelters constructed, and lifesaving evacuation plans developed and rehearsed. The country should consider organizing the economic, transportation, and administrative life of the nation so that no single or small group of successful attacks could paralyze the “national organism.” American cities could weather an atomic attack with “minimum casualties and disruption” if citizens and urban centers were prepared. “Since modern science can be marshaled for the defense as well as the attack, t here’s reason to hope that protective weapons and techniques w ill be improved. Even protective devices and vigilance however cannot be perfect guards against surprise initial attack, or against the unlimited choices of targets offered an enemy through the range and speed of modern weapons.”21 All this theorizing occurred at a time when the United States was the only nation to actually possess an atomic bomb. Many of those in positions of authority in the United States expected that it would be five years, or more, before the USSR would have an atomic bomb (it was in fact a bit more than three). And some well-informed insiders even expected that Soviet capabilities in term of atomic weapons would lag behind t hose of the US by decades. United States military and diplomatic officials theorized extensively about atomic destruction while the US held a full monopoly on the weapon, and when there was no present-day, immediate vulnerability. Their projections justified policies and practices that w ere not yet crucial. In a series of studies of the US Strategic Bombing Surveys in both Germany and Japan, the historian Gian Peri Gentile proposed that the reports were as much about the organization of the US Armed Forces as they were about new weapons. The surveys w ere expected to resolve at some level the question of credit. Which services and which technologies had actually produced victory? Different evidence was used in different reports, and survey authors selected what to include and what to exclude. In some ways, he pro-
Unforgettable Fire
119
posed, the bombing survey of Japan normalized the atomic bomb as though it w ere no more powerful than any other bomb. Every claim about these matters had a lesson: if in fact the naval blockade brought Japan’s economy down—and the bombings were incidental, a public excuse for surrender— then the Navy won the war. For Army leadership, giving too much credit to the atomic bomb was also a problem. Army Air Forces officials wanted a separate Air Force (it came in 1947). The bombings of Hiroshima and Nagasaki required only two bombers—one for each city—to carry the weapon. Each mission also included two support planes (for observation and instruments), though one of the Nagasaki support bombers was blown off course and never made it to the city. At the most, the missions required six planes. While technically the atomic bombings w ere an example of air power, it was anemic air power as compared to the thousand-bomber raids of the European theater and the earlier attacks in Japan. Thus for both Navy and Army Air Forces, there were institutional reasons to minimize the role of the atomic bomb in ending the war. Giving the bombs themselves credit appealed to neither service. Gentile’s close reading of a much-quoted section of one of these reports suggested that historians have been perhaps missing the reasons for the way the argument was presented. The (speculative) proposal in one report of the Strategic Bombing Survey that “Japan would have surrendered even if the atomic bombs had not been dropped, even if Russia had not entered the war, and even if no invasion had been planned or contemplated” might have reflected this question of credit. By the spring of 1946 when the reports were being prepared, military leadership did not want the Soviet Union to get credit for ending the war (so Russia’s entry into the war would not have mattered, they said). Nor did Allied leaders relish the thought of shared Soviet Occupation of Japan, which credit might justify. L ater historians, such as Hasegawa,22 documented the dramatic impact of Soviet entry into the war on t hose making decisions in the Japanese inner circle, but for the Strategic Bombing Survey in 1946, invoking the Soviet Army was unappealing. And giving all the credit to two single bombs would suggest that earlier air power strategies of massive raids and firebombing were perhaps ineffective—a charge that was indeed already brewing in a report on the economic impact of air power in Germany by John Kenneth Galbraith, who said roughly that air power made no difference to the economy in Germany. Any claim about the effectiveness of military technology should be understood at least
120
Rational Fog
partly as an argument about proper future investments and priorities, as Gentile suggested.23 In their closing discussion, the authors of the atomic bomb summary survey report said that the investigators in the field in Japan, as they proceeded about their study, found an insistent question framing itself in their minds: “What if the target for the bomb had been an American city?” They pointed out that all major factories in Hiroshima were on the periphery of the city and escaped serious damage. At Nagasaki, plants in the valley where the bomb exploded were seriously damaged. But no single bomb could have destroyed all industrial works in e ither city b ecause they w ere dispersed and spread out, and the survey authors proposed that such dispersal should be the policy for US cities as well. “The similar peril of American cities and the extent to which zoning has diminished it differ from city to city. Although a reshaping and partial dispersal of the national centers of activity are drastic and difficult measures, they represent a social and military ideal toward which very practical steps can be taken once the policy has been laid down.”24 Finally, in an intriguing spin, the report closed with a call for peace. Our national policy has consistently had as one of its basic principles the maintenance of peace. Based on our ideals of justice in a peaceful development of our resources, this disinterested policy has been reinforced by a clear lack of anything to gain from war—even in victory. No more forceful arguments for peace and for the international machinery of peace than the site of the devastation of Hiroshima and Nagasaki have ever been devised. As the developer and exploiter of this ominous weapon, our nation has responsibility, which no American should shirk, to lead in establishing and implementing the international guarantees and controls which will prevent its f uture use.25
This was the closing paragraph in a document almost entirely built around learning more about how atomic bombs could be useful in f uture warfare. In the summer of 1946, the Manhattan Engineer District (MED), soon to become a system of national laboratories under the control of the new Atomic Energy Commission, published its own report about the bombings and the lessons they provided. General Groves was clearly the primary author of this report, though the formal head of the project was Brigadier General Thomas F.
Unforgettable Fire
121
Farrell. W hether he wrote the words himself or not, the report reflected Groves’ perspectives. For MED, the value of Navy or Army or air power was functionally irrelevant. What mattered was to provide proof that the Manhattan Engineer District had been right, apparently about everything. The MED report, The Atomic Bombings of Hiroshima and Nagasaki, published in June 1946, said the bomb was “the greatest scientific achievement in history.” It gave the bomb full credit for ending the war. And a fter only sixteen days of field work in Nagasaki and four days in Hiroshima, the team concluded that t here was no residual radiation in either city, that t here had never been residual radiation, and that t here were no health effects from radiation on early entrants or aid workers (this is still a controversial issue in 2019). The MED report also said the bombs had exploded at just the right spots, that the heights were correctly chosen, and that MED had correctly predicted both the time when the bombs would be ready and the ideal plan for the attack “in e very detail.” For this group, the cities provided proof of the legitimacy of their l abor: the MED report is striking for its openly propagandistic qualities. In subtle and direct ways, it justifies the use of the bomb, minimizes radiation effects, emphasizes the military importance of the two cities, and praises the accuracy and foresight of its own planning and predictions. “The bombs performed exactly according to design” and w ere “placed in such positions that they could not have done more damage from any alternative bursting point in either city.” Scientific assumptions about how long it would take to build the bomb w ere “correct” and despite the complexity of the work—“an almost infinite number of scientific and engineering developments and tests”—played out as expected. The experts who helped “select targets” were mathematicians, physicists, and weather consultants (but no one who understood Japanese history, cities, or social life). Targets were chosen to produce “the greatest military effect on the Japanese people” (though most Japanese cities had already been heavily bombed). The MED, the report claimed, had even wanted and expected a firestorm and chosen cities accordingly: “The targets should contain a large percentage of closely built frame buildings and other construction that would be most susceptible to damage by blast and fire.” Hiroshima was particularly favored b ecause of its “small wooden workshops set among Japanese houses” and the many industrial buildings also of wood frame construction and “highly susceptible to fire damage.” The report stressed that the MED team knew in advance that the
122
Rational Fog
firestorm would happen “months before the first test was carried out” as a result of calculations by the physicist Hans Bethe. MED report writers seemed upset, however, that some of the reinforced concrete buildings at the center of Hiroshima had not collapsed. This circumstance seemed to call into question the power of the bombs, and even the quality of MED planning. “Some of the reinforced concrete buildings were of a far stronger construction than is required by normal standards in Amer ica because of the earthquake danger in Japan. This exceptionally strong construction undoubtedly accounted for the fact that the framework of some of the buildings which w ere fairly close to the center . . . did not collapse.” Some of the bridges in Hiroshima also did not collapse, which MED survey writers sought to explain as a result of the height of the explosion. Vigorously refuted in the report w ere claims by Japanese observers that the blast effects, which they characterized as “the pressure wave similar to normal explosions,” included gruesome physical injuries like ruptured abdomens and eyeballs.26 Such things might have happened, but they were not caused by the blast: “No such results were actually traced to the effect of air pressure alone.”27 The second Japanese claim rejected was that the cities remained sites of residual radiation. The presence of residual radiation was denied repeatedly. All casualties from radiation, the report stated, were produced in the first second of the bomb’s detonation. Radiation from scattered fission products a fter the bombings and induced radioactivity from objects near the center of the explosion “were definitely proved not to have caused any casualties.”28 It is unclear what form this proof could have taken in 1946, since medical surveys at the time w ere incompletely organized, and t here had been no systematic effort to track early entrants or rescue workers to determine if they had experienced medical effects. Nor were longer-term effects of exposure to low-level radiation understood. Unlike the Strategic Bombing Survey, the MED report concluded confidently that the atomic bombs ended the war. “The atomic bomb did not alone win the war against Japan, but it most certainly ended it, saving the thousands of Allied lives that would have been lost in any combat invasion of Japan.” Striking for its allocation of credit to the MED team and to the bomb, its praise for a “perfectly planned” attack with “crew and equipment [that] functioned perfectly” and a bomb that performed “exactly as expected,” the report legitimated everything MED had done.
Unforgettable Fire
123
It is perhaps commonplace that formal institutional reports celebrate the wisdom and foresight of institutions. But the MED report also illuminates what aspects of the bombings loomed large for t hose responsible for building the bombs. They stood in a position of moral culpability—the journalistic and theological debate about the legitimacy of the atomic bombings was ferocious within weeks. U.S. News published a critical essay on August 17, eight days a fter the Nagasaki bombing, and the New York Times reported on August 20 that a coalition of clerical leaders was protesting Truman’s decision. Angry letters to the editor appeared in local newspapers across the United States. At least some US citizens found the decision to use the bombs troubling right away. Yet the writers of the MED report vigorously claimed that everything turned out just as it should have.29 For biomedical scientists, the perspectives w ere very different. The information needed from the two cities was medical, biological, reproductive, and long-term. The survivors were surrogates for future populations that would be exposed to radiation. As one administrator involved in the studies put it in 1956, they were “the most important people living.” By this he meant that they were people whose suffering could become a resource for managing the new world of radiation risk. They w ere pioneers, and a fter the full-scale media event of the Pacific weapons tests, Operation Crossroads in 1946, their experiences seemed relevant to every living person all over the world.30 Medical surveys began in September 1945 (by the Joint Commission), continued with the creation of the Atomic Bomb Casualty Commission in 1947, and continue into the present with the Radiation Effects Research Foundation. The bodies of t hose exposed to the radiation released by the two atomic bombs in 1945 are still being studied by scientists at the time of the publication of this book. Some of t hose involved expect the RERF to continue d oing research even a fter all the survivors, now in their seventies and up, are gone. Like the studies of the Strategic Bombing Survey and of the Manhattan Engineering District, the work of the Joint Commission and the Atomic Bomb Casualty Commission that followed it was future oriented and intended to play a role in defense and military planning in the United States. It has generated thousands of scientific papers over the years and collected more than a million biological samples now held in a biorepository in Hiroshima. Over the decades it became a much more general resource for radiation risk in the wake of nuclear power accidents like Three Mile Island, Chernobyl,
124
Rational Fog
and Fukushima. It has also responded to the increasing recognition that the low doses delivered by routine medical irradiation could pose risks. The studies of the survivors w ere initially controlled by the United States. A fter 1975, the Atomic Bomb Casualty Commission was renamed and became a jointly funded project of Japan and the United States, the Radiation Effects Research Foundation. The survivors in Japan suffered profoundly in their loss of families, homes, businesses, and health. They were also subject to social discrimination in Japan, considered unsuited for arranged marriages, often scarred with burns and keloids that marked them as sickly, and feared as possible carriers of ge netic damage. The discoveries by fly geneticist H. J. Muller of the mutagenic effects of X-rays in Drosophila, and by agricultural geneticist Lewis Stadler of similar effects in barley and maize, both in 1928, together with follow-up research by many others, established the genetic damage-inducing effects of radiation. Muller won the Nobel Prize in 1946—the year of Crossroads—when these effects had a new urgency. The possibility that the bomb survivors might have children who expressed mutated genes was almost immediately part of postwar planning for the occupation of Japan. University of Michigan geneticist James V. Neel directed the project to study genetic effects in the survivors. The work of his team, which included William J. Schull and dozens of Japanese physicians, nurses, and midwives, was initially seen as most important at the ABCC. Genetic effects, however, were never documented. For most of his life, u ntil his death in 2000, James Neel searched for genetic effects in the offspring of the survivors. But despite all the methods of contemporary molecular genetics, the genetic effects remained undetectable at a statistically significant level. In 1991, a fter almost fifty years, Neel and his co-author William J. Schull wondered if they “could be just manipulating the noise in the system” as they tried to calculate a doubling dose for genetic effects. A 2006 summary of results noted that no ge netic effects could be identified despite almost sixty years of analysis of birth defects (including untoward pregnancy outcome, malformation, stillbirth, and perinatal death), chromosome aberrations, and alterations of plasma and erythrocyte proteins, as well as epidemiologic study on mortality (any cause) and cancer incidence (the latter study is still ongoing). Even molecular biological techniques and human genome sequence databases have not been able to pin down these effects—though genetic effects of radiation are readily
Unforgettable Fire
125
tracked in experimental organisms like mice and flies, and they are generally believed to have occurred in the survivors. The original purpose of mapping the human genome, which began with the Department of Energy, a successor agency to the Atomic Energy Commission, was to establish the impact of radiation exposure on the atomic bomb survivors. But genetic effects remain elusive. Many scientists believe they are t here but difficult to detect, partly because human reproduction cannot be manipulated the way reproduction in flies or mice can be. The biomedical lessons to be drawn from Hiroshima and Nagasaki w ere therefore oriented around predicting human biological survival. The cities supplied clinical data, autopsy materials, and infants in whom genetic futures could be seen. The storage practices applied to biological materials taken from the survivors have sometimes seemed almost compulsory, even as the purposes and proper management of such storage w ere u nder interrogation. Experts in Japan saved atomic bomb survivor teeth, bloods, and tumors—for what seem at times to have been almost spiritual and philosophical reasons. The biological samples were a kind of talisman for translating risk. Produced at the very moment of fracture and collected in the course of biomedical research at the nuclear Pacific’s ground zero, the samples are permanently marked by a form of energy that threatens the h uman future. The marks could be deployed to solve problems of dosimetry (tracking gamma-ray and neutron dosimetry through tooth enamel), cancer risk, and escalating energy demands around the world. The flat technical accounts of genomic data can barely hold in check the “destiny of mankind.” Death, judgment, and heaven and hell are entangled with tooth enamel electron-paramagnetic-resonance.31 That the remains of the survivors are uniquely precious is a common theme in the scientific construction of their meanings. “Understanding the effect of radiation on the human genome remains an important challenge and RERF has precious biosamples to help mankind learn how sensitive the genome is to radiation and how much genomic variation affects the transport of radiation risk to other populations,” said a 2012 report on the RERF biobank. The precious biosamples promise an epidemiology without end—the chance to study the survivors forever. Indeed, the RERF continues to look forward to another twenty years of research: “Twenty years from now, there will be almost no survivors left. So does that mean no RERF?” RERF Scientific Director Ohtsura Niwa told a journalist in 2015: “No. We intend to keep on g oing, to do something for the next generation.”32
126
Rational Fog
Both the institution and the data are i magined as endless. The cities of Hiroshima and Nagasaki w ere destroyed and therefore trea sured. Damage in these places became a scientific, political, and institutional resource, a form of valuable collateral data made by chaotic violence. No bombing run is structured to make knowledge in a controlled experiment. Such events are the opposite of controlled. Yet the ruins of cities in both Germany and Japan became critical field sites after the fact. Different research groups arrived in Hiroshima and Nagasaki with different agendas—to promote air power, justify their roles in building the bomb, or calculate future medical effects—and observers selected from the abundance of damage what kinds of data they needed. Surprisingly missing from t hese formal reports was a systematic effort to understand the social and psychological consequences of the bombings. At first, Allied observers, most of whom knew very little about Japan, a dopted the position that the Japanese were uniquely stoic and that their psychology was so different from that of other potential bomb victims (meaning those in the United States) that social and psychological studies would not be useful, mobile, or applicable to other places. While the physical data about buildings and rail lines could readily travel in policy and scientific circles, and the embodied data within exposed survivors could be relevant to all human bodies, the psychological and social data were apparently not seen by Allied observers as generalizable or mobile. The Atomic Bomb Casualty Commission staff, for example, focused on biological effects exclusively. It did not track the social and psychological effects, though US scientists working in Japan saw the social consequences all around them. Even the physicists at MED noted in their report that the “atomic explosion almost completely destroyed Hiroshima’s identity as a city . . . even if there had been no damage to structures and installations, the normal city life would still have been completely shattered.” The fact of social trauma was known, seen, and recognized, but US leaders and scientists did not immediately see that trauma as a resource for making new knowledge about the psychological and social impact of trauma. While the social sciences w ere deeply involved in the US war effort, and the anthropologist Ruth Benedict’s “armchair anthropology” in her book The Chrysanthemum and the Sword (involving no work in Japan) was widely consulted by Occupation authorities, the social sciences other than economics
Unforgettable Fire
127
ere not incorporated into postwar studies of the atomic bomb. I have some w rough theories about why not, based on correspondence and formal documents relating to the ABCC. In these letters and reports, the Japanese mind (unlike the Japanese body) was portrayed as irrelevant to the American or even the Soviet mind. US observers tended to see Japanese victims as biologically mobile, able to stand in for other populations, but psychologically sui generis, unlike other people and therefore not likely to be productive as resources for psychological study. I cannot fully explain how this assumption mattered, but I believe it did. While the US Strategic Bombing Survey, Pacific Theater, Morale Division produced a lengthy report (256 pages) in 1947, The Effects of Strategic Bombing on Japanese Morale, this did not involve a systematic study of the psychological consequences of the atomic bombs, but rather an assessment of the effectiveness of bombing based on how much it shaped the outcome of the war. Morale was not, apparently, psychological. But to some other observers, the atomic bombs posed unique psychological challenges. In April 1962, seventeen years after the bombings, former US Army psychiatrist Robert Jay Lifton began trying to understand the trauma of the atomic bomb. He had worked earlier with American POWs repatriated from North Korea in 1953, a fter they had been exposed to “thought control” or “brainwashing.” His interest in the ways that minds can be manipulated and changed expanded through the 1950s to studies of mind control and “totalism” in China and in Holocaust survivors. Lifton became a key international figure in the networks that defined trauma as PTSD, and broadened the definitions and meanings of the experience.33 The work with bomb survivors followed from his evolving views of trauma and psychoanalytic theory. Having lived earlier in Japan, and with some knowledge of Japanese, he returned in 1960 intending to study Japanese youth. He finished his study by visiting Hiroshima, and this led to his next project: he began meeting survivors. He realized that seventeen years after the bombings, no analyst had attempted a comprehensive psychological study of the experience of an atomic attack. Lifton decided to try to understand the “full h uman impact” of the bombings.34 His roughly two-hour interviews took place with interpreters. The interviews were part of his effort to think “scientifically” about trauma. One of his first papers on his work, published in Daedalus in 1963, presents lengthy quotes from those he interviewed as they described what they saw
128
Rational Fog
and felt in the hours, days, and weeks a fter the bombings. One common theme, he said, was that the bombings had been and continued to be a scientific experiment: The dominant emotion h ere is the sense of having been made into “guinea pigs,” not only b ecause of being studied by research groups (particularly American research groups) interested in determining the effects of delayed radiation, but more fundamentally b ecause of having been victimized by the first “experiment” (a word many of them use in referring to the event) with nuclear weapons.35
Some survivors, suspicious of his status as an American scientist associated with Yale University (where many of those working at the Atomic Bomb Casualty Commission held faculty appointments) asked him if he was “selling the bomb.” Lifton concluded that his interests in fundamental understanding of h uman trauma w ere intertwined with his personal and professional ambi36 tions. He was ambitious both for himself, and for knowledge—like so many others who came to Japan to understand the bomb. As I have suggested here, those studying Hiroshima and Nagasaki strategically selected what they would notice and problematize, and also what they would not notice or not see. For the authors of the Strategic Bombing Survey in the fall of 1945, the unfolding postwar debate about the relevance of air power to military victory was a powerful subtext. For different reasons, neither the US Army nor the US Navy wanted the bombings at Hiroshima to have “won the war.” For the authors of the report of the Manhattan Engineer District, the cities seemed to provide little more than justification for their effort and validation of their brilliant planning. In language that now seems particularly awkward, the MED authors characterized the attacks as perfect, unfolding exactly as planned, with effects that had been predicted all along. For medical experts, the problems were more complex and enduring. Detecting biological effects required recording “everything” over decades. The problems expected to be most important—genetic effects—were never shown at statistically significant levels, though leukemia emerged among survivors three or four years after the bombings. Later, even heart disease was associated with exposure to radiation. The biological scientists noticed the trauma but did not document it or write papers about it. Eventually, Lifton
Unforgettable Fire
129
wrote his much praised book Death in Life about the interviews he undertook in 1962. In 1985, the Tokyo publisher Iwanami Shoten published a report on the Impact of the A Bomb: Hiroshima and Nagasaki 1945–1985 prepared by the Committee for the Compilation of Materials on Damage Caused by the Atomic Bombs in Hiroshima and Nagasaki. The report marked the fortieth anniversary of the bombings. Parts of the report had been published earlier, but the 1985 edition was widely circulated in Britain and the United States, and published to coincide with a solidarity conference of mayors who supported peace. Physician Soichi Iijima, who organized the text, produced a concise comprehensive account of the atomic bomb damages. But Japanese observers saw the cities as exemplifying the importance of peace. Indeed, from the 1960s moving forward, political and scientific leadership in Japan had invoked the bombed cities to argue for peace. The cities celebrated peace ceremonies e very year on August 6 and 9 (only 1951 was skipped), and beginning in 1968 the mayor of Hiroshima sent telegrams to the heads of all nuclear states to protest nuclear weapons.37 The 1985 text proposed that the experiences of those in Hiroshima and Nagasaki were the common possession of peoples the world over. It documented the bombing itself, with lavish illustrations including maps, charts, and photographs, and with compilations of data about the extent of the damage at various distances from ground zero. Photos showed the famous black rain in Nagasaki; charts tracked white cell counts and keloid formation. The text even included an account of the struggle to gain benefits for the atomic bomb survivors, who were not categorized as military personnel. The book closed with a chapter about the possible abolition of nuclear weapons. The experiences, scars, and burdens of those in Hiroshima and Nagasaki were leveraged neither toward military strategy, justifying a given scientific result, nor the production of abstracted and neutral knowledge that could be applied in many different circumstances, but rather toward the goal of peace. In documenting the suffering of the two cities, these reports suggested, every bit of data was making an argument for the elimination of war.38 The use of a military technology transformed these urban spaces into unforgettable sites for making scientific and institutional truths of many kinds. They were places of future-oriented productivity for career-building research (“selling the bomb”), military prediction, calculating risk, and measuring moral culpability. What any technical study group or research
130
Rational Fog
group saw depended on priorities. Working in the same two cities, research groups extracted different lessons and data. It is important to acknowledge that Hiroshima and Nagasaki are still sites of scientific research. The Radiation Effects Research Foundation (RERF), which is the successor agency to the Atomic Bomb Casualty Commission, is engaged in ongoing studies of long-term radiation risk. RERF scientists also played a key role in the assessment of populations exposed at Chernobyl and on studies of workers at the Fukushima Daiichi Nuclear Power Plant. Industrial interests in Japan and the United States sought to draw a sharp line between the risks of nuclear war and the risks of nuclear power, but the work of the RERF (which became the basis of worker protection standards for the industry), and the activism of atomic bomb survivors themselves, have drawn these two nuclear domains together. This is true particularly in the wake of the Fukushima disaster, identified by one atomic bomb survivor as Japan’s “third atomic bombing.” RERF is therefore a critical node in a complex global network of scientific institutions that adjudicate radiation risk and proclaim when it is present and when absent.39 Nuclear weapons are scientific at both ends—made by science and to be solved by science, in the terms first outlined by Ulrich Beck in his influential analysis of the Risk Society (1992). The sciences at the front end are elite physics, chemistry, and engineering (the sciences of atomic bombs and nuclear energy). And the sciences at the back end, when the consequences have to be tallied, include messy and slow epidemiology, psychological and social work with devastated survivors, and field biology. The survivors of both the atomic bombings and the nuclear meltdowns have been exposed to a form of environmental contamination that can be known to them only through technical expertise—only through the testimony of scientists. As Kuchinskaya has observed, “radiation is not directly perceptible to the unaided human senses. People cannot see, hear, or feel radiation. Their senses register nothing. As a result, formal representations become doubly important in defining the scope of what is considered dangerous contamination.”40 Those formal representations are technical products, research papers, international reports, and charts of dose-response curves. This quality of radiation risk—its manifestation only in scientific inscription—may play a critical role in public fears and responses. In 2014, RERF statisticians Eric Grant and Harry Cullings completed a draft paper that reconstructed the physical maps of Hiroshima and Nagasaki as the
Unforgettable Fire
131
cities might have been in the summer of 1945. The maps used to study the survivors for seventy years had been based on US Army aerial photographs taken before the bombings. T hese maps showed where each survivor had been at the moment of the bombs’ detonations, the moment when radiation exposure was most critical. They included distortions that w ere the result of the technological limitations of the cameras, the tilting to the horizon, how the cameras were held, what kind of lenses were used, and the exact height at which they w ere taken. The two cities w ere photographed in a series of overlapping images that in the aggregate produced a full but imperfect image. New digital techniques in the twenty-first century make it possible to correct t hese images, improve the irregularities, recapture some “lost” terrain, and stretch the city visually into a proper and accurate geographical format—a visual format that matched the actual ground. The resulting veridical map—a stunning reconstruction of a detailed ghost city full of houses and shops gone for seventy years—then had implications for how the RERF might calculate exposure to radiation for each survivor. Grant, Cullings, and their colleagues placed individual survivors on t hese new maps, and found that every one of the 93,000 survivors changed position. Survivors had long been placed on the old map on the basis of two intersecting numbers on a two-dimensional grid. In the original placements, the last two digits of each number a fter the decimal point had for convenience been dropped. This had the effect of pushing most survivors essentially directly on to the grid lines. When t hose extra decimals w ere added back in, the maps changed. P eople did not move far, but they moved. The changes made the RERF epidemiology group, Grant said, “more confident” about their assessments.41 Different experts extracted different lessons from the two cities. The same is true, of course, of humanists and social scientists who have tried to make sense of the events of the summer of 1945, including many historians like myself.
6 Battlefield of the Body
WA R ENGAGES W I T H T HE H UM A N BODY AS BO T H W E A PON A ND TA RGE T. I T IS A DOM AIN OF BODILY IN T ENSI T Y,
a place where h uman bodily capabilities are pushed as far as they can be, and also where human bodily injury constitutes a key form of evidence, in both science and politics. The injured body is evidence of e ither victory or defeat. It is also scientific evidence of the extremes and the limits of embodiment. In the biomedical sciences focused on human extremes relevant to war, we can see very clearly the mutual nature of knowledge to heal and to injure. They are tangled up together and not easily pulled apart. In the twentieth century, scientists and physicians seeking to solve prob lems of advanced and sophisticated armed forces began to build a detailed picture of the h uman body as a target. They assessed the best ways to destroy it and the best ways to keep it functioning so that it could continue to destroy other bodies. In some ways perhaps this is obvious. But it is not trivial for our understanding of war and modern biomedical sciences.1 To suggest how this process unfolded, I look at biomedical fields where this way of seeing has been important. I consider the rise of aviation medicine, the development of front-line field research with grievously wounded soldiers during the Second World War, studies on the experimental battle132
Battlefield of the Body
133
field of the Korean War, and the late-twentieth-century struggle to understand the biomedical effects of chemical exposures on US troops in Vietnam and Gulf War I. All those studied in these biomedical fields were US troops, not enemies. They w ere soldiers whose injuries and experiences provided many kinds of collateral data. I am interested here in how experts studied injury in order to enhance it. Militarily relevant scientific research often constructed an image of embodiment that was oriented around violence. In 1943, Yale physiologist John Fulton described the brain to a colleague as “a semi-fluid substance, suspended by fairly inelastic attachments in the cerebrospinal fluid in a rigid box.”2 Fulton selected t hose properties of the brain relevant to its demolition by firearms. I propose that his perspectives reflected a general emergence, after 1900, of a suite of biomedical sciences that saw the body as a target, and a battlefield. The scientific studies of human injury I consider here are historical evidence of what the body was presumed to be—what meanings it carried—in increasingly violent industrialized and scientific warfare in the twentieth century. War animated an interest in extreme physical circumstances of injury and stress, circumstances that could be studied “naturally” on the battlefield or in the laboratory with h uman subjects. T hese ideas reflected the gruesome experiences of the First World War. Many of those who were in leadership positions in the scientific community in the United States by the 1940s had been involved in the First World War, sometimes in scientific programs, for example to create new chemical weapons, and sometimes as frontline soldiers. That war, which produced bodies poisoned by gas, damaged by artillery, and mutilated in machine gun fire was the war of their youth. They knew how technological change could change the experience of embodiment, even if they would not have put it quite that way. As their attention turned to bodily risk in the 1930s and l ater, they seemed to understand the body as a system that would be routinely placed under profound stress, opened, crushed, frozen, starved, poisoned, or riddled with bullets. Its limits had to be tested and its properties as a target fully understood. Just as Carolyn Bynum’s striking studies (of medieval notions of the resurrection of the dead and the immortality of the body itself) help us understand h uman social life in e arlier times, so too can scientific notions of war’s injured bodies help us understand the twentieth century.3 Scientific studies
134
Rational Fog
of blunt trauma, starvation, freezing, nausea, shock, wound ballistics, and high-altitude hypoxia leveraged controlled injury to (eventually) enhance the body of the soldier and to permit t hose healed to continue to injure the bodies of those identified as enemies. Knowing how long a h uman body can function during extreme cold, blood loss, lack of food, lack of oxygen, or nauseating physical movement was important in the context of technoscientific war. The questions of bodily injury that became so critical in biomedicine in the twentieth c entury reflected particular ways of waging war. The airplane was a new technology that placed the body in novel circumstances of risk. Aviation medicine developed in response to the emerging medical problems posed by flight. Flight was a new opportunity rapidly adopted by world armed forces. Only six years elapsed from that first tentative flight of the Wright b rothers at Kitty Hawk in late 1903 to the first US military purchase of a Wright airplane in 1909. That single plane launched the creation of the new Aeronautical Division of the US Signal Corps. Other military forces around the world were similarly enthusiastic. The airplane industry began in rapid response to military demand. It was unclear at first what exactly would be done strategically with the new technology. Ideas about airplanes and their military value shifted through at least the 1940s (or perhaps in some sense even into the present). US power theorist Billy Mitchell thought bombers made other forces obsolete. UK theorist Hugh Trenchard thought bombing could produce widespread “discontent,” which would end all wars (though it was not clear how). Italian theorist Giulio Douhet saw “civilization” as conferring greater vulnerability, such that air power could attack the “nerve systems” of enemy states and bring them down.4 Certainly at the beginning of the First World War the new airplanes were expected to aid in reconnaissance and intelligence. They could also be used to drop bombs or chemical weapons, though that did not happen much. It was not clear how many aircraft were needed, what training pilots and crews needed to have, or what administrative structure was best for managing new air forces. By 1914 when the First World War exploded, all major combatants had nonetheless invested in air power and begun to explore its potential. One consequence of this enthusiasm was that pilots and crews began to operate in poorly understood upper atmospheres, exposed with relatively
Battlefield of the Body
135
l ittle protection to conditions of speed and acceleration unlike those normally encountered on the ground. Air power produced new biological experiences of nausea, hypoxia, disorientation, and airborne decompression sickness. Aviation medicine as a field grew out of these experiences. An airborne pilot was embedded in a machine, brought by that machine into unfamiliar and dangerous spaces. Like a soldier holding a gun, a pilot fused machine and body into a single, functioning entity.5 The risks w ere recognized as early as 1912, when a War Department memo provided detailed instructions for physical exams for aspiring pilots. The candidate had to have perfect vision, no problems with colorblindness, and excellent balance such that he could hop about with eyes open or shut. The US Army required aviators to meet higher physical standards than the average soldier, as though the physical risks of air flight could be modulated by the perfection of the young men sent out to engage in it. Thirty percent were rejected. Yet by the time World War I began, many air services actually had lower standards than they had for the infantry. Georges Guynemer, who became a great French Ace, was deemed too frail and sickly for the infantry and, thus, went into the air service. Eddie Rickenbacker, in spite of his national status as a premier race car driver and his later status as a flying ace, had a corneal defect that hindered his depth perception.6 The first “flight surgeons” w ere appointed in 1918 to the US Army Medical Research Laboratory. Their original mandate was to figure out what pi lots should wear on long flights when they would be very cold, probably hungry, and certainly deprived of oxygen. The Medical Research Laboratory, in Mineola, New York, offered eight-week courses to train physicians who would be working with pilots and crews. By the 1920s, Army Air Service Major Henry “Hap” Arnold, the f uture commander of the Army Air Forces in World War II, saw clearly the coming f uture risks. Pilots flying in single-seat pursuit aircraft at 20,000 feet would face new biological pressures as aircraft altitudes rose to 40,000 feet. They would need oxygen, air-tight compartments, and pressure suits. Flight surgeons began experimenting on pilots and found that even the most experienced pilots could become disoriented under certain flight conditions. Pilots had to learn to trust technology rather than their own instincts, but there was a “disconnect between those who designed aircraft and t hose who flew them,” Schultz noted. Some technologies could not be used without extreme discomfort or danger.7 The B-17 Flying Fortress, for example, exposed crews to b itter cold, hypoxia, and decompression sickness.
136
Rational Fog
A bombardier struggling with a jammed machine gun might have to remove his gloves and risk freezing his hands. And the B-17 flew high to protect crews from anti-aircraft fire, but its design did not accommodate the biological needs of the crews it was supposedly protecting.8 One of the earliest historians of aviation medicine was the Yale physiologist John Fulton. Fulton oversaw the OSRD wound ballistics program and worked on nausea during the Second World War. He later founded the program in the history of medicine at Yale University. He was also one of the leading midcentury proponents of psychosurgery, especially lobotomy, in the United States. In his account of the history of aviation medicine, Fulton (apparently seeing all studies of oxygen as relevant) tracked its origins to Robert Boyle’s work in the 1640s with the air pump.9 But a more reasonable starting place for aviation medicine might be medical work with altitude sickness, including the work of French and British scientists in the nineteenth and early twentieth centuries. Mountain sickness was a military and a colonial problem, and in the International High Altitude Expedition of 1935, climbers w ere accompanied in the Andes by medical experts (including Ancel Keys) who tracked the consequences of prolonged low oxygen. As the Second World War geared up in 1939 and crews began flying at altitudes of up to 35,000 feet, reports of decompression sickness began to appear in the scientific literature. In 1940, a new US Committee on Aviation Medicine was created by the US National Research Council. While the Navy had established aviation medicine laboratories at Pensacola, Florida, and Bethesda, Maryland, and at the Naval Aircraft Factory in Philadelphia, such laboratories addressed only immediate “flying problems encountered by pilots now.”10 The Army Air Forces also maintained aviation medicine laboratories at Wright Field and Randolph Field, where physiologists worked with airplane companies to help engineer new plane designs.11 But scientists at academic schools of medicine saw the problems of aviation medicine in terms of fundamental h uman biology, and trusted neither the airplane industry nor the defense forces to pursue the questions properly. Decompression sickness was an early primary focus. The Subcommittee on “Decompression-Sickness” formed in May 1942 set out to understand embolism and “the bends.” The group also attempted to use chamber tests to identify recruits who w ere less susceptible to decompression sickness, finding that about half those in the younger age group (18–23) could tolerate exposure at 38,000 feet without symptoms. The group also considered how to pressurize
Battlefield of the Body
137
cabins. “The question of how far troops can be transported without oxygen and remain in a state of tactical efficiency is being analyzed for altitudes between 8,000 and 14,000 feet,” one report noted. The studies could help set specifications for pressure control in aircraft.12 Meanwhile other groups funded through the Committee on Aviation Medicine w ere studying rabbits and guinea pigs “explosively decompressed to 50,000 feet” producing deaths by “rupture of stomach or possible visceral hemorrhage,” devices to record blood content of the h uman ear to aid in the study of “greying and black-out,” the proper construction of decompression chambers, darkness and human vision, and the effects of acceleration on dogs and cats.13 In the summer of 1942, Flexner visited Fulton’s laboratory at Yale, and the two scientists went together to a Connecticut corset company to look at options for gradient pressure suits. The Spencer Corset Company proposed that they could build flight suits “in volume” that would cost about $325.14 That same summer, Harold Lamport of Yale visited amusement parks to find rides that could be modified for the experimental production of nausea. Lamport told Fulton that the “Spitfire” and “Rolloplane” seemed promising, and the Eyerly Aircraft Company of Salem, Oregon, was willing to produce a faster laboratory version of the Rolloplane for $5,000. This would reach a speed of fifty rpm in five to ten seconds. “The ready shifting of the pivoted car suggests its use for studying lateral and negative acceleration as well as the more usual positive direction,” Lamport said. He included with this letter his own sketch, from memory, of the ride.15 The mundane world of everyday technology, of corsets and amusement park rides, were thus leveraged to solve some technological problems of air power. In 1942, Carl Schmidt at the University of Pennsylvania was placing h uman subjects in refrigerated decompression chambers to study the effects of low oxygen and low temperature on respiratory, cardiovascular, and visual function. Robert Wilkins at Evans Memorial Hospital was stressing circulation systems and human subjects until they blacked out. Wallace Fenn at the University of Rochester was placing subjects in a tank with their head protruding through a rubber collar, while pressure was applied to the body to test blood pressure effects. Henry Ricketts at the University of Chicago was keeping people in low-oxygen conditions for six hours every day for six weeks in order to test the long-term consequences of prolonged anoxia. (It is perhaps more accurate to say that Ricketts was trying to do this. He found it hard to secure subjects willing to continue in his program).16
138
Rational Fog
Research in aviation medicine included studies not only of pilot’s bodies but also of all the technologies in which those bodies were placed, such as helmets, goggles, specialized clothing for cold, and antigravity suits. Engineers reconfigured control panels to minimize the effects of “deceleration trauma” (accidents). A 1944 collaboration between the NRC, the Army Air Forces Office of Flying Safety, and the American Society of Mechanical Engineers produced a consensus for standardized engine controls and cockpit instruments. A protocol for “checklists” for each plane reduced pilot error.17 Psychologists sought drugs that could keep pilots alert and calm. The pilot was technologically reconfigured inside and out, almost made into a machine himself.18 Meanwhile the Subcommittee on Decompression Sickness of the US Office of Scientific Research and Development was placing undergraduates at Yale into decompression chambers to figure out whether some individuals were resistant to the bends. They found that about half could tolerate exposure at the equivalent of 38,000 feet for three hours without symptoms. The next step, of course, would be a predictive test to assess which individuals had higher tolerances. In another unforgettable project, the University of V irginia research group filmed the facial expressions of subjects who were brought to a physiological state of crisis by radial and linear acceleration to the point of unconsciousness (Figure 10). They created a visual record of physiological trauma.19 Other groups began to study what to do about “the anxiety state in combat flying.” Capt. Eugene DuBois of the US Office of Naval Research summarized the problem in a 1945 report about what was variously called “flying fatigue,” “flying stress,” lack of moral fiber, or cowardice. The likelihood of pilots’ experiencing such stresses, he proposed, followed a standard Gaussian curve. He had not constructed such a curve on the basis of any data, but he came to his materials with the assumption that it would be t here. Among the factors that could produce this stress, Dubois proposed, w ere e nemy action and the 20 sight of friends being killed. Midrange crashes also became the focus of studies during the war. T here was no point, study organizers noted, in tracking trivial accidents in which no one was hurt, or in tracking “accidents of such severity that the plane was completely disintegrated.” What mattered were the accidents that involved serious but survivable injury. Research organized by Hugh deHaven at the Cornell Medical College found that in these midrange crashes the most
FIGURE 10. Pilot in Physiological Crisis: The face transformed by motion. Eugene M. Landis, The Effects of Acceleration and Their Amelioration, in E. C. Andrus et al., Advances in Military Medicine, Made by American Investigators, vol. 1 (Boston: Little, Brown, 1948), page 251, figure 33.
140
Rational Fog
serious injuries were to the head and the face. The cockpit in 1940 was filled with hazards, including an instrument panel bristling with projections and poorly designed control wheels. The Crash Injury Conferences that began in 1943 focused on bringing together Air Force, air industry, and biomedical scientists to determine how to make cockpits safer. The three-point safety b elt, still used in automobiles and other vehicles, originated with this research.21 In these wide-ranging studies, a visual, x-ray, quantitative, biochemical, and psychosocial record of the stresses of high altitude and high-speed air travel was created as a resource for air power planning. Human subjects (soldiers, undergraduates, medical volunteers, divers, and climbers) modeled the experience of long, cold, uncomfortable, and dangerous flights. Their bodies provided guidance as engineers reconfigured cockpits and clothing for pilots and crews. These studies at many different institutions w ere forms of what we might call normal science. This is the term for scientific studies undertaken within an established paradigm for understanding phenomena. The idea of “normal science” was first articulated by Thomas Kuhn in his influential 1962 book, The Structure of Scientific Revolutions.22 Important to the concept is the idea that scientific research commonly draws on an established, broad consensus that certain kinds of questions are important and relevant. In the case of the nascent field of aviation medicine, creating controlled human injury u nder laboratory conditions was clearly a m atter of broad consensus. Both the airplanes and the p eople placed inside them w ere seen as malleable. Some p eople might be less vulnerable to decompression sickness. Some psychological states might be controlled with drugs. And the airplane itself could be modified to reduce the damage of midrange accidents. The ultimate goal in aviation medicine was to keep pilots and crews alive and functional for as long as possible under as many conditions as possible. This would permit them to continue their bombing missions, that is, keep them healthy so that they could fly high enough and long enough to injure and kill persons on the ground in targeted areas. While the emphasis in the formal texts of aviation medicine was on the survival and safety of crews, that survival and safety by definition facilitated death and injury to o thers on the ground. Aviation medicine was simultaneously knowledge of healing and of injuring. Those already grievously injured also became the focus of field research during the war, for example in Henry Beecher’s studies on the Italian front (Figure 11). Beecher (1904–1976), a Harvard anesthesiologist, was one of the
Battlefield of the Body
141
FIGURE 11. Harvard anesthesiologist Henry Beecher and his team in the Italian Alps, 1944–45. Board for the Study of the Severely Wounded, North African–Mediterranean Theater of Operations, The Physiologic Effects of Wounds: Surgery in World War II (Washington, D.C.: Office of the Surgeon General, Department of the Army, 1952), frontispiece.
most influential figures in twentieth-century medicine. He wrote a widely discussed paper in the 1950s that defined the placebo effect. His papers on unethical research in the 1960s essentially launched the modern bioethics movement. He also later helped to redefine death as brain death, a policy that facilitated organ transplantation. Whatever one makes of Beecher as a person—and he was a complicated person as both Laura Stark and Susan Lederer are helping us to understand—he certainly had an impact on the practices of biomedicine.23 He was also among those who played a key role in field studies of those injured in real time in war. His group worked with severely wounded soldiers on the Italian front to assess possible treatments for shock in soldiers so severely injured that they were not expected to survive. In the system of triage, Beecher and his team were given for field study those soldiers considered unlikely to survive. Beecher’s experiences in his field laboratory on the Italian front reappeared over and over again in his later work. His interest in placebos, informed consent, and brain death all reflected his years as a peripatetic Army medic with
142
Rational Fog
the Board for the Study of the Severely Wounded. In the field, in constant motion, and under tremendous stress himself, he experimented with the treatment of shock, the nature of pain, and the use of anesthesia. When Beecher’s service ended in the summer of 1945, he wrote to his commanding officer that “work in this theater under your guidance has been an experience that will influence the w hole course of my life.”24 What was that work? Beecher had long been interested in shock, but he began d oing laboratory research at Harvard on the treatment of shock in early 1940. By the time he finally arrived at Anzio, he had been campaigning to be allowed to do battlefield research for two years. Perhaps characteristically, Beecher was ahead of his peers in recognizing the value of the wounded soldiers found on a modern battle. His early appeals w ere merely generalized proposals to be sent to Europe if he could be of use. But he later settled on the problem of anesthesia and shock, and in his appeals noted that most research on the problem had been carried out with animals. He said he was not criticizing this work, but suggesting “that many of the urgent practical prob lems concerned with the relationship of anesthesia to shock can only be settled on h uman subjects where there is an abundance of material, namely, at one of the active fronts. In civil practice so few people are operated on in shock that they do not constitute an adequate testing ground.”25 Here the battlefield provided a superfluity of research subjects that could not be encountered in civilian life. Keeping him in a civilian setting at Yale would constitute wasting this opportunity for important research. Beecher characterized it as “an opportunity of a generation, perhaps many generations” that was “slipping through our fingers.” The things that could be learned by a good observer on the battlefront could be of real value “not only in military m atters but in civil practice for many years to come.” He wished to join the armed forces, he said in his appeal to the Chair of the OSRD Committee on Medical Research, and be sent to the front to do biomedical research on shock and anesthesia.26 In the summer of 1943, Beecher’s wishes came true. He was commissioned Major Beecher in the US Army and sent to North Africa as a consultant in resuscitation and anesthesia. He spent a total of twenty-five months in active service in Africa, Italy, and France, d oing his most important work in Italy as a member of the Board for the Study of the Severely Wounded. This group (composed of six surgeons, one chemist, two nurses, and ten clerks, along with drivers and technicians) had a mobile laboratory and seven
Battlefield of the Body
143
pyramidal tents. They traveled throughout Italy following violent activity or “maximum military activity,” collecting 186 severely wounded men for study. Of these patients, sixty-five died. This was a case fatality rate of 35 percent, about double that of the field hospital average. But the Board was sent only the most desperate cases, considered non-transportable regardless of how they had been injured (e.g., in battle or as a result of a car accident or other accident). Beecher’s group was looking at wounds that w ere intended to be abstracted and neutral. The bodies in question were marked not by history or war but by physiological response to blood loss. Beecher’s group found that “suitable clinical cases were abundant and the laboratory was often hard- pressed to keep up with the material.”27 Most of t hose sent to Beecher were Americans wounded in action by gunshot, shell fragmentation, grenades, mine explosions, or the collapse of buildings. Two had been injured in truck accidents. One was injured in a plane crash, another in a tent fire, and three had been accidentally shot or stabbed. Six were civilians wounded during combat, and thirteen were wounded German prisoners of war. The range of people and causes of injury suggests the opportunities that modern warfare created. His project depended on collateral data, the unintended production of opportunities to collect and assess new knowledge, as a result of human and environmental damage produced by war. Beecher’s most intriguing field decision was to ask the grievously wounded soldiers with whom he interacted whether they w ere in pain. In the end, over the course of two years, he interviewed 225 men who had severe injuries from the Anzio, Venafro, and Cassino fronts and a few from southern France. Of these, he selected fifty men with extensive peripheral soft tissue damage and life-threatening injuries. He eliminated men who had severe head injuries because he wanted his subjects unaffected cognitively. He asked each man, “As you lie there, are you having any pain?” A surprising 32 percent of those he interviewed said they were having no pain at all. Around 24 percent said that their pain was severe. These were soldiers with life-threatening wounds, but most of them were not in severe pain.28 The interviews led Beecher to become interested in placebo effects and in the roles of mental states in medicine.29 In ways that must have gratified him, Beecher’s example on the Italian front was invoked as the model for f uture field research with injured soldiers when the Korean War broke out.
144
Rational Fog
From June 1950 when the war began u ntil July 1953 when it ended with an armistice agreement that established a permanent demilitarized zone between North and South, the battlefields of the Korean War w ere more or less constantly functioning as field research sites for scientists and physicians from the United States. Korea became a grand experiment in gastric secretion, adrenal function, muscle metabolism, wounding, the psychology of battle, and the absorption of glucose and circulatory homeostasis after massive injury. Combat was a “unique opportunity” in which “healthy young adult males in excellent physical condition” were “severely injured by high velocity missiles.” The 1955 report of the Surgical Research Team noted that trauma was the central medical experience of war, and “trauma initiates a dynamic pro cess” in which “the wound is more than a wound,” producing complex systemic changes throughout many bodily systems. This complexity, the report’s authors proposed, made an active field research program essential to effective battlefield care.30 Some of the most important work in K orea focused on wound ballistics, the study of wounds as a way to understand how to produce more destructive weaponry. This was very different from Beecher’s field work. He was studying wounds in the field as a way to figure out how to treat shock and preserve the lives of more of those who were severely injured. But wound ballistics was and is an enterprise built around using wounds produced in the battlefield or the laboratory to modify bullet and weapons technology. It is almost a form of back-engineering, from the destroyed flesh to the technological concepts. The goal is to figure out how and why bullets can be induced to cause more damage. We might call wound ballistics the opposite of health research. We could even call it a form of “public health in reverse,” the term usually used for biological weapons. Wound ballistics originated in the mid-nineteenth century, when new small arms technologies produced unfamiliar and devastating wounds. Eric Prokosch’s study of the sciences of wound ballistics tracks the evolving idea that new industrialized bullets produced something akin to “an actual explosion within the body.”31 Tissue not directly hit could be destroyed by the energy of the projectile, and beginning around 1848 scientists began shooting animal organs and tissues trying to understand what was happening. One key model was hydrodynamics. The human body is mostly water, and US scientist Charles Woodruff drew on marine engineering to explain “cavitation.”32
Battlefield of the Body
145
Two British researchers, Prokosch notes, even drew on an image of a “picturesque seaside scene” to explain what was happening: In the summer time when the harbors of the Northeast coast are crowded with fishing craft, one can see the explosive effect exemplified. If a little tug enters the shoal of boats gradually it can push its way through them, disturbing only the boats that touch its bows; but were to steam wildly through them it would scatter them right and left, transferring the shock straight to the harbor wall. A bullet entering the brain at a low velocity drives the contents against the walls of the cranial cavity, but it has not momentum enough to drive them with a rupturing force. A flat nose bullet, like the dumb dumb, is able to transfer its momentum more rapidly and more effectually than a fully mantled bullet—hence its greater explosive effect.33
A range of “flesh simulants” including modeling clay and soap permitted scientists to carry out controlled trial shots that varied all the factors (size, shape, and speed). So, for example, a US Army doctor working in 1916 and 1917, Lewis B. Wilson, fired into gelatin embedded with black threads. The threads permitted him to see the tangling up of fiber and flesh in a simulated wound.34 After the First World War, US Army ordnance and medical department teams began systematic studies of wounds. They shot anesthetized pigs and goats, seen as preferable to cadavers or carcasses b ecause physiological effects of wounding could be studied. They also calculated the loss of velocity and rapidity of loss of velocity as a bullet passed through a target. By World War II, t here was an established body of literature on bullets, speed, the nature of the wounding they produced, the cavitation problem, and the ways that different parts of the body were affected. Experimental techniques had become more accurate and sophisticated. Observers then realized how much velocity mattered. T here was a breakpoint at about 2,500 feet per second. Any bullet moving faster produced significantly more devastating wounds.35 In December 1940, the primatologist Solly Zuckerman and his colleagues published a paper in the British Medical Journal that tracked bullets with a spark shadow graph—a device that could cast a brief shadow of the target onto photographic paper. The method produced astonishing images of tissue change. Animal limbs swelled momentarily, in a distortion that the authors said “can only be likened to that of an internal explosion.”36
146
Rational Fog
In 1943, E. Newton Harvey at Princeton University began shooting cats in order to test how bullets affected flesh. He had a team of five biologists, and ballistics and x-ray technicians, at the Biological Laboratories. The group considered shooting g reat apes, which were closer in size to human soldiers, but they were large, expensive, and difficult to procure. For reasons of space and expense they settled on shooting cats and dogs, and eventually mostly cats. Harvey’s team reduced the size of both missile and target in proportion so that the missile striking a small animal represented the situation “so far as mass of missile and mass of target are concerned analogous to t hose of a standard Army r ifle ammunition and the human body.” The cats then being shot at Princeton were surrogates for enemy soldiers. Mounted on a stand, anesthetized, and shot while being filmed and photographed, they represented future victims of US military action (Figure 12). The group at Princeton used high-speed cameras to take pictures at a rate of 8,000 frames per second of the “changes which occur when a high velocity bullet enters soft tissue.” Such wounds occur in a few thousandths of a second, but Harvey’s team could make these rapid events visible and analyzable with high-speed and x-ray photography. Different parts of anesthetized cats’ bodies were shaved and marked with grids. Images of heads, thighs, abdomens, and femurs documented the damage. Harvey’s team calculated the law of force which retards the missile, developing a retardation coefficient of living muscle that measured the loss of velocity that a sphere experienced when g oing through a cat’s thigh. In this way wound events became technical abstractions. The cats w ere not specific animals, but exemplary ones. What happened to them was generally relevant to wounds in all situations. Equations captured the bodily effects on any type of tissue at any energy. The laws revealed in these interactions between surrogate enemy soldier (a cat) and surrogate battlefield bullet could, in theory, be applied everywhere. Increasing the ability of weapons to produce injury was a technical and quantitative problem. Harvey’s group at Princeton understood the importance of getting the equation right. Wound ballistics studies in Korea were among hundreds integrated into war plans from the beginning. The Wound Ballistics Survey of the Medical Research and Development Board of the Surgeon General’s Office of the Department of Army tracked wounds and the effectiveness of body armor. From November 1950 u ntil May 1951, Carl M. Herget, an Army PhD who had been working for several years on body armor; Captain George Coe, a member of
Battlefield of the Body
147
FIGURE 12. Roentgenogram (#288) of the thigh of a cat made a fter the femur was struck
by a 1 / 32-inch steel sphere with an impact velocity of 3,000 fps. Note the shattered femur and the manner in which the fragments are clustered around it. James Boyd Coates, ed., Wound Ballistics (Washington, D.C.: Office of the Surgeon General, Department of the Army, 1962), figure 107.
148
Rational Fog
the chemical corps; and physician Major James Beyer of the Medical Corps worked together in Korea to characterize wounds. In the charts and diagrams that made up most of their final report at the end of the war, the wound ballistics team in Korea presented data on 700,773 wounds and 4,600 persons. They concluded that much of the ammunition on the battlefield was basically wasted. Most fragments from most bombs hit no one. Small arms killed or wounded very few soldiers. A full 92 percent of casualties were the result of mortar and grenade fragments.37 Like geologists or ornithologists, they collected field objects that could be placed in relationship to each other and to their natural and biological consequences (Figure 13). Fragments of bombs and grenades from killed soldiers were placed in a format that mirrored how birds’ eggs, plant seeds, or archaeological fragments might be placed, by ascending size or clumped in terms of shape or origin. Experts catalogued wounds and mortar fragments using the field methods of natural history. They classified, compared, measured, and named the fragments and wounds that formed the basis of their results. Fragments collected from a World War II German shell, for example, were laid out in a sequence, bits of missiles linked only by their outcome. All had been found in persons fatally wounded by other projectiles, not t hese. These were secondary missiles. In another image, a drawing of a h uman face was marked with the locations at which fragments of Plexiglas had caused injuries, “Location of 85 Wounds due to Plexiglas Fragments.”38 And in one particularly striking composite image, the anatomic locations of 6,003 hits on 850 killed in action due to shell fragments were superimposed on a drawing of a single male body (Figure 14). The patterns on this cumulative body revealed where a hit was most likely to kill: the front of the throat. Such data could provide guidance to snipers and to ballistics experts in the development of weapons and training of personnel. It could also be important in the development of battlefield body armor. Again, the map of the 850 killed in action inscribed knowledge to heal and to injure in a single image. Prokosch’s powerful study of the history of the development of antipersonnel mines tracks the dramatic growth in the killing power of antipersonnel weapons—mines like the M18A1 Claymore, standardized by the Army in 1960 and the first US high-explosive munition in which steel balls were used as fragments. Claymore mines are constructed with “pre-fragmentation” that produces a “high level of mortality,” exploding at precisely the right level to maximize bodily damage. The Claymore was developed in response to the
Battlefield of the Body
FIGURE 13. Placed in a format reminiscent of natural history collections, fragments recovered from a German 75 mm high-explosive shell. James Boyd Coates, ed., Wound Ballistics (Washington, D.C.: Office of the Surgeon General, Department of the Army, 1962), figure 27.
149
150
Rational Fog
FIGURE 14. A single male body marked by 6,003 hits, as a guide to vulnerability. James Boyd Coates, ed., Wound Ballistics (Washington, D.C.: Office of the Surgeon General, Department of the Army, 1962), appendix H, figure 1.
experiences and research data in the war in Korea. It was one practical result of the field studies of the Army’s Wound Ballistics Survey.39 There is at least one other way that the bodies of US soldiers became contested sources of knowledge. The term “friendly fire” usually refers to injury from one’s own artillery or stray bullets, in the context of a ctual warfare. People are sometimes hit by bullets or bombs intended for the enemy. But h ere I use the term to refer to o thers kinds of injury. In this case, I (somewhat idiosyncratically) invoke the idea to describe injury to one’s own troops produced by scientific, medical, and technological sophistication. The friendly
Battlefield of the Body
151
fire I consider is the friendly fire of military dominance, deriving from the privileged status of the United States Armed Forces, in troops that fight with significant scientific advantages. I suggest that sophisticated armed forces face risks directly tied to that sophistication. About 12 million gallons of the herbicide called Agent Orange (it was not orange; the stripe on the barrels was orange) w ere used in Vietnam between 1961 and 1971. Estimates today suggest these 12 million gallons included about 375 pounds of dioxin. Dioxin is one of the most dangerous chemicals known. Millions of Vietnamese civilians, tens of thousands of Vietnamese veterans from several nations, and thousands of workers and other personnel were exposed to these herbicides. The country was drenched with Agent Orange and other herbicides as part of a strategy for managing forest warfare. The long-term health effects have been profound. Interest in chemical herbicide development began in World War II. The chemical industry in general was trying to expand the options for agricultural chemicals for weed control and to increase food production, but all crop destruction technologies had possible military uses. By the mid-1940s chemists had produced effective herbicides in sufficient quantities to use them against Japan’s rice crops. It is perhaps telling that t here were concerns about destroying the rice crop in 1945 b ecause the United States expected to occupy Japan sooner rather than later, and occupiers would need the rice. By 1950 t here was a large and growing domestic market for herbicides. The Air Force secretly contracted with Hughes aircraft to design a spray system for military use. The first test of herbicides in Vietnam was in August 1961 but serious use began in January 1962. One of the earliest practices was to spray Agent Orange on foliage around the perimeter of US military installations with hand-held sprayers. The code name for the larger project was Operation Ranch Hand. Controversial from the outset, the program raised immediate concern in the scientific community about the ecological effects in jungles and mangroves. There were also ethical and l egal questions raised about using any form of chemical weapons at all, even for tactical purposes, and even against plants. The crop destruction program was particularly sensitive, and the Kennedy White House chose to demand direct approval for each of the specific targets. Generally, the fields belonged to noncombatants. Spraying deprived both local residents and enemy soldiers of food. Stellman and Stellman, and colleagues, in their important work, have reconstructed the spraying missions and mapped where
152
Rational Fog
levels of dioxin are most dangerous. Drawing on careful archival work with flight records and an understanding of the practices around spraying during the war, they identified at least 19,905 sorties between 1961 and 1971. After 1966, Agent Orange was joined by other herbicides including Agents Purple, Pink, and Green.40 Concerns about the ethics of using herbicides, the possibility that they could be seen as chemical weapons, and the uncertain ecological consequences were taken just seriously enough by US leadership to animate a camouflage program. The United States signed a pact with the Republic of Vietnam that stated that all herbicides belonged to the Vietnamese once they entered Viet namese territory. Vietnamese nationals managed all inventory control and transfer of herbicides. And missions w ere legally framed as requested by the Vietnamese. While the US Air Force was actually spraying herbicides, the formal “responsibility” was with the Vietnamese. American crews wore civilian clothing when they were spraying, and US aircraft carried out the missions with removable identification insignias. Every mission included a Viet namese crew member. All of this suggests how thoroughly US officials understood the nature of these activities. Some of the aircraft loaded with Agent Orange crashed, and hundreds of flights w ere aborted. Forty-two of t hese aborted flights that were intended to spray herbicides over large areas ended in emergency herbicide dumps, in which the entire chemical payload was jettisoned in about thirty seconds. Early in 1967 more than 5,000 scientists delivered a petition asking President Johnson to end the use of Agent Orange in Vietnam. They were supported by the Federation of American Scientists and the American Association for the Advancement of Science. The spraying program ceased in May 1970. By that time the Environmental Protection Agency was reining in domestic use and this put pressure on the Air Force to remove Agent Orange from its arsenal. When the program was abruptly discontinued, there was still plenty of Agent Orange stored in the United States. The Air Force had tens of thousands of gallons in storage at Kelly Air Force Base outside San Antonio, Texas. In addition, manufacturing sites where Agent Orange had been produced w ere already seriously contaminated by dioxin. One area in New Jersey is one of the most heavily dioxin-polluted regions in the United States, entirely attributed to the manufacture there of Agent Orange by Diamond Shamrock.
Battlefield of the Body
153
It was true that the health effects and toxicity of dioxin w ere not well understood even within chemical manufacturing circles in the early years of herbicide manufacture in the 1930s and 1940s. By the 1960s, the companies knew they had a problem. In 1965 Dow Chemical shut down its production line for nearly a year to construct a new control process to minimize the level of dioxin in the final product. In the end, millions of p eople w ere exposed to high levels of dioxin—three to four million Vietnamese and tens of thousands of US, Australian, New Zealand, South Korean, and Vietnamese troops. US soldiers returning from Vietnam suffered from rashes, nausea, headaches, and birth defects in their c hildren. By the late 1970s many veterans were demanding help, suspecting that chemical exposure was implicated in their problems. The first response of the VA was to characterize their prob lems as psychological. Some doctors and administrators working in the Veterans Administration system, however, began to wonder if Agent Orange was the cause. One benefits counselor for the VA in Chicago, Maude DeVictor, began trying to correlate the sicknesses she was seeing with the spraying maps from Vietnam. She began to talk to representatives from chemical companies, trying to figure out what exactly was in the herbicide and how it could have made the veterans sick. Eventually, after the VA attempted to silence her, DeVictor contacted Chicago television reporters. A report on her ideas by the journalist Bill Kurtis, a reporter for Chicago’s CBS affiliate WBBM-TV, aired in 1978. “Agent Orange: Vietnam’s Deadly Fog” provoked a form Air Force response, and studies began the next year. Eventually the CDC documented that veterans who had served in Vietnam were d ying (of cancers) at a rate 45 percent higher than veterans who had not.41 In 1991 Congress moved studies of Agent Orange from the VA to the National Academy of Sciences. The goal was to provide a more neutral institutional setting for the controversial problem of Agent Orange exposure, and also to expand attention to include not only cancer but neurobehavioral disorders, respiratory disorders, immune system problems, gastrointestinal diseases, and thyroid problems. Meanwhile some of the most grievous consequences of Agent Orange w ere found in Vietnam itself where p eople w ere continuing to live in a heavily contaminated dioxin environment. Studies in Vietnam began in 2002. In the end the class-action lawsuit that was successfully settled in the United States meant that Agent Orange effects w ere constructed as a medical
154
Rational Fog
problem through law rather than through medical research proper. They have never been formally recognized by the scientific community. Dioxin is well understood. But the systems through which so many people in Vietnam, both soldiers and residents, were exposed to high levels of dioxin have prevented official recognition of its effects. Like the missions camouflaged as managed by South Vietnam—the shipments transferred, the falsely marked delivery jets—the long-term health effects are camouflaged by law, made liabilities rather than realities. It is a common enough practical strategy in technoscientific warfare. In a case that shows similar qualities, the first Gulf War in 1991 placed US troops in a climate of technical and medical toxicity from many directions. It was a strange mix. Soldiers received multiple vaccines in the short build-up to the war. They sprayed their tents with organic phosphate pesticides to kill bugs. They experienced the smoke and debris from the explosion and burning of an Iraqi chemical weapons depot that contained sarin nerve gas. They were given a drug to protect them against nerve gases. By any measure their prepared and exposed bodies w ere unusually technological, scientific, and cyborgian. The vaccines and drugs they received were intended to protect them, but might have played a role in making them ill. Almost 700,000 US troops were sent to the Persian Gulf at the end of 1990. Their assignment was to drive the Iraqi forces of President Saddam Hussein out of Kuwait. The military campaign was swift and widely supported by the US public. It was also successful. Shortly after the war, in March 1991, an ammunition depot in southern Iraq was purposely blown up by American combat engineers. This would be a fairly standard strategy to deprive the enemy of resources. But this ammunition depot, called Kamisiyah, contained sarin nerve gas. T hose in the region exposed to the smoke from the explosion were almost certainly exposed on some level to the pyrogenic toxic products of one of the most deadly, rapidly acting nerve gases ever produced.42 The story of Gulf War syndrome follows a familiar trajectory. By 1996, about 100,000 of the 700,000 veterans of the first Gulf War were reporting a range of medical problems. They had gastrointestinal distress, malformed children, chronic fatigue syndrome, memory loss, autoimmune disorders, double vision, joint pain, and other symptoms. Now called Gulf War illness, the condition affects about 250,000 of the 697,000 veterans who served. The definitions and symptoms have varied over time. The VA’s Research Advisory Committee on Gulf War Veterans’ Illnesses (2014) defined it as fatigue,
Battlefield of the Body
155
pain, neurologic or cognitive problems, gastrointestinal problems, and skin and respiratory problems. The US Centers for Disease Control and Prevention’s (CDC) definition overlaps but adds nasal congestion and excessive gas. Numerous studies have shown that Gulf War veterans have a higher prevalence of such symptoms than do non-deployed Persian Gulf War veterans or other control groups, but the illness category remains contentious.43 For decades veterans w ere told by doctors at the Veterans Administration, and by scientists on a VA committee, that their problems were psychological, the consequence of wartime stress or pre-existing psychiatric illness. In 1997 the Pentagon admitted that 100,000 American service members might have been exposed to nerve gas when the ammunition depot was blown up. In a familiar ritual, a new panel of scientists in 1998, convened by the Institute of Medicine of the National Academy of Sciences, also cast doubt on the connection between neurotoxin exposure and veterans’ illnesses. In 2002 a new panel appointed by the Veterans Administration found that Persian Gulf veterans had double the risk of disease as veterans who did not go to the Persian Gulf. There is a large literature on Gulf War syndrome, which has been the focus of several lengthy scientific reports. In t hese reports, there are technical details and t here are underlying assumptions and values about the proper roles of institutions, the nature of biological evidence, and the problems of liability and risk. T hese are all issues subject to debate and disagreement. Like many other complex medical states, Gulf War Syndrome occupies a liminal space, a borderland. If it was produced by the bombing of the weapons depot, it is the result of technoscientific war. And the same systems of knowledge production that brought it into being are now charged with testifying to its existence. In some ways the history of military medicine is an awkward subject. Historians of medicine have consistently been less interested in questions about wartime experiences of health. In 2015, then president of the American Association for the History of Medicine Margaret Humphries proposed that many historians of medicine have virtually ignored wars. She noted a convention in which a story of medical thinking or therapy progresses until “the war happened.” The war, whichever war it happened to be, then ends the story. There is some outstanding work on the history of military medicine, trauma, health care and disability, and war, and particularly on shell shock
156
Rational Fog
and battle trauma. As I have suggested h ere, active battlefields, at least in the twentieth c entury, have been medical hotspots of tremendous energy and productivity. The findings on these battlefields have transformed emergency care, trauma, surgery, shock treatment, and many other fields. Defense funding has supported critically important medical studies that benefit anyone who ends up in an emergency room. It is also important to recognize that the medical effects of modern warfare are technoscientific in two ways. They result from industrialized and scientized war, produced by chemistry, physics, mathematics, and other scientific fields. And they are in turn characterized, contained, or managed through the labor of scientific experts, including statisticians, physicians, epidemiologists, geneticists, and so on. This latter group provides the evidence of the effects of war, and may be called on to determine if any condition is legitimate, real, and biological. The production of experimental and battlefield injuries has often fused these two dimensions of modern technoscientific war. Knowledge of healing and injuring work together simultaneously, both in the field and in the battlefield. One final example: In January 1948, thirty-two heat-acclimatized men were transported from MacDill Air Force Base in Florida to Camp Shiloh in Manitoba, Canada. The field research project was an effort to understand the effects of severe cold on metabolism, dietary requirements, and the adrenal system. The men w ere provided with limited caloric intake for twelve days in –35°F weather. They stayed in a region chosen to assure isolation, desolation, and open exposure to the wind. Their urine and blood w ere monitored. Sleeping in tents, with only standard cold-weather gear, they were also divided into four groups of eight, with each group on a different ration. “As a psychologic strategy to ensure the continuity and reliability of the last days’ observation, the subjects and staff were led to believe that they would be ‘rescued’ on the 14th day at the earliest.” Instead, they were suddenly rescued on the evening of the 12th day, and w ere quickly transported to a warm building in Camp Shiloh. A report on this experiment appeared in the United States Armed Forces Medical Journal in 1950. It was a classic example of the production of experimental injury in order to understand how soldiers could function under conditions of extreme cold and limited dietary intake. It pushed the h uman body to test the limits of hunger, cold, and psychological stress. It even included a vague trick that
Battlefield of the Body
157
would guarantee that the last few days’ results w ere consistent, not affected by the expectation of “rescue.” My approach is informed by the work of the literary scholar Elaine Scarry, who has explored the language used to describe war. She proposes that while it is true that the goals of any military action are formally political, diplomatic, or moral, the action itself is intended to produce human injury. In her 1985 study, The Body in Pain: The Making and Unmaking of the World, Scarry suggested that “injury is the thing every exhausting piece of strategy and every single weapon is designed to bring into being: it is not something inadvertently produced on the way to producing something e lse but is the relentless object of all military activity.” 44 Descriptions of technical capabilities, she proposes, for example a particular helicopter’s ability to hover, are essentially assessments of the object’s power to injure. In the case of a heli copter, this would involve the ability to see the opponent, to approach and reach the opponent, and to damage the opponent, as well as the helicopter crew’s relative degree of safety, and its ability to go on injuring, either immediately or at some future time. Scarry explicitly raises important questions about the common idea that injured bodies are “byproducts” of war. The language so commonly used in war in which injury to civilians is a “byproduct” (or collateral damage) suggests that such injury is accidental or unwanted. But all of the deaths on both sides of any war are centrally useful to whatever it was that was being sought through the war’s activity. She proposes that the human community has uses for death, and that deaths in war are needed or required. Soldiers understand that it is this use to which they have been summoned, this to which they have consented, and that they are e ither going to die for their country or kill for their country. In the end, she proposes, all deaths validate the beliefs of the winning side. Once the war has ended the physical alterations no longer belong to two different sides but seem to belong to the war itself. In ways that I find persuasive, she proposes that war involves a simple and startling blend of the real and the fictional. The reality of the body—the body in pain, maimed, or dead, perhaps hard to dispose of—is separated from its source and conferred on an ideology, issue, or instance of political authority that lacks other, more benign, sources of substantiation. There is no straightforward advantage to settling international disputes by means of war rather than by song contests or chess games, except that the legitimacy of the outcome outlives the end of a war, b ecause, as she puts it, so many of its
158
Rational Fog
participants are frozen in a permanent act of participation. By “participation” she means the bodily nature of their own deaths or injuries. The winning side achieves for a time the force and status of a material fact by the “sheer material weight of the multitudes of damaged and opened human bodies.” 45 Her attention to injury and its usefulness suggests some of the questions raised by the mass production of both technology and injury. As I have suggested here, the h uman body in the twentieth century became a battlefield. It appeared in scientific studies as both weapon and target, as vulnerable, malleable, flexible. Its limits w ere tested in the air, in the cold, and in shock on the Italian front. Human bodily injury in the twentieth century came to constitute a key form of evidence, in both science and politics. As Scarry suggested, the injured body can be evidence of both victory and defeat. It is also scientific evidence of the extremes and the limits of embodiment. In the biomedical sciences focused on h uman extremes relevant to military violence, the mutual nature of knowledge to heal and to injure is illuminated.
7 Battlefield of the Mind
IN HIS FA MOUS 19 47 AR T ICLE “ T HE ENGINEERING OF CONSEN T,” T HE PUBLIC REL AT IONS AND PROPAGANDA
expert Edward Bernays proposed that communication networks were shrinking the world. In the United States, he said, “words hammer continually at the eyes and ears of America,” and the country had become “a small room in which a single whisper is magnified thousands of times.”1 It was an observation that became even more cogent as the decades unfolded. Bernays was (as he often emphasized) a nephew of Sigmund Freud, twice (his m other was Freud’s sister; his father was the b rother of Freud’s wife). Born in Vienna, he moved to New York as a child. Drawing sometimes on his uncle’s ideas, he became one of the most important and successful promoters of public relations and propaganda. His approaches and strategies informed not only the Nazi propaganda machine but also the management of public morale in the United States through the Second World War, Korean War, and even Vietnam War.2 Generally seen as cynically amoral and happy to persuade p eople of anything no m atter how false or dangerous, Bernays was nonetheless recognized in his own life and later as influential and effective. His 1923 book, Crystallizing Public Opinion, drew on psychological ideas about an inner life and 159
160
Rational Fog
described ways to animate subconscious desires that could be manipulated by images and hints, code words, and subtle claims that might be resisted if stated overtly. People could be persuaded, he suggested, only by those who understood how they thought, and Bernays proposed that they did not think clearly. For the average citizen, he said, “his own mind is the greatest barrier between himself and the facts . . . His own absolutism prevents him from seeing in terms of experience and thought rather than in terms of group action.”3 Bernays knew that propaganda had risks. In 1942 he noted that appeals to the common man could be made to prejudices, hatred, and unfulfilled desires. “Manipulation of symbols by unscrupulous leaders against a background of post-war psychological and economic uncertainty, led millions to follow new leaders and ideologies in the ‘twenties and ‘thirties.” The rise of communists, Nazis, and fascists was “obviously” accelerated by this manipulation of symbols. “Hitler used symbolism. The Hitler salute is political symbolism.” The Nazis, Bernays said, had deployed “totalitarian brutality” with “threat, intimidation and censorship.” 4 Even things like national holidays and festivals were oriented in Nazi Germany around party needs and priorities. “Here we have the totalitarian apotheosis of morale building, carried on in total psychological warfare—offensive and defensive. Only the Nazis practiced this advance building of morale at home—false and demagogic though it was.” Abroad they also pursued propaganda goals which Bernays identified as a “strategy of terror.”5 Bernays was only one of many thinkers trying to come to terms with mass culture, nationalism, war, and propaganda in the twentieth c entury. Their work reflected a new public landscape of communications a fter 1900. Literacy scholars estimate that about 12 percent of the world population could read in 1820, with rates in Europe higher (at about 50 percent). By 1900, in the United States, 89 percent of p eople self-identified as being able to read.6 Increasing access in the United States to a high school education a fter about 1910 was shifting family dynamics in unexpected ways, as a new generation became more educated than their parents. Meanwhile, audio broadcasting by radio began in the 1910s, at first mostly for weather, but by 1920 it was providing news, educational programming, music, and dramatic entertainment. When the worldwide depression began in 1929, many households experienced it and understood it through radio, from FDR’s “fireside chats” to news broadcasts about disturbing world events and the threat of another war as the 1930s un-
Battlefield of the Mind
161
folded. The radio linked h ouseholds and communities to a broader world. Like television in the 1960s and the internet in the 1990s, it changed the pace at which information about distant places and events could reach an expanding global audience. Bernays and his peers recognized in this new mass audience a particular form of political, economic, and military power. In 1919 the psychologist G. Stanley Hall, surveying the brutal consequences of the First World War, said that “there is a large sense in which psychological forces play the chief role in all wars.”7 As the infamous World War I propagandist and sociologist George Creel put it, “the mind of a nation must be mobilized no less than its manpower.”8 Political scientist Harold Lasswell said that democracies “need propaganda to keep the less informed members of the public under control” and published a set of strategies for d oing this in wartime, including “strengthen the belief of the people that the enemy is responsible for the war, with examples of the e nemy’s depravity” and “make the public believe that unfavorable news is really enemy lies. This will prevent disunity and defeatism.”9 This way of thinking made e very human mind a potential outpost—a military space—to be undermined, recruited, or enrolled. “Terror bombing” pursued by Allied air forces in the Second World War constructed an image of an e nemy civilian’s psychology (fear) through which empires could fall. “Brainwashing” in the 1950s imagined the potential of a radical disjunction between the body and the mind: a seemingly American POW could in fact be a Communist agent, camouflaged in both an identity and a body. Studies of propaganda and communication quantitatively tracked how influence worked and how powerful it could be, and explored how it could function in the management and strategy of violent conflict. Even anthropology as a discipline engaged with the control of “culture” and the project of bringing isolated groups into “modernity” as consumers and allies of “freedom.” Their work constructed the minds of isolated “primitives” as a resource for the power of the United States (and other powers). Scientific constructions of the mind as a battlefield interpreted mental states as defense resources. Changing minds became a key state project. Scientific and social scientific research on propaganda and communications, psychological warfare, brain-washing or mind control, and obedience to authority was often supported with military funding, especially in the decades roughly comprising the Cold War from the 1940s to the 1980s. It was often oriented
162
Rational Fog
around establishing ways of scientifically controlling feelings and thoughts in order to control economies and political relationships. This research made the mind a critical battlefield in technoscientific war. While most observers might imagine that individual identity is stable and inviolable, psychology, psychiatry, anthropology, and even political science have flirted over the last century with conceptions of identity that are unstable and easily manipulated. Experts in these fields noticed those elements of the h uman mind that made it plastic and malleable. They even produced instruction guides for how to access this plasticity for political, military, and economic purposes. Their work suggested that while identity might be subjectively experienced as core, solid, and integral to the self, it could be left behind under conditions of coercive sensory deprivation, isolation, starvation, and manipulation—or training, modernization, and economic growth. By extension, such methods could be weaponized. Psychological warfare accomplishes most of its work with words and arguments, images and propaganda. Leaflets distributed to e nemy soldiers encouraging them to surrender were an early (and effective) form of psychological warfare in the First World War. Many other programs in industrialized nations followed, bringing social scientists into military programs. If war is politics by other means, as Clausewitz suggested, propaganda is perhaps war by other means. Psychology, sociology, political science, and anthropology are generally understood to be “soft” sciences, low status, and unable to extract reliable, law-like rules about natural truth or society. The practical applications of psy chology and other social sciences to military needs appealed to practitioners at least partly b ecause practical applications were linked to higher status in some scientific fields. Physicists benefited from their success building bombs. Why could not the social sciences do the same? Many of the leading social scientists of the twentieth century were involved in some way in psychological warfare programs or in defense needs, and US defense funding supported research that could be used to shape opinions, loyalties, and views of both friend and enemy. And in the United States, the CIA became an impor tant source of funding in communications and psychological research.10 Much of this research was conceived in the half-light of militarized knowledge. By this I mean that in many social science projects, aspects of knowledge acquired for secret purposes could also be shared and publicized, their origins in defense projects occluded. The resulting scientific work was both open and secret at the same time. As Joy Rohde shows in her work on the
Battlefield of the Mind
163
Pentagon’s social science programs, experts often had both a public and a private or secret account of their work, and t hese accounts did not necessarily conflict, they just differed.11 And Christopher Simpson beautifully demonstrates in his Science of Coercion the mixed up and entangled qualities of communications and political science research. Scholars in these fields routinely disappeared the defense origins of their theories. Ideas nurtured with CIA or Department of Defense funding were published as secret defense reports and then recycled as neutral, academic social scientific research, its defense origins and purposes expunged. Scholars often simply reworded or retitled projects for public consumption so that their military relevance dis appeared. And partly because of this professional tendency, scholars whose thinking called into question the propriety or morality of t hese engagements with defense interests were defined out of the field.12 This Cold War quality of open scientific data—publicly known, with origins that w ere fuzzy, concealed, or vaguely disappeared—was also present in many other scientific domains. It is part of the reason that the influence of defense interests on knowledge production in general has been so poorly mapped and understood. It was meant to be invisible b ecause defense work sometimes seemed inconsistent with the pursuit of pure knowledge.13 The social sciences were, perhaps, more vulnerable to this mixing. Bernays was a uniquely powerful figure in early propaganda. He was famous for his ability to rally community leaders to his cause—whether that cause was Lucky Strike cigarettes or Ivory Soap. In his work for the American Tobacco Company he managed to persuade physicians to issue public findings that smoking tobacco was healthy. One of Bernays’s best-known campaigns was intended to encourage women to smoke, an activity not socially acceptable for women at the time. It was part of his tobacco work in 1928. In an effort to expand the market for cigarettes, he skipped the direct pitch and focused instead on “freedom.” Lucky Strike cigarettes, he proposed, w ere forms of female freedom. As Tye noted, up u ntil World War I, “the pattern had been for firms to alter their product line or pitch to fit changing consumer tastes; Bernays believed that, approached the right way, consumers themselves could be made to do the adjusting.” Bernays consulted psychologists to plumb women’s fears and desires and then enlisted “opinion leaders” like medical experts and media stars to give testimonials about the benefits of smoking. He urged h otels to add cigarettes to their dinner menus. He proposed that smoking could save women from the “dangers of overeating.”14
164
Rational Fog
There w ere problems with such a roundabout and unfocused pitch. He might be successful in persuading women to smoke, but could he guarantee that they would smoke Lucky Strikes? Bernays’s own research showed that women disliked the green and red packaging of these cigarettes. When the company’s president refused to change the color and design, Bernays launched a campaign to promote the color green. Bringing aboard art professors, clothing designers, and society matrons, his campaign culminated in a lavish and well-publicized “Green Ball.” Was it effective? Bernays thought so. American Tobacco’s revenues r ose by $32 million that year. Like others in his profession, Bernays had little interest in truth with a capital T. Truth was merely a tool, something to be found or invented and used to convince people to buy a product or support a policy. His goal was to animate and engineer desire. In one of his most famous papers, “The Engineering of Consent,” he proposed that experts skilled in policy and persuasion could engineer consent by covertly channeling public opinion. The social order appeared almost as a predictable machine that could be controlled if you knew where the levers were. As Bernays explained in his 1928 book Propaganda: “In almost e very act of our daily lives, w hether in the sphere of politics or business, in our social conduct or our ethical thinking, we are dominated by the relatively small number of persons . . . who understand the mental processes and social patterns of the masses.” Bernays saw himself as one of the elite few who “understood.”15 Following in the tradition of Bernays, and heavily influenced himself by Freudian thought, the political scientist Harold Lasswell came to play a crucial role in ideas about propaganda in the mid-twentieth-century United States. During the Second World War he was chief of the Experimental Division for the Study of Wartime Communications. This was a group housed at the Library of Congress and funded by grants from the Rockefeller Foundation. Lasswell helped to create an interdisciplinary group of scholars who developed communications research as a field. This group emphasized the power and importance of behavioral sciences, which could provide insights into h uman motivation and persuasion.16 After the war, Lasswell continued to study symbolism and propaganda at the Hoover Institute and Rand Corporation. Throughout his work he argued that democracies needed propaganda. The reason was that an intelligent elite should make decisions about public policy. T hese elites should then use the
Battlefield of the Mind
165
tools of communication to convince the public that they w ere right. “We must put aside democratic dogmatisms about men being the best judges of their own interests.” Such advice was particularly important during war time, when the US should always blame the enemy for the war and claim unity and victory in the name of history and deity. Lasswell said pivotal events, such as the rise of Nazism in Germany, could be understood only by invoking psychological theories. In his Psychopathology and Politics, he said that p eople did not behave according to logic. The basic premise that p eople acted in their own best interest was wrong, Lasswell said. People could and did support policies that undermined what was most important to them. This happened, he said, because people responded to emotional and symbolic manipulation, particularly in a crisis, rather than to data and reason. Freudian notions of the id, ego, and superego could explain these dynamics, he suggested, as individuals succumbed to primitive impulses.17 Lasswell’s imagined voters were sleepwalkers rather than thinking participants in democratic deliberation. They had to be led to think properly, because on their own they would be swept up in emotion. “White” propaganda (persuasion), Lasswell said, was perfectly legitimate in a democracy. “Black” propaganda, which involved the presentation of something with a false source—for example, information that pretended to be from a government critic when the source was in fact being paid by government agents—was much less so. But the most common form of propaganda, he said, was gray, with some mixture of openness and disinformation.18 It could work by invoking glittering generalities—for example, “freedom”—that bypassed reason and thought. Vague language about virtue could be effective, he said, as could oversimplification and the appeal to “plain folks.” Propaganda could also draw on stereotypes, scapegoats, and well-known slogans. Lasswell noted that unstated assumptions could sometimes be more powerful and persuasive than openly acknowledged ones: a concept that would be rejected if stated explic itly could be accepted if it w ere simply implied. Like many communications and political science scholars, Lasswell was receiving funding for his research in the 1950s and 1960s from the Central Intelligence Agency. The CIA provided full support for the Center for International Studies at the Massachusetts Institute of Technology. For the MIT group, the CIA underwrote publication of formal studies in both classified and nonclassified editions. It also funneled support for researchers at other
166
Rational Fog
institutions through the Center. For example, the Ford Foundation funneled CIA funds to scholars at MIT and elsewhere.19 These networks suggest the intriguing combination of openness and concealment in psychological warfare and communications research. The CIA was concerned just enough about public response to CIA-sponsored studies of communications research to camouflage their lines of support. Funding through an apparently neutral or autonomous organization like the Ford Foundation protected both the scholars involved and the CIA itself. Open military funding was common in many of the sciences, of course. Scholars in physics, biology, chemistry, and even anthropology w ere supported by military sources ranging from the Office of Naval Research (a key supporter of oceanography) to the Atomic Energy Commission (a nominally civilian agency responsible for the nation’s nuclear weapons programs, and that supported environmental and biological research on a grand scale).20 The CIA was more secretive, and scholars might fear that CIA support would call into question the legitimacy of their research. The psychologist and communications theorist Hadley Cantril, associate director of the famous Princeton radio project in the late 1930s, was essentially r unning a broadcast service for the CIA. Cantril was recognized as a serious scientist, the first to attempt to collect and systematically assess communication theory and data survey findings. His surveys of public opinion around the world defined international public opinion studies for more than two decades. Like many other scholars of the time, Cantril believed that the United States should work to bring everyone around the world, as rapidly as possible, into a modern capitalist economy. For Cantril, there could be no intellectually valid critique of western interventionist development policies. T here were, in his eyes, no legitimate questions about the Western assumption that what everyone wanted or needed was to become a consumer of mass-produced goods. Capitalism was inherently good and the only right answer for groups all over the world, no m atter their current 21 circumstances. This idea guided US policy for decades in sometimes devastating ways. It played out with particular violence in isolated populations in the poor world. It was a form of warfare, with economic and social goals that privileged the values and beliefs of elites in the United States. If people could be persuaded to support wars, vote in certain ways, purchase cigarettes, or use beauty products, could they also be persuaded to become modern, capitalistic, demo
Battlefield of the Mind
167
cratic citizens, rather than revolutionary communists? Thwarting the labor of agitators or radicals, in what was then called the Third World, engaged many kinds of science. US observers feared that (naive, poor) people in traditional societies were easily swayed by the appeals of radical nationalists, socialists, and extremists.22 It was therefore important to attract them to capitalism— by fairly draconian means, if necessary. Many nations such as Brazil, the Soviet Union, Peru, India, and China were in practice diverse and polyglot. They consisted of lands, cultures, and linguistic groups cobbled together, configured as nations, but with different kinds of land claims, agricultural practices, religious traditions, languages, and social organization. How could t hese diverse groups be made loyal to the modern state of the twentieth century? Anthropologists could study differences, calculate their impact, and perhaps suggest solutions. The United States had its own history of punitive and murderous practices to manage indigenous groups, as did Australia and many other places. But open, violent cultural erasure and genocidal policies w ere generally understood after 1945 as violations of human rights. The goal of projects like the Cornell-Peru Project at Vicos was to establish a benevolent protocol for bringing indigenous groups into modernity. The Cornell group promised Peruvian officials a low-cost blueprint for rapid cultural transformation that could be applied in other places. The University would document the pro cess and develop plans that could lead to its use in places where “backward” people with their “stagnant” lifestyles undermined economic growth. The Vicos Project was a field experiment where t hese ideas w ere applied to an isolated and impoverished area, one of the most celebrated, closely watched, and simultaneously vilified social science development projects of the period. The “Indian problem” in Peru focused on how to integrate millions of poor indigenous people into the fabric of the modern state, and the work at Vicos was supposedly providing a model for how to do this. Jason Pribilsky’s compelling study of the project explores in practical terms how the project worked on the ground.23 Conducted between 1952 and 1966, the Cornell Peru project was a modernization experiment in applied social change. It took 2,250 indigenous people as research subjects and made the highlands of Peru a social science laboratory. The Cornell team brought new crops, pesticides, DDT, a modern health clinic, and new latrines to the community at Vicos. The project also encouraged new values including faith in modern science and medicine, a rejection of native healers, and consumerism.
168
Rational Fog
FIGURE 15. Cornell Vicos Team in Peru, 1950s. Division of Rare and Manuscript Collections, Cornell University Library, RMC2004-5228.
It created the social change that it promised to monitor, record, and analyze (Figure 15). Widely seen as an audacious social experiment that moved an isolated group from “the Stone Age to the atomic age,” Vicos was also widely criticized. Some observers in Peru saw it as an effort to co-opt indigenous p eople and to prevent socialist revolution through material goods and social indoctrination in the values of capitalism. In 1955 Alan Holmberg, the Cornell anthropologist who led the project, held a prestigious fellowship at the Center for Advanced Behavioral Sciences at Stanford University. There he encountered a network of Cold War social scientists who w ere working to promote US interests around the world, including the political scientist Harold Lasswell, who held an appointment at the CIA-funded MIT Center for International Studies, and the RAND Corporation psychologist John L. Kennedy, who was d oing research on the ways to reduce h uman error in air defense systems. Holmberg’s discussions with t hese
Battlefield of the Mind
169
two men had an impact on his thinking about the Vicos project, as he began to see modernization as an anticommunist opportunity. The Cornell researchers maintained an open door policy at Vicos, inviting other anthropologists, undergraduate students, Peace Corps volunteers, and even journalists to visit the site and learn more about it. In 1963 a journalist from the Christian Science Monitor described the project as proof that a community “however backward can enter the 20th century peacefully and quickly on the Democratic side of the fence.” The same year, Reader’s Digest called it “the miracle at Vicos.” “While the Soviets talked about improving the lot of under developed peoples, the Vicosinos shook off the yoke of feudalism in a single decade.” Saturday Review in 1962 said that the experiment had “lifted the h uman spirit across 400 years within one decade.” As such exuberant press coverage suggests, Vicos stood as an exemplar of the potential for the transformation of indigenous groups not only in Peru but elsewhere.24 Pribilsky emphasizes the ways that the project seemed uniquely scientific to t hose involved. Anthropologists saw it as an unprecedented opportunity to develop a field experiment that was as close as possible to a laboratory setting for the transformation of society. The case provides a powerful way of seeing how the social sciences worked in the Cold War. Persuasion, propaganda, and social manipulation w ere brought to bear on world dominance. Knowledge of people and social systems could become a guide to manipulation. Military support of many different kinds traveled through the networks of academic social scientists in sociology, political science, anthropology, and economics. There was no innocent place to stand, and even those who wanted to help indigenous groups could be enrolled in t hese kinds of projects, which might increase access to food and health care for people who desperately needed them, but w ere also part of an agenda that granted l imited agency to those being scientifically and politically transformed. David H. Price has meticulously documented the FBI’s extensive surveillance of activist anthropologists at the height of the Cold War, proposing that fears of McCarthyism “shaped and dulled what might have been a significant and vital anthropological critique of race, class and the inadequacies of global capitalism.”25 “Free inquiry” was quietly constrained by fears of state retaliation, as progressive intellectuals w ere systematically purged from policy roles and abandoned by academic institutions. In turn, knowledge itself was shaped by both opportunity and oppression. Social scientists seemed to have insights
170
Rational Fog
that could be leveraged to produce social and political change: They were making a theory of mind, relevant to a key battlefield of the twentieth century: the mind. One domain where these questions were most overt, and strange, involved the question of “brainwashing.” In the early twentieth c entury, the American psychologist and behaviorist John Broadus Watson (1878–1958) developed theories of human social control that emphasized the utter plasticity of the mind. He claimed that if he were given twelve healthy infants he could apply his behavioral techniques to make them into anyone he wished, any kind of person—musician, professional, or criminal. Identity in Watson’s theory of behaviorism was utterly formed by environment. It implied that all human behavior could be modified by scientific techniques of controlling the environment.26 Watson’s dissertation at the University of Chicago in 1903 was a study that correlated rat learning with the rat nervous system. Then he turned to seabirds. In 1913 he applied what he had learned to h uman beings. Having studied animals, which cannot be asked what they think, he proposed that consciousness was irrelevant to h uman behavior and certainly unimportant to psy chology. The w hole point of psychological sciences, he said, was not to move inside the brain, but to predict and control h uman behavior. This he thought made it an objective science, one that could pursue questions with mathematical rigor.27 Watson’s c areer as an academic psychologist ended in 1920, when he was forced to resign from Johns Hopkins University over an affair he had with a graduate student. He went on to a successful career in marketing and PR.28 But his early ideas of an empty consciousness—a brain that does not know itself—were influential particularly in the 1950s, when ideas about mind control became a pressing Cold War problem. One of these key ideas came to be known as brainwashing.29 Journalist Edward Hunter publicly used the term in September 1950, in a Miami News article. He said it was a literal translation from Chinese, his-nao, and that it combined modern science and ancient Chinese practices of coercion. In fact the CIA had been using the term well before the Miami News story, and Hunter knew of this use b ecause of his work as a psychological warfare specialist for the US Office of Strategic Services and later the CIA.30 The term initially referred to Chinese methods of retraining uncooperative peasants after the Chinese Revolution of 1949. Raised in feudalistic circumstances, peasants in China had to be reconstructed as communists. Like the indigenous
Battlefield of the Mind
171
peasants at Vicos, they had to accept new ways of life and new political loyalties. In China, those who resisted were imprisoned and tortured until they agreed to change. Some American POWs in the Korean War experienced similar forms of torture and manipulation. In response, some cooperated with the Chinese, signing confessions that falsely accused the United States of having engaged in bacteriological warfare. Chinese interrogators even induced twenty-one US prisoners to remain in Korea, as committed Communists.31 The interrogators w ere especially interested in pilots, who could describe false biological weapons missions in seemingly persuasive detail. These confessions by captured POWs were used in worldwide propaganda campaigns denouncing the United States. Public response in the United States to t hese “turncoat GIs” captured the racial and sexual tensions of the 1950s: three of the twenty-one were black, possibly recruited to highlight racial inequality in the United States, and some reports characterized “half” of the GIs as homosexuals. Zweiback’s account of t hose who w ere interviewed e ither in China or after their eventual return to the United States suggests that t here was no brainwashing involved—people stayed for a variety of reasons, including a fear of prosecution in the United States b ecause they had “broken” and confessed to war crimes.32 But the ideas of brainwashing had tremendous symbolic and political power. CIA officials began to worry about possible “Manchurian candidates.” If trained pilots could be persuaded to undermine their own country as a result of systematic torture and mind control, could they come back to the United States posing as loyal citizens, but actually bent on destroying the country and overthrowing democracy? This has of course been the plot of various films and novels. In 1956, the US Air Force Personnel and Training Research Center at Lackland Air Force Base published a study that assessed Communist techniques of coercive interrogation. It drew on interviews of repatriated prisoners who had resisted, and suggested that a combination of sleep deprivation, malnutrition, isolation, and extreme physical threat and pain could shift prisoners’ sense of identity. It all led “to a sense of terrible weariness and weakness.”33 Robert J. Lifton, who had been an Air Force psychiatrist in service in Korea (1951–1954), worked with some of the prisoners as they were repatriated and explored their experiences and the nature of what he called “thought control” in his 1961 book on the process.34 He proposed that the human sense of identity and selfhood could be changed in several ways. Prisoners could be
172
Rational Fog
induced to experience feelings of guilt and shame, perhaps with forced denunciations of friends and loved ones. Isolation and exhaustion could eventually produce a breaking point, a crisis. When the crisis came, captors commonly offered leniency and an opportunity to survive by essentially becoming someone else. Lifton noted that commonly at this stage people would tend to confess something. They might blame some rejected element of their identity for their flaws, so for example blaming capitalism or democracy. Then the captors would offer the option of rebirth as someone who followed the rules of the new order.35 His theorizing about t hese matters came at just the moment when the philosopher Hannah Arendt was attending as a journalistic observer the (televised) trial of Nazi administrator Adolph Eichmann in Jerusalem. The trial began in April 1961, ended with a guilty verdict in December 1961, and led to Eichmann’s death by hanging in May 1962. Her five articles about the trial appeared in The New Yorker in February and March 1963. They appeared in book form as Eichmann in Jerusalem: A Report on the Banality of Evil later that same year.36 Her report was controversial and is still the subject of ongoing debate. To some critics, she blamed the Jews for the Holocaust, b ecause her text described how effectively Eichmann worked with Jewish leaders in the management of trains to the death camps. It also portrayed Eichmann as anti-Semitic, but suggested that his anti-Semitism did not explain his participation in mass murder. Her search for an explanation settled elsewhere, on the power and importance of individual choices, hour by hour. Eichmann wanted to be promoted, she suggested, and like so many others (those who cleaned the floors at the Wannsee Conference Center where the Final Solution was planned, or those who drove the trains or sold supplies to the camps) he participated as a worker who wanted a promotion and wanted to do a good job. This is what made his evil banal.37 This deeply terrifying vision of totalitarian states and their ability to configure the behavior of individuals also intersected with the unfolding research of a twenty-seven-year-old Assistant Professor at Yale, Stanley Milgram. Milgram referenced the experiences of Lifton’s Chinese-held POWs in his grant proposal for these field studies, which were to become some of the most famous human psychological experiments ever conducted. In his later public pronouncements he tended to emphasize the relevance of his findings to an understanding of human behavior during the Holocaust. He was like others coming to terms with signs that the human mind was vulnerable to manipu-
Battlefield of the Mind
173
lation in troubling ways. But historian Ian Nicholson has persuasively argued that Milgram was also concerned with white masculinity, and the troubling rise of the effeminate “organization man,” a theme that also comes up in Zweiback’s account of public reactions to the POWs who stayed in K orea.38 Milgram’s own research is so well-known as to barely need description. But h ere it goes: Milgram presented the experiment to recruited subjects as a test of memory. Subjects were told that they would be interacting with a second research subject who was a “learner.” The recruited subjects were teachers. In a series of simple word trials, learners were punished by the “teacher” with a small electric shock if they answered incorrectly. The learner was, of course, a confederate in the experiment and had rehearsed dramatic responses to the rising voltage of the nonexistent shocks. The real subject of the scientific study was the behavior of the recruited teacher, and the real question was w hether the teacher would continue to administer shocks as levers indicated increasing pain and danger to the supposed learner: “Intense Shock,” “Extreme Intensity Shock,” “Danger Severe Shock,” and “XXX.”39 Many subjects did protest as the level of shock went up, but an experimenter in a lab coat would urge them on with unrelenting directives to finish the experiment. The white lab coat was apparently important: trials without it produced a lower response. The Milgram Experiments, conducted in various venues with variations in the performance, setting, sequence, and language, came to be widely seen as a touchstone for understanding submission to authority. The submissive response was common and even transnational. Published reports emphasized the submission of white males. The few women tested were not mentioned in the first three blockbuster papers that made Milgram’s fame and reputation, and in some ways Milgram’s experiments seemed to channel concerns about masculinity that permeated Cold War public culture, where a stream of popular and academic literature linked “consumerism, feminism, and Communist infiltration to the demise of the strong ‘inner-directed’ American male.” Historian Nicholson points out that much of this popular literature expressed concerns about the “other-directed” conformist, a trend that spoke to a “worrisome lack of distance between the United States and the Soviet Union.” Milgram’s work, he suggests, presented not a “timeless” experiment on “human nature” (however much it might be true that p eople often submit to authority) but, rather, a historically contingent performance of American masculinity at a time of heightened male anxiety. This, he suggests, is part of what gave the experiments such an
174
Rational Fog
immediate, charismatic quality, a legibility or social force that made Milgram’s career. His work seemed to confirm masculine fears about workplace conformity, and perhaps even about new forms of feminism. Much of the public and journalistic response, too, seemed to reflect concerns about disappearing masculinity and the decline of John Wayne–like “independent thinking” in US society.40 Milgram presented his work as demonstrating that under the right conditions anyone—Eichmann, or any white suburban father—could be induced to engage in cruelty. It was partly about Eichmann, the average Nazi, and even Chinese mind control. But it gained force from more domestic problems of gender identity. One lesson commonly derived from Milgram’s work was that the mind was vulnerable to manipulation in ways that mattered for war and social control. When the CIA began studying mind-altering drugs, their work reflected this notion. After the Second World War, and in response to fears of potential Soviet competition, US defense agencies, particularly the CIA, actively and aggressively pursued studies of ESP, psychokinesis, map dowsing, divination, and supernatural or paranormal phenomena. CIA programs tested many kinds of hallucinogenic and psychotropic drugs on human subjects in an effort to see how they might be useful for extracting information or manipulating subjects. CIA programs studied people who claimed to have unusual powers— psychics, spoon benders, mediums—who were subjected to laboratory and field tests of their abilities. Scientists at forty-four US universities and colleges participated in these CIA projects focused on hallucinogens and special powers, though often without knowing that their funding came from the CIA. The goal was to determine if supernatural abilities were real, replicable, and potentially useful in war. Much of the research involved elite scientific communities, including work at the Stanford Research Institute, at universities like Harvard and Princeton, and in laboratory facilities supported by the Navy, Army, Air Force, and Atomic Energy Commission. It was a shadow world of spectacular paranormal events, LSD, magic mushrooms, and unlikely intersections of science and divination, operating in the usual half-light of secrecy and publicity. Beginning in 1953 in response to reports of Chinese “brainwashing” techniques, the Secretary of State John Foster Dulles approved the creation of the CIA’s MKUltra Project in “mind control.” The CIA aggressively tested LSD and other drugs as aids to interrogation and the control of human behavior
Battlefield of the Mind
175
(many agents took LSD themselves). It involved thousands of subjects, most of whom w ere drugged without consent and many of whom were permanently damaged by their experiences with powerful psychotropic substances. In one of the most infamous cases, the Fort Detrick biochemist Frank Olson was secretly given LSD in his coffee, as a part of MKUltra, and nine days l ater died in a fall from a h otel window. His death has been generally assessed as a suicide brought on by LSD, though his family views it as a murder. Most of the MKUltra records were purposely destroyed in 1973, on order of the then- director of the CIA. But some that w ere stored incorrectly survived the purge, and in the 1970s w ere part of Congressional investigations into the CIA programs. Several lawsuits by those affected and their families followed.41 Many of the protocols and practices in MKUltra did not remotely conform to rigorous scientific methods or ethical research practice—to say the least—but they certainly reflected the idea that scientific research and experimentation could lead to a better military understanding of controlling the human mind. Ideas about mysterious m ental powers have not entirely disappeared. In 2014, a $3.85 million Office of Naval Research program began studying “Spidey sense.” This was a way of describing the intuitive premonition of danger that some soldiers had demonstrated in Iraq and even in some e arlier wars. Some individuals w ere known by their peers as particularly good at avoiding enemies or IEDs and could report a “premonition” of danger that helped them protect themselves and others. It seemed to be form of risk awareness that came without conscious logic. The Navy began developing training to help troops in active combat situations “understand connections . . . and anticipate trajectories.” Peter Squire, program officer in ONR’s Expeditionary Maneuver Warfare and Combating Terrorism Department, told the journalist Annie Jacobsen (who has written several fascinating books about these remarkable borderline worlds of science and war) that “we have to understand what gives rise to this so-called ‘sixth sense.’ If the researchers understand the process, there may be ways to accelerate it—and possibly spread the powers of intuition throughout military units.” The goal is to teach more troops premonition. Note that DoD no longer calls it ESP. It is now “sensemaking” defined as “a motivated continuous effort to understand connections (which can be among p eople, places, and events) in order to anticipate their trajectories and act effectively.” 42 If the brain has predictive capabilities, defense interests are eager to exploit them in war.
176
Rational Fog
Fear was presumably always a part of managing war. Generals, soldiers, and theorists like Clausewitz described it and thought about it. What was dif ferent in the twentieth century was the rise of a network of trained (authoritative) experts who could help produce strategies of fear, mental distress, and mind control. They could apply experimental methods to calibrate how fear and desire could be used for state goals. In the process, the mind became a scientific battlefield, a terrain to be conquered. Some of the theories that social scientists produced matched specific political fears—of the Soviet Union, revolutionaries in the developing world, and indigenous groups in the United States and elsewhere. Over the course of the twentieth c entury, sophisticated h uman sciences— psychology, sociology, and political science—became technical resources for the production of militarily important mental states: hopelessness, fear, morale, and courage. The brain or mind became a place where revolutions could be thwarted, dictators brought down, and capitalist modernity sustained. Propaganda could prevent communism in vulnerable populations and it could mobilize one’s own civilians. The new technical armamentarium drew on social scientific research, theories of the mind, and an expectation that emotions could be studied, monitored, and controlled as a path to victory. The human mind remains one of the most important battlefields in scientific warfare. As terrorism makes clear, the mind is a recognized agent of social order and disorder. M ental states undergird social and political systems. The mind continues to be a critical defense territory to be mapped. The Defense Advanced Research Projects Agency (DARPA) is one of the key funders of the brain-mapping project now underway. This seemingly interior mental space—organ, imagination, and identity—is in practice constructed and deployed widely across fields of knowledge, practice, politics, and social order. In a provocative 1938 paper in a political science journal, the psychoanalyst Gregory Zilboorg proposed that scientifically understanding propaganda would require an understanding of the “obscure psychological processes which take place within the individual.” A man’s “instincts” he proposed, are the “only source of his emotions,” and emotions, “visible or invisible, are the source, the cause, and the meaning of his living, his behaving.” Propaganda could mobilize any population. It was also, he said, “cheaper than vio lence, bribery or other possible control techniques. In times of peace we have a multitude of civilized ways of expressing our hatred,” for example in political campaigns, violent sports, vicarious participation in crimes, and
Battlefield of the Mind
177
participation in the criminal justice system. But in times of war there were no civilized ways to hate. “We must become primitive, and we do.” Dictatorial states, he said, were particularly oriented around “brewing hatred” and “aggressive though these countries appear to democratic communities, their p eople as well as their governments are sincerely convinced that they do nothing more than defend themselves.” 43 Zilboorg thus proposed that the mind was indeed the real location—the core terrain—where wars, revolutions, racial hatreds, and genocides happened. He gave violence a fundamental psychological role and constructed it as almost natural, flowing from human needs. It was necessary in his account. As he contemplated the seething world order in the late 1930s, this must have seemed like a persuasive claim.
8 Blue Marble
A MILI TA RIZED GEOGR A PH Y CONGE A LED A ROUND T HE WORLD BE T W EEN 19 4 5 AND 1975. THE COLD WAR ARMS RACE
brought technologies, weaponry, people, air bases, missiles, and detonations to places once invisible, inhospitable, irrelevant, unknown; to tropical paradises, frozen landscapes, deserts, islands; and to airless, cold places in space and the upper atmosphere and under the sea. These were often places seen by military planners as empty, valueless, owned by no one, occupied by no one, remote, and expendable. Geographical spaces seen this way came to be “filled” with technical achievements. They were sites of engineering and scientific feats of astonishing scale and cost, including missile silos in Greenland’s moving ice (Camp Century), massive bunkers underground (Greenbrier), and ambitious new satellites (CORONA) that circled in space and took surveillance photo graphs even of the most inaccessible landscapes of the USSR.1 With a sense that e very location on the globe posed some kind of potential threat or offered some strategic advantage, US military planners expanded the empire through science and technology. Knowledge was built into the structure of this geographical expansion. The “shadow libraries” of the Cold War literally preserved underground the “human knowledge” and record of 178
Blue Marble
179
the “American way of life,” a way of life to be reconstituted after World War III on the basis of books, photographs, and recordings. Meanwhile, a constant stream of new knowledge made by scientists, engineers, medical experts, and social scientists provided new ways of reconfiguring the geographical world. Many of these projects also fused military and civilian purposes and operated in the half-light of both publicity and secrecy: A lucky Boy Scout won a visit to a nuclear village u nder the ice at Camp C entury, but its mobile, under-the-ice warheads were secret. Nuclear weapons made this militarized world geography in two ways. First, nuclear weapons were formally secret and therefore ideally kept, tested, and produced in out-of-the-way places. Nuclear strategy required silos in bases that could be hidden or were less visible to e nemy surveillance. Plutonium production and nuclear testing also required empty and isolated spaces where secrecy of a certain kind could be sustained. Such facilities and weapons testing programs were physically dispersed in places that w ere at least thought to be isolated or remote. Isolation was part of nuclear production in many places. The second way that nuclear weapons made this new world was through the mobility and detectability—with specialized monitoring devices—of radiation itself. Radioactive wastes and dust moved around the globe, sending “messages” so to speak about testing and plutonium production, and in practice challenging any idea of isolation, emptiness, or secrecy. Eventually, there seemed to be no place that was not linked to nuclearity: traces of radiation appeared in the upper atmosphere, children’s baby teeth, fish, topsoil, photographic plates, reefs, and rain. Soviet detonations could be detected almost in real time in the wind. Radiation spread through w ater and air, invaded human and animal bodies, traveled through the food chain, settled in soils, and accumulated in streams and rivers. Gradually, unevenly, its presence became known as a global problem. Environmentally disastrous, this remaking of the planet also created places that became “natural” in a different way. It is now possible to “dive Bikini” (it is still radioactive, so t here are time limits) in order to see what appears to be a thriving undersea world. Like Chernobyl and other places free of h uman habitation because of radioactive contamination, Bikini has become a special site for “nature” in which wild living things are safe from h uman interference. While the field biologists Timothy Mousseau and Anders Moller have documented the negative impact of radiation on life in such contaminated
180
Rational Fog
places—in sperm abnormalities and unusual growth patterns—other scientists propose that even if radiation is damaging wildlife, h uman presence 2 would be even worse. The bombs and other nuclear disasters have created contaminated purity, made by damage. Nature is protected by what destroys its ability to support human life. As the anthropologist Joseph Masco has observed, human populations have transformed their environments for as far back as we can read human history, but this global nuclear economy “does represent something new. For the first time, the effects of industrial transformation are both worldwide and nationalized through a discourse of state security.”3 The effects w ere material, social, ecological, and political. They were state sponsored and embedded in everyday life. Nuclear weapons as technologies thus physically performed the mixtures of publicity and secrecy that characterized much of the Cold War knowledge enterprise. They remade the world as a bomb site and missile base, and as a radioactive, contaminated, environmentally linked w hole. The experience of this new geography had class and political dimensions. Poor regions of the world were pulled into a vast “slow” nuclear war: the detonation of more than 2,000 nuclear weapons in atmospheric weapons testing programs by the superpowers was, as Bo Jacobs has so vividly reminded us, in practice a limited nuclear war waged against people with no political or technological power.4 Communities in the Marshall Islands, Algeria, Aboriginal territories in Australia, French Polynesia, and Kiribati in the Pacific Ocean were repeatedly bombed. Their populations w ere exposed to radiation, blast, and burn, and their lands contaminated, abandoned, and confiscated. Although Siberia and Nevada were more frequently bombed, by the Soviet Union and the United States, respectively, in territorial domestic testing, many of the remote island bomb sites w ere colonial, extraterritorial, and uniquely marginal. They were, Jacobs says, “the location of a nuclear war fought out of sight of the developed world,” and while scholars in the developed world repeatedly described nuclear weapons as “unusable,” in the poor world they were in fact used repeatedly. The commonly repeated premise that testing preserved “peace” was incoherent. Testing, Jacobs proposes, was itself a war.5 The geographical transformations of the Cold War were beautiful, arresting, and tragic. The earth became a global battleground.6 I here notice the energy and the creativity that was brought to bear in these projects. This attention is not intended to valorize e ither the projects or the skills, or to suggest
Blue Marble
181
that the people and ideas involved should be seen as demonstrative of human progress. Rather it is to call the reader’s attention to just how much human capital has been expended on vast systems of this kind. I want readers to notice what the systems used up, in a finite world of h uman talent and a finite world of natural resources and life. I want them to notice the labor involved, the hard work, the inventiveness, and the resources expended. Noticing it is a way of understanding and seeing its consequences. I close with a consideration of The Blue Marble, the photograph taken of earth from space in 1972. It was a beautiful planet already bound together by networks of technical expertise, and already deeply compromised. I will start with the Columbia River. The visitors choosing the Hanford site in central Washington State on the Columbia River in December 1942 spent four hours t here before they decided that it was ideal for a wartime plutonium production facility. In their initial report, those identifying the site noted the “terrain generally flat, sloping to river. Sandy soil with no vegetation other than sagebrush. Known locally as ‘scab land’ and considered of no value [my emphasis]. Total population and area estimated at less than 1,000 . . . Remainder of land practically worthless.”7 The idea of worthlessness and emptiness, of a valueless place that could be filled with wartime and Cold War technology, appeared over and over in site selection processes for nuclear testing, production, and waste. The sense of Hanford (Figure 16) as remote and unpopulated persisted long a fter the site was in fact filled with p eople, including Hanford workers and their families, and the ranchers and Native Americans who had been t here all along.8 This prevailing sense of emptiness made many of the plant’s policies and practices seem logical. The plant needed w ater, energy, and emptiness. The first serious study of the impact of the radioactive materials generated by the plutonium plant began before the plant was operational and was inspired by salmon, a part of nature understood to have commercial value. Wild salmon had been the focus of a massive extraction industry in the region in the nineteenth c entury, as canning technologies made it possible to ship and market the fish broadly. The dams built in the 1930s destroyed up-river salmon breeding. Only the lower dam had a fish ladder. By 1943 when construction at the plutonium site began at Hanford, commercial salmon fisheries w ere compensating for the disrupted natural salmon runs. They kept salmon in pens, drawing on the river water to keep their fish alive and healthy.
182
Rational Fog
FIGURE 16. The B-Reactor at Hanford: 1945 view from above the entire facility. United States Department of Energy.
Gen. Leslie Groves, who ran the entire Manhattan Engineer District that was building the bomb, l ater reported that one of his colleagues told him that “whatever e lse you may accomplish, you w ill incur the everlasting enmity of the entire Northwest if you harm a single scale on a single salmon.” Perhaps in response, an Applied Fisheries Laboratory was included in the original construction of the site. Concern about fish as commercial products (not actors in an ecosystem) was built into the Hanford site. It was a classified, secret fish laboratory run by scientists from the University of Washington. The contract with the university said that the purpose of the lab was to investigate the use of x-rays in the treatment of fungal infections in salmon. This, however, was not the point. The laboratory staff was in fact using x-rays as a source of radiation to study effects of radiation on salmon and trout. The key question was w hether the Manhattan Engineer District could be held accountable in court for possible future damage to the valuable fisheries.9 The same legal concerns shaped the first scientific monitoring programs for fallout, for example, studies of the fallout from the Trinity test in the summer of 1945. Army personnel used radiation survey meters to track fallout
Blue Marble
183
up to 200 miles from the test. The distance of 200 miles was a legal distance, rather than a biological or a physical one. It was the point beyond which legal counsel believed that litigation was unlikely.10 Soon a fter, the war radiation began to show up everywhere. The July 1945 Trinity test in New Mexico fogged photographic plates at Eastman Kodak’s Indiana plant. By late 1945, radioactive dust was known to be circling in the upper atmosphere around the world. A fter the Soviet detonation of an atomic bomb in fall of 1949, those involved in monitoring nuclear risks realized that the technologies that detected radioactive dust could also reveal the state of the Soviet arsenal.11 The Hanford facility therefore began purposely releasing highly radioactive iodine-131 into the atmosphere in late 1949. The goal was to create an experimental circumstance that might help the US Air Force assess any airborne radioactive materials that were detected and believed to come from Soviet sources. These releases were called green runs, because the iodine was released while it was “green,” still highly radioactive instead of being cooled and filtered first. In December 1949, the plant released roughly 8,000 Ci of iodine-131 from the smokestack at Hanford. This was probably the largest single-day release in the history of the plant.12 The experiment in 1949, the details of which w ere made public only in 1986, contaminated a large area. It threatened regional wildlife and livestock, and it posed a significant risk for civilians and for workers at Hanford. Many scholars have explored how and why Hanford was managed the way it was. Over a few decades (between 1943 and 1983), engineers, scientists, physicians, and administrators, all highly trained, well-educated, and at least as well-informed as anyone about the risks of contamination, systematically created a costly disaster. Hanford is now one of the most polluted and contaminated places in the world, 586 square miles of Superfund Site. Some of the damage done at Hanford is basically impossible to undo, and some sections can be managed only with containment—tunnels that will hold radioactive waste “forever.” How did this happen? Specialized and selective ignorance was probably crucial. During the war itself, rapid production to meet a wartime emergency was a higher priority than safety. While those managing Hanford wanted workers to be protected, the entire facility was rushing to produce enough plutonium for the war effort. In addition, the regional ecosystem was perceived superficially. T hose choosing the site did not consider the larger environment, did not think about
184
Rational Fog
forms of life that were not economically valuable, and did not imagine the “eternal” quality of the waste that the plant would produce. They continued to think of the location as isolated, when it was connected in the usual ecosystem webs that are more widely recognized t oday. They continued to think the waste could be buried safely when it could not. Economic interests and the personal safety of workers w ere higher priorities than environmental damage. Like those who dumped chemical weapons into the sea, those managing Hanford i magined a natural world that was almost infinite, able to disperse anything u ntil it no longer mattered. The site seemed massive, a place where burying waste and releasing radiation would somehow never matter. The same attitude seemed to prevail in many other places that w ere gradually militarized. Deserts, too, became nuclear. The first atomic test ever carried out took place in New Mexico in July 1945. Eventually, the American West became a kind of nuclear colony. Lands belonging to Native American groups were contaminated. Federal lands gained missile silos and testing grounds. At the very moment when Hollywood films celebrated the glories of the American West—with John Wayne starring as the masculine ideal in films like Red River (1948), Rio Grande (1950), Hondo (1953), and Rio Bravo (1959)—the West was becoming a nuclear territory. Ironically, as John Terino pointed out to me, the filming of John Wayne’s The Conqueror in 1956 (a movie in which he played Genghis Khan, with Susan Hayward as his Tartar love interest) was shot partially in St. George, Utah, down range from the Nevada Nuclear Test Site. Some observers referenced in the Wikipedia entry about the film have blamed the testing for high profile deaths from cancer among t hose who participated in the filming.13 The first test of an atomic weapon seemed to require a desert (Figure 17). Harvard physicist Kenneth Bainbridge was project director for this first atomic test. He was charged with figuring out how to prepare for the blast, debris, and radiation the bomb was expected to produce. He also had to choose a location. Los Alamos, where the bomb had been developed, was ruled out from the beginning. It was too close to an urban center, and the test would be loud and visible. At this stage, in July 1945, publicity was not wanted. Bainbridge and his colleagues favored an area that was flat, with little rainfall, low winds, and no or few people. They considered deserts in New Mexico, California, and even sandbars off the coast of Texas. Eventually they settled on the Alamogordo bombing range, in bleak territory near White
Blue Marble
185
FIGURE 17. Trinity Test, 1945, Alamogordo (later White Sands). Atomic Heritage Foundation.
Sands, New Mexico. It was already owned by the US government. It was flat, barren, isolated, and unpopulated. The closest town was twenty-seven miles away. It was unfortunately also prone to intense winds, which might spread fallout. But despite these risks, Alamogordo became the site of the test that was codenamed Trinity. The first device was detonated on July 15, 1945, on continental American territory, twenty-seven miles from an American town.14 The five-ton gadget exploded that day was mounted on a platform at the top of a 103-foot high tower. In essence, the goal of the test was to see w hether the plutonium-implosion–type bomb would work. The team at Los Alamos was confident that the bomb type to be used at Hiroshima—the gun type with uranium core—would work. Planners contemplating the test worried about the glare from the explosion. Observers w ere given suntan lotion and sunglasses. The weather at the site was clear by about 5:30 a.m., and the bomb was detonated. Another weapon would not be fired on continental American soil for more than five years. In 1947, a fter the first Bikini test in the Marshall Islands, the Atomic Energy Commission, the new civilian organization overseeing the nuclear arsenal, began looking again for a testing location within the continental United States. Pacific sites posed problems of transportation for troops, observers, and materials, and were inherently less secure than domestic lands. AEC officials worried that only a national emergency could justify testing within
186
Rational Fog
the continental United States, but soon the Soviet Union supplied one. In August 1949 it detonated a nuclear weapon that was detected within about a week by US observers. Then in October 1949, the Chinese Communist leader Mao Zedong declared the creation of the P eoples’ Republic of China. In the summer of 1950, the United States became involved in the Korean conflict. Suddenly, three emergencies seemed to have arrived. On December 18, 1950, President Truman approved a domestic bomb testing site for nuclear weapons in southern Nevada. It was part of the already established Tonopah Bombing and Gunnery range in Nye County about sixty- five miles northwest of Las Vegas. It was not the only pre-existing military location where a new domestic bomb testing site might have been established. Other sites considered included Alamogordo White Sands in New Mexico (the side of the Trinity test), Dugway Proving Grounds in Utah, and an area near Eureka, Nevada. The site actually chosen was already Air Force property and was promptly turned over to the Atomic Energy Commission. Eventually, it included 850,000 acres or 1,350 square miles—a vast, empty space for practicing with nuclear weapons. When it announced in 1950 that atomic testing would begin again in the continental United States, the AEC proclaimed that no effects would be felt outside the testing range. In January 1951 the first testing began. T here w ere five nuclear detonations between January 27 and February 6, 1951. Like all such tests, these had a codename—in this case, Operation Ranger. The February 2 test shattered windows in Las Vegas. Eventually, Ground Zero was moved twenty-nine miles north to Yucca Flats, an area surrounded by mountains. While testing continued in the Pacific until 1958, the Nevada Proving Grounds at Yucca Flats became the most important site for atomic warfare maneuvers with US ground troops. People were brought to the testing zone in order for them to experience the atomic bomb. Having seen and experienced a nuclear weapon, this thinking went, soldiers would be better prepared in the next war. They would be able to manage their own fears. Soldiers participating in t hese tests were given a pamphlet warning about the dangers of desert reptiles and insects. They were given no health information about exposure to radiation from the bombs.15 In the course of the Cold War more than 1,000 nuclear weapons w ere detonated in Nevada. One above-ground test was carried out in Colorado, and there was early testing in New Mexico. But the Nevada Proving Ground was
Blue Marble
187
the most important domestic nuclear testing site until atmospheric testing ended in 1963. The Soviet Union, engaged in its own testing program, carried out almost 500 tests at the Semipalatinsk Test Site in what is now Kazakhstan, with a similar disregard for the environmental or medical consequences.16 It was a region, like Hanford, that was presented by Soviet authorities as “uninhabited.” The practices at such testing grounds might be implicated in emerging ecological ideas. Anthropologist Joseph Masco has proposed that the 1954 film Them!, which involved very large mutant ants produced by weapons tests in Nevada, depicted atomic bombs as an ecological rather than strictly military threat. The ants were evidence of the mutation produced by radiation and of the possible biological and environmental risks of the bombs. One CDC report estimated that above-ground nuclear testing in the continental United States has produced 11,000 cancer deaths and somewhere between 11,300 and 212,000 thyroid cancers among US citizens. This, Masco proposes, constitutes a radioactive ecology both in the environment and in individual bodies.17 Makhijani and Schwartz have identified seven classes of p eople who must negotiate the health risks resulting from US nuclear testing and production. The first six are traditional: workers, soldiers who participated in weapons tests, human subjects in research projects, DoD personnel who help maintain weapons, and survivors at Hiroshima and Nagasaki. The seventh group, they propose, consists of “the world inhabitants for centuries to come.”18 The site also became relevant to the redemptive technology of nuclear energy. In 2002, the area at Yucca Flats was chosen for a nuclear waste disposal site for spent nuclear fuel and high-level radioactive waste from nuclear energy plants in the United States. The Yucca Mountain Nuclear Waste Repository—which at this point holds no nuclear wastes—is adjacent to the Nevada Test Site. Valued in 1950 for its surface qualities, the region’s geology did not initially m atter. But as atmospheric testing was banned and under ground testing seemed likely to take its place, the geology of the region became important. Significant scientific studies were carried out of its groundwater hydrology, geological properties, and stratigraphy. The knowledge gained from t hese same studies later facilitated plans for possible underground waste storage, which required still more geological research. Yucca Mountain is now one of the best understood geological regions in the United States.19
188
Rational Fog
And the nuclear waste and environmental damage initially identified by AEC insiders as “unimportant” has arguably become the most important issue for the nuclear industry today.20 The Yucca Mountain nuclear storage plan has been controversial and unpopular. Candidate Barack Obama promised to abandon it and once in office, basically did. Opposition in Nevada is particularly intense. The 104 operating US nuclear energy plants today therefore have nowhere to store their wastes—a critical problem in the industry. T hose who favor the use of Yucca Mountain as a nuclear waste repository seem to commonly dismiss opposition to the site as political rather than scientific. Of course, the same would be true of the original creation of the Nevada Test Site, the historical circumstance that made Yucca Mountain a possible waste repository. The selection of that place in Nevada was always political rather than scientific. The American West, with its blooming cacti and stunning mesas and bluffs, holds a special place in the iconography of the United States. It is a geography and a terrain worth seeing, popular with tourists, beloved by citizens. It is the American frontier that disappeared. It has also been the site of massive engineering and scientific feats. The money spent studying the geology of Yucca Mountain alone—some estimates say $9 billion in geological research— suggests the technical complexity of this natural and political terrain. Why do we know what we know about the natural world? Why do we have this geological knowledge and not other knowledge? Defense priorities made it important to completely understand this small section of the earth’s crust. Tropical islands in the Pacific—places of romantic and touristic beauty— also became militarized outposts for nuclear testing and scientific field laboratories.21 In the winter of 1945–1946, US Naval planners decided they needed a deep- water site distant from the continental United States to test naval vulnerabilities. The Marshall Islands, which had been seized by the United States from Japanese control at the end of the Second World War, became the preferred site. Adm. Henry Purnell “Spike” Blandy led the Task Force charged with planning the tests, and Blandy called them “a few miserable islands of insignificant value, but won with precious blood of America’s finest sons.” He suggested that no one cared about them. “All that can be raised on most of t hese islands is a few coconuts, a little taro, and a strong desire to be somewhere else.” These were ways of justifying the seizure and destruction of the homeland of Marshall Islanders, an indigenous group with remarkable open-
Blue Marble
189
ocean navigation skills, a complex social and political structure, and a love of their home islands.22 The legal status of the Marshall Islands was unresolved when testing began there. This did not seem to matter to the military planners. Crossroads, the first test conducted in the Marshall Islands in the summer of 1946, took place while US control of the Marshall Islands was still being negotiated. US scientists and political leaders were also still arguing about whether atomic energy would be u nder civilian or military control, and about w hether and for how long the United States could sustain an atomic monopoly. The islands were seized as testing grounds in a context of uncertainty about the arms race, the future structure of the atomic program, and even their own legal status. US Pacific testing had its origins in the Navy’s fears about air power. Navy officials wanted to test the vulnerabilities of naval technologies to atomic weapons. The question was of some practical relevance. Some critics suggested that the Navy was irrelevant in the age of air power. Could naval forces actually be destroyed by atomic bombs? The question had ramifications for the US Army Air Corps, b ecause atomic bombs might make the “thousand-bomber raids” of Curtis LeMay’s attacks on Tokyo obsolete. Delivery required only a few B-29s, and eventually not even that. Intercontinental ballistic missiles removed pilots entirely from the equation. Between 1946 and 1958, the United States detonated sixty-seven nuclear weapons at Bikini and Enewetak Atolls in the Marshall Islands, producing 80 percent of the total explosive energy ever released into the atmosphere by American nuclear tests. Although testing ended there in 1958 and atmospheric weapons testing in general ended by international agreement in 1963, contamination and radiation-related illness continue into the present. The Proving Grounds “established the relationship between America and the Marshall Islands as one of science by way of violence: Islands and islanders became sites at which violence and mass destruction could be offshored, routinized, and rationalized in order to be understood.”23 Throughout this testing a distinct set of international laws applied to the Marshall Islands. Islanders l ater sued the United States government for the damage to their homelands. As Mary X Mitchell shows, the testing program and the legal and political apparatus that supported it constituted a new form of US imperialism.24 Contemporary commentator and antinuclear activist Norman Cousins described the island testing as the “standardization of catastrophe.” Medical doctor David Bradley, who observed the nuclear testing at Bikini saw it as
190
Rational Fog
modeling other kinds of damage: “Bikini is not some faraway little atoll pinpointed on an out-of-the-way chart. Bikini is San Francisco Bay, Puget Sound, East River. It is the Thames, the Adriatic, Hellespont, and misty Baikal. It isn’t just King Juda [hereditary leader of the Bikinians] and his displaced native subjects about whom we have to think—or to forget.”25 Before the opening of the Nevada Test Site in 1951, the two atolls on the Marshall Islands w ere America’s only testing grounds. Other islands in French Polynesia in the Pacific were the sites of French bomb tests after 1962, and the UK conducted tests at Kiribati. Between and a fter the bomb tests, scientists studied the islands and their plant and animal life, including devastated coconut palms and burned fish. Bikini and Eniwetok (like Hanford and Hiroshima, Nagasaki and Chernobyl, and Fukushima today) became living field laboratories of bomb effects and traces. Scientists found direct results on the islands’ corals, sponges, crabs, rats, coconuts, and local crops. Studies of the people exposed t here revealed heightened cancer risks and other health effects. The beautiful atolls became places where local destruction could produce universal truth. Cold regions joined deserts as crucial military spaces. In a shrinking Cold War world, shaped by bomber ranges and intercontinental ballistic missiles, the Arctic played a special role. These northern regions were beneath the shortest air routes between the United States and the USSR. Greenland, Canada, and Alaska, the friendly northern territories, became home to Ladd Air Force Base, Goose Air Force Base, Labrador Base, Thule Base, and a network of Early Warning Radar Stations around the Arctic Circle. As Lt. Col. Elmer F. Clark put it in his 1965 report on Camp C entury, Cold Regions Research and Engineering Laboratory: “With the advent of such weapons as the atomic bomb, the supersonic long range bomber, and the intercontinental ballistic missile, it was inevitable that military attention should be drawn to the remote arctic regions which lie athwart the shortest air routes between the major land masses of the Northern Hemisphere.”26 Clark’s comments reflected ideas about nuclear exceptionalism—the idea that nuclear weapons constituted an entirely novel form of warfare. And the Arctic was a new front, Alaska a strategic base, and cold weather training a critical new skill for the Armed Forces. Once a place of no particular interest to military leadership, the far north was transformed. Alaska, not yet a state but a US territory, became a natural field laboratory for winter warfare exercises, a site for communications and surveillance
Blue Marble
191
technologies, and a strategic location for bombers and missiles. Defense expenditures in Alaska totaled more than $1 billion between 1941 and 1945, and spending continued to escalate a fter the war. By 1952, the Department of Defense employed more than half the Alaska territory’s workforce, including many civilians. While mining and fishing industries declined, military proj ects and facilities drove economic growth, with defense investments averaging about $250 million per year from 1949 to 1954. The militarization of Alaska also transformed the lives of indigenous workers who had new opportunities and new challenges. When a forty-ninth star was added to the American flag in 1959, it represented the country’s first “defense state.”27 Greenland with its massive ice sheets and intense isolation also became a new focus of US interest during the Second World War and the Cold War. The United States wanted to buy Greenland just a fter 1945, but Denmark did not want to sell. Eventually in 1951, Denmark signed a defense agreement with the United States. This gave the US the right to develop military installations on the ice. The collaboration ended after the 1968 crash of a nuclear-armed B-52 near Thule Air Base. But for sixteen years, United States interests in Greenland coalesced with Danish interest in security and sovereignty. Greenland became a space for the exploration of ideas about “ice worms” (missiles that could be hidden in glaciers). It was also a site for studies of life in space, because the cold at Camp Century seemed to mimic space. By 1966 the reactor at Camp C entury was threatened by rapidly moving ice sheets. It had to be removed. The camp was abandoned.28 Camp C entury thus lasted only six years, from 1960 to 1966 (Figure 18). It was promoted in the press with the “gee-whiz” tone often accorded to technological wonders. And to be fair, it was a visually exciting project. The earliest photos of the camp in official Army reports show massive ice walls, hallways, and under-ice living quarters that seem both familiar and bizarre. The camp was a city u nder the ice, with its own sources of w ater and even food (Figure 19). With room for 225 residents, it had a library, a movie theater, a chapel, and regular meals that seemed like 1950s versions of home cooking. It was also home to Project Ice Worm, an army plan to install 600 mobile Iceman missiles (these were like Minuteman missiles but intended for use under ice) across Greenland’s icecap. These would be constantly moving on rails through permanent tunnels in the ice, able to be launched from 2,100 different spots. The entire plan was shelved, for practical and strategic reasons, not the least being that permanent tunnels in a moving glacier were not
192
Rational Fog
FIGURE 18. Camp Century ice tunnel. Elmer F. Clark, Camp C entury Evolution of Concept and History of Design Construction and Performance (Hanover, N.H.: US Army Materiel Command, Cold Regions Research and Engineering Laboratory, 1965), page 16, figure 15.
possible. The much-heralded nuclear reactor that fueled the whole enterprise had to be hauled out a fter less than three years. It was being squeezed and crushed by the moving ice and it had become a nuclear hazard. Camp Century began collapsing less than four years a fter it was built. By the mid-1960s the problems that were supposed to be addressed by under-ice missiles were turned over to nuclear submarines, seen as the strategic equivalent. In the process of this brief and unsuccessful experiment, however, Camp Century became the first place from which a deep ice core was extracted. The core drilled t here in 1966 went all the way through to bedrock. It provided scientific insights going back more than 100,000 years. It is perhaps impor tant to mention that US Army scientists w ere drilling through the ice cap and collecting ice cores in 1966 as a way of studying climatological history, and assessing the risks of global warming. Polar warming was recognized by Army
Blue Marble
193
FIGURE 19. Cutaway view of Camp Century. Elmer F. Clark, Camp C entury Evolution of Concept and History of Design Construction and Performance (Hanover, N.H.: US Army Materiel Command, Cold Regions Research and Engineering Laboratory, 1965), page 48, figure 37.
leadership as a potential defense problem as early as 1947. Even slight warming would threaten military installations, runways, and roads. It could possibly open up new transportation routes and therefore new avenues of attack for the Soviet Union. Warming research therefore constituted a critical part of the Arctic program of the US Army. Greenland was a scientifically productive site of research in meteorology, geology, and oceanography. A new Arctic Institute of North America, created in 1957, sponsored research in Greenland with funding from US corporations that had interests of various kinds in the far north. The US Army Corps of Engineers also had a significant postwar research program oriented around glaciology. In the Arctic, as elsewhere, scientific projects were often both secret and public at the same time. T hose interested in promoting the remote, isolated places in the far north as sites of impressive technoscience could invite journalists and even Boy Scouts to observe their wonders. Some aspects of t hese projects would be concealed or misrepresented, but their existence was not a
194
Rational Fog
secret. This quality, of intense publicity and intense secrecy, coexisting and simultaneous, took a particularly interesting form in the first satellite system, codenamed CORONA. As John Cloud’s fascinating studies have demonstrated, the satellite CORONA was secret in ways that atmospheric weapons testing, the Hanford River plant, and even Camp C entury could not be. CORONA (top secret code words were capitalized by convention, not as acronyms) was not itself publicized. But it played a key role in collecting dense mapping data that became public. It exemplifies the mixed military-civilian qualities of t hese earth-transforming projects, as Cloud’s account of its internal practices makes clear.29 The CORONA satellite reconnaissance program, which ran from 1958 to 1972, grew out of CIA “black projects” in the early 1950s. T hese were intended to collect information about Soviet installations and facilities. Sputnik heightened US concerns about surveillance from space and led to ramped-up efforts to construct a usable and reliable system of satellite surveillance. In 1962 CORONA became operational. It was the first US technological system for secret photography from space. It was a film return system. Exposed film was wound into a reentry capsule, called the bucket. This bucket was then returned to earth independently of the satellite itself. The satellite could stay up for about nineteen days. It would then reenter the atmosphere protected by shields and parachutes that slowed its descent. It might have sent several buckets back to earth during this nineteen-day run. The satellite had an elaborate security apparatus. It could keep its secrets, in theory. But it also depended on very practical earth sciences research that was public and open. L ater satellite remote sensing systems grew directly out of the CORONA project, and global positioning systems depended on the masses of data it collected—data that were both top secret and readily available in plain sight. CORONA involved contracts with private industry, Lockheed, Eastman Kodak for producing the films, and Itek Corporation, which designed the cameras. Participants in the CORONA project were also drawn from many kinds of institutions, including university scientists from many disciplines. In addition, high-ranking CIA officials coalesced around CORONA. They w ere interested in its potential. And those who worked on the satellites became members of what was known as a code word community, a specialized group of experts with varied stakes in the project. As Cloud puts it, this code word
Blue Marble
195
community was (and is) “the institutional embodiment of the convergence upon which CORONA was built.”30 Reconnaissance satellite systems like CORONA and its progeny have remained the main source for general US map revisions. Revised maps that drew on the CORONA data stated that they were “based on aerial photography and other source data.” That other source data, for all US maps, for the last third of the c entury was CORONA and its successor reconnaissance satellite systems. This means that mapping professionals in the United States routinely had critical access to top-secret data, though the origins of that data had been disappeared. As this brief description suggests, global geopositioning was crucial for collecting both the film and the satellite. When the project began, earth scientists could not fully characterize what is known as “the figure of the earth.” This is the precise shape of the earth, or the earth’s degree of oblateness, its exact curve. Problems with existing models were recognized during the war. The German rocket scientist Werner von Braun, who was recruited to US rocket projects a fter the war as a part of the secret Project Paperclip (which recruited scientists who had worked for the Nazi state), told US officials that aiming V-2 rockets at London revealed data errors hundreds of meters out of alignment, even across such a small distance as the English Channel. Pacific islands later used in bomb tests w ere also found to be mapped to the wrong places on the earth, sometimes by dozens of miles. With the imagined accuracy of new ICBMs, even small errors could be important. Also desperately needed w ere better images of lands inaccessible in any other way: those controlled by the Soviet Union and its allies. But positional accuracy depended on an accurate map of the earth itself and a more perfect understanding of its shape. This kind of information was not needed in earlier wars. The range, scale, and scope of missiles and bombs produced a “shrinking” of the world that generated new fundamental scientific questions. It was one of many ways that the planet became a single entity, bound more closely together. More than any other projects, the satellite projects incorporated every aspect of the planet in a militarized vision of risk. CORONA linked top-secret government initiatives to federal and university laboratories but also to map production factories, professional and trade journals, and scientists and engineers in many institutions and disciplines. The data from CORONA had so many potential uses. It was saturated with value, useful for bombing maps
196
Rational Fog
and hiking maps. It was a way of seeing the world, invented for military purposes and then leveraged in many other ways.31 The satellite projects begun with CORONA evolved quickly from something expected to be interim and tentative, to a major complex of remote sensing. They mobilized geographic and geodesic scientists, and expanded cartography in every imaginable way. In the process they collected intelligence that shaped debates about nuclear capabilities and state options. Close observation of many kinds of activities around the world in almost real time became possible. One result was a sociotechnical system that made the earth knowable, strategic, and also secret in new ways.32 Eventually this system also made surveillance from space part of the h uman condition: Google Earth has images of my new-ish car parked in front of my house in Philadelphia, an example of the compression, from global to local, political to personal, and military to civilian, that space surveillance has produced. There was one other unfamiliar and alien domain that attracted new scientific attention in the heart of the Cold War. For the things and the people that had to survive an expected World War III, new kinds of geographical spaces came to be built underground, carved into mountains, and protected by stone. These bunkers created an artificial world of bleak, long-term survival. Like the desert, the sky, the ocean, and the ice, excavated stone was newly annexed into military planning. One of the smallest bunkers still in use t oday is the Presidential Emergency Operations Center, on the White House lawn, built during the presidency of Franklin Roosevelt and not intended as long-term nuclear survival site. Other bunkers built in the Cold War include Mount Weather in V irginia, for the President and the Supreme Court; Raven Rock Mountain Complex in Blue Ridge Summit, Pennsylvania, a “shadow Pentagon” for directing war; Peters Mountain near Charlottesville, V irginia, for the intelligence community; and the Greenbrier Resort in West V irginia, for members of Congress. Greenbrier was closed in 1995 as a nuclear bunker, but reopened as a tourist site. It is now considered the “the Graceland of Cold War Tourism.”33 Built underneath the legendary Greenbrier Resort in White Sulphur Springs, West V irginia, from 1957 to 1962, the bunker was intended to provide sixty days of food, water, and protection to one hundred senators and 435 representatives. Here they would continue to govern a country engaging in a full-scale nuclear war. Greenbrier had a twenty-five-ton blast door, a 12,544-square-foot bunker, twelve beds for decontamination, eighteen dor-
Blue Marble
197
mitories, a cafeteria, a power plant, and meeting rooms for the House and the Senate. Excavated 720 feet into the hillside under the resort, it also had a large waste incinerator for the bodies of those who died on their way out of Washington—a five hour drive away. E very bed had a constantly updated nameplate as members of Congress changed. T here was a small hospital, a large supply of antidepressants, and two straitjackets (just in case).34 There were practical issues with this bunker, including distance from Washington, and in 1995 it ceased to be the designated emergency shelter for Congress. In 2006, it was opened for public tours. Underground facilities also came to hold documents rather than people. Library and records management, as Spencer has suggested, was critical to “US victory in a Third World War.”35 Indeed, so-called shadow libraries like Iron Mountain Records Center expressed how nuclear planners saw knowledge as part of survival. Protecting the country’s information, a record of its scientific data, and its cultural “heritage,” was a way of imagining a future shadow United States. It seemed that the nation could be reconstructed with the help of library filing instruction manuals, technical data, and revered historical documents such as the US Constitution. In the early 1950s, the Committee for the Protection of Cultural and Scientific Resources was created to plan strategies for storage and protection. Librarian Scott Adams became the leading proponent of remote storage for atomic protection. He coined the term “shadow libraries” in a report to the Association of Research Libraries (ARL) in 1954. He also called on scientific societies, industry, and government to identify what needed to be saved.36 While many everyday libraries did not see their holdings as rare or precious because they w ere available in hundreds of other libraries, government agencies, corporations, and other groups recognized the problem. They began developing strategies to protect their materials, drawing on data produced by a series of extraordinary bomb tests. Just as live animals, naval vessels, and test Japanese houses were placed in harm’s way in atomic tests in the Cold War, so too w ere texts and film records. In the Nevada desert in 1956, during Operation Teapot, AEC staff gathered bookshelves, vaults, microfilm cabinets, and filing cabinets, and filled them with books, paper rec ords, microfilm, and movie reels. These were positioned at various distances from ground zero, with some units left in the open air, and others placed inside buildings. Meters with each cache measured thermal, blast, and radiation effects.37
198
Rational Fog
When a thirty-kiloton A-bomb was detonated, much of this library material was destroyed and burned. But storage units and vaults inside buildings, especially basements, and those located at least 4,700 feet from the explosion fared better. Data from the Teapot records test helped the US National Archives evaluate strategies for preserving the government records it held. A year later a similar test for Operation Plumbbob used a new Mosler Safe Com pany “supervault” that proved to be atomic bomb–proof. Other tests focused on microfilm and found that it could usually still be read, even if it became fogged from radiation. For their mass duplication programs, special librarians could thus rely on microfilm. Microfilm use exploded as a result of the new standards for atomic protection, and US spending on microfilm doubled in 1952, the year when many institutions began their duplication programs.38 For some institutions, a daily exercise in protection became impor tant. The National Archives and Records Administration (NARA) chose a fifty-five-ton Mosler supervault to protect the Charters of Freedom (the Declaration of Independence, the Constitution, and the Bill of Rights). For much of the Cold War, security staff used robotic arms to lower the documents into the vault e very night. Many banks and financial institutions turned to new cooperative libraries which would hold their daily business records in locations distant from likely attack. In terms of their numbers, size, and structural strength, as well as the advanced level of their information management systems, t hese corporate libraries provide a striking demonstration of the importance of information defense to America’s imagined f uture. One of the most famous of these libraries was inspired by the stories of European immigrants who had experienced the chaos unleashed when their records w ere destroyed in World War II. Herman Knaust converted the empty shafts of iron mines and mushroom farms, in an area about one hundred miles from New York City, into a g iant vault, proposing that the magnetite iron ore walls made the new “Iron Mountain” facility “about the safest spot in Amer ica.” By 1961 Iron Mountain held 200,000 microfilm reels, about $100 million in precious paintings, many government records, and daily records of the banking and financial industries. It became a general emergency storage center. Meanwhile, other such libraries were created across the United States in abandoned mines, quarries, and other locations.39 These Cold War facilities now provide protection for documents and rec ords against natural disasters, deterioration, theft, vandalism, terrorism, and war. And in the internet age, underground archives in the United States have
Blue Marble
199
grown larger than ever. The Cold War is long over, but long-term records preservation is still a high priority. Indeed, the entire enterprise highlights the critical value of information in modern society. It is even possible, Spencer suggests, that the cooperative shadow libraries kindled the programs that eventually were intended to protect people. One manager from one of the underground installations proposed that “a corporation starts with a records protection program, somebody brings in a few cots and some food, and the next thing you know, they want a full-scale operating office.” 40 Greenbrier today is a tourist attraction. So, too, are the B Reactor at Hanford; the Minuteman National Historic Site in South Dakota; Warren Air Force Base in Cheyenne, Wyoming; and Missile Site Park in Greeley, Colorado. They are monuments to a transformative moment in US history, when a network of installations was created to launch a full-scale nuclear war, and when the entire planet became a battlefield. These contaminated places have also become wildlife preserves, with set-aside land around Los Alamos; at the Savannah River plant, which produced plutonium and tritium; at the Idaho National Engineering and Environmental Laboratory, which operates fiftytwo nuclear reactors; and even at the Hanford Nuclear Superfund site. The DOE announced in 1999 that 89,000 acres were being set aside at Hanford for the benefit of a bird, the long-billed curlew, and two plants, a desert parsley and Columbia yellow cress. Places where people cannot go, because they are too dangerous, become refuges for other living things. The radiation is of course not good for the wildlife, but it is better, apparently, than h uman beings. The photograph taken by Apollo 17 astronauts and released by NASA to the public on Christmas Eve, 1972, is commonly called The Blue Marble (Figure 20). It is the most widely reproduced image in history. It was not the first image of Earth from space, or the first high-altitude image. It was not even the first image of the entire Earth fully lit by the light of the sun. But it was a photo of breathtaking beauty. It animated a public discussion of “Spaceship Earth.” Many commentators have linked this single image to the general rise of public sympathy for modern environmentalism. It is seen as shaping alternative religious movements and supporting theories of Gaia, because it shows the whole Earth as almost a single living organism. It is both a scientific and a philosophical image with a strong emotional charge. It was released exactly four years after a similarly coded image. On Christmas Eve in 1968, the crew of Apollo 8 broadcast to the world an image
200
Rational Fog
FIGURE 20. The Blue Marble (AS17-148-22727), 1972. NASA.
of the earth rising over the horizon of the moon. That photo was called Earthrise, and it showed the earth half lit with a desolate lunar landscape in the foreground (Figure 21). Everything in space was black, gray, or tan—but earth was brilliant white and blue. Earthrise and The Blue Marble w ere scientific images that performed the Cold War message of peaceful militarism. Space program leadership in both the US and the USSR recognized space as strategic. The chess game was always in progress and it always involved emotions, feelings, and loyalties, as well as missiles and bombs. Space was a part of the struggle between the two superpowers. But the beautiful images of a single gleaming world vaguely promised a hopeful f uture. The very technology intended to create military
Blue Marble
201
FIGURE 21. Christmas Eve 1968, Earthrise. NASA.
advantage had produced images that could be strategically used to suggest unity, a shared purpose, peace. If radioactive fallout and nuclear risk bound the world together around fear, The Blue Marble and Earthrise visually captured the potential of hope.41 The geographical reconfiguration of the planet that unfolded in the Cold War was scientific to its core. The Blue Marble and Earthrise were scientific images. The bombs, bomb shelters, and satellites w ere technical. The iceworms in Greenland absorbed the labor of engineers and physicists and required considerable skill to plan and build. CORONA brilliantly validated the potential of information processing techniques. Human labor oriented around science and technology remade the entire planet in only forty or fifty years. It was a massive and devastating system of knowledge, with legacies that are functionally endless. The nuclear exclusion zones established all over the world are monuments to brilliant and intensive h uman labor that could have been devoted to something else.
202
Rational Fog
In Benjamin Lazier’s thoughtful reflection on philosophical responses to the images of earth taken from space, he references the philosopher Hannah Arendt’s distress about the Soviet satellite Sputnik in 1957. Arendt saw Sputnik as one more troubling manifestation of growing replacement of the living organism or the living earth with technical artifacts. Sputnik was a machine in the sky, and it was, she thought, the first ideological or technological step toward leaving the planet b ehind. It was almost, for her, a machine that claimed that the planet did not m atter, for Sputnik seemed to imagine a future new planet, with this old, tattered planet to be left b ehind. It promised heedlessness, indifference to the present and to the limitations of lives on earth. Arendt saw in this imagination the echoes of totalitarianism, which always requires new lies and new worlds.42 It may have been less clear to her at the time that Sputnik was only a small, visible, even trivial sign of the brutal scientific and technological transformations that had been unfolding by 1959 for more than a decade. The general emergence of “global” awareness came out of the half-light, the shadow world, of bunkers and ice worms. The earth from space, however beautiful it appeared, was a militarized earth, bristling with weapons that held it like a vise.
9 Hidden Curriculum
E X PER T S IN T HE UNI T ED S TAT ES IN T HE COLD WA R FACED A CH ALLENGING PROFESSIONAL WORLD. T HEIR
training had prepared them to see science, medicine, and engineering as benevolent enterprises oriented around “the welfare of mankind.” In practice, however, there was no obvious way to avoid contributing to militarized knowledge. Even if a scientist did not do defense work, they w ere training people who would or could. Even if a discovery was entirely civilian in intent, it could be mobilized and militarized, years or even decades later. Some scientists found themselves unwilling intellectual contributors to defense initiatives they deplored. They were also newly vulnerable to possible prosecution or fines, even deportation, b ecause their expertise itself situated them as security risks, the bearers of secrets that could bring down states. McCarthyism cost many scientists their jobs and their c areers as more than half of those investigated by the federal government between 1947 and 1954 w ere scientists. Some found the tension so overwhelming that they dropped out. Others became historians, sociologists, activists, or critics of science itself, or of other scientists. Or they defensively oriented their research entirely around philosophy or high theory—around anything that could seem safely remote from 203
204
Rational Fog
practical military application. O thers oscillated comfortably throughout their careers from civilian to military research projects, apparently with a sense that this was normal science in twentieth-century America. They sometimes needed security clearances and defense funding and sometimes did not; they did both theoretical work and applied work, in national labs, the defense industry, and academe. And some embraced their roles as political and military players with enthusiasm and enjoyed the access to funding, influence, and power. In autobiographical accounts and archival records of the early Cold War years, one can trace the imperfectly managed tensions, the back and forth. For many experts just a fter the end of the Second World War in 1945, there was a withdrawal, a vow never to do military research again. Some constructed a “line in the sand” that allocated a certain limited percentage of time (20 or 50 percent) to military projects or a certain number of years of service. This could be followed by a perhaps uncomfortable surge of patriotism (the Korean War seemed to animate many but so did Sputnik and Vietnam) that pulled them back in to a system of knowledge production that violated their own ideas about the core values of pure science. Many more, however, w ere passively enrolled, resolutely understanding themselves as “apolitical” even as they supported programs to make biological weapons or build atomic bombs. As David Van Keuren observed in his study of “Science Black and White,” the ties between “pure” science and the practical needs of national security operated across institutions, in academe, in private industry, and even within the military’s own research organizations. The culture of “basic, unclassified science and the world of classified research related to national security sometimes co-existed within the same laboratory quadrangles. They led separate intellectual lives, but because of shared institutional affiliations, they sometimes existed in symbiotic relationship to one another. Indeed, the parallel pursuit of basic research and of national security possibly reached its fullest development within these [military] laboratories.”1 Such spaces provided an education in new forms of professional life. In a July 9, 1954, letter, the Yale University biophysicist Ernest Pollard described to a powerful official at the Atomic Energy Commission how he learned the craft skills involved in keeping secrets. “Many of us scientists learned the meaning of secrecy and the discretion that goes with it during the war,” Pollard said. “We had very little instruction from outside.” When
Hidden Curriculum
205
the war was over, he said, he made a conscious decision to avoid secret research. He “thought carefully through the problems of secrecy and security, and made the decision to handle only material that was entirely open. I returned one or two documents I received concerning the formation of the Brookhaven Laboratory, in which I played a small part, without opening them.” But the outbreak of the Korean War, and his own concerns about the Soviet Union, led to a change of heart. He came to feel that “I as a scientist should pay a tax of twenty percent of my time to do work that would definitely aid the military strength of the United States.” In the process, as he engaged in secret research during the Cold War, he learned a form of extreme social discipline: I have learned to guard myself at all times, at home, among my f amily, with the fellows of my college when they spend convivial evenings, with students after class asking me questions about newspaper articles, on railroad trains and even in church. It has been a major effort on my part, unrelenting, continually with me, to guard the secrets that I may carry.2
Pollard’s practices of secrecy constituted a form of Hochschild’s emotional labor.3 It was bound up in his identity as a human being, and it unfolded at church, in the classroom, even among his f amily. His commitments to the state were threaded through e very aspect of his life—and he knew it. He was aware of the struggle. Others were perhaps less so. In his 1962 interview with the psychologist Anne Roe, who was then working on her second edition of The Making of a Scientist, the paleobotanist Ralph Works Chaney introduced an answer to a question she did not ask. The answer was about his work as Assistant Director of the Radiation Laboratory at Berkeley during World War II. “At that time I learned to lie,” he told her. “I regularly told p eople and advised our staff to tell p eople that they w ere doing things other than what they were doing. I, for example, was concerned with radar, which anyone in the world would have known was a lie b ecause that was left to MIT, but even so most people d idn’t know and I was talking just the night before last to a man who had a high position in the laboratory, he told me that he d idn’t know what we w ere d oing.” After this admission, the tactful Roe did not directly ask him to explain what he had in fact been doing at the radiation lab. Instead, she suggested
206
Rational Fog
that it was unclear what roles a paleobotanist would play in a radiation laboratory. He changed the subject. A few pages later in the typescript of the interview, Chaney brought it up again: “I told you a little while ago I learned to lie. What I did in the Radiation Laboratory is nobody’s business, particularly around h ere. And I have regularly lied, and p eople d on’t really know. And I’m not even going to tell you, though . . . this is no occasion to, you don’t care.” 4 Almost twenty years after the war, Chaney both wanted to talk about it and did not. He had learned to lie and the lesson still troubled him. Chaney learned to lie, and Pollard learned to keep secrets. Their experiences shed some light on the struggles and strategies of the rank and file experts who fueled economic growth and facilitated national defense in the heart of the Cold War in the United States. They learned to keep secrets, lie, and pass polygraphs. They shared tips about what to say in security clearance hearings, how to burn trash, manage selective service requirements, conceal or exaggerate the military relevance of a project, and manage the anger and frustration of their peers. They became vulnerable to science swerved by defense interests that could hijack their own “pure science” goals. They also became vulnerable to formal investigations into their loyalties. In their professional worlds, then, there was a new hidden curriculum, a set of skills learned as a necessary part of their professional training—to lie, to keep secrets. This hidden curriculum provided scientists with some of the guidance they needed to navigate the demands that militarization produced in the Cold War. The term hidden curriculum is drawn from educational theory. It reflects the idea that primary and secondary schools convey an overt curriculum of information and knowledge—the a ctual empirical content of what must be learned for tests and assignments. But schools also convey a hidden curriculum, one of social compliance, submission to authority, disciplined time management, and rule-following. The modern educational system, this idea suggests, teaches algebra and the three branches of government, of course, but also prepares students for conformity, obedience, and productive participation in an industrialized society. The hidden curriculum often has a moral dimension. It conveys to students how to be both good citizens and good workers.5 The generation I focus on here had learned in the course of their formal education after 1900 that science was open, universalistic, and internationalistic—an endeavor focused on the “welfare of mankind.” But in the heart of
Hidden Curriculum
207
the Cold War, for many scientists, their research was not open but secret, not internationalistic but nationalistic, and not conducive to welfare but engaged with the sophisticated technical production of injury to h uman beings—new weapons, new surveillance methods, new information systems, and even new ways to interrogate prisoners, bring down economies, and start epidemics. Rank and file experts in fields from physics to sociology found their research calibrated to empower the state, and scientists trained to see themselves as creating knowledge as a social good found themselves engaged in something that felt very different to them. Professional societies from the American Association for the Advancement of Science, to the American Society of Microbiology, to the American Chemical Society created committees on “social issues” and produced statements on science and the “welfare of mankind” through the 1950s, 1960s, and 1970s. Meanwhile, their members made weapons and worked in the defense industry. For some, the situation became “disastrous.” The mathematician Serge Lang said he found it “unbearable to live under conditions of political repression, so I have to reconcile two contradictory desires. One is to do my beautiful mathematics, and the other is to preserve working conditions, living conditions which are acceptable philosophically, intellectually, humanly.” 6 In response, some scientists became participants in organizations that addressed the militarization of science, like the nuclear arms control group Pugwash, the liberal Federation of American Scientists, and the more radical (and still thriving) activist group Science for the P eople. None of the scientists I consider had the option of opting out completely, at least not if they wished to continue practicing science and doing research. Both what they understood about their circumstances, and what they did not see or recognize, can illuminate the strategies and struggles of individuals as they navigate totalizing systems of political and economic power. In his 1956 book, The Torment of Secrecy, the social scientist Edward Shils proposed that no other professionals had been asked to “sacrifice so much of their own tradition as the scientists.”7 What then were the outlines of t hese sacrifices? What strategies did the security state engender? What did scientists in midcentury America sacrifice? What did they need to know? What was the “hidden curriculum” of militarization? Certainly it all involved keeping secrets.
208
Rational Fog
Handling classified information involved more than silence. It required archival and library skills (what to keep, what to destroy); psychological and social skills (how to mislead people including family and friends); and a knowledge of legal vulnerabilities and liabilities. Experts needed to know how to dispose of documents that had to be burned “by a custodian in the presence of a witness” with a certificate signed by both. Burning trash was a formal problem: Fred Rogers, an astrophysicist at Naval Laboratory in Indianapolis, in 1943 wrote a three-page memo about how to burn trash from the lab.8 They needed to know how to survive a security clearance hearing. Scientists told each other to use certain key words (“intermittent,” “professional,” “now ended”) and to say “Of course I never gave them any secret information.” Relationships with suspect individuals w ere “not current,” “not close,” and “not continuing” in the legalistic phrasing in which they w ere coached, and these relationships had “no likelihood of renewal,” renewal being one of the security board’s criteria for meaningful relationships. Physicist E. U. Condon, in his hearing before the Eastern Industrial Personnel Security Board in April 1954, asserted that “in connection with various defense projects on which I worked early in the war, I was trained, as we all w ere, to h andle classified information. That it was to be communicated only to persons who had the required clearance, and then only if there was need for them to have it as part of their duties. The lessons from that training I have always since observed.”9 Condon spent the next fifteen years fighting accusations of disloyalty, and eventually gave up defense work.10 They needed to know the legal worlds in which their technical expertise made them vulnerable. Grounds for having security clearances revoked included supporting Henry Wallace in his 1948 Progressive Party Campaign for president, supporting l abor u nions, criticizing the Korean War, supporting socialized medicine, and even maintaining membership in the Federation of American Scientists and the American Association for the Advancement of Science. The bar was low for rejection, and a scientist who did not understand the rules could lose a lot. A Department of Defense pamphlet produced for researchers, dated July 1964 and filed in the papers of the mathematician Barkley Rosser, laid out the criminal penalties for various offenses. Conspiring to damage materials or facilities crucial to national defense carried a fine of $10,000 and a prison sentence of ten years. Making defective tools or machines carried the same penalty.11
Hidden Curriculum
209
Scientists needed to keep good records of their personal lives. The mathematician J. Barkley Rosser’s own security investigation file captures this quotidian, commonplace experience: Rosser was like many midcentury experts, subject to state surveillance relating to a wide range of issues. His file included information about his childhood, education, marriage, and e very address he had ever occupied, in California, New York, Washington D.C., and New Jersey. His employment record was outlined, but so too was a sightseeing trip to Quebec and an overnight visit to Nassau. A change in his expected hotel stay in Karachi, Pakistan, in March 1968 was explained (“the plane was very late”). His memberships in the Math Club, Sigma Kappa Phi Fraternity, Duval Country Club, and Princeton University Band were listed, as were his close friends. Rosser’s mathematical work, then, involved potential criminal penalties and high, even intimate, surveillance of his life. Being an expert was dangerous. Rosser, presumably like many other experts, had trouble reconstructing all of his associations and social club memberships. He apologized at one point in the file, saying that he was sure that he had belonged to some others, but he could not remember them, and he closed by noting “I have held and do hold numerous clearances with DOD and its branches but do not have the details. I hold a current clearance with the Institute for Defense Analyses, Princeton, NJ; a clearance in connection with my position as Director of the Mathematics Research Center, US Army, University of Wisconsin, and a clearance through the White House in connection with the Office of Science and Technology. Also, on April 15, 1965, I was cleared for Top Secret by the Defense Industrial Security Clearance Office, Defense Supply Agency . . . and in 1966 my clearance with the National Security Agency was updated.” In other words, Rosser was cleared by many different groups, in many dif ferent ways.12 Experts needed to avoid speaking to the wrong people, and apparently even avoid sitting in the same room with the wrong people. In December 1956, the American statistician John W. Tukey was observed by surveillance agents having “contacts” with Professor George Barnard, a prominent British statistician who had been a member of the Communist Party in the 1930s. These “contacts” consisted of attending a scholarly panel at a meeting at the Imperial College in London, during which Tukey sat in the same room, without speaking to Barnard, for a little more than an hour. The resulting security report on this incident is almost comical:
210
Rational Fog
Doctors C. A. Bennett and John W. Tukey engaged in normal scientific activity on the afternoon of 5 December 1956 attending a statistical discussion at the Imperial College of Science and Technology and a gathering afterward for tea . . . The discussion was opened at 2:30 . . . Dr. Bennett arrived about 2:35, Dr. Tukey about 3:10 and Dr. Barnard about 4:00. Doctors Bennett and Tukey left at 5:15 without holding any private conversation with Dr. Barnard.
Tukey, a Princeton mathematician who had the highest possible Atomic Energy Commission security clearance in the United States (Q) was later required to sign this narrative, including a final statement saying that he had not in fact spoken to Barnard. He and other experts had to tolerate intense surveillance and formal monitoring systems that tracked their every move.13 At times, they also needed to know how to downplay or conceal the military relevance of a project. This was a form of “lying” sometimes for the benefit of their more innocent colleagues, who could be expected to be more sympathetic to a proposed project if they could be hoodwinked into thinking that the research they were being invited to undertake had nothing to do with military interests. “In the scientific part (or appendices) originally intended combat applications had best be played down—adumbrated rather than emphasized . . . described as imaginative investigations of new techniques (e.g. millimicro chemistry, ‘intelligent’ computers, e tc.) or novel combinations of known techniques (power for unattended sound sources, very high speed underwater bodies, e tc.) of inevitable importance to recognized types of military problems.”14 This was in a discussion among the Jasons—a semisecret group of elite scientists who advised the government in the 1950s and later. It referred to a plan for the creation of a possible new National Institute for Science, where classified research could take place in a freewheeling, intellectual, university-like atmosphere. One plan was to have a classified scientific journal, so that scientists could “publish” their work.15 Experts needed to understand how defense funding worked. Would defense-oriented federal funders still be interested if the scientific results turned out wrong? When the geneticist Ernst Caspari, in his third year of AEC support for his mutation research with the moth Ephestia, discovered that one of his research assumptions was incorrect, he worried that AEC support was no longer appropriate:
Hidden Curriculum
211
There is another point which bothers me and which I would like to raise, with your permission. Contrary to my expectations, my results show that the effects of 5-bromodeoxyuridine are much less radiomimetic than is usually assumed in the literature. I am wondering whether t hese results would make the project an inappropriate one for support by the Atomic Energy Commission.16
At this point Caspari had been supported by the AEC for work on Research Contract No. AT(30–1)2902, “Somatic Mutations in the Moth Ephestia,” for more than a year. Edington responded on December 27, saying “Your second question concerning 5-bromodeoxyuridine confuses me somewhat. We are interested in basic research and the fact that 5-BUDR is less radiomimetic than usually assumed does not detract from our interest in your research program. We are interested in your approach to the study of somatic cell genetics and development in Ephestia.”17 For Caspari even to ask the question was to imply that the AEC only wanted certain kinds of findings—that its support was contingent on a particular, specific, technical result. Did Caspari r eally believe this? Did other scientists? Scientists also generally needed to agree with US military interventions. Mathematician Stephen Smale, in 1966, almost lost his NSF grant a fter he publicly criticized US policy in Vietnam. A few years later, two statistics professors at Berkeley found their research grants (from the Office of Naval Research and the US Army) withdrawn after they publicly criticized the war. The mathematician Serge Lang said in 1970 that the mathematics community in the United States had become “intimidated” and afraid of FBI lists, harassment, job loss, and funding loss. Lang resigned from Columbia in 1971 to protest the university’s treatment of antiwar protestors.18 Some kinds of p eople were interpreted as inappropriate for scientific positions that required security clearance by virtue of their sexual orientation. Like w omen, who often w ere seen as more vulnerable to seduction or blackmail by Soviet agents, homosexuals w ere virtually locked out of the c areer paths that many scientists and engineers traveled in the Cold War.19 Richard Gayer, now a San Francisco lawyer whose practice focuses on appealing denials of clearances, had an e arlier c areer as an engineer who did analog cir cuit design. His clearance was suspended in 1970 when he listed his membership in a gay organization. Before 1975, virtually no one who was known or believed to be gay was granted security clearance, at any level.20
212
Rational Fog
In 1983, Gayer successfully represented High Tech Gays, an organization of 700 workers in the high-technology center south of San Francisco. The Department of Defense policy that the group contested called for an expanded investigation of workers who are known to have had homosexual activity within fifteen years of their application. As the policy stated, all homosexual activity is subject to investigation. “Sexual conduct can be a relevant consideration in circumstances in which deviant conduct indicates a personality disorder or could result in exposing the individual to direct or indirect blackmail or coercion.” The manual states that “such behavior includes homosexuality.”21 Security clearances for known homosexuals were often delayed for years (though not necessarily denied), affecting their c areer options. A judge ruled in 1984 that t here was no evidence of blackmail of homosexuals and no reason to subject homosexuals to more lengthy security reviews.22 At the height of the Cold War, being an expert involved compulsory heterosexuality. Harassment could also come from other scientists e ager to e ither reject or embrace the military goals of the United States. The University of Pennsylvania anthropologist Ward Goodenough had no troubles with security clearance or accusations of disloyalty. But his anthropological peers viewed his work on the Coordinated Investigation of Micronesian Anthropology (CIMA) as illegitimate, by virtue of its relevance to naval interests in the Pacific and its funding by the Office of Naval Research. CIMA was the “largest research project in the history of American Anthropology,” engaging about 10 percent of all the anthropologists in the United States in a massive fieldwork program in the late 1950s and early 1960s. Goodenough was at Yale graduate school in anthropology, and as a part of his dissertation, did research on Truk (now called Chuuk) as a part of CIMA. L ater, he said, in the 1970s, he was pilloried by his fellow anthropologists, as a collaborator. “I heard that people said my work (on language and social systems on Truk) was not credible, because it was funded by ONR.”23 Perhaps the anthropologists w ere particularly prone to such responses, even if the politics were flipped around the other way. Goodenough was criticized for taking ONR money, the implication being that his work was contaminated by defense funding. But when the prominent linguist Morris Swadesh was identified by the FBI in 1949 as a security risk, someone not suitable for defense funding, he became unemployable, unsupported by his anthropological colleagues in the American Anthropological Association and the
Hidden Curriculum
213
American Ethnological Society.24 Security issues stirred up strong feelings in every direction, and experts faced potentially angry responses from other experts over a wide range of choices. Scientists w ere also harassed by the 1950s equivalent of Twitter trolls. Henry deWolf Smyth, who voted to permit the Princeton physicist J. Robert Oppenheimer to keep his security clearance (seen by some as an unpatriotic act), received a threatening letter from an “Angry American F amily” that promised “some day we Americans w ill catch up to all of you traitors.”25 The geneticist Arthur Steinberg was also subject to shocking public attacks, losing a deal on a house and several jobs because of inaccurate reports that he was a Communist. Steinberg gave only thirty-five documents to the historical archives at the American Philosophical Society (APS) in Philadelphia. All of them chronicled the cruelty with which he was treated after being accused. A January 1954 letter from his attorney to a housing development where Steinberg and his wife had attempted to purchase a home described “the anonymous phone calls which my client received from some neighbors threatening dire consequences if they lived in the house.” A 1948 letter from a colleague openly stated that Steinberg had been removed from the list of viable candidates for a job b ecause departmental faculty had heard about “the Communist charges.”26 In selecting what he chose to donate to APS archives, Steinberg clearly intended that his painful experiences not be forgotten. Meanwhile, sociologists documented the unhappiness of scientists working in industrial and military labs. Charles D. Orth in 1965 proposed that perhaps scientific training in itself, whether it be in medicine, chemistry or engineering, appears to predispose those who go through it to unhappiness or rebellion when faced with the administrative process as it exists in most organizations. Scientists and engineers cannot or w ill not . . . operate at the peak of their creative potential in an atmosphere that puts pressure on them to conform to organizational requirements which they do not understand or believe necessary.27
This sociological scholarship does not look like “sociology of science” as it would be defined today. Rather, these scholars studied a community of experts in terms of organizational dynamics, workplace behavior, and business administration. Their goal was to help managers resolve problems with
214
Rational Fog
scientists, engineers, and technicians in the workplace. They proposed that such problems began after the Second World War. Thus Orth, Bailey, and Wolek, in their 1964 study of “the behavior of scientists and engineers in organizations,” said “these new p eople bring with them new and deeply dif ferent needs” and “we are fumbling largely in the dark as we try to insure their motivation.”28 The transformation they tracked, they proposed, was novel: These problems are relatively new. Before 1920 they w ere virtually unknown. At that time scientists off campus were nearly unheard of, while scientists on campus fell into small clusters of four or five specialists with virtually nothing to manage beyond a small amount of inexpensive equipment and their own free time. Those days of now-nostalgic myth abruptly vanished forever with the advent of World War II.29
Sociologist George A. Miller, in a 1967 paper in the American Sociological Review, said that industrial work conditions conflicted with the professional ideals of scientists and engineers.30 Other observers proposed that t hose working in such conditions w ere generally mediocre scientists: At a 1961 Office of Naval Research Manpower Meeting, participants questioned the quality of experts employed at the Johns Hopkins Applied Physics Laboratory, the Naval Research Lab, the Naval Ordnance Laboratory, Aberdeen Proving Ground, Taylor Model Basin, Naval Electronics Lab, and Wright Field.31 In the eyes of some experts, the new professional experiences of the Cold War permitted mediocre scientists to find employment and led to the alienation of individual experts.32 Perhaps this general alienation explained why some scientists dropped out. In July 1971, Larry Wendt (his signature) sent a letter to the newsletter of Science for the People (a Boston-based organization of activist scientists including the physicist Charles Schwartz) addressed to “Dear Brothers and Sisters.” Wendt said that three years earlier he had discontinued his education in science, three courses short of a bachelor’s degree in Chemistry, when he had “come to the sudden realization that my education was totally inane, dehumanizing and worthless. A degree in chemistry was only good for making a buck. From then on, I reoriented my life away from science and toward humanity.” Now, he said, he had made the right choice because “instead of doing something like inventing new nerve gases . . . or making lots of money, I am
Hidden Curriculum
215
awaiting sentencing for refusing induction.” Before he heard about Science for the People, he said “I thought that most scientists were politically, socially and emotionally retarded.” Now he felt that he could not ignore the scientists. “Technology belongs to the people, not to an elite. The people who rule America use technology to turn the whole world into a servo-mechanism to feed their sick desires. That same technology can be used to feed, provide good health, and house everybody on this planet.”33 Thus this refugee from science summarized his objections: science was engaged only with money and violence, when it should be engaged with social justice. In his disgust for science, of course, Wendt was expressing a powerful professional ethos that science should be pure and unengaged with commerce and power, and focused on meeting human needs and on human justice. The almost-chemist had absorbed just enough of his introductory science training—his indoctrination into a specific historical construction of what science should be—to be disturbed by what he saw.34 The University of Washington Sociologist Jeffrey Schevitz interviewed a group of scientists who worked in defense in the 1970s and asked them explic itly about their roles in weapons production during the Vietnam War, a war with which many of them disagreed. Schevitz’s published account of these interviews emphasizes the ways that scientists and engineers drew bound aries around their responsibility for what they were involved in making. Some emphasized that their own contributions w ere relatively distant from “napalm or anything positive.” They were only involved, for example, in making the “traveling wave tubes” that helped to confuse enemy radar and make it less likely that American planes would be shot down. Others portrayed themselves as helpless and unimportant: their own positions were too weak to have much influence either way (“My particular protest isn’t going to make too much difference ”). Some said they worked on weapons, but protested the war in other ways, for example by putting peace bumper stickers on their cars or by bringing up the antiwar movement at lunch with other scientists. One electrical engineer who helped to design “elegant” proximity fuses told Schevitz that he and his engineering team recognized that “the purpose of all proximity fuses is to increase the kill-ratio, which is a very ugly thing for all of my engineers to be thinking about, and yet, they’re thinking about it. It bothers us all, really, that such a clever device and technologically interesting phenomenon” has as its only customer the US Air Force. But, he noted, “we have our competitors. If we d on’t make this device our competitors will.”35
216
Rational Fog
Schevitz also interviewed “drop-outs,” scientists and engineers who left the defense industry. One engineer reported that while working on a new assignment that was “very enjoyable for me analytically,” he “began to be more and more troubled about the nature of my work,” which was to figure out optimal routes for aircraft to bomb cities without getting shot down. Another drop-out said “It had never occurred to me that earth sciences was, uh, militarily oriented.” Schevitz interviewed a senior PhD chemist he called “Mary” who had made it into top management at Stanford Research Institute and who had finally, after many years of personal reservations about her work, submitted a letter of resignation. In her long career, which began during World War II, she had worked on atomic bombs, combustion, ballistic missiles, and explosives, all, as she put it, “work that can be used to kill p eople.” Now, she wanted to get out, and “do some good.” For this scientist, timing was the issue. She had given a certain period of her professional life to producing injury. Now, she told Schevitz, that time was over.36 Calculating the proper proportions—the balance between violence and the welfare of mankind—seemed to be one strategy. In March 1969, Ronald F. Probstein and his fellow researchers at MIT’s Fluid Mechanics Laboratory announced that they had made what they called a “directed effort to change” their research. They reduced the amount of military-sponsored research they were doing from 100 percent to 35 percent, with the remaining 65 percent explicitly devoted to “socially oriented research.”37 The physicist Brian Easlea, hired at Sussex University in 1964 to teach physics, became instead a prolific historian of science and science studies scholar. He switched departments after a two-year stint teaching physics in Brazil, where he was distraught by what he saw of the Brazilian state’s use of scientists and science. His books (four over about a decade) were informed by feminist theory and had a freewheeling, satirical, undisciplined style that presumably did not appeal to professional historians of science—he has not been widely cited in the field—but that raised provocative questions. Easlea attributed many social and political problems to misogyny and insecure masculinity. Liberation and the Aims of Science (1973), Witch Hunting, Magic and the New Philosophy: An Introduction to Debates of the Scientific Revolution 1450–1750 (1980), Science and Sexual Oppression: Patriarchy’s Confrontation with Woman and Nature (1981), and Fathering the Unthinkable: Masculinity, Scientists and the Nuclear Arms Race (1983) all presented novel and feminist ideas about the origins and practice of science.
Hidden Curriculum
217
In his account of the building of the atomic bombs, Fathering the Unthinkable, for example, Easlea dissected the language and social practice of the nuclear weapons community, to highlight a juvenile and misogynistic culture. He tracked the many ways that bombs w ere commonly seen as living t hings, and as male or female. When Livermore’s first bomb tests fizzled, other physicists said the lab was producing “girls.” And critics of Edward Teller, intending to insult him, said he was the “mother” of the hydrogen bomb rather than the father: Stanley Ulam had the idea and “inseminated” Teller with it, a way of putting t hings that made Teller the weak, “female” partner. Indeed, Easlea constructed the entire bomb program as an expression of insecure masculinity, which produced violent conquest of nature driven not by real human or even political needs, but by the deep rage of what he called uterus envy.38 Easlea’s reactions to the militarization of science were particularly broad ranging and included a fairly substantive analysis of the origins of science in general. His only book that was widely reviewed was the 1980 Witch-Hunting, Magic and New Philosophy, and in a sizzling review, the historian of science and medicine Roy Porter (1982) expressed both enthusiasm and bewilderment, calling the book “as frustrating as it is stimulating.”39 This is probably a fair assessment of Easlea’s work in general. Easlea still has some devoted followers—he died in late 2012.40 Unlike so many historians who treated the atomic physicists with an emphasis on their brilliance, Easlea presented them (his own tribe, more or less) as war criminals. Samuel Stewart West also joined in the critique and called into question the culture of physics. Like Easlea, West changed c areers, in his case from physics to sociology. He earned his physics degrees at Caltech, completing the PhD in 1934 and publishing several papers in geophysics volumes in 1941 and in 1950. He then earned a master’s degree in sociology at the University of Washington, Seattle, and began to publish papers on the social mores of the community from which he had decamped. Caltech in the 1930s would have been a fertile ground for the kinds of dissatisfaction he came to feel. Kaiser has catalogued how an older generation of physicists responded to the explosive enthusiasm for physics after 1945. Leaders in the field worried that physicists w ere no longer drawn to the science as a vocation. Graduate students wanted jobs, leisure activities, and suburban homes, and they trained to be managers and bureaucrats rather than path-breaking thinkers.41 West’s enraged 1960 paper calling into question the
218
Rational Fog
“Ideology of Academic Scientists,” however, was about much more than suburban housing or intellectual mediocrity. West interviewed fifty-seven academic scientists. He asked them questions about their freedom to pursue research and reach conclusions based on evidence, as well as impartiality, absence of bias, and group loyalty. He also mea sured creativity. In his explanation of his research program, West proposed that it is “commonly believed that persons engaged in scientific research adhere to a set of moral values representing ideal types of behavior which facilitate the production of new knowledge. However, the many listings of t hese values which may be found in the literature range from intuitive to, at best, speculative.” He went on to list the values he found in historian of science Bernard Barber’s 1952 book, Science and the Social Order. These included emotional neutrality, faith in rationality, universalism, disinterestedness, impartiality, and (somewhat incongruously at that particular moment) freedom. West questioned these supposed values, noting that “faith in rationality is essentially irrational.” He concluded that the moral values associated with scientific research w ere perhaps mythological: “the classical morality of science” was not associated with increased productivity, and any lab manager needed to understand what was really going on in the lab. “Apparently, if there ever w ere a reasonably firm consensus with respect to scientific values, it was not maintained long after 1920. It may never have been more than a myth,” West concluded.42 The physicist Charles Schwartz also became an outspoken critic of physics. He was trained in physics at MIT (1954) and joined the physics faculty at Berkeley in 1960. He had no particular political interests in the early phase of his c areer. He participated in a visiting fellowship in Washington, D.C. that was widely understood to be a prelude to joining the Jasons, the secret group of physicists involved in advising the government. At the time he reported that he had no concerns about the militarization of science. But in the summer of 1966, his b rother died in an airplane accident, and he read The Autobiography of Malcolm X. As Schwartz told the story in a 1995 oral history, these two events provoked him to reconsider his life.43 He began to be concerned about the war in Vietnam and particularly with the ways that physicists w ere contributing to the war effort. In a very short time, he became an activist. In 1967 he wrote a letter to the editor of Physics Today, the professional newsletter of the American Physical Society, that questioned the war. The editor of Physics Today refused to pub-
Hidden Curriculum
219
lish it b ecause the publication was “devoted to physics as physics and physicists as physicists.” Schwartz then proposed to the membership of APS an amendment that would allow members to present any resolution for consideration to the membership. It was framed as a free speech question and did not mention Vietnam, but it was actually about scientists and the war. L ater somewhat infamous, the proposal sparked a public debate in the letters to the editor pages of Physics Today. Even Edward Teller weighed in, proposing that physicists should never become part of “pressure groups.” As the journal Nature noted a few weeks later, “It is of course delightfully disingenuous that Dr. Edward Teller, who has been a one-man pressure group on defense policy for several years, should now maintain that pressure groups are a nuisance.” 44 Teller, of course, was pressuring public officials to invest more resources into building more bombs.45 Schwartz’s efforts in 1969 galvanized a group of other physicists to help form a new, antiwar group, Scientists and Engineers for Social and Political Action (SESPA, later Science for the P eople, SftP). “We reject the old credo that ‘research means progress and progress is good,’ ” planning documents said.46 Mainstream scientific organizations had remained aloof from pressing problems that engaged scientists, but this new group would refuse to be aloof. “Why are we scientists? For whose benefit do we work? What is the full nature of our moral and social responsibility?” 47 SESPA and then SftP w ere more radical than groups like the Federation of American Scientists, which promoted active engagement with federal policies as a means of shaping them. The more radical scientists were puzzled by their colleagues’ lack of interest in political and social questions. Physicists played leading roles in military advising and research, yet many seemed indifferent to the consequences of their work.48 Schwartz catalogued what he called the “rationalizations” of scientists when questions were raised about their engagement with the military, including “I am fooling the DoD by taking their money for my research, ” “I take DoD money, but I am just d oing basic research, not work on weapons,” and “If I d on’t do this work on weapons, someone e lse w ill.” 49 The physicists were not alone in their concerns. The American Society for Microbiology’s formal involvement with US biological weapons production began during World War II (many nations established biological warfare programs after 1925). For the microbiologists, this involvement had significant professional consequences: of the ninety-one presidents of the American Society for Microbiology, twenty-one spent some
220
Rational Fog
portion of their time at Fort Dietrich, the main US biological defense research facility.50 Microbiologists have not always agreed about w hether such forms of l abor are legitimate or suspect, and their halting efforts to reach a consensus made no progress during the Second World War. In 1967 a debate unfolded among members of the American Society of Microbiology about the society’s relationship to the nation’s chemical and biological weapons research programs at Camp Detrick, Maryland. Many biologists supported the close ties to Detrick, and leadership in the society often included those who had worked with or for Detrick. O thers felt that the Advisory Committee of the society that was assigned to help Camp Detrick should be disbanded. In a first round vote, this proposal failed. One member said that disbanding the committee would be a “political” initiative that would “shroud the organization with moral or political views.” Another said it “commits our members by majority vote to an implied ethical or moral position that has no place in a professional society.”51 Supporting biological weapons research apparently had no implied ethical or moral position. In 1968, outgoing ASM President Salvador Luria dissolved the Advisory Committee, saying that the close ties of the society with the defense establishment was ethically incompatible with the proper roles and functions of a scientific society. But in a special plenary session, the membership overruled Luria and reestablished the Advisory Committee.52 These failed initiatives reflected professional realities: many scientists supported US military research and saw no conflict between defense projects and their professional values. But the discontent around the edges, I would suggest, was nonetheless revealing and important. Activist critics of defense spending and its impact included the cybernetician Norbert Weiner, former Los Alamos physicist Joseph Rotblat, and biochemist and Nobelist Linus Pauling. Fractures opened within the scientific community around defense- oriented science and particularly around the legitimacy of the war in Vietnam.53 When activist groups like Science for the People adopted showy tactics of public disruption, staging pranks at meetings of the American Association for the Advancement of Science, they w ere reacting to the fact that scientists were directly responsible for some of the most egregious technologies of modern warfare. Most radically, in 1972 Science for the P eople leaders proposed that d oing good science would require restructuring society. Capitalism, racism, sexism, and imperialism led to forms of knowledge that reinforced hierarchies and social injustice.
Hidden Curriculum
221
It is possible to see the group as negotiating the proper role of science in a capitalist democracy. They w ere also interrogating what the individual scientist should do. What was the proper position of the expert caught in networks that were so heavily influenced by defense spending? Many recognized that individual choices w ere not likely to change what was happening to science in general. The structural and institutional forces must have seemed unchangeable, as the money and power that circled through scientific networks were unprecedented. In late 1945, the Association of Pasadena Scientists was founded as a response to the growing controversy over the use of atomic weapons. The group was intended to help experts meet the “apparent responsibility of scientists in promoting the welfare of mankind and the achievement of a stable world peace.” The same or similar phrasing appeared in founding documents for the Federation of American Scientists (1945), Atomic Scientists of Chicago (1945), Society for Social Responsibility in Science (1949), Pugwash (1957), Physicians for Social Responsibility (1961), Scientists Institute for Public Information (1963), Union of Concerned Scientists (1969), Science for the People (1969), International Physicians for the Prevention of Nuclear War (1980), Computer Professionals for Social Responsibility (1981), Psychologists for Social Responsibility (1982), High Technology Professionals for Peace (1984), and even, finally, in the ethics codes of the American Society of Microbiology, which concluded after several decades of debate and contention that microbiologists should as a group “discourage any use of microbiology contrary to the welfare of humankind.” The Advisory Committee to Detrick was finally terminated. Other societies, like AAAS and the American Chemical Society, also adopted provisions in the 1960s and 1970s that stated that science was intended to facilitate the welfare of mankind. Was it necessary to have codes of ethics, public announcements, and special statements by professional societies to enforce the idea that science was intended to facilitate the welfare of mankind and that scientists, engineers, and physicians should not be involved in making things that injured human beings? For the true insiders, the elite physicists at the top of the power structure, the Cold War could be an intoxicating, almost seductive time of influence, discovery, and relevance. Herbert York, who helped build the hydrogen bomb, saw himself as participating in one of the g reat events of history—a key player who as a mere thirty-year-old interacted with “legendary yet living heroes.”54
222
Rational Fog
But for many others, it was a time of anxious reckoning with their roles in producing violent human injury. Science became socially and professionally a different kind of labor, with different rules and protocols and with central contradictions that generated tensions within the community. Experts trained in the deep benefits of science for the welfare of mankind could either struggle with the cognitive dissonance or ignore it. Many scientists ignored it. I would suggest that in the scientific community violence is a lot like gender. Gender is often “the ghost at the table.” Whether or not people articulate it, choose to notice it, or can be persuaded that it m atters, gender is still there at the table in social networks. It still shapes social and emotional interactions and guides life paths. H uman societies have organized labor so relentlessly in gendered terms that the gender system has been naturalized to the point of invisibility. It is socially, maybe even strategically, not seen. In terms of postwar science I use the image of the ghost in the same way. I want to suggest that violence has been similarly present and absent in the technical networks I consider. It was obvious from some perspectives and to some of the players. But others simply could not see it or could not notice it. At many times, it has been purposely invisible—invisible for a reason. To turn too directly to consider it would involve some difficulties for many experts. It is the awkward difficulties in practices, professional strategies, and emotional reactions that I have tried to outline h ere. The consequences I consider may be absent from many formal accounts of postwar science. But the deeply conflicted role of science in military violence was always there, and still is. Wang’s brilliant exposition of how the physicist Merle Tuve came to terms with both his romantic vision of pure science and his critical role in the production of military technologies captures the incoherence with empathy.55 For experts in the Cold War, this dark intersection of innocence and guilt could at times produce professional anguish. As Wang noted, the mid-twentieth century produced a seemingly endless array of bodies and gatherings devoted to earnest investigation of the nature of science and its human dimensions. These included the Conference on Science, Philosophy, and Religion (1939– 1960s); the Conference on the Scientific Spirit and Democratic Faith (1943 and 1946); the Committee on Science and Values, organized by the American Academy of Arts and Sciences in 1951; the Institute on Religion in an Age of Science (1955 and later); and the 1959 “Symposium on Basic Research,” cosponsored by the National Academy of Sciences, the American Association for
Hidden Curriculum
223
the Advancement of Science, and the Alfred P. Sloan Foundation. Many other public meetings in this period addressed the nature of modern science and the moral implications of new scientific knowledge and new technologies. Popular texts like Jacob Bronowski’s Science and Human Values and Vannevar Bush’s Science the Endless Frontier articulated links between science and “civilization.” My own discipline, history of science, was institutionalized in US universities at this moment, partly as a form of instruction that would validate and promote the power and importance (and even the purity) of science in Western civilization. After World War II, the chemist James B. Conant, who had promoted the importance of chemistry during both World Wars and was then President of Harvard, saw history of science as a civics lesson, a way to teach freedom and citizenship. But leaders of Science for the People more or less inverted Conant’s proposal and said that doing good science would require restructuring society. Powerful institutions could shape knowledge, they proposed, and proper knowledge could be produced only in a just society. This internal critique of science that unfolded in the 1960s has an ironic afterlife. If science was in fact always s haped by politics, belief, assumption—if it was or could be the product of militarism, racism, imperialism, capitalism, and sexism—why should it be trusted at all?56
Conclusion R E AS ON, TE R R OR , C H AO S
MILI TA RY T ECHNOLOGICA L A ND SCIEN T IFIC ACHIE V EMEN T S A RE SEDUC T IVE. T HE Y OPER AT E IN VARIO US
kinds of half-light, and they generate collateral data that can then serve as a guide to f uture violence. In the process of the rise of scientized warfare, the forces of reason have often operated in a fog in which reason can produce unreasonable things. Today as we face new kinds of terroristic warfare, a mirror-like dynamic is in play. Those challenging extremely powerful nations leverage science and technology to disrupt the very systems of reason, rationality, and order out of which science and technology developed. Similarly, feelings have been a critical target in industrialized and scientific warfare. Air power, submarine warfare, satellites, the internet, defense-funded neurosciences, drone sciences, and psychiatry all engage with producing fear, generating uncertainty, and controlling emotions. Today, non-state actors hoping to undermine powerful states also target feelings. Generating fear, rage, confusion, hatred, or even forms of love can be a resource to undermine an enemy. In Carol Cohn’s classic ethnographic work with defense intellectuals, she was struck by how readily anger, rage, and frustration could be openly expressed by participants in defense planning discussions, while sadness and 224
Conclusion
225
FIGURE 22. August 6, 2015, the 60th commemoration of the atomic bombing of Hiroshima, a young boy helping place in the river my yellow floating lantern, on which I wrote a message of peace. Photo by the author.
empathy were virtually forbidden, off the table, and subject to profound social disapproval. In her unforgettable opening anecdote in a 1993 paper, she described a “true story told to me by a white male physicist.” This physicist was working with a group modeling counterforce nuclear attacks, trying to get realistic estimates of the number of immediate fatalities that would result from different deployments. “At one point we remodeled a particular attack, using slightly different assumptions, and found that instead of there being thirty-six million immediate fatalities, there would only be thirty million. And everybody was sitting around nodding, saying ‘Oh that’s g reat, only thirty million,’ when all of a sudden I heard what we were saying. And I blurted out, “Wait, I’ve just heard how w e’re talking—Only thirty million! Only thirty million h uman beings killed instantly?’ Silence fell upon the room. Nobody said a word. They didn’t even look at me. It was awful. I felt like a woman.”1 His feeling “like a woman” reflected the powerful symbolism of the gender system and its real world effects in terms of a discussion, in a room, about an imagined future nuclear war—and perhaps in terms of many other discussions, in many other rooms.
226
Rational Fog
The early twentieth-century biologist and philosopher Ludwik Fleck suggested that t hose t hings seen to be neutral or rational—those things understood to be outside of the realm of emotion—are precisely the things around which crucial values and assumptions are expressed. Fleck’s 1935 study, The Genesis and Development of a Scientific Fact, barely noticed in his own lifetime, was taken up by historians of science in the 1960s as a powerful model for understanding science in general.2 One of his most important interpreters was the physicist turned historian Thomas Kuhn, who made Fleck’s ideas about thought collectives and thought styles a central part of his highly influential study of “paradigm shifts,” The Structure of Scientific Revolutions (1962). This book became one of the most frequently cited and well known studies in the field. But when Kuhn drew on Fleck, he erased Fleck’s profound and compelling attention to emotion. As Uffa Jensen notes, for Fleck, scientists “enter a state of emotional chaos” when they encounter something that contradicts conventional wisdom or might lead (as Kuhn would have it) to a paradigm shift. This uncertainty and anxiety, Fleck proposed, persisted until a new explanation in a novel thought style (a different model of nature) could be constructed. Fleck therefore saw certain emotions (including confusion, reassurance, confidence, and bewilderment) as an integral part of the research process in the natural sciences and a critical element in any understanding of science as a knowledge-creation process. “These emotions are part of the very core of scientific observation and knowledge production,” Jensen notes. But Fleck’s “insistence on the important role that emotions play in scientific processes of observation, explanation, and theory building was lost on Kuhn.”3 Fleck himself would perhaps have appreciated the informative nature of Kuhn’s omission, for one of Fleck’s fundamental assumptions about the social world was that emotion is everywhere and in every act. If and when emotion seems to disappear, then that point of disappearance is a point of critical consensus of some kind. Fleck saw neutrality and rationality almost as cultural blind spots. They were notions around which the consensus was so thick that emotion could seem to be absent. Kuhn’s decision to selectively embrace Fleck’s ideas, but leave the emotions out, was therefore consistent with Fleck’s own insights. In the heart of the Cold War, and its many seething passions, with a scientific community in crisis and scientists even on his own campus (Princeton University) fighting among themselves about the nature and value
Conclusion
227
of science, Kuhn sought to construct a picture of science that sustained its cool neutrality. Historians and other humanistic and social scientific scholars have strug gled with how to think and write about emotions. For some historians of science, attention to emotion seems to threaten to become psychological or psychoanalytic—the post hoc imposition of modern theories of personality on historical figures—or an explanation of the past in terms of the feelings of individuals rather than power structures or ideologies. Scholars of international relations in political science have also tended to ignore the explicit consideration of emotion. Political scientist Neta Crawford attributes this to the “assumption of rationality” in which state behavior is interpreted as intelligent and driven by reason—by a cognitive calculation of cost-benefit. Even those who study “irrational” behavior in international relations look for cognitive biases, not emotions, she notes. A few emotions (particularly hatred) are widely accepted as relevant to international relations, and these tend to be unproblematized by IR professionals: they seem obvious. But other emotions are less accepted and less noticed, even if they are in practice built into theories of world politics. Crawford suggests that “fear and other emotions are not only attributes of agents, they are institutionalized in the structures and processes of world politics.” But their presence and importance has been systematically disappeared.4 Emerging literatures in the history of emotion suggest that emotions act in the world, and for the purposes of social and historical analysis, it is not necessary to extract any data about “real,” interior feelings or mental processes. Rather, one can notice how emotions are invoked, understood, explained, and performed. Emotions can be approached as a language-game that follows generic and narrative conventions, Crawford suggests, and emotion-rules are encoded in grammars of representation. Seeing emotions in terms of a grammar of representation permits a certain agnosticism about interior mental states. It does not require a hypothesis about actual feelings or experienced feelings. At the same time, I want to make it clear that even the idea that t here is a technologically visible, biological, interior correlate of emotion is a product of the rise of modern science. This bodily change that can be studied in a laboratory—this mechanized emotion of neurological circuits and organic brain states—comes into being with modernity and with modern science. It is a product of knowledge systems, known to exist because of the practices that track its interior mechanisms.
228
Rational Fog
Most importantly, I want to emphasize that emotions are commonly expressions of power relations. They link the individual with the social in dynamic and revealing ways. They are always about what Crawford calls “enaction,” enacted through social and political order.5 I begin my concluding thoughts with this exposition on emotion b ecause war is a domain of intense emotion. Clausewitz knew that, and I would guess that many active service p eople know that t oday. Machines and scientific ideas now structure how those emotions look, what they mean, and how armed forces and non-state actors deploy them. Even though the strategies of twenty-first century terrorism may seem primitive, brutal, or pre-modern, they reflect technocratic rationality and the global force of reason. They are produced by the intensity of technocratic war: for those challenging powerful states which control scientific weapons of inconceivable potential, producing fear to cause damage is a technologically realistic option. In his 1946 catalog of words and phrases he identified as new slang, the linguist I. Willis Russell included “mass-produced,” “bamboo railroad,” and “incentive pay.” As chairman of the American Dialect Society’s New Words Research Committee from 1944 to 1984, he seems to have considered language a rich venue for social and political commentary. His 1946 report also described the new and common use of “terror bombing.” He defined it as bombing designed to hasten the end of the war by “terrorizing the e nemy population.” Terror bombing of German cities, he said, was “deliberate military policy.” As Russell shrewdly observed, fear was an intended outcome of air power, and while many of t hose in leadership positions in the Army Air Forces proclaimed that the United States did not target “the man on the street,” the US and other air services did target not only the man in the street but also the w oman and the child. In the process, they thought they w ere targeting how people felt.6 In 1943, US Army Chief of Staff George Marshall proposed that it was important to bomb Munich “because it will show the people being evacuated to Munich that there is no hope.”7 Air power was thus supposed to produce feelings—hopelessness—and those feelings would translate into Allied victory. Indeed the generation of feelings was what the bomber did best. As the war unfolded the technology increasingly came to be deployed by all sides to do what it could do: indiscriminate damage that scared p eople. The initial plan for careful and targeted bombing of factories or rail lines was not as effective as planners hoped (remember Galbraith’s suggestion that it had no im-
Conclusion
229
pact on the German economy). It gave way to indiscriminate carpet bombing and fire bombing at least partly because targets were very difficult to hit, and hitting them had limited effects on industrial production or on the German war machine. In the decades after the war, hitting targets accurately became a high priority for the Pentagon, and drone warfare became one solution. But perhaps surprisingly, drones also in the end came to produce political outcomes partly by generating fear. Drones were mass produced during the Second World War. The young Norma Jean Dougherty, who became Marilyn Monroe, worked in a drone factory and had her first photo opportunity on the drone production line—she was “discovered” by a photographer documenting wartime production (Figure 23). But t hese were “target drones,” small remotely controlled planes that could be used to train pilots and artillery teams in the practice of aiming. The word “drone” was an explicit reference to male bees, which do not sting and which w ere seen as dispensable. Target drones w ere intended to be 8 destroyed. In the 1950s, remotely piloted aircraft (RPAs) began to be tested as pos sible reconnaissance aids. The issue became more pressing after 1960, when a CIA surveillance spy plane piloted by Gary Powers was shot down over the Soviet Union. Powers was captured, tried, and sentenced to prison in the Soviet Union for espionage. The U-2 plane was supposedly flying too high for Soviet missiles. But in fact a missile brought it down and Powers parachuted out. The incident strained US-Soviet relations further. The idea of a surveillance vehicle that did not have a pilot inside became more appealing.9 Ryan Aeronautical Company had been producing jet-powered “Firebee” target drones, and they w ere adapted (more fuel, slightly larger) for possible long-range aerial reconnaissance. T hese new drones, code named “Fire Fly,” were launched from beneath the wing of a DC–130, with the operator on the plane. This operator could tell the drone where to go and what to do. Work on drones was top secret. Flyovers v iolated international law and even some Air Force insiders objected to unmanned drone surveillance programs. General Walter Sweeney, Commander of Tactical Air Command (TAC), “emphatically refused to participate in a reconnaissance program conducted with unmanned” drones. As one participant recalled, “the meeting ended on a Sweeney exclamation: ‘When the Air Staff assigns eighteen inch pilots to this command, I’ll reconsider the issue!’ ”10
230
Rational Fog
FIGURE 23. The young Marilyn Monroe, making drones. David Conover / United States Army.
In 1964 the engineer John W. Clark produced a study of “remote control in hostile environments.” Like earlier engineers who theorized about technologies that could take the place of people in dangerous situations, he pictured autonomous deep-sea labor. In effect, the consciousness of the man operating such a robot is “transferred to an invulnerable mechanical body.”11 Clark called these “telechiric systems” in his April 1964 paper in New Scientist.12
Conclusion
231
It is perhaps not trivial that the very first radio-controlled autonomous bomb-delivery planes, proposed by RCA Engineer Vladimir Zworykin in 1934, were theorized on the basis of Japanese suicide pilots. Kathryn Chandler’s 2014 study notes that Zworykin, a television “pioneer,” wrote a memorandum to the president of Radio Corporation of America (RCA), David Sarnoff, describing a possible “Flying Torpedo with an Electric Eye.”13 Zworykin said a camera could transmit images from an airborne torpedo to an operator who could remotely control what would in essence be a flying bomb. He was at least partly responding to newspaper reports in the 1930s of Japanese Suicide Corps training. “The solution of the problem evidently was found by the Japanese, who according to newspaper reports, organized a Suicide Corps to control surface and aerial torpedoes.” Chandler, whose work tracks how “unmanned” warfare was i magined and enacted through drones, points out that Zworkyn saw technology as a way to get around the limitations of the body and culture. “We hardly can expect to introduce such methods in this country,” Zworykin said, “and therefore have to rely on our technical superiority to meet the problem. One possible means of obtaining practically the same results as the suicide pilot is to provide a radio-controlled torpedo with an electric eye.”14 Zworykin’s notion of an “intelligent” radio-controlled plane which could see targets in real time and return a fter a mission was very dif ferent from the missiles and rockets already being developed and later widely used in the Second World War. He imagined a vehicle that had a pilot, of sorts, even if that pilot was not in the plane. In 1972, the former Bell Laboratories engineer Robert Barkan—by then a member of the Pacific Studies Center in Palo Alto, California—predicted that the absent pilot would become a cost-saving strategy for the Department of Defense. “Much of t oday’s high fighter costs—over $3 million for the F-4 Phantom—are spent on increasing the probability that the h uman crew returns alive.” Unmanned aircraft, he pointed out, have no need for multiple life-support systems, ejection seats, reliable engines, and strong airframes. They can be lightweight, and “once the man is taken out of the plane,” more maneuverable and fast moving.15 The armed Predator drone widely used today is the twenty-first-century realization of t hese visions. It was developed for CIA use a fter Israeli success with targeted killing and armed drones in the late 1990s. It combines intense killing force with completely asymmetrical risk. The covert CIA program to kill individuals using unmanned aerial vehicles far outside war zones began
232
Rational Fog
nder President Bush and was expanded under President Obama. The first u targeted killing using a drone was in June 2004. Since then hundreds of Predator attacks have been carried out in Pakistan, Yemen, Somalia, and Libya. In 2016 the Obama administration reported that the United States had killed 2,436 people in 473 strikes. The official report said only between sixty-four and 116 of those killed were non-combatants. Journalists on the ground have challenged these numbers, particularly the very low count for noncombatants. The philosopher Gregoire Chamayou has proposed that the drone looks like “the weapon of cowards” when considered in light of traditional principles of bravery and self-sacrifice. Pilots are in Nevada when they strike individual targets in Somalia or Libya. The risk is completely, utterly nonreciprocal. Many e arlier weapons systems produced asymmetrical risk—it is a general trend in the rise of modern warfare from archery to guns to artillery to V-2 rockets—but the modern drone produces levels of asymmetry and extreme precision that are truly novel. As the former Bell Telephone engineer Robert Barkan put it in an essay in The New Republic in 1972, “War will eventually become a contest between machines.”16 Machines, he said, did not bleed, die, get addicted, shoot their officers, or refuse to fight. While drone warfare might alter relationships within military institutions, as Barkan suggests, it also alters the relationship between citizens and the state. By eliminating even the possibility of US combat deaths, such technologies shift the equation for justifying violent state action. They lower the bar, so that US citizens risk nothing in pursuing violent attacks on o thers. This is destabilizing in the sense that it makes possible military action that has no “cost.” For some Pentagon officials, this complete elimination of risk to pilots is the key advantage of Predator and other drones, which w ill do “a better job” than h umans. One told defense scholar Peter W. Singer that “they d on’t get hungry, t hey’re not afraid. They d on’t forget their o rders. They d on’t care if the guy next to them has just been shot. W ill they do a better job than humans? Yes.”17 Drones produce new kinds of fear. One tribal leader in North Waziristan told a New Yorker reporter that the long build up to a drone attack “turned people into psychiatric patients.” The drones circle for hours or even days before striking, and p eople below can see them, hovering at about 20,000 feet. They generate an insect-like sound, a flat, gnawing buzz. “The F-16s might be less accurate, but they come and go.”18
Conclusion
233
Chamayou proposes that the whole world has become a hunting ground and drones can be used wherever the CIA and the Pentagon choose, even outside war zones. But what is a war zone? “By redefining the notion of armed conflict as a mobile place attached to the person of the enemy, one ends up, under cover of the laws of armed conflict, justifying the equivalent of a right to execute suspects anywhere in the world, even in zones of peace, illegally and without further procedures, one’s own citizens included.”19 Air war theory from the 1930s into the 1980s imagined a stable enemy whose means of production could be destroyed with bombs. This destruction of the physical means of production would lead to state defeat, as it became impossible to continue the war effort due to lack of manufactured goods and technologies. But this no longer describes violent conflict in the twenty- first century. Fewer and fewer p eople and resources can now produce damage, even to a nation as powerful and well-defended at the United States. And while the means of producing damage are still often scientific and technological, the technologies do not have to be manufactured in factories controlled by the enemy. They need not even be weapons. They can be made for other purposes, and leveraged by those who wish to produce chaos: The civilian aircraft that brought down the World Trade Towers in 2001 w ere built for corporate use in commercial air travel networks. They became the physical equivalent of bombs because of the ways that they were used. And non-state actors like ISIS, with limited production resources and no stable or protected factories to manufacture weapons, can draw on the fringes and everyday edges of a global technical armamentarium. They can turn cell phones made for consumers into IEDs and use captured military technologies like US Humvees, US M-198 Howitzer artillery, Chinese field guns, and old Soviet AK47s. Violent non-state groups benefit from the superfluity of weaponry, from accidents, abandoned materials, and captured guns. Weapons can start anywhere and end up anywhere. They can be manufactured in Ohio, taken to support US troops in Syria, abandoned during a fire fight, captured and repurposed, and then used to kill US troops. New technologies also produce entirely new strategies for emotional damage. No one at the Orlando nightclub in June 2016—none of the forty- nine p eople killed, the fifty-three wounded, or the hundreds or thousands of o thers traumatized—was about to launch a military attack on anyone. Whether t hese particular people lived or died had no effect on any live military front or battlefield. They w ere killed in a war that they w ere not actually
234
Rational Fog
participating in. And they were not “collateral” damage but the intended victims. They w ere involuntary participants in a brutal performance that depended on technoscience and virtual witnessing for its efficacy. The reason to kill and injure them was so that their murders could be seen and their trauma replayed on mass media. They w ere part of a psychological warfare program, their deaths intended to produce fear and anguish. The global web has become a challenging new military frontier in many ways. In her terrifying reflection on cyberwar, Chris Demchak, a professor of cybersecurity at the US Naval War College, proposes that we have entered an era of the “democratization of predation globally.” Cyberspace, “entirely man-made, man-owned, contracted out, man-maintained, man-updated, man- monitored, man-defended, and man–disrupted,” has made violence easier. It has “eased three historically daunting systemic obstacles to predatory be havior, namely scale, proximity, and precision.”20 War in the past required resources to organize and supply armies; cross long distances to carry out active battles; and support research, technology, production, and training. Now with cyber technologies of war, would-be attackers can get by with limited resources and small numbers of people, and can with impunity attack anyone anywhere. “The characteristics of the globally open, unfettered, opaque cyberspace have not only generated much wealth but have also democratized predation on strangers at w ill with little to no governance or social controls to curb appetites or success.” This “virtual anarchy” of the global web, she says, “has led not to war as we have known it but to a form of conflict in the interstices between peace and war that involves not only states but also anyone with access, time, and basic equipment.”21 Cyberwar requires no uniformed militaries, observable incursions, or physically evident military power. Rather, it is more like a convoluted systemic struggle “in which . . . campaigns may take years to unfold largely cloaked in multilayers of preferably anonymous deception and the slow, deep, systemic enfeeblement of adversaries rather than any identifiable, attributable, direct, physical strikes. This not-quite-peace-but-clearly-not- traditional-war” changes war for the foreseeable future. Cyberspace is now “a conflict-laden substrate shared globally across the critical systems of any connected society.” Referencing the traditional idea of state sovereignty and independence that more or less resolved the Thirty Years War in 1648, she predicts the necessary emergence of a new Cyber-Westphalian system to
Conclusion
235
manage the global web. She also proposes that the importance of this web for national sovereignty has not been generally recognized and understood. Invoking a rather s imple form of technological determinism, she suggests that cyberspace now joins other technologies that have transformed war in the past—from “the stirrup, the long bow, gunpowder, the steam engine, telegraph, radar, to nuclear fission.”22 The Department of Defense has defined the global web as territory that is equivalent in terms of its security implications to the traditional territories of land, sea, and space. But military theorists and scientists wrestle with questions of just war and cyberattack. A physical invasion with troops is by consensus a legitimate reason for military counter-attack. Force and violence are justified responses. But if a cyberattack shut down a country’s banking system, producing chaos, and if it could be tracked directly to operatives in a sovereign nation, would a legitimate response by the attacked nation be physical invasion? Bombing? What if such an attack undermined democratic institutions by manipulating election results or public opinion? Attacks on democracy are in fact underway around the world. Aggressors using such methods recognize that controlling elections and public opinion can shape state survival. New technologies are pushing the limits of existing laws and theories, and as Demchak suggests, may even be implicated in new ways of understanding the modern state and its power and autonomy. Technical expertise is now so thoroughly militarized that t here is no way out—not for experts, scientists, political leaders, or the public. An enterprise with explicit commitments to the welfare of mankind has also produced anticipatory and actual h uman injury on a g rand scale, not as a matter of individual choice, but as a matter of structure. I proposed in my introduction that violence was central to twentieth-century knowledge systems, that the battlefield became a crucial field laboratory in this period, and that the wounds I consider (the experimental injuries documented in formal texts) are evidence of both nature and history. They help us understand how those producing them thought, what worlds they lived in, and what kinds of questions loomed large for them. Gaining greater insights into violence t oday is one primary job of highly trained chemists, physicists, computer scientists, oceanographers, and mathematicians around the world. Everything humans know about nature can become a resource for state power, and e very form of knowledge can cut both ways. If you know how an economy works and what facilitates its growth,
236
Rational Fog
you also know how to bring that economy down. If you understand what the human mind needs to sustain a sense of safety and order, you also know how to destabilize that mind. If you understand the engineering of a bridge, you know how to bring it down. And if you know how to stop a pathogen, virus, or bacteria, you also know how to maximize its spread. Over the last century, scientists and engineers figured out many ways to produce h uman injury. It was not the most obvious use to which human intelligence could have been applied, but it has been a very important one. In characterizing how and why this occurred, I have invoked efficiency and reason—ideas central to the very models of rationality that I describe—to suggest that at least some of this scientific effort has involved significant waste of h uman ability and talent. I do not see an easy way to reorient knowledge around “the welfare of mankind” though I think that seeing the problem clearly is a first step.
NOTES
REFERENCES
AC K N O W LE D G M E N T S
INDEX
NOTES
INT R O DU CT IO N
1. Harvey, 1948; Owens, 2004. 2. Hughes, 2004; Cowan, 1983; Alder, 1997. 3. Malone, 2000. R. M. Price, 1997. 4. My use of “legible” is drawn from the work of James Scott, in his study Seeing Like a State, and refers to the ways that certain aspects of social and political life are noticed and attended to—legible—while others are ignored or seen as irrelevant. J. Scott, 1988. 5. Forman, 1973. 6. Zachary, 1997; Owens, 1994; Kevles, 1975. 7. Bousquet, 2009; Van Keuren, 1992, 2001; Rees, 1982. 8. See Westwick, 2003. 9. Dennis, 2015. 10. Rohde, 2009; Moore, 2008; Bridger, 2015. 11. Dennis, 1994; for a compelling sociological account of one early debate, see Nelkin, 1972. 12. Relyea, 1994. 13. Hollinger, 1995, 442. 14. Swartz, 1998.
239
240
Notes to Pages 11–32
1 5. All in Hollinger, 1995, 442–446. 16. Beecher, 1955. 17. Lindee, 1994. 18. Haraway, 1988, 579, 583. 19. Peter Paret’s work has helped us understand Clausewitz both in his own time and as his ideas acted in the world a fter his death. See Paret 2004, 2007, and 2015. 20. Paret, 2007, 9. 21. See Bellinger 2015 on Marie’s influence in the text. 22. Ghamari-Tabrizi, 2005. 23. Kaldor, 1981. 24. See particularly essays in Krige and Barth, 2006. 25. Galison, 2004, 237. 26. Other scholars, including Bousquet (2009) and Hacker (1994), have told much of the central story. 27. Schwartz, 1998. 1. T O H O L D A G U N
1. Lorge, 2008. 2. See Roberts, 1956, and, less egregiously, McNeill, 1982, and Parker, 1996. 3. See Evans, 1964. 4. Gat, 1988; also, Cassidy, 2003. 5. McNeill, 1982; B. S. Hall, 1997. 6. But see Buchanan, 2008, on the importance of charcoal. 7. Cressy, 2011, 2012; also, Frey, 2009. 8. Frey, 2009. 9. Cushman, 2013. 10. On the history of how chemists engaged with gunpowder production, the work of Seymour Mauskopf is particularly relevant. See Mauskopf, 1988, the edited volume in 1995, and the essay on gunpowder, 1999. See also Buchanan, 2014. 11. Buchanan, 2006. 12. Nayar, 2017. 13. Nayar, 2017, 521. 14. Parker, 2007, 353. 15. Kleinschmidt, 1999; Parker, 2007. 16. Kleinschmidt, 1999. 17. McNeill, 1995. 18. Malone, 2000; Silverman, 2016. 19. Silverman, 2016. 20. Silverman, 2016. 21. Diamond, 1997. 22. Perrin, 1979; Kleinschmidt, 1999.
Notes to Pages 33–61
241
23. Parker, 2007. 24. Perrin, 1979. 25. Kleinschmidt, 1999, 626. 26. Perrin, 1979. 27. Buchanan, 2012, 924. 28. Buchanan, 2012; see also Ágoston, 2005; and Gommans, 2002. 29. Ralston, 1990. 30. Ralston, 1990. 31. Gross, 2019. 32. Inikori, 1977, 2002; Richards, 1980; Hacker, 2008. 33. Marshall, 1947. 34. Marshall, 1947; Grossman, 1995. 35. Spiller, 2006; Strachan, 2006; Rowland and Speight, 2007. 36. Grossman, 1995. 37. Grossman, 1995. 38. Grossman, 1995. 39. Diamond, 1997. 2. T H E L O G IC O F M A S S P R O DU CTI O N
1. Bousquet, 2009, 75. 2. Bousquet, 2009, 76. 3. Carnahan, 1998, 213. 4. See Kaempf, 2009. 5. Faust, 2005, 28. 6. Alder, 1997. 7. Small, 1998; Diamond and Stone, 1981. 8. Williams, 2008. 9. Diamond and Stone, 1981. 10. Diamond and Stone, 1981. 11. Diamond and Stone, 1981, 69. 12. Immerwahr, 2019. 13. Though see Carpenter, 1995. 14. Sumida, 1997. 15. Mahan, 1890. 16. Karsten, 1971; LaFeber, 1962. 17. Kennedy, 1988; Sumida, 1997. 18. Fairbanks, 1991. 19. Mindell, 1995; 2000. 20. Fairbanks, 1991. 21. O’Connell, 1993. 22. Anderson, 2006.
242
Notes to Pages 64–84
3. T R E NCH E S , TA NKS , CH E M I C A LS
1. See, for example, Stichekbaut and Chielens, 2013; Shell, 2012; and Finnegan, 2006. 2. Travers, 1987; Selcer, 2008. 3. For a helpful comparative perspective, see Sachse and Walker, 2005. 4. McNeill, 1982. 5. Nickles, 2003; Winkler, 2015. 6. Ashworth, 1968. 7. Ashworth, 1968. 8. See R. M. Price, 1997; E. Jones, 2014. 9. E. Russell, 2001. 10. R. M. Price, 1997; E. Russell, 2001; Haber, 1986. 11. Haber’s son Ludwig Haber published an account of his father’s work that is definitely worth reading, Haber, 1986. 12. R. M. Price, 1997. 13. George, 2012. 14. D. P. Jones, 1980. 15. R. M. Price, 1997. 16. See Müller, 2016. 17. McNeill and Unger, 2010. 18. Cohn, 1993. 19. Szabo, 2002; Wessely, 2006. 20. Loughran, 2012. 21. Loughran, 2012. 22. Johnson, 2015. 23. See Shephard, 2000; Wessely, 2006; Winter, 2006. 24. Badash, 1979. 25. Gordin, 2015b. 26. E. Crawford, 1988. 27. Stanley, 2003. 28. Heilbron, 2000. 29. Kevles, 1971. 30. E. Crawford, 1988; 1990, 252. 31. E. Crawford, 1988, 164. 32. Doel, Hoffman, and Krementsov, 2005. 33. Irwin, 1921, 44. 34. Irwin, 1921, 44. 35. See T. Biddle, 2002, 264, 268. 36. T. Biddle, 2002, 265. 37. T. Biddle, 2002, 267. 38. All cited in T. Biddle, 2002, 268. 39. Cited in T. Biddle, 2002, 268. 40. In Freud, 1957, 307, from a 1915 essay.
Notes to Pages 84–102
243
4 1. Reprinted letter in Rilke, 1947. 42. Einstein and Freud, 1933. 4 . M O BIL IZE D
1. May 17, 1946, Meeting of the National Research Council, Washington Records of the National Research Council, Archives of the National Academy of Sciences, Washington, D.C. Grow’s autobiographical Surgeon Grow: An American in the Russian Fighting, chronicles his experiences on the Russian front in World War I. 1918. New York: Frederick A. Stokes. 2. United States Department of Commerce, Office of Technical Services, 1947. 3. See Owens, 1994. 4. T. Biddle, 2002, 266; S. Biddle, 2004. 5. Owens, 1994. 6. Kevles, 1977, 11. 7. Daemmrich, 2009; Lindee, unpublished manuscript. 8. Grier, 2005. 9. Kevles, 1977. See also Feffer, 1998. 10. Herman, 1995. 11. Galston, 1972; Bridger, 2015. See also Anonymous, 2008. 12. Owens, 1994, 523. 13. Zachary, 1997. 14. Zachary, 1997. 15. See Stewart, 1948, and all the OSRD reports, listed in US Department of Commerce, Office of Technical Services, 1947 (Office of Technical Services, 1947; Navy Research Section, 1950). 16. Zachary’s biography captures this quality, 1997. 17. Owens, 1994, 530. 18. Liebenau, 1987. 19. Fleming, 1929, 227. 20. Selcer, 2008. 21. Bud, 1998; Neushul, 1993. 22. Quoted in Wright, 2004, 495. 23. Harris, 1999. 24. Hobby, 1985. 25. Daemmrich, 2009. 26. Daemmrich, 2009; Swann, 1983. 27. Adams, 1991; Keefer, 1948. 28. Rasmussen, 2001. 29. E. Russell, 1999; Perkins, 1978. 30. Dunlap, 1978; E. P. Russell, 1996; E. Russell, 1999. 31. E. Russell, 1999.
244
Notes to Pages 103–122
32. Quinn, 1995. 33. Siegfried, 2011. 34. Sime, 1996, 92. 35. Sime, 2012. 36. On Groves, see Bernstein, 2003. 37. On Oppenheimer, see Bird and Sherwin, 2005. 38. Andrew Brown, 2012. See also Veys, 2013. 39. Landers et al., 2012. 40. Carson, 1962. 41. Masco, 2008, 362. 42. Masco, 2008, 362. 5. U NF O R G E T TA BL E F IR E
1. Japan Broadcasting Corporation, 1981. 2. Committee for the Compilation, 1981, xv. 3. See Gentile, 2000, 1085. 4. Dower, 1986; Hasegawa, 2005. 5. Blackett, 1949; and then Alperovitz, 1995, 1998. 6. Stimson, 1947; Fussell, 1981. 7. Miyamoto, 2005. 8. Stimson, 1947; R. P. Newman, 1998; Malloy, 2008. 9. Blackett, 1949. 10. See Walker, 1996; Sherwin, 1975; Alperovitz, 1995; Blackett, 1949; Stimson, 1947; Malloy, 2008. 11. On the Soviet bomb, see Holloway, 1994, and esp. Gordin, 2009. 12. Hedges et al., 1986. 13. US Strategic Bombing Survey Reports, http://www.ibiblio.org/hyperwar/AAF /USSBS/. 14. Gordin, 2015a. 15. Committee for the Compilation, 1981. 16. Hasegawa, 2005. 17. Galison, 2001, 8. 18. US Strategic Bombing Survey, 1946a; 1946b; 1946c; 1947. 19. US Strategic Bombing Survey, 1946c, 39. 20. US Strategic Bombing Survey, 1946c, 41. 21. US Strategic Bombing Survey, 1946c, 41. 22. Hasegawa, 2005. 23. Gentile, 1997. 24. US Strategic Bombing Survey, 1946c, 43. 25. US Strategic Bombing Survey, 1946c, 45. 26. US Strategic Bombing Survey, 1946c, 5. 27. US Strategic Bombing Survey, 1946c, 23.
Notes to Pages 122–136
245
2 8. US Strategic Bombing Survey, 1946c, 24. 29. On the early debate, see Willis, 1997; Yavenditti, 1974; Hopkins, 1966; Kaur, 2013; Boller, 1982. Pope Pius XII famously called the atomic bombs the “the most terrible weapon that the human mind has ever conceived” in August 1945. 30. Lindee, 1994; 2016. 31. Lindee, 2016. 32. Lindee, 2016. 33. Zwigenberg, 2014, 163–175. 34. Lifton, 1968. 35. Lifton, 1963. 36. Lifton, 1963. 37. Committee for the Compilation, 1981, xvii. 38. Committee for the Compilation, 1981. 39. Lindee, 2016. 40. Kuchinskaya, 2013, 78. 41. Quoted in Lindee, 2016. 6 . BAT T L E F IE L D O F T H E BO DY
1. See Bourke, 1996. See also Bourke, 1999 and 2015. 2. J. F. Fulton to E. N. Harvey, September 29, 1943. Papers of E. N. Harvey, American Philosophical Society, Philadelphia, Penn. 3. Bynum, 1991. 4. T. Biddle, 2002; W. Mitchell, 1930. 5. See Schultz, 2018. 6. Kennett, 1991. 7. Schultz, 2018, 7, 35. 8. Schultz, 2018, 35–36. 9. Fulton, 1948; Leake, 1960. 10. J. C. Adams to Admiral Smith, USN, September 2, 1942, “Development of the Sands Point Research Project (Guggenheim Properties).” In Box 9, General Records 1940–1946, CMR, OSRD, RG227, NARA. These reports were in the context of hearings chaired by A. N. Richards relating to a proposal by Cornell University physiologist Eugene DuBois for a new aeromedical research laboratory at Sands Point. This proposal was not approved, Richards told Dubois, because “neither the Surgeon General of Navy nor the Chief Air Surgeon has identified a need for the aeromedical laboratory which is not being filled by the service research establishments.” September 4, 1942, Richard to DuBois. In Box 9, General Records 1940–1946, Committee on Medical Research, OSRD, RG227, NARA. See also Mackowski, 2006. 11. Statements by Brigadier General DNW Grant, September 1, 1942, to the Committee on Medical Research, relating to the possible development of the Sands Point Research Project (Guggenheim Properties). In Box 9, General Records 1940–1946, Committee on Medical Research, OSRD, RG227, NARA.
246
Notes to Pages 137–148
12. Fulton, J. F. January 27, 1948. Subcommittee on Decompression-Sickness. Report filed in Box 46, Folder 689, Papers of John F. Fulton, Yale University Library. 13. Funding and projects listed in Louis B. Flexner, Report, 1942. Box 12, RG 227, NARA. 14. In Flexner, “Report on Flexner’s Visits to New Haven, The Fatigue Laboratory and Clark University, July 19–23, 1942.” In Box 12, RG 227, Committee on Medical Research, NARA. Flexner’s visit to the decompression chamber at Yale, then in construction, included a meeting with an FBI agent who provided recommendations about how to secure the facility, which the agent proposed needed twenty-four- hour security guards. 15. Lamport to Fulton, September 18, 1942, Box 105, Folder 1435, Papers of John F. Fulton, Yale University. 16. Schmidt, 1943, and Koelle, 1995. 17. Schultz, 2018, 90–92. 18. Much of this discussion is drawn from my 2011 paper on Experimental Wounds, Lindee, 2011. 19. Andrus et al., 1948, 232–262, 251. 20. See Dill, 1959; Aub, 1962. 21. Kehrt, 2006. 22. Kuhn, 1962. 23. Stark, 2016; Lederer discusses her ongoing work on Beecher’s ideas in her podcast “The Evolution of ‘Beecher’s Bombshell,’ ” available at https://www.primr.org /podcasts/may2/. 24. “Soon, I suppose, I s hall return to the Harvard Medical School where I left a professorship quite a long time ago. I have profited by the opportunities I have had in ways that cannot be measured, and shall return to Harvard rich in t hese experiences.” Beecher to Maj. Gen. Morrison C. Stayer, US Group Control Council, US Army (Germany), dated July 17, 1945. Papers of Beecher, Countway Library. 25. Beecher to A. N. Richards, October 16, 1942, in Folder 10, Papers of Walter B. Cannon, Countway Medical Library, Boston, Mass. 26. Beecher to A. N. Richards, October 16, 1942, in Folder 10, Papers of Walter B. Cannon, Countway Medical Library, Boston, Mass. 27. Board for the Study of the Severely Wounded, 1952, 311–312. 28. Beecher, 1946. 29. Beecher, 1955. 30. Walter Reed Army Institute of Research, 1955, 24. 31. Quoted in Prokosch, 1995, 11. 32. Prokosch, 1995, 12. 33. Prokosch, 1995, 13. 34. Prokosch, 1995, 13. 35. Prokosch, 1995, 14–15. 36. Prokosch, 1995, 16. 37. Coates, 1962, 734–737, and Appendix H, 843–853.
Notes to Pages 148–171
247
3 8. Coates, 1962, 592. 39. Prokosch, 1995, 39, 41. 40. Stellman et al., 2003. 41. W. J. Scott, 1988. 42. Hannel, 2017. 43. Hannel, 2017. 44. Scarry, 1985, 73. 45. Scarry, 1985, 62. 7. BAT T L E F IE L D O F T H E M I N D
1. Bernays, 1947, 113. 2. Tye, 1998; Justman, 1994. 3. Bernays, 1923, 128. 4. Bernays, 1942, 240. 5. Bernays, 1942, 242; Tye, 1998. 6. Dillon and Kaestle, 1981; Kaestle, 1985. 7. G. S. Hall, 1919, 211. 8. Creel, 1941, 340. 9. On Lasswell, see Merelman, 1981. 10. Rohde, 2013; Solovey, 2013; Solovey and Cravens, 2012. 11. Rohde, 2013. 12. Simpson, 1996. 13. For helpful explorations of these contested boundaries a fter 1945, see Krige, 2006; also Wolfe, 2013 and 2018; Cohen-Cole, 2009. 14. Tye, 1998. 15. Tye, 1998. 16. Merelman, 1981. 17. On related research trajectories in the Soviet Union, see Gerovitch, 2002. 18. Ascher and Hirschfelder-Ascher, 2004. 19. Simpson, 1996. 20. On oceanography, see Hamblin, 2005; on the AEC, see Creager, 2013. 21. Simpson, 1996. 22. Santos, Lindee, and Souza, 2014. 23. Pribilsky, 2009. 24. Cited in Pribilsky, 2009. 25. D. H. Price, 2004, xi; see also D. H. Price, 2008; Nader, 1997. 26. Watson, 1924; Woodworth, 1959; Kreshel, 1990. 27. Kreshel, 1990. 28. Woodworth, 1959. 29. Farber, Harlow, and West, 1957; and see Lemov, 2011. 30. Lemov, 2011. 31. Zweiback, 1998.
248
Notes to Pages 171–190
3 2. Zweiback, 1998. 33. Biderman, 1956. 34. Lifton, 1961. 35. Lifton, 1961, 1963. 36. Arendt, 1963. 37. Arendt, 1963; Benhabib, 1996. 38. Nicholson, 2011; Zweiback, 1998. 39. Milgram, 1963. 40. Nicholson, 2011. 41. Jacobsen, 2017a, provides a gripping account of t hese studies and their aftermath; see also Moreno, 2006, and Albarelli, 2009. 42. Jacobsen, 2017b. 43. Zilboorg, 1938. 8. BL U E M A R BL E
1. E. Clark, 1965; Wiener, 2012; Cloud, 2001; and https://www.cia.gov/news-information /featured-story-archive/2015-featured-story-archive/corona-declassified.html. 2. Møller and Mousseau, 2015; Webster et al., 2016; see also K. Brown, 2019. 3. Masco, 2004, 542, note 6. 4. Jacobs, 2010; 2014; Nixon, 2011. 5. Jacobs, 2014. 6. See Pearson, Coates, and Cole, 2010; Kirsch, 2005. 7. Stacy, 2010, 418. 8. K. Brown, 2013. 9. Stacy, 2010. 10. Bruno, 2003. 239; Hacker, 1987, 92. 11. Gordin, 2009. 12. Stacy, 2010. 13. https://en.wikipedia.org/wiki/The_Conqueror_(1956_film). 14. Bruno, 2003. 15. Bruno, 2003. 16. See Bauer et al., 2005. 17. Masco, 2004. 18. Makhijani and Schwartz, 1998. 19. Macfarlane, 2003. 20. Walker, 2009. 21. On islands as laboratories, see DeLoughrey, 2013. On nuclear families (the military families living on the islands), see Hirschberg, 2012. 22. On the process of taking over the islands see S. Brown, 2013. 23. M. X. Mitchell, 2016. 24. See M. X. Mitchell, 2016; A. L. Brown, 2014. 25. M. X. Mitchell, 2016.
Notes to Pages 190–210
249
2 6. E. Clark, 1965, 1. 27. Farish, 2013. 28. Nielsen, Nielsen, and Martin-Nielsen, 2014. 29. Cloud, 2001. 30. Cloud, 2001, 237. 31. Cloud, 2001. 32. Barnes and Farish, 2006. 33. Wiener, 2012; Vanderbilt, 2002. Raven Rock, in Pennsylvania near Camp David in northern Maryland, is a similar “underground Pentagon.” Graff, 2017. 34. Wiener, 2012. 35. Spencer, 2014. 36. Spencer, 2014. 37. Spencer, 2014. 38. Spencer, 2014. 39. Spencer, 2014. 40. Spencer, 2014, 166. 41. See Poole, 2008. 42. Lazier, 2011. 9 . H IDDE N CU R R ICU L U M
1. Van Keuren, 2001, 208. 2. Pollard to Thomas Murray, July 9, 1954, Papers of Henry DeWolf Smyth, American Philosophical Society, Philadelphia. 3. Hochschild, 1983. 4. Chaney transcription, Papers of Anne Roe, APS. 5. Giroux and Purpel, 1983. 6. Lang, 1971, 77. 7. Cited by Hamblin, 2005, 55. 8. Fred Rogers to Dr. Thompson, June 14, 1943. Papers of Fred T. Rogers, Rice University. 9. Condon, page 26 of draft testimony for scheduled hearing before the Eastern Industrial Personnel Security Board, April 1954, in Condon Papers, APS, File Eastern Industrial Personnel Security Board, Hearing—April 1954, #2. 10. Wang, 1992. 11. Papers of Rosser, Dolph Briscoe Center for the Humanities, University of Texas at Austin. 12. Papers of Rosser, Dolph Briscoe Center for the Humanities, University of Texas at Austin. 13. December 7, 1956, “Memorandum for the Record” Signed by Dr. C. A. Bennett, Manager, Operation Research and Synthesis, Hanford Atomic Production Corp, General Electric Company, Richland, Washington; and by Dr. J. W. Tukey, Professor of Mathematical Statistics, Princeton University. In the Papers of John W. Tukey, APS.
250
Notes to Pages 210–219
14. From a proposal developed by The Jasons, 1960, in Papers of John Wheeler, APS, Pieces of #137 First Draft Report, “Suggestions on Presentation of Results,” “An Editorial Suggestion.” 15. Aaserud, 1995; Finkbeiner, 2006. 16. Ernst Caspari to Charles W. Edington, Geneticist Biology Branch AEC, December 11, 1963. Papers of Caspari, APS. 17. Edington to Caspari, December 27, 1963, Papers of Caspari, APS. 18. Chang and Leary, 2005. 19. See Johnson, 2006, on the general persecution of gays in the Cold War, and Rossiter, 1995 and 2012, on w omen scientists in the Cold War. 20. See oral history interview with Gayer, http://outhistory.org/exhibits/show /philadelphia-lgbt-interviews/interviews/richard-gayer. 21. https://casetext.com/case/high-tech-gays-v-disco. 22. https://casetext.com/case/high-tech-gays-v-disco. 23. Interview with author, Ward Goodenough, December 14, 2007, Haverford, Penn.; see also Kirch, 2015. 24. S. Newman, 1967. 25. Papers of Henry DeWolfe Smyth, APS. 26. Papers of Arthur Steinberg, APS. 27. Orth, Bailey, and Wolek, 1965, 141. 28. Orth, Bailey, and Wolek, 1965, 5; Hower and Orth, 1963. 29. Orth, Bailey, and Wolek, 1965, 1. 30. Miller, 1967. 31. In discussions at Manpower Meeting, 1959, sponsored by ONR, November 27, 1961, in Box 1, E-@, RG359, Minutes of FCST Meeting, 1959–1973, NARA; see also William Bradley to John Wheeler, January 24, 1958, papers of Wheeler, Jason (Project 137) #1, Papers of John Archibald Wheeler, APS. 32. See Kaiser, 2004. 33. Larry Wendt, July 1971 Science for the P eople newsletter, p. 31, “Dear B rothers and Sisters”—copy found in Van Pelt Library, University of Pennsylvania. 34. Larry Wendt, July 1971 Science for the P eople newsletter, p. 31, “Dear B rothers and Sisters.” See also https://scienceforthepeople.org/. 35. Schevitz, 1979, 51–60. 36. Schevitz, 1979, 51–60. 37. Probstein, 1969. 38. Easlea, 1983. 39. Porter, 1982. 40. Millstone, 2012. 41. Kaiser, 2004. 42. West, 1960, 61. 43. C. L. Schwartz, 1995. 44. Cited in Charles Schwartz, 1971.
Notes to Pages 219–235
251
45. Teller’s self-glorifying (and deeply problematic) account of his own role in promoting nuclear weapons is Teller and Shoolery, 2001. See also Bernstein, 1990. 46. In Moore, 2008, 151. 47. Moore, 2008, 151; M. Brown, 1971. 48. Moore, 2008, 153; Charles Schwartz, 1996. 49. Charles Schwartz, 1971. 50. As Cassell, Miller, and Rest, 1992. 51. See Cassell, Miller, and Rest, 1992. 52. See Nelson, 1969; Lappé, 1990, 115. 53. Oreskes and Krige, 2014; Rubinson, 2016. 54. In Easlea, 1983, 138. 55. Wang, 2012. 56. Oreskes, 2019. CO NCL U S IO N
1. Cohn, 1993, 227. 2. Fleck, 1979. 3. Jensen, 2014, 264. On emotion in postwar science, see also Biess and Gross, 2014. 4. N. Crawford, 2000. 5. N. Crawford, 2000. 6. I. W. Russell, 1946, 295. 7. T. Biddle, 2002. 8. See discussion in Chamayou and Lloyd, 2015. 9. Chamayou and Lloyd, 2015. 10. In R. C. Hall, 2014. 11. Chamayou and Lloyd, 2015. 12. J. Clark, 1964. 13. Zworykin, (1934) 1946. 14. All recounted in Chandler, 2014, 36–39. 15. Barkan, 1972, 14. 16. Barkan, 1972, 15. 17. Singer, 2009. 18. Coll, 2014. 19. Chamayou and Lloyd, 2015, Kindle 978. 20. Demchak, 2016. 21. Demchak, 2016. 22. Demchak, 2016.
REFERENCES
Aaserud, Finn. 1995. Sputnik and the “Princeton Three”: The National Security Laboratory That Was Not to Be. Historical Studies in the Physical and Biological Sciences 25 (2): 185–239. Adams, David P. 1991. The Greatest Good to the Greatest Number: Penicillin Rationing on the American Home Front, 1940–1945. New York: Peter Lang. Ágoston, Gábor. 2005. Guns for the Sultan: Military Power and the Weapons Industry in the Ottoman Empire. Cambridge: Cambridge University Press. Albarelli, H. P. 2009. A Terrible M istake: The Murder of Frank Olson, and the CIA’s Secret Cold War Experiments. Walterville, Ore.: Trine Day. Alder, Ken. 1997. Innovation and Amnesia: Engineering Rationality and the Fate of Interchangeable Parts Manufacturing in France. Technology and Culture 38 (2): 273–311. Alperovitz, Gar. 1995. The Decision to Use the Atomic Bomb and the Architecture of an American Myth. New York: Alfred A. Knopf. ———. 1998. Historians Reassess: Did We Need to Drop the Bomb? In Hiroshima’s Shadow: Writings on the Denial of History and the Smithsonian Controversy, edited by Kai Bird and Lawrence Lifschultz. Stony Creek, Conn.: Pamphleteer’s Press, 5–21. Anderson, Benedict R. O’G. 2006. Imagined Communities: Reflections on the Origin and Spread of Nationalism. Revised edition. London: Verso. Andrus, E. C., D. W. Bronk, G. A. Carden, Jr., C. S. Keefer, J. S. Lockwood, J. T. Wearn, and M. C. Winternitz. 1948. Advances in Military Medicine, Made by American Investigators. Vol. 1. Boston: L ittle, Brown. 253
254 References
Anonymous. 2008. In Memoriam: Arthur Galston, Plant Biologist, Fought Use of Agent Orange. YaleNews (July 18). Available online at https://news.yale.edu/2008/07/18 /memoriam-arthur-galston-plant-biologist-fought-use-agent- orange. Arendt, Hanna. 1963. Eichmann in Jerusalem: A Report on the Banality of Evil. New York: Viking Press. Ascher, William, and Barbara Hirschfelder-Ascher. 2004. Linking Lasswell’s Political Psychology and the Policy Sciences. Policy Sciences 37 (1): 23–36. Ashworth, A. E. 1968. The Sociology of Trench Warfare 1914–1918. British Journal of Sociology 19 (4): 407–423. Aub, J. C. 1962. Eugene Floyd DuBois 1882–1959. Washington, D.C.: National Academy of Sciences. Badash, Lawrence. 1979. British and American Views of the German Menace in World War I. Notes and Records of the Royal Society of London 34 (1): 91–121. Barkan, Robert. 1972. Nobody Here but Us Robots. New Republic 166 (18) (April 29): 14–15. Barnes, Trevor J., and Matthew Farish. 2006. Between Regions: Science, Militarism, and American Geography from World War to Cold War. Annals of the Association of American Geographers 96 (4): 807–826. Bauer, Susanne, Boris I. Gusev, Ludmila M. Pivina, Kazbek N. Apsalikov, and Bernd Grosche. 2005. Radiation Exposure Due to Local Fallout from Soviet Atmospheric Nuclear Weapons Testing in Kazakhstan: Solid Cancer Mortality in the Semipalatinsk Historical Cohort, 1960–1999. Radiation Research 164 (4): 409–419. Beecher, H. K. 1946. Pain in Men Wounded in B attle. Annals of Surgery 123 (1): 96–105. ———. 1955. The Powerful Placebo. Journal of the American Medical Association 159 (17): 1602–1606. Bellinger, Vanya. 2015. Marie von Clausewitz: The W oman b ehind the Making of “On War.” Oxford: Oxford University Press. Benhabib, Seyla. 1996. Identity, Perspective and Narrative in Hannah Arendt’s “Eichmann in Jerusalem.” History and Memory 8 (2): 35–59. Bernays, Edward L. 1923. Crystallizing Public Opinion. New York: Boni and Liveright. ———. 1942. The Marketing of National Policies: A Study of War Propaganda. Journal of Marketing 6 (3): 236–244. ———. 1947. The Engineering of Consent. Annals of the American Academy of Political and Social Science 250: 113–120. Bernstein, Barton J. 1990. Essay Review—From the A-bomb to Star Wars: Edward Teller’s History. Technology and Culture 31 (4): 846–861. ———. 2003. Reconsidering the “Atomic General”: Leslie R. Groves. Journal of Military History 67 (3): 883–920. Biddle, Stephen. 2004. Military Power: Explaining Victory and Defeat in Modern Battle. Princeton, N.J.: Princeton University Press. Biddle, Tami Davis. 2002. Rhetoric and Reality in Air Warfare: The Evolution of British and American Ideas about Strategic Bombing, 1914–1915. Princeton, N.J.: Princeton University Press.
References
255
Biderman, Albert D. 1956. Communist Techniques of Coercive Interrogation. Lackland Air Force Base, San Antonio, Tex.: United States Air Force Office for Social Science Programs, Air Force Personnel and Training Research Center. Biess, Frank, and Daniel M. Gross, eds. 2014. Science and Emotions after 1945: A Transatlantic Perspective. Chicago: University of Chicago Press. Bird, Kai, and Martin Sherwin. 2005. American Prometheus: The Triumph and Tragedy of J. Robert Oppenheimer. New York: Alfred A. Knopf. Blackett, P. M. S. 1949. Fear, War, and the Bomb: Military and Political Consequences of Atomic Energy. New York: Whittlesey House. Board for the Study of the Severely Wounded, North African–Mediterranean Theater of Operations. 1952. The Physiologic Effects of Wounds: Surgery in World War II. Washington, D.C.: Office of the Surgeon General, Department of the Army. Boller, Paul F. 1982. Hiroshima and the American Left: August 1945. International Social Science Review 57 (1): 13–28. Bourke, Joanna. 1996. Dismembering the Male: Men’s Bodies, Britain and the G reat War. Chicago: University of Chicago Press. ———. 1999. An Intimate History of Killing: Face to Face Killing in 20th Century Warfare. London: Granta Books. ———. 2015. Deep Violence: Military Violence, War Play, and the Social Life of Weapons. New York: Counterpoint Press. Bousquet, Antoine. 2009. The Scientific Way of Warfare: Order and Chaos on the Battlefields of Modernity. New York: Columbia University Press. Bridger, Sarah. 2015. Scientists at War: The Ethics of Cold War Weapons Research. Cambridge, Mass.: Harvard University Press. Brown, Andrew. 2012. Keeper of the Nuclear Conscience: The Life and Work of Joseph Rotblat. Oxford: Oxford University Press. Brown, April L. 2014. No Promised Land: The Shared Legacy of the Castle Bravo Nuclear Test. Arms Control Today 44 (2): 40–44. Brown, Kate. 2013. Plutopia: Nuclear Families, Atomic Cities, and the Great Society and American Plutonium Disasters. Oxford: Oxford University Press. ———. 2019. Manual for Survival: A Chernobyl Guide to the F uture. New York: W. W. Norton. Brown, Martin, ed. 1971. The Social Responsibility of the Scientist. London: Collier-MacMillan. Brown, Steve. 2013. Archaeology of Brutal Encounter: Heritage and Bomb Testing on Bikini Atoll, Republic of the Marshall Islands. Archaeology in Oceania 48 (1): 26–39. Bruno, Laura. 2003. The Bequest of the Nuclear Battlefield: Science, Nature, and the Atom during the First Decade of the Cold War. Historical Studies in the Physical and Biological Sciences 33 (2): 237–260. Buchanan, Brenda J. 2006. Gunpowder, Explosives, and the State: A Technological History. Aldershot, England: Ashgate. ———. 2008. Charcoal: “The Largest Single Variable in the Performance of Black Powder.” Icon 14: 3–29.
256 References
———. 2012. Reviewed Work(s): Islamic Gunpowder Empires: Ottomans, Safavids, and Mughals, by Douglas E. Streusand. Technology and Culture 53 (4): 923–925. ———. 2014. Gunpowder Studies at ICOHTEC. Icon 20 (1): 56–73. Bud, Robert. 1998. Penicillin and the Elizabethans. British Journal for the History of Science 31 (3): 305–333. Bynum, Caroline Walker. 1991. Material Continuity, Personal Survival, and the Resurrection of the Body: A Scholastic Discussion in Its Medieval and Modern Contexts. In Fragmentation and Redemption: Essays on Gender and the H uman Body in Medieval Religion. New York: Zone Books, 239–297. Carnahan, Burrus M. 1998. Lincoln, Lieber and the Laws of War: The Origins and Limits of the Principle of Military Necessity. American Journal of International Law 92 (2): 213–231. Carpenter, Ronald H. 1995. History as Rhetoric: Style, Narrative, and Persuasion. Columbia: University of South Carolina Press. Carson, Rachel. 1962. Silent Spring. Boston: Houghton Mifflin. Cassell, G., Linda Miller, and Richard Rest. 1992. Biological Warfare: The Role of Scientific Societies. In The Microbiologist and Biological Defense Research: Ethics, Politics, and International Security, edited by Raymond A. Zilinskas. New York: New York Academy of Sciences, 230–238. Cassidy, Ben. 2003. Machiavelli and the Ideology of the Offensive: Gunpowder Weapons in “The Art of War.” Journal of Military History 67 (2): 381–404. Chamayou, Grégoire, and Janet Lloyd. 2015. A Theory of the Drone. New York: New Press. Chandler, K. F. 2014. Drone Flight and Failure: The United States’ Secret T rials, Experiments and Operations in Unmanning, 1936–1973. (Unpublished doctoral dissertation.) University of California, Berkeley. Chang, Kenneth, and Warren Leary. 2005. Serge Lang, 78, a Gadfly and Mathematical Theorist, Dies. New York Times (September 25). Clark, Elmer F. 1965. Camp C entury Evolution of Concept and History of Design Construction and Performance. Technical Report No. 174. Hanover, N.H.: US Army Materiel Command, Cold Regions Research and Engineering Laboratory. Clark, John W. 1964. Remote Control in Hostile Environments. New Scientist 22 (389): 300–304. Cloud, John. 2001. Imaging the World in a Barrel: CORONA and the Clandestine Convergence of the Earth Sciences. Social Studies of Science 31 (2): 231–251. Coates, James Boyd, ed. 1962. Wound Ballistics. Washington, D.C.: Office of the Surgeon General, Department of the Army. Cohen-Cole, Jamie. 2009. The Creative American: Cold War Salons, Social Science, and the Cure for Modern Society. Isis 100 (2): 219–262. Cohn, Carol. 1993. Wars, Wimps, and W omen: Talking Gender and Thinking War. In Gendering War Talk, edited by Miriam Cooke and Angela Woollacott. Princeton, N.J.: Princeton University Press, 227–246. Coll, Steve. 2014. The Unblinking Stare: The Drone War in Pakistan. New Yorker (November 17). Available online at http://www.newyorker.com/magazine/2014/11/24 /unblinking-stare.
References
257
Committee for the Compilation of Materials on Damage Caused by the Atomic Bombs in Hiroshima and Nagasaki. 1981. Hiroshima and Nagasaki: The Physical, Medical, and Social Effects of Atomic Bombings. Translated by Eisei Ishikawa and David L. Swain. New York: Basic Books. Cowan, Ruth. 1983. More Work for M other: The Ironies of Household Technology from the Open Hearth to the Microwave. New York: Basic Books. Crawford, Elisabeth. 1988. Internationalism in Science as a Casualty of the First World War: Relations between German and Allied Scientists as Reflected in Nominations for the Nobel Prizes in Physics and Chemistry. Information (International Social Science Council) 27 (2): 163–201. ———. 1990. The Universe of International Science, 1880–1939. In Solomon’s House Revisited: The Organization and Institutionalization of Science, edited by Tore Frängsmyr. Canton, Mass.: Science History Publications, 251–269. Crawford, Neta C. 2000. The Passion of World Politics: Propositions on Emotion and Emotional Relationships. International Security 24 (4): 116–156. Creager, Angela. 2013. Life Atomic: A History of Radioisotopes in Science and Medicine. Chicago: University of Chicago Press. Creel, George. 1941. Propaganda and Morale. American Journal of Sociology 47 (3): 340–351. Cressy, David. 2011. Saltpetre, State Security and Vexation in Early Modern England. Past & Present 212 (1): 73–111. ———. 2012. Saltpeter: The Mother of Gunpowder. Oxford: Oxford University Press. Cushman, Gregory T. 2013. Guano and the Opening of the Pacific World: A Global Ecological History. New York: Cambridge University Press. Daemmrich, Arthur. 2009. Synthesis by Microbes or Chemists? Pharmaceutical Research and Manufacturing in the Antibiotic Era. History and Technology 25 (3): 237–256. DeLoughrey, Elizabeth M. 2013. The Myth of Isolates: Ecosystem Ecologies in the Nuclear Pacific. Cultural Geographies 20 (2): 167–184. Demchak, Chris C. 2016. Cybered Ways of Warfare: The Emergent Spectrum of Democratized Predation and the Future of Cyber-Westphalia Interstate Topology. In Cyberspace: Malevolent Actors, Criminal Opportunities, and Strategic Competition, edited by Phil Williams and Dighton Fiddner. Carlisle Barracks, Penn.: United States Army War College Press, 603–640. Dennis, Michael A. 1994. “Our First Line of Defense”: Two University Laboratories in the Postwar American State. Isis 85 (3): 427–455. ———. 2015. Our Monsters, Ourselves: Reimagining the Problem of Knowledge in Cold War America. In Dreamscapes of Modernity: Sociotechnical Imaginaries and the Fabrication of Power, edited by Sheila Jasanoff and Sang-Hyun Kim. Chicago: University of Chicago Press, 56–78. Diamond, Jared. 1997. Guns, Germs, and Steel: The Fates of H uman Societies. New York: W. W. Norton. Diamond, Marion, and Mervyn Stone. 1981. Nightingale on Quetelet. Journal of the Royal Statistical Society, Series A (General) 144 (1): 66–79.
258 References
Dill, D. B. 1959. Eugene F. DuBois, Environmental Physiologist. Science 130 (3391): 1746–1747. Dillon, David A., and Carl F. Kaestle. 1981. Perspectives: Literacy and Mainstream Culture in American History. Language Arts 58 (2): 207–218. Doel, Ronald E., Dieter Hoffman, and Nikolai Kremenstov. 2005. National States and International Science: A Comparative History of International Science Congresses in Hitler’s Germany, Stalin’s Russia, and Cold War United States. Osiris, 2nd series, vol. 20, Politics and Science in Wartime: Comparative International Perspectives on the Kaiser Wilhelm Institute. Chicago: University of Chicago Press, 49–76. Dower, John W. 1986 War without Mercy: Race and Power in the Pacific War. New York: Pantheon Books. Dunlap, Thomas R. 1978. Science as a Guide in Regulating Technology: The Case of DDT in the United States. Social Studies of Science 8 (3): 265–285. Easlea, Brian. 1983. Fathering the Unthinkable: Masculinity, Scientists and the Nuclear Arms Race. London: Pluto Press. Einstein, Albert, and Sigmund Freud. 1933. Why War? Dijon, France: International Institute of Intellectual Co-operation, League of Nations. Evans, John X. 1964. Shakespeare’s “Villainous Salt-Peter”: The Dimensions of an Allusion. Shakespeare Quarterly 15 (4): 451–454. Fairbanks, Charles H. 1991. The Origins of the Dreadnought Revolution: A Historiographical Essay. International History Review 13 (2): 246–272. Farber, I. E., Harry F. Harlow, and Louis Jolyon West. 1957. Brainwashing, Conditioning, and DDD (Debility, Dependency, and Dread). Sociometry 20 (4): 271–285. Farish, Matthew. 2013. The Lab and the Land: Overcoming the Arctic in Cold War Alaska. Isis 104 (1): 1–29. Faust, Drew Gilpin. 2005. “The Dread Void of Uncertainty”: Naming the Dead in the American Civil War. Southern Cultures 11 (2): 7–32. Feffer, Loren Butler. 1998. Oswald Veblen and the Capitalization of American Mathe matics: Raising Money for Research, 1923–1928. Isis 89 (3): 474–497. Finkbeiner, Ann K. 2006. The JASONs: The Secret History of Science’s Postwar Elite. New York: Viking. Finnegan, Terrence. 2006. Shooting the Front: Allied Aerial Reconnaissance and Photographic Interpretation on the Western Front–World War I. Washington, D.C.: NDIC Press. Fleck, Ludwig. 1979. Genesis and Development of a Scientific Fact. Edited by Thaddeus J. Trenn and Robert K. Merton. Translated by Frederick Bradley and Thaddeus J. Trenn from the German. Chicago: University of Chicago Press. Fleming, Alexander. 1929. On the Antibacterial Action of Cultures of a Penicillium, with Special Reference to Their Use in the Isolation of B. Influenza. British Journal of Experimental Pathology 10 (3): 226–236. Forman, Paul. 1973. Scientific Internationalism and the Weimar Physicists: The Ideology and Its Manipulation in Germany after World War I. Isis 64 (2): 150–180. Freud, Sigmund. 1957. The Standard Edition of the Complete Psychological Works of Sigmund Freud: Translated from the German under the General Editorship of James
References
259
Strachey, in Collaboration with Anna Freud XIV, 1914–1916. Edited and translated by James Strachey. London: Hogarth Press and the Institute of Psychoanalysis. Frey, James W. 2009. The Indian Saltpeter Trade, the Military Revolution, and the Rise of Britain as a Global Superpower. The Historian 71 (3): 507–554. Fulton, John F. 1948. Aviation Medicine in Its Preventive Aspects: An Historical Survey. London: Oxford University Press. Fussell, Paul. 1981. Hiroshima: A Soldier’s View: “Thank God for the Atom Bomb.” New Republic 185 (8) (August 22/29): 26–30. Galison, Peter. 2001. War against the Center. Grey Room, no. 4: 5–33. ———. 2004. Removing Knowledge. Critical Inquiry 31 (1) 229–243. Galston, Arthur W. 1972. Science and Social Responsibility: A Case Study. Annals of the New York Academy of Sciences 196 (4): 223–235. Gat, Azar. 1988. Machiavelli and the Decline of the Classical Notion of the Lessons of History in the Study of War. Military Affairs 52 (4): 203–205. Gentile, Gian P. 2000. Shaping the Past Battlefield, “For the F uture”: The United States Strategic Bombing Survey’s Evaluation of the American Air War against Japan. Journal of Military History 64 (4): 1085–1112. Gentile, Gian Peri. 1997. Advocacy or Assessment? The United States Strategic Bombing Survey of Germany and Japan. Pacific Historical Review 66 (1): 53–79. George, Isabel. 2012. The Most Decorated Dog in History: Sergeant Stubby. New York: HarperCollins. Gerovitch, Slava. 2002. From Newspeak to Cyberspeak: A History of Soviet Cybernetics. Cambridge, Mass.: MIT Press, 2002. Ghamari-Tabrizi, Sharon. 2005. The Worlds of Herman Kahn: The Intuitive Science of Thermonuclear War. Cambridge, Mass.: Harvard University Press. Giroux, Henry, and David Purpel, eds. 1983. The Hidden Curriculum and Moral Education: Deception or Discovery? Berkeley, Calif.: McCutcheon. Gommans, Jos. 2002. Mughal Warfare: Indian Frontiers and High Roads to Empire. London: Routledge. Gordin, Michael. 2009. Red Cloud at Dawn: Truman, Stalin, and the End of the Atomic Monopoly. New York: Farrar, Straus, and Giroux. ———. 2015a. Five Days in August: How World War II Became a Nuclear War. Princeton, N.J.: Princeton University Press. ———. 2015b. Scientific Babel: How Science Was Done Before and A fter Global English. Chicago: University of Chicago Press. Graff, Garrett M. 2017. Raven Rock: The Story of the U.S. Government’s Secret Plan to Save Itself—While the Rest of Us Die. New York: Simon & Schuster. Grier, David Alan. 2005. Dr. Veblen at Aberdeen: Mathematics, Military Applications and Mass Production. In Instrumental in War: Science, Research, and Instruments between Knowledge and the World, edited by Steven A. Walton. Boston: Brill, 263–270. Gross, Rachel S. 2019. Layering for a Cold War: The M-1943 Combat System, Military Testing, and Clothing as Technology. Technology and Culture 60 (2): 378–408.
260 References
Grossman, David. 1995. On Killing: The Psychological Cost of Learning to Kill in War and Society. Boston: Little, Brown. Haber, Ludwig Fritz. 1986. The Poisonous Cloud: Chemical Warfare in the First World War. London: Clarendon Press. Hacker, Barton C. 1987. The Dragon’s Tail: Radiation Safety in the Manhattan Project, 1942–1946. Berkeley: University of California Press. ———. 1994. Military Institutions, Weapons, and Social Change: T oward a New History of Military Technology. Technology and Culture 35 (4): 768–834. ———. 2008. Firearms, Horses, and Slave Soldiers: The Military History of African Slavery. Icon 14: 62–83. Hall, Bert S. 1997. Weapons and Warfare in Renaissance Europe: Gunpowder, Technology, and Tactics. Baltimore: Johns Hopkins University Press. Hall, G. Stanley. 1919. Some Relations between the War and Psychology. American Journal of Psychology 30 (2): 211–223. Hall, R. Cargill. 2014. Reconnaissance Drones: Their First Use in the Cold War. Air Power History 61 (3): 20–27. Hamblin, Jacob Darwin. 2005. Oceanographers and the Cold War: Disciples of Marine Science. Seattle: University of Washington Press. Hannel, Eric. 2017. Gulf War Syndrome. In The SAGE Encyclopedia of War: Social Science Perspectives, edited by Paul Joseph. Thousand Oaks, Calif.: SAGE Publications, 760–763. Haraway, Donna. 1988. Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective. Feminist Studies 14 (3): 575–599. Harris, Henry. 1999. Howard Florey and the Development of Penicillin. Notes and Records of the Royal Society of London 53 (2): 243–252. Harvey, E. Newton. 1948. The Mechanism of Wounding by High Velocity Missiles. Proceedings of the American Philosophical Society 92 (4): 294–304. Hasegawa, Tsuyoshi. 2005. Racing the Enemy: Stalin, Truman, and the Surrender of Japan. Cambridge, Mass.: Belknap Press of Harvard University Press. Hedges, John I., John R. Ertel, Paul D. Quay, Pieter M. Grootes, Jeffrey E. Richey, Allan H. Devol, George W. Farwell, Fred W. Schmidt, and Eneas Salati. 1986. Organic Carbon-14 in the Amazon River System. Science 231 (4742): 1129–1131. Heilbron, J. L. 2000. The Dilemmas of an Upright Man: Max Planck as Spokesman for German Science. With a new afterword. Cambridge, Mass.: Harvard University Press. Herman, Ellen. 1995. The Romance of American Psychology: Political Culture in the Age of Experts. Berkeley: University of California Press. Hirschberg, Lauren. 2012. Nuclear Families: (Re)producing 1950s Suburban America in the Marshall Islands. OAH Magazine of History 26 (4): 39–43. Hobby, Gladys. 1985. Penicillin: Meeting the Challenge. New Haven, Conn.: Yale University Press. Hochschild, Arlie. 1983. The Managed Heart: The Commercialization of H uman Feelings. Berkeley: University of California Press.
References
261
Hollinger, David. 1995. Science as a Weapon in Kulturkampfe in the United States during and a fter World War II. Isis 86 (3): 440–454. Holloway, David. 1994. Stalin and the Bomb: The Soviet Union and Atomic Energy, 1939–1956. New Haven, Conn.: Yale University Press. Hopkins, George E. 1966. Bombing and the American Conscience during World War II. The Historian 28 (3): 451–473. Hower, Ralph Merle, and Charles Orth. 1963. Managers and Scientists: Some Human Problems in Industrial Research Organizations. Boston: Division of Research, Graduate School of Business Administration, Harvard University. Hughes, Thomas. 2004. Human-Built World: How to Think about Technology and Culture. Chicago: University of Chicago Press. Immerwahr, Daniel. 2019. How to Hide an Empire: A History of the Greater United States. New York: Farrar, Straus & Giroux. Inikori, J. E. 1977. The Import of Firearms into West Africa, 1759–1807: A Quantitative Analysis. Journal of African History 18 (3): 339–368. ———. 2002. Africans and the Industrial Revolution in E ngland: A Study in International Trade and Development. Cambridge: Cambridge University Press. Irwin, Will. 1921. The Next War: An Appeal to Common Sense. New York: E. P. Dutton & Co. Jacobs, Robert. 2010. The Dragon’s Tail: Americans Face the Atomic Age. Amherst: University of Massachusetts Press. ———. 2014. The Radiation That Makes People Invisible: A Global Hibakusha Perspective. Asia-Pacific Journal 12 (31): 1–11. Jacobsen, Annie. 2017a. Phenomena: The Secret History of the U.S. Government’s Investigations into Extrasensory Perception and Psychokinesis. New York: Little, Brown. ———. 2017b. The U.S. Military Believes People Have a Sixth Sense. Time (April 3). Available at http://time.com/4721715/phenomena-annie-jacobsen/. Japan Broadcasting Corporation, ed. 1981. Unforgettable Fire: Pictures Drawn by Atomic Bomb Survivors. Translated by the World Friendship Center in Hiroshima from the Japanese. New York: Pantheon Books. Jensen, Uffa. 2014. Across Different Cultures? Emotions in Science during the Early Twentieth Century. In Science and Emotions after 1945: A Transatlantic Perspective, edited by Frank Biess and Daniel M. Gross. Chicago: University of Chicago Press, 263–277. Johnson, David. 2015. Executed at Dawn: The British Firing Squads of the First World War. Cheltenham, UK: History Press. Johnson, David K. 2006. The Lavender Scare: The Cold War Persecution of Gays and Lesbians in the Federal Government. Chicago: University of Chicago Press. Jones, Daniel P. 1980. American Chemists and the Geneva Protocol. Isis 71 (3): 426–440. Jones, Edgar. 2014. Terror Weapons: The British Experience of Gas and Its Treatment in the First World War. War in History 21 (3): 355–375. Justman, Stewart. 1994. Freud and His Nephew. Social Research 61 (2): 457–476.
262 References
Kaempf, Sebastian. 2009. Double Standards in US Warfare: Exploring the Historical Legacy of Civilian Protection and the Complex Nature of the Moral-Legal Nexus. Review of International Studies 35 (3): 651–674. Kaestle, Carl F. 1985. The History of Literacy and the History of Readers. Review of Research in Education 12: 11–53. Kaiser, David. 2004. The Postwar Suburbanization of American Physics. American Quarterly 56 (4): 851–888. Kaldor, Mary. 1981. The Baroque Arsenal. New York: Hill and Wang. Karsten, Peter. 1971. The Nature of “Influence”: Roosevelt, Mahan and the Concept of Sea Power. American Quarterly 23 (4): 585–600. Kaur, Raminder. 2013. Atomic Schizophrenia: Indian Reception of the Atom Bomb Attacks in Japan, 1945. Cultural Critique 84: 70–100. Keefer, Chester S. 1948. Penicillin: A Wartime Achievement. In Advances in Military Medicine, vol. 2, edited by E. C. Andrus. Boston: L ittle, Brown, 717–722. Kehrt, Christian. 2006. “Higher, Always Higher”: Technology, the Military and Aviation Medicine during the Age of the Two World Wars. Endeavour 30 (4): 138–143. Kennedy, Paul. 1988. The Influence and Limitations of Sea Power. International History Review 10 (1): 2–17. Kennett, Lee B. 1991. The First Air War, 1914–1918. New York: Free Press. Kevles, Daniel J. 1971. “Into Hostile Political Camps”: The Reorganization of International Science in World War I. Isis 62 (1): 47–60. ———. 1975. The Debate over Postwar Research Policy, 1942–1945: A Political Interpretation of Science: The Endless Frontier. Social Science Working Paper No. 93. Pasadena, Calif.: California Institute of Technology, Division of the Humanities and Social Sciences. Available online at https://authors.library.caltech.edu/82789/1 /sswp93.pdf. ———. 1977. The National Science Foundation and the Debate over Postwar Research Policy, 1942–1945: A Political Interpretation of Science—The Endless Frontier. Isis 68 (1): 5–26. Kirch, Patrick V. 2015. Ward H. Goodenough, 1919–2013: A Biographical Memoir. Washington D.C.: National Academy of Sciences. Kirsch, Scott. 2005. Proving Grounds: Project Plowshare and the Unrealized Dream of Nuclear Earthmoving. New Brunswick, N.J.: Rutgers University Press. Kleinschmidt, Harald. 1999. Using the Gun: Manual Drill and the Proliferation of Portable Firearms. Journal of Military History 63 (3): 601–630. Koelle, George B. 1995. Carl Frederic Schmidt: July 29, 1893–April 4, 1988. In Bibliographic Memoirs, vol. 68, National Academy of Sciences. Washington, D.C.: National Academies Press. Kreshel, Peggy J. 1990. John B. Watson at J. Walter Thompson: The Legitimation of “Science” in Advertising. Journal of Advertising 19 (2): 49–59. Krige, John. 2006. American Hegemony and the Postwar Reconstruction of Science in Europe. Cambridge, Mass.: MIT Press.
References
263
Krige, John, and Kai-Henrik Barth, eds. 2006. Global Power Knowledge: Science and Technology in International Affairs. Osiris, 2nd series, vol. 21, Historical Perspectives on Science, Technology, and International Affairs. Chicago: University of Chicago Press. Kuchinskaya, Olga. 2013. Twice Invisible: Formal Representations of Radiation Danger. Social Studies of Science 43 (1): 78–96. Kuhn, Thomas. 1962. The Structure of Scientific Revolutions. Chicago: University of Chicago Press. LaFeber, Walter. 1962. A Note on the “Mercantilistic Imperialism” of Alfred Thayer Mahan. Mississippi Valley Historical Review 48 (4): 674–685. Landers, Timothy, Bevin Cohen, Thomas E. Wittum, and Elaine L. Larson. 2012. A Review of Antibiotic Use in Food Animals: Perspective, Policy, and Potential. Public Health Reports (1974–) 127 (1): 4–22. Lang, Serge. 1971. The DoD, Government and Universities. In The Social Responsibility of the Scientist, edited by Martin Brown. London: Collier-MacMillan, 51–79. Lappé, Marc. 1990. Ethics in Biological Warfare Research. In Preventing a Biological Arms Race, edited by Susan Wright. Cambridge, Mass.: MIT Press, 78–99. Lazier, Benjamin. 2011. Earthrise; or, the Globalization of the World Picture. American Historical Review 116 (3): 602–630. Leake, Chauncey D. 1960. Eloge: John Farquhar Fulton, 1899–1960. Isis 51 (4): 486, 560–562. Lemov, Rebecca. 2011. Brainwashing’s Avatar: The Curious C areer of Dr. Ewen Cameron. Grey Room, no. 45: 61–87. Liebenau, Jonathan. 1987. The British Success with Penicillin. Social Studies of Science 17 (1): 69–86. Lifton, Robert Jay. 1961. Thought Reform and the Psychology of Totalism: A Study of “Brainwashing” in China. New York: W. W. Norton. ———. 1963. Psychological Effects of the Atomic Bomb in Hiroshima: The Theme of Death. Daedalus 92 (3): 462–497. ———. 1968. Death in Life: Survivors of Hiroshima. New York: Random House. Lindee, M. Susan. 1994. Suffering Made Real: American Science and the Survivors at Hiroshima. Chicago: University of Chicago Press. Lindee, Susan. 2011. Experimental Wounds: Science and Violence in Mid-Century America. Journal of Law, Medicine & Ethics 39 (1): 8–20. ———. 2016. Survivors and Scientists: Hiroshima, Fukushima, and the Radiation Effects Research Foundation, 1975–2014. Social Studies of Science 46 (2): 184–209. Lorge, Peter Allan. 2008. The Asian Military Revolution: From Gunpowder to the Bomb. Cambridge: Cambridge University Press. Loughran, Tracey. 2012. Shell Shock, Trauma, and the First World War: The Making of a Diagnosis and Its Histories. Journal of the History of Medicine and Allied Sciences 67 (1): 94–119. Macfarlane, Allison. 2003. Underlying Yucca Mountain: The Interplay of Geology and Policy in Nuclear Waste Disposal. Social Studies of Science 33 (5): 783–807. Mackowski, Maura Phillips. 2006. Testing the Limits: Aviation Medicine and the Origins of Manned Space Flight. College Station: Texas A&M University Press.
264 References
Mahan, A. T. 1890. The Influence of Sea Power upon History, 1660–1783. Boston: L ittle, Brown. Available via Project Gutenberg at https://w ww.gutenberg.org/ebooks /13529. Makhijani, Arjun, and Stephen I. Schwartz. 1998. Victims of the Bomb. In Atomic Audit: The Costs and Consequences of U.S. Nuclear Weapons Since 1940, edited by Stephen I. Schwartz. Washington, D.C.: Brookings Institution Press, 395–431. Malloy, Sean L. 2008. Atomic Tragedy: Henry L. Stimson and the Decision to Use the Bomb against Japan. Ithaca, N.Y.: Cornell University Press. Malone, Patrick M. 2000. The Skulking Way of War: Technology and Tactics among the New E ngland Indians. 1st paperback edition. Lanham, Md.: Madison Books. Marshall, S. L. A. 1947. Men against Fire: The Problem of B attle Command in F uture War. Washington, D.C.: Infantry Journal. Masco, Joseph. 2004. Mutant Ecologies: Radioactive Life in Post–Cold War New Mexico. Cultural Anthropology 19 (4): 517–550. ———. 2008. “Survival Is Your Business”: Engineering Ruins and Affect in Nuclear America. Cultural Anthropology 23 (2): 361–398. Mauskopf, Seymour H. 1988. Gunpowder and the Chemical Revolution. Osiris 4, The Chemical Revolution: Essays in Reinterpretation: 93–118. ———. 1995. Lavoisier and the Improvement of Gunpowder Production. Revue d’histoire des sciences 48 (1/2): 95–121. ———. 1999. “From an Instrument of War to an Instrument of the Laboratory: The Affinities Certainly Do Not Change”: Chemists and the Development of Munitions, 1785–1885. Bulletin of the History of Chemistry 24: 1–14. McNeill, John, and Corinna Unger, eds. 2010. Environmental Histories of the Cold War. New York: Cambridge University Press. McNeill, William H. 1982. The Pursuit of Power: Technology, Armed Force, and Society since A.D. 1000. Chicago: University of Chicago Press. ———. 1995. Keeping Together in Time: Dance and Drill in Human History. Cambridge, Mass.: Harvard University Press. Merelman, Richard M. 1981. Harold D. Lasswell’s Political World: Weak Tea for Hard Times. British Journal of Political Science 11 (4): 471–497. Milgram, Stanley. 1963. Behavioral Study of Obedience. Journal of Abnormal and Social Psychology 67: 371–378. Miller, George A. 1967. Professionals in Bureaucracy: Alienation among Industrial Scientists and Engineers. American Sociological Review 32 (5): 755–768. Millstone, Erik. 2012. Obituary: Dr Brian Easlea. Bulletin, University of Sussex (December 7). Available online at http://www.sussex.ac.uk/internal/bulletin/staff /2012-13/071212/brianeaslea. Mindell, David A. 1995. “The Clangor of That Blacksmith’s Fray”: Technology, War, and Experience aboard the USS Monitor. Technology and Culture 36 (2): 242–270. ———. 2000. War, Technology, and Experience aboard the USS Monitor. Baltimore: Johns Hopkins University Press.
References
265
Mitchell, M. X. 2016. Test Cases: Reconfiguring American Law, Technoscience, and Democracy in the Nuclear Pacific. (Unpublished doctoral dissertation.) University of Pennsylvania, Philadelphia. Mitchell, William. 1930. Skyways: A Book on Modern Aeronautics. London: J. B. Lippincott. Miyamoto, Yuki. 2005. Rebirth in the Pure Land or God’s Sacrificial Lambs? Religious Interpretations of the Atomic Bombings in Hiroshima and Nagasaki. Japanese Journal of Religious Studies 32 (1): 131–159. Møller, A. P., and T. A. Mousseau. 2015. Biological Indicators of Ionizing Radiation in Nature. In Environmental Indicators, edited by R. H. Armon and O. Hanninen. Netherlands: Springer, 871–881. Moore, Kelly. 2008. Disrupting Science: Social Movements, American Scientists and the Politics of the Military, 1947–1975. Princeton, N.J.: Princeton University Press. Moreno, Jonathan D. 2006. Mind Wars: Brain Research and National Defense. New York: Dana Press. Müller, Simone M. 2016. “Cut Holes and Sink ’em”: Chemical Weapons Disposal and Cold War History as a History of Risk. Historical Social Research / Historische Sozialforschung 41 (1 (155, Risk as an Analytical Category: Selected Studies in the Social History of the Twentieth C entury)): 263–284. Nader, Laura. 1997. The Phantom F actor: Impact of the Cold War on Anthropology. In The Cold War and the University: Toward an Intellectual History of the Postwar Years, edited by Noam Chomsky. New York: New Press, 107–146. Navy Research Section. 1950. A Catalog of OSRD Reports. Washington, D.C.: Library of Congress. Nayar, Sheila J. 2017. Arms or the Man I: Gunpowder Technology and the Early Modern Romance. Studies in Philology 114 (3): 517–560. Nelkin, Dorothy. 1972. The University and Military Research: Moral Politics at MIT. Ithaca, N.Y.: Cornell University Press. Nelson, Bryce. 1969. Salvador Luria Excluded by HEW. Science 166 (3904): 487. Neushul, Peter. 1993. Science, Government, and the Mass Production of Penicillin. Journal of the History of Medicine and Allied Sciences 48 (4): 371–395. Newman, Robert P. 1998. Hiroshima and the Trashing of Henry Stimson. New England Quarterly 71 (1): 5–32. Newman, Stanley. 1967. “Morris Swadesh.” Language 43 (4): 948–957. Nicholson, Ian. 2011. “Shocking” Masculinity: Stanley Milgram, “Obedience to Authority,” and the “Crisis of Manhood” in Cold War America. Isis 102 (2): 238–268. Nickles, David Paull. 2003. Under the Wire: How the Telegraph Changed Diplomacy. Cambridge, Mass.: Harvard University Press. Nielsen, Kristian H., Henry Nielsen, and Janet Martin-Nielsen. 2014. City u nder the Ice: The Closed World of Camp Century in Cold War Culture. Science as Culture 23 (4): 443–464. Nixon, Rob. 2011. Slow Violence and the Environmentalism of the Poor. Cambridge, Mass.: Harvard University Press.
266 References
Noyes, W. A., Jr., ed. 1948. Chemistry: A History of the Chemical Components of the National Defense Research Committee, 1940–1946. Science in World War II, Office of Scientific Research & Development. Boston: L ittle, Brown. O’Connell, Robert L. 1993. Sacred Vessels: The Cult of the Battleship and the Rise of the U.S. Navy. New York: Oxford University Press. Office of Technical Services, Bibliographic and Reference Division. 1947. Bibliography and Index of Declassified Reports Having OSRD Numbers. Washington D.C.: Library of Congress. Oreskes, Naomi. 2019. Why Trust Science? Why the Social Nature of Science Makes It Trustworthy. Edited by Stephen Macedo. Princeton: Princeton University Press. Oreskes, Naomi, and John Krige, eds. 2014. Science and Technology in the Global Cold War. Cambridge, Mass.: MIT Press. Orth, Charles D., Joseph C. Bailey, and Francis W. Wolek. 1965. Administering Research and Development: The Behaviour of Scientists and Engineers in Organizations. London, Tavistock Publications. Owens, Larry. 1994. The Counterproductive Management of Science in the Second World War: Vannevar Bush and the Office of Scientific Research and Development. Business History Review 68 (4): 515–576. ———. 2004. The Cat and the Bullet: A Ballistic Fable. Massachusetts Review 45 (1): 178–190. Paret, Peter. 2004. From Ideal to Ambiguity: Johannes von Müller, Clausewitz, and the People in Arms. Journal of the History of Ideas 65 (1): 101–111. ———. 2007. Clausewitz and the State: The Man, His Theories, and His Times. Princeton, N.J.: Princeton University Press. ———. 2015. Clausewitz in His Time: Essays in the Cultural and Intellectual History of Thinking about War. New York: Berghahn Books. Parker, Geoffrey. 1996. The Military Revolution: Military Innovation and the Rise of the West, 1500–1800. 2nd edition. New York: Cambridge University Press. ———. 2007. The Limits to Revolutions in Military Affairs: Maurice of Nassau, the Battle of Nieuwpoort (1600), and the Legacy. Journal of Military History 71 (2): 331–372. Pearson, Chris, Peter A. Coates, and Tim Cole, eds. 2010. Militarized Landscapes: From Gettysburg to Salisbury Plain. London: Continuum UK. Perkins, John H. 1978. Reshaping Technology in Wartime: The Effect of Military Goals on Entomological Research and Insect-Control Practices. Technology and Culture 19 (2): 169–186. Perrin, Noel. 1979. Giving Up the Gun: Japan’s Reversion to the Sword, 1543–1879. Boston: D. R. Godine. Poole, Robert K. 2008. Earthrise: How Man First Saw the Earth. New Haven, Conn.: Yale University Press. Porter, Roy. 1982. Review: Witch-Hunting, Magic, and the New Philosophy, an Introduction to Debates of the Scientific Revolution 1450–1750, by Brian Easlea. Social History 7 (1): 85–87. Pribilsky, Jason. 2009. Development and the “Indian Problem” in the Cold War Andes: “Indigenismo,” Science, and Modernization in the Making of the Cornell-Peru Project at Vicos. Diplomatic History 33 (3): 405–426.
References
267
Price, David H. 2004. Threatening Anthropology: McCarthyism and the FBI’s Surveillance of Activist Anthropologists. Durham, N.C.: Duke University Press. ———. 2008. Anthropological Intelligence: The Deployment and Neglect of American Anthropology in the Second World War. Durham, N.C.: Duke University Press. Price, Richard M. 1997. The Chemical Weapons Taboo. Ithaca, N.Y.: Cornell University Press. Probstein, Ronald F. 1969. Reconversion and Non-Military Research Opportunities. Astronautics and Aeronautics (October): 50–56. Prokosch, Eric. 1995. The Technology of Killing: A Military and Political History of Antipersonnel Weapons. London: Zed Books. Quinn, Susan. 1995. Marie Curie: A Life. New York: Simon & Schuster. Ralston, David B. 1990. Importing the European Army: The Introduction of European Military Techniques and Institutions in the Extra-European World, 1600–1914. Chicago: University of Chicago Press. Rasmussen, Nicolas. 2001. Plant Hormones in War and Peace: Science, Industry, and Government in the Development of Herbicides in 1940s America. Isis 92 (2): 291–316. Rees, Mina. 1982. The Computing Program of the Office of Naval Research, 1946–1953. Annals of the History of Computing 4 (2): 102–120. Relyea, Harold C. 1994. Silencing Science: National Security Controls and Scientific Communication. Norwood, N.J.: Ablex. Richards, W. A. 1980. The Import of Firearms into West Africa in the Eighteenth Century. Journal of African History 21 (1): 43–59. Rilke, Rainer Maria. 1947. Letters of Rainer Maria Rilke. Vol. 2, 1910–1926. Translated by Jane Bannard Greene and M. D. Herter Norton from the German. New York: W. W. Norton. Roberts, Michael. 1956. The Military Revolution: An Inaugural Lecture Delivered before the Queen’s University of Belfast. Belfast: Boyd. Rohde, Joy. 2009. Gray M atters: Social Scientists, Military Patronage, and Democracy in the Cold War. Journal of American History 96 (June): 99–122. ———. 2013. Armed with Expertise: The Militarization of American Social Research during the Cold War. Ithaca, N.Y.: Cornell University Press. Rossiter, Margaret W. 1995. Women Scientists in America: Before Affirmative Action, 1940–1972. Baltimore: John Hopkins University Press. ———. 2012. Women Scientists in America: More Struggles and Strategies since 1972. Baltimore: Johns Hopkins University Press. Rowland, D., and L. R. Speight. 2007. Surveying the Spectrum of H uman Behaviour in Front Line Combat. Military Operations Research 12 (4): 47–60. Rubinson, Paul. 2016. Redefining Science: Scientists, the National Security State, and Nuclear Weapons in Cold War America. Amherst: University of Massachusetts Press. Russell, Edmund. 1999. The Strange C areer of DDT: Experts, Federal Capacity, and Environmentalism in World War II. Technology and Culture 40 (4): 770–796. ———. 2001. War and Nature: Fighting Humans and Insects with Chemicals from World War I to Silent Spring. Cambridge: Cambridge University Press.
268 References
Russell, Edmund P. 1996. Speaking of Annihilation: Mobilizing for War against H uman and Insect Enemies, 1914–1945. Journal of American History 82 (4): 1505–1529. Russell, I. Willis. 1946. Among the New Words. American Speech 21 (4): 295–300. Sachse, Carola, and Mark Walker, eds. 2005. Politics and Science in Wartime: Comparative International Perspectives on the Kaiser Wilhelm Institute. Osiris, 2nd series, vol. 20. Chicago: University of Chicago Press. Santos, Ricardo Ventura, Susan Lindee, and Vanderlei Sebastião de Souza. 2014. Varieties of the Primitive: Human Biological Diversity Studies in Cold War Brazil (1962–1970). American Anthropologist 116 (4): 723–735. Scarry, Elaine. 1985. The Body in Pain: The Making and Unmaking of the World. Oxford: Oxford University Press. Schevitz, Jeffrey M. 1979. The Weaponsmakers: Personal and Professional Crisis during the Vietnam War. Cambridge, Mass.: Schenkman. Schmidt, Carl. 1943. Some Physiological Problems of Aviation. Transactions & Studies of the College of Physicians of Philadelphia 11: 57–64. Schultz, Timothy P. 2018. The Problems with Pilots: How Physicians, Engineers, and Airpower Enthusiasts Redefined Flight. Baltimore: Johns Hopkins University Press. Schwartz, Carl Leon. 1995, July 19. Interview by Patrick Catt [tape recording]. Oral History Interviews. Niels Bohr Library & Archives. American Institute of Physics, College Park, Maryland. Available online at http:/www.aip.org/history-programs /niels-bohr-library/oral-histories/5913. Schwartz, Charles. 1971. A Physicist on Professional Organization. In The Social Responsibility of the Scientist, edited by Martin Brown. London: Collier-MacMillan, 17–34. ———. 1996. Political Structuring of the Institutions of Science. In Naked Science: Anthropological Inquiry into Boundaries, Power, and Knowledge, edited by Laura Nader. New York: Routledge, 148–159. Schwartz, Stephen I., ed. 1998. Atomic Audit: The Costs and Consequences of U.S. Nuclear Weapons Since 1940. Washington, D.C.: Brookings Institution Press. Scott, James. 1988. Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed. New Haven, Conn.: Yale University Press. Scott, Wilbur J. 1988. Competing Paradigms in the Assessment of Latent Disorders: The Case of Agent Orange. Social Problems 35 (2): 145–161. Selcer, Perrin. 2008. Standardizing Wounds: Alexis Carrel and the Scientific Management of Life in the First World War. British Journal for the History of Science 41 (1): 73–107. Shell, Hanna Rose. 2012. Hide and Seek: Camouflage, Photography, and the Media of Reconnaissance. New York: Zone Books. Shephard, Ben. 2000. A War of Nerves: Soldiers and Psychiatrists, 1914–1994. London: Jonathan Cape. Sherwin, Martin J. 1975. A World Destroyed: The Atomic Bomb and the Grand Alliance. New York: Alfred A. Knopf. Siegfried, Tom. 2011. Atomic Anatomy: A C entury Ago, Ernest Rutherford Inaugurated the Nuclear Age. Science News 179 (10) (May 7): 30–32.
References
269
Silverman, David J. 2016. Thundersticks: Firearms and the Violent Transformation of Native America. Cambridge, Mass.: Harvard University Press. Sime, Ruth Lewin. 1996. Meitner, Frisch Got to Fission Theory First—The Rest Found It in Nature. Physics Today 49 (7): 92. ———. 2012. The Politics of Forgetting: Otto Hahn and the German Nuclear-Fission Project in World War II. Physics in Perspective 14 (1): 59–94. Simpson, Christopher. 1996. Science of Coercion: Communication Research and Psychological Warfare, 1945–1960. Oxford: Oxford University Press. Singer, P. W. 2009. Robots at War: The New Battlefield. The Wilson Quarterly 33 (1): 30–48. ———. 2010. Wired for War: The Robotics Revolution and Conflict in the Twenty-first Century. New York: Penguin Books. Small, Hugh. 1998. Florence Nightingale: Avenging Angel. New York: St. Martin’s Press. Solovey, Mark. 2013. Shaky Foundations: The Politics-Patronage-Social Science Nexus in Cold War America. New Brunswick, N.J.: Rutgers University Press. Solovey, Mark, and Hamilton Cravens. 2012. Cold War Social Science: Knowledge Production, Liberal Democracy, and Human Nature. New York: Palgrave Macmillan. Spencer, Brett. 2014. Rise of the Shadow Libraries: America’s Quest to Save Its Information and Culture from Nuclear Destruction during the Cold War. Information & Culture 49 (2): 145–176. Spiller, Roger. 2006. Military History and Its Fictions. Journal of Military History 70 (4): 1081–1097. Stacy, Ian. 2010. Roads to Ruin on the Atomic Frontier: Environmental Decision Making at the Hanford Nuclear Reservation, 1942–1952. Environmental History 15 (3): 415–448. Stanley, Matthew. 2003. “An Expedition to Heal the Wounds of War”: The 1919 Eclipse and Eddington as Quaker Adventurer. Isis 94 (1): 57–89. Stark, Laura. 2016. The Unintended Ethics of Henry K Beecher. The Lancet 387 (10036): 2374–2375. Stellman, Jeanne Mager, Steven D. Stellman, Richard Christian, Tracy Weber, and Carrie Tomasallo. 2003. The Extent and Patterns of Usage of Agent Orange and Other Herbicides in Vietnam. Nature 422 (17): 681–687. Stewart, Irvin. 1948. Organizing Scientific Research for War: The Administrative History of the Office of Scientific Research and Development. Boston: Little, Brown. Stichekbaut, Birger, and Piet Chielens. 2013. The Great War Seen from the Air in Flanders Fields, 1914–1918. Brussels: Mercatorfonds. Stimson, Henry L. 1947. The Decision to Use the Atomic Bomb. Harper’s Magazine (February): 97–107. Strachan, Hew. 2006. Training, Morale, and Modern War. Journal of Contemporary History 41 (2): 211–227. Sumida, Jon Tetsuro. 1997. Inventing Grand Strategy and Teaching Command: The Classic Works of Alfred Thayer Mahan Reconsidered. Baltimore: Johns Hopkins University Press. Swann, John Patrick. 1983. The Search for Synthetic Penicillin during World War II. British Journal for the History of Science 16 (2): 154–190.
270 References
Swartz, Louis H. 1998. Michael Polanyi and the Sociology of a F ree Society. American Sociologist 29 (1): 59–70. Szabo, Jason. 2002. Seeing Is Believing? The Form and Substance of French Medical Debates over Lourdes. Bulletin of the History of Medicine 76 (2): 199–230. Teller, Edward, and Judith Shoolery. 2001. Memoirs: A Twentieth-Century Journey in Science and Politics. Cambridge, Mass.: Perseus Publishing. Travers, Timothy. 1987. The Killing Ground: The British Army, the Western Front, and the Emergence of Modern Warfare, 1900–1918. London: Allen & Unwin. Tye, Larry. 1998. The Father of Spin: Edward L. Bernays & the Birth of Public Relations. New York: Crown. US Strategic Bombing Survey. 1946a. The Effects of Atomic Bombs on Hiroshima and Nagasaki. Washington, D.C.: US Government Printing Office. US Strategic Bombing Survey. 1946b. Japan’s Struggle to End the War. Washington, D.C.: US Government Printing Office. US Strategic Bombing Survey. 1946c. Summary Report (Pacific War). Washington, D.C.: US Government Printing Office. US Strategic Bombing Survey. 1947. Index to Records of the United Stated Strategic Bombing Survey. June. http://www.ibiblio.org/hyperwar/NHC/NewPDFs/USAAF /United%20States%20Strategic%20Bombing%20Survey/USSBS%20Index%20 to%20Records.pdf. Van Keuren, David K. 1992. Science, Progressivism, and Military Preparedness: The Case of the Naval Research Laboratory, 1915–1923. Technology and Culture 33 (4): 710–736. ———. 2001. Cold War Science in Black and White: U.S. Intelligence Gathering and its Scientific Cover at the Naval Research Laboratory, 1948–1962. Social Studies of Science 31 (2): 207–229. Vanderbilt, Tom. 2002. Survival City: Adventures among the Ruins of Atomic America. New York: Princeton Architectural Press. Veys, Lucy. 2013. Joseph Rotblat: Moral Dilemmas and the Manhattan Project. Physics in Perspective 15 (4): 451–469. Walker, J. Samuel. 1996. The Decision to Use the Bomb: A Historiographical Update. In Hiroshima in History and Memory, edited by Michael J. Hogan. Cambridge: Cambridge University Press, 11–37. ———. 2009. The Road to Yucca Mountain: The Development of Radioactive Waste Policy in the United States. Berkeley: University of California Press. Walter Reed Army Institute of Research. 1955. Battle Casualties in Korea: Studies of the Surgical Research Team. Vol. 1, The Systemic Response to Injury. Washington, D.C.: Walter Reed Army Medical Center. Wang, Jessica. 1992. Science, Security, and the Cold War: The Case of E. U. Condon. Isis 83 (2): 238–269. ———. 2012. Physics, Emotion, and the Scientific Self: Merle Tuve’s Cold War. Historical Studies in the Natural Sciences 42 (5): 341–388. Watson, John B. 1924. Behaviorism. New York: W. W. Norton.
References
271
Webster, S. C., M. E. Byrne, S. L. Lance, C. N. Love, T. G. Hinton, D. Shamovich, and J. C. Beasley. 2016. Where the Wild Things Are: Influence of Radiation on the Distribution of Four Mammalian Species within the Chernobyl Exclusion Zone. Frontiers in Ecology and the Environment 14 (4): 185–190. Wessely, Simon. 2006. Twentieth-Century Theories on Combat Motivation and Breakdown. Journal of Contemporary History 41 (2): 269–286. West, S. S. 1960. The Ideology of Academic Scientists. IRE Transactions on Engineering Management EM-7 (2): 54–62. Westwick, P. J. 2003. The National Labs: Science in an American System, 1947–1974. Cambridge, Mass.: Harvard University Press. Wiener, Jon. 2012. The Graceland of Cold War Tourism: The Greenbrier Bunker. Dissent 59 (3): 66–69. Williams, Keith. 2008. Reappraising Florence Nightingale. British Medical Journal 337 (7684): 1461–1463. Willis, Kirk. 1997. “God and the Atom”: British Churchmen and the Challenge of Nuclear Power 1945–1950. Albion: A Quarterly Journal Concerned with British Studies 29 (3): 422–457. Winkler, Jonathan Reed. 2015. Telecommunications in World War I. Proceedings of the American Philosophical Society 159 (2): 162–168. Winter, Jay. 2006. Remembering War: The Great War between Memory and History in the 20th Century. New Haven, Conn.: Yale University Press. Wolfe, Audra. 2013. Competing with the Soviets: Science, Technology, and the State in Cold War America. Baltimore: Johns Hopkins University Press. ———. 2018. Freedom’s Laboratory: The Cold War Struggle for the Soul of Science. Baltimore: Johns Hopkins University Press. Woodworth, Robert S. 1959. John Broadus Watson: 1878–1958. American Journal of Psychology 72 (2): 301–310. Wright, Pearce. 2004. Obituary: Norman George Heatley. The Lancet 363 (February 7): 495. Yavenditti, Michael. 1974. The American People and the Use of Atomic Bombs on Japan: The 1940s. The Historian 36 (2): 224–247. Zachary, G. Pascal. 1997. Endless Frontier: Vannevar Bush, Engineer of the American Century. New York: Free Press. Zilboorg, Gregory. 1938. Propaganda from Within. Annals of the American Academy of Political and Social Science 198: 116–123. Zweiback, Adam J. 1998. The 21 “Turncoat GIs”: Nonrepatriations and the Political Culture of the Korean War. The Historian 60 (2): 345–362. Zwigenberg, Ran. 2014. Hiroshima: The Origins of Global Memory Culture. Cambridge: Cambridge University Press. Zworykin, Vladamir K. (1934) 1946. Flying Torpedo with an Electric Eye. In Television, vol. 4, edited by Arthur F. Van Dyck, Edward T. Dickey, and George M. K. Baker. Princeton, N.J.: RCA Review, 293–302.
ACKNOWLE DGMENTS
As every scholar knows, there are many debts in a project like this. I will start with the hundreds of Penn undergraduate students in a course, Science, Technology and War, that I have been teaching for many years. These students asked me provocative and challenging questions and forced me to think critically about knowledge and violence. In my struggle to help them understand some fairly confusing questions, I learned a great deal. They have been willing subjects in my various efforts to make sense of these materials, tolerant of my sometimes-meandering lectures, and interested and engaged with the big issues. Teaching is a remarkable privilege. It is an honorable and powerful way to make one’s way through life. I conceived of this book in terms of my students. I am also deeply indebted to my wonderful colleagues at Penn, who sustain a community of scholarly engagement that consistently enlightens me. Robert Aronowitz, David Barnes, Etienne Benson, Stephanie Dick, Sebastian Gil-Riano, Harun Küçük, Beth Linker, Ramah McKay, Jonathan Moreno, Projit Mukharji, and Heidi Voskuhl made me laugh, helped me figure things out, shared lunches and dinners (and faculty meetings . . .), and generally supported my work. I feel fortunate that Penn’s outstanding Department of the History and Sociology of Science is my professional home.
273
274
Acknowle dgments
My PhD students—many of whom are cited in this book—have similarly been crucial supporters of my work. John Terino read and commented critically on the manuscript; Mary X. Mitchell helped me improve my most important arguments. Kate Dorsch transformed an incoherent list of sources into a beautiful list of references. I have learned a lot working with the amazing scholars who found their way to our program and chose to work with me, including Joanna Radin, Mary X. Mitchell, Rosanna Dent, Kate Dorsch, Britt Shields, Sumiko Hatakeyama, Katya Babintseva, Joy Rohde, Perrin Selcer, Austin Cooper, Andy Hogan, Samantha Muka, Jason Oakes, Kristoffer Whitney, Jessica Martucci, Roger Turner, Andi Johnson, Paul Burnett, and Chloe Silverman. Audra Wolfe, who was my student long ago, provided a helpful critical reading of the manuscript. I am deeply indebted to the many scholars whose conversations, insights, and ideas informed this study, including Jessica Wang, John Krige, Kelly Moore, Sarah Bridger, Warwick Anderson, Angela Creager, Ian Burney, Charles Rosenberg, Hans Pols, Morris Low, Ian Burney, Paul Forman, John Terino, Vassiliki Betty Smocovitis, Miriam Solomon, Steven Feierman, Marty Sherwin, Gisela Mateos, Edna Suarez, Alison Kraft, Elena Aronova, Michael Gordin, Joanna Radin, and Maria Jesus Santesmases. Santiago Peralto Ramos helped me with images and permissions. My son Grant Skakun, who is a professional editor, read the entire manuscript carefully and noted many points that helped me to clarify the text. I am grateful for feedback from participants at workshops and conferences where these ideas were explored, including Purdue University; Manchester University; Yale University; the Dark M atters conference in Barcelona in 2013; University of Massachusetts; Hiroshima University; Universidad Nacional Autonoma de Mexico (UNAM) in Mexico City; the Graduate University for Advanced Studies (Sokendai), Hayama, Japan; Kobe University; Nanyang Technological University in Singapore; the Japan Society for Science, Technology, and Society; Johns Hopkins University; and the University of Sydney. Personal debts are profound to Miriam Solomon and Annette Lareau; to my beloved sisters Marguerite and Lauren Lindee; to my b rothers Michael, Herbert, and Charles Lindee; and to dear friends George Gerton, Karen-Sue Taussig, Maria Sanchez Smith, Betty Smocovitis, Scott Gilbert, Jean-Marie Kneeley, Amira Solomon, Eve Trout Powell, Jessica Getson, Pat Pellerin, Sally Seiler, and Imogen Warren. Friends and f amily laughed at my jokes, took me out birding or hiking, and tolerated long conversations about violence. I am grateful to my sons Grant Skakun and Travis Skakun. Grant helped me with editing; Travis came over to make wonderful stir-fries and spaghetti sauces. The editorial and production staff at Harvard University Press played a supportive and constructive role throughout this process. Janice Audet was an engaged
Acknowle dgments
275
editor, working with me with kindness and patience throughout the difficult labor of finishing a book. Janice read e very page and provided a running commentary that made a huge difference in the final manuscript. Four anonymous reviewers encouraged and corrected me. Emeralde Jensen-Roberts was incredibly helpful and well-informed as I tried to navigate permissions for images. There is one other person to whom I must express my gratitude. Late in this project I happened upon my beat-up old paperback copy of Dorothy Nelkin’s 1972 study, The University and Military Research: Moral Politics at MIT. The book was, to be honest, misfiled in my basement, with some old travel books. I w asn’t looking for it. But I picked it up and started reading, and I was immediately enthralled. This was the third book in a series on science and technology as they intersected with public needs, controversies, and constraints. It was Nelkin’s fifth book—the first two on migrant laborers, and then Nuclear Power and Its Critics (on the Cayuga Lake controversy) and The Politics of Housing Innovation. Reading her book on MIT, just as I was wrapping up this project, brought home to me her profound and enduring influence on the ways I think. Dot, the name we called her, was my adviser, mentor, co-author, and friend. I met her long after this 1972 book was published, and we never discussed it. Yet in this book, she explored many of the themes that animated my own interests and goals in Rational Fog. I was in some ways following her, without entirely realizing it. Dot died of cancer in the spring of 2003. The empty space she left in our community will never be filled. I dedicate this book to her.
INDEX
Note: Figures are indexed in italic. Aberdeen Proving Ground, 1–2 Adams, Scott, 197 Agent Orange, 91, 151–154 airplanes, studies on bodily impact of, 134–135 air power: emotions produced by, 228; and speculation on science and f uture wars, 83–84; and United States Strategic Bombing Survey, 116, 128; in WWI, 83–84, 134–135 Alamogordo bombing range, 184–185 Alaska, nuclear testing in, 190–191 Alder, Ken, 22, 49, 51 altitude sickness, 136–137 American expansion, 56–57 American Society for Microbiology, 219–220 Anderson, Benedict, 61 antibiotics, 95–99, 106, 107–108 antiepistemology, 16 antipersonnel mines, 148–150
Applied Fisheries Laboratory, 182 Arctic, nuclear testing in, 190–194 Arendt, Hannah, 172, 202 Armed Forces Ordnance Museum, 1–2 arms control, in Japan, 33–34 Arnold, Henry “Hap,” 135 artillery, standardization of, 46–50 artillery teams, as sociotechnical system, 5–6 Association of Pasadena Scientists, 221 atomic bomb: development of, 102–106; long-term consequences of, 108–109; Pius XII on, 244n29; and scientists’ critique of militarized science, 217. See also Hiroshima and Nagasaki; nuclear weapons Atomic Bomb Casualty Commission, 123–124, 127. See also Radiation Effects Research Foundation (RERF) Atomic Bombings of Hiroshima and Nagasaki, The (MED report), 114, 120–123, 126, 128
277
278 Index
Atomic Energy Commission, 185–186 Atwater, William, 1 aviation medicine, 135–140
Bush, Vannevar, 89–90, 91–94 Bynum, Carolyn, 133 Byrnes, James, 112
B-17 Flying Fortress, 135–136 Babbage, Charles, 52–54 Bailey, Joseph C., 214 Bainbridge, Kenneth, 184–185 ballistics experiments, 4–5, 144–150 Barkan, Robert, 231, 232 Barnard, George, 209–210 battlefield care of soldiers, 46–47, 51–55 Battle of Gettysburg, 40 Becquerel, Henri, 102–103 Beecher, Henry, 140–143, 246n24 behaviorism, 170 Benedict, Ruth, 126 Bernays, Edward, 159–160, 163–164 Beyer, James, 148 Biddle, Tami Davis, 83 Bikini, nuclear testing at, 179, 189–190 Biological Weapons Convention (1972), 62 Blackett, P. M. S., 112 Blanc, Honore, 50–51 Blandy, Henry Purnell “Spike,” 188 Blue Marble, 181, 199–202 Board for the Study of the Severely Wounded, 142–143 body: chemical herbicides’ impact on, 151–154; flight’s impact on, 134–136; and friendly fire, 150–151; and Gulf War illness, 154–155; injured, and understanding social life, 133–134; injured, as byproduct of war, 157–158; injured, as evidence, 132, 158; and production of experimental injury, 156–157; studies on flight’s impact on, 134–140; studies on grievously injured, 140–144; as target, 132; and wound ballistics, 4–5, 144–150 Bousquet, Antoine, 44–45 Bradley, David, 189–190 brainwashing, 161, 170–174 Buchanan, Brenda, 34 bunkers, 196–199
Camp Century, 191–192, 193 cannon, 49 Cantril, Hadley, 166 capitalism, 166–167 care of soldiers, 46–47, 51–55 Carson, Rachel, 108 Caspari, Ernst, 210–211 cats, ballistics experiments on, 4–5, 146, 147 Central Intelligence Agency (CIA), 165–166, 174–175 Chain, Ernst, 96–97 Chamayou, Gregoire, 232, 233 Chandler, Kathryn, 231 Chaney, Ralph Works, 205–206 chemical weapons: long-term consequences of, 74–75; and militarization of civilian sciences, 91; used in Vietnam, 151–154; used in WWI, 5, 63, 68–74; used in WWII, 99–102, 106–107 Chemical Weapons Convention (1993), 62 China: brainwashing in, 170–171; gunpowder warfare in, 23–24 chlorine gas, 71–72, 73 cigarettes, propaganda concerning, 163–164 civilian knowledge systems, 90–91 civilians: as consumers of weapons, 6, 114; in discussions on technology in warfare, 45–46, 62 Civil War, 40 Clark, Elmer F., 190 Clark, John W., 230 Clausewitz, Carl von, 14–15, 228 Claymore mines, 148–150 clothing, 36 Cloud, John, 194–195 Cocoanut Grove Fire, 99 code word community, 194–195 Coe, George, 146–148 Cohn, Carol, 224–225
Index
Cold War: and cold-region nuclear testing, 190–194; and desert nuclear testing, 184–188; and images of earth taken from space, 199–202; and impact of Hanford plutonium site, 181–184; and island nuclear testing, 188–190; and knowledge built into geographic expansion, 178–179; and nuclear weapons, 179–181; and satellite projects, 194–196; scientists’ perception of, 221–223; and tensions facing scientists, 203–204; and underground bunker construction, 196–199; and wartime scientific secrecy, 205–210 collateral damage, 11 collateral data, 11–12 Command of the Air, The (Douhet), 84 Committee for the Protection of Cultural and Scientific Resources, 197 Committee on Aviation Medicine, 137 Communism, 10, 170–172 Conant, James, 10–11, 89, 223 Condon, E. U., 208 Conqueror, The, 184 consumers, noncombatants as, 6, 114 contracts, 93–94 Coordinated Investigation of Micronesian Anthropology (CIMA), 212 Cornell-Peru Project, 167–169 corning, 27 CORONA satellite reconnaissance program, 194–196 Cousins, Norman, 189 Crawford, Neta, 227, 228 Creel, George, 161 Cressy, David, 26 Crimean War (1853–1856), 51–54 Cullings, Harry, 130–131 Curie, Marie, 103 Curie, Pierre, 103 cyberwar, 234–235 danger, premonitions of, 175 DDT (dichloro-dyphenyl trichloroethane), 95, 99–102, 106–107, 108
279
decompression sickness, 136–137, 138, 246n14 deep tank fermentation, 98 Defense Advanced Research Projects Agency (DARPA), 176 defense spending: scientists’ critique of, 220–221; scientists’ understanding of, 210–211; of United States, 7–10 deHaven, Hugh, 138–140 Demchak, Chris, 234–235 Demerec, Milislav, 98–99 democracy, 10–11 Dennis, Michael, 9 Department of Defense (DOD), 8 DeVictor, Maude, 153 Diamond, Jared, 41 Differential Analyzer, 92 dioxin, 151, 152–154 diseases, insect-borne, 100–102, 108 dogs, used in WWI, 72, 73 d’Olier, Franklin, 116 Douhet, Guilio, 84, 134 Dreadnought, 57, 60–61 dreadnoughts, 60–61 drills, systematic, 22, 29–30, 41–42 drones, 229–233 drugs, and mind control, 174–175 DuBois, Eugene, 138, 245n10 Dulles, John Foster, 174 dung, and saltpeter acquisition, 26, 27 Earthrise, 199–201 Easlea, Brian, 216–217 Eddington, Arthur S., 81 Edington, Charles W., 211 education: military, 48; and psychological control, 160 Effects of Atomic Bombs on Hiroshima and Nagasaki, 116 Eichmann, Adolph, 172, 173–174 Einstein, Albert, 81, 84–85, 103, 104 emotional trauma, 75–78, 126–128
280 Index
emotions: and science, 224–227; war as domain of, 228–233 epistemology, 16 excrement, and saltpeter acquisition, 26, 27 factory workers, killing, 46 Farrell, Thomas F., 120–121 fascism, 10 Faust, Drew, 47 fear, and wartime mind control, 176 feelings: and science, 224–227; war as domain of, 228–233 feminism, 13 Fenn, Wallace, 137 Fermi, Enrico, 103 feudalism, 27 Fleck, Ludwik, 226–227 Fleming, Alexander, 94–96 Flexner, Louis B., 137, 246n14 flight, studies on bodily impact of, 134–140 flintlocks, 28, 31 Florey, Howard, 96, 97 Ford, Henry, 43 forest warfare, 30–31, 42, 151–154 Forman, Paul, 6–7 Franck, James, 71 Freud, Sigmund, 84–85 friendly fire, 150–151 Frisch, Otto, 103 Fulton, John, 133, 136, 137 Galbraith, John Kenneth, 119 Galison, Peter, 16, 116 Galston, Arthur, 91 gas masks, 72 Gayer, Richard, 211–212 Geiger, Hans, 71 gender: and brainwashing, 173–174; compared to violence in scientific community, 222; and emotional trauma, 76; and emotions in scientific warfare, 225 Genesis and Development of a Scientific Fact, The (Fleck), 226
genetic effects of radiation, 123–125, 128 Gentile, Gian Peri, 118–120 German scientists, 80–82, 103–104 Gettysburg, Battle of, 40 global warming, 192–193 Goodenough, Ward, 212 Gordin, Michael, 113 Grant, Eric, 130–131 Greenbriar Resort, 196–197, 199 Greenland, 191–192, 193 Grew, Joseph, 112 Gribeauval, Jean Baptiste de, 49–50 Gross, Rachel S., 36 Groves, Leslie, 104–105, 120–121, 182 Grow, Malcolm, 86, 243n1 Gulf War illness, 154–155 gunpowder: challenges concerning, 22; components and manufacturing of, 25–27; history of, 23, 24 gunpowder empires, 34–36 guns: adoption of European-style, 37; challenges concerning, 21–22; and components and manufacturing of gunpowder, 25–27; European reception of, 24; forms of, 28–29; French program to build better musket, 50; historical impact of, 21, 27–28; in India, 35; invention of, 23–24; in Japan, 32–34; and mock firing, 37–41; Native Americans’ use of, 30–32, 42; in Ottoman Empire, 34–35; and promotion of European social and political order, 35–36; reception of, 24–25; in Russia, 35; in Safavid Empire, 35; and slave trade, 36–37; sociotechnical systems surrounding, 22–23; systematic drills with, 22, 29–30; and technological choice, 40–42 Guynemer, Georges, 135 Haber, Fritz, 64, 70–71, 81, 82, 242n11 Haber, Ludwig, 242n11 Hahn, Otto, 71 Hale, George Ellery, 82 Hall, G. Stanley, 161
Index
Hampton Roads, Battle of (1862), 57–60 Hanford plutonium site, 181–184 Haraway, Donna, 12–13 Harvey, E. Newton, 146 Heatley, Norman, 97 herbicides, 91, 151–154 Herget, Carl M., 146 Herz, Gustav, 71 hidden curriculum, 206 Hideyoshi, Lord, 33 High Tech Gays, 212 Hiroshima and Nagasaki: biomedical lessons drawn from, 123–125, 128; forensic explorations of damage caused by bombing of, 114–115, 126; impact of bombing of, 114, 129; and Manhattan Engineer District report, 120–123, 128; and procedures associated with atomic bomb, 111; Radiation Effects Research Foundation’s study of, 130–131; reasons for bombing of, 111–114; as site of scientific research, 130; trauma caused by bombing of, 126–128; understanding nuclear bombing of, 110–111; and United States Strategic Bombing Survey, 116–120, 128 Hollinger, David, 10 Holmberg, Alan, 168–169 Holocaust, 172 homosexuals, security clearance revoked from, 211–212 Humphries, Margaret, 155 Hunter, Edward, 170 Iijima, Soichi, 129 Immerwahr, Daniel, 55 Impact of the A Bomb: Hiroshima and Nagasaki 1945–1985, 129 India, guns in, 35 indigenous people, 30–32, 42, 167–168 industrialization: Blanc and, 50–51; discussions concerning, 45–46, 61–62; Gribeauval and, 49–50; and identification of soldiers, 47–48; impact of, 43–45; Mahan and,
281
55–57; Nightingale and, 51–55; scholarship on, 44–45; and sea-going machines, 57–61 insect-borne diseases, 100–102, 108 interchangeability, 46–50. See also standardization internationalistic science, 78–83 International Research Council, 82–83 internet, and cyberwar, 234–235 Iron Mountain, 198 Irwin, Will, 83 Jacobs. Bo, 180 Japan, guns in, 32–34 Jensen, Uffa, 226 Jewett, Frank, 88 Joint Commission, 123 Kahn, Herman, 14 Kaldor, Mary, 16 Keefer, Chester, 99 Keeler, William F, 58, 59 Kennedy, John L., 168 Kevles, Daniel, 82, 89 Kleinschmidt, Harald, 33 Knaust, Herman, 198 knowledge: political structures supporting, 5; production of violence and, 16–17; reasons for knowing, 4–5; technologies as evidence of past, 3–4 Korean War, 143–144, 146–150, 171 Kuhn, Thomas, 140, 226–227 Kurtis, Bill, 153 Lamport, Harold, 137 Lang, Serge, 207, 211 Lasswell, Harold, 161, 164–166, 168–169 Lazier, Benjamin, 202 Lederer, Susan, 141 lewisite, 73–74 Lifton, Robert Jay, 127–129, 171–172 literacy, 160 Lorge, Peter, 23
282 Index
Lucky Strike cigarettes, 163–164 Luria, Salvador, 220 Luther, Martin, 24 Machiavelli, Niccolò, 24 Mahan, Alfred Thayer, 55–57 Makhijani, Arjun, 187 malaria, 101 Malone, Patrick, 30, 31 Manhattan Engineer District (MED) report, 114, 120–123, 126, 128 Manifesto of the Ninety-Three Intellectuals (1914), 80–81 Marshall, George, 228 Marshall, S. L. A., 37–38, 39, 40 Marshall Islands, nuclear testing in, 188–190 Masco, Joseph, 108–109, 180, 187 masculinity, white, 173–174 mass production. See industrialization matchlocks, 28, 31 Maurice of Nassau, Prince of Orange, 29 Mauskopf, Seymour, 240n10 McNeill, William, 64 medicine, military, 46, 51–55 Meitner, Lise, 103 Merton, Robert K., 10 microbiology, 219–220 microfilm, 198 Milgram, Stanley, 172–174 military education, 48 military knowledge systems, 90–91 military medicine, 46, 51–55 Miller, George A., 214 mind: and Bernay’s study of propaganda, 159–160, 163–164; and brainwashing, 161, 170–174; drugs and manipulation of, 174–175; exploitation of, in war, 175–176; and Lasswell’s study of propaganda, 164–166; and management of indigenous people, 167–169; plasticity of identity and, 162; political and social scientific studies on, 162–163; and pro-capitalism propaganda, 166–167; scientific construction of,
as battlefield, 161–162, 169–170; and twentieth-century communications, 160–161; and understanding propaganda, 176–177; weaponization of, 12 Mindell, David, 57, 58 mines, 148–150 Mitchell, Billy, 84, 134 Mitchell, Mary X, 189 MKUltra Project, 174–175 mobile armies, challenges facing, 45 mock firing, 37–41 Moller, Anders, 179–180 Monitor, 57–60 Monroe, Marilyn, 229, 230 Mott, F. W., 77 Mousseau, Timothy, 179–180 Mueller, Paul H., 100 Mughals, 35 Muller, H. J., 124 mustard gas, 73 Myers, Charles, 76 Nagasaki. See Hiroshima and Nagasaki narratives of resentment, 95 National Archives and Records Administration (NARA), 198 National Research Council, 89 Native Americans, 30–32, 42, 167–168 Nayar, Sheila, 27–28 Nazi propaganda, 160 Neel, James V., 124 Nevada Proving Ground, 186–188 Next War, The (Irwin), 83 Nicholson, Ian, 173–174 Nightingale, Florence, 51–55 Niwa, Ohtsura, 125 Nobel Prize, 80 no-man’s land, 68, 69 noncombatants: as consumers of weapons, 6, 114; in discussions on technology in warfare, 45–46, 62 Norden, Carl, 84 normal science, 140
Index
nuclear weapons: consumption and understanding of, 6; developed during WWII, 102–106; and emotions in scientific warfare, 225; long-term consequences of, 108–109; procedures associated with, 111; sciences in development and impact of, 130; US preparedness against, 117–118, 120. See also Cold War; Hiroshima and Nagasaki Obama, Barack, 188 Ocean Dumping Act (1972), 74 Office of Naval Research, 9 Office of Scientific Research and Development (OSRD), 9, 87, 91, 92–94, 109 Olson, Frank, 175 Operation Plumbbob, 198 Operation Ranch Hand, 151–152 Operation Teapot, 197–198 Oppenheimer, J. Robert, 105, 213 Orth, Charles D., 213, 214 Ottoman Empire, guns in, 34–35 Owens, Larry, 91, 94 Paret, Peter, 14, 240n19 (intro.) Parker, Geoffrey, 29 Pauling, Linus, 220 penicillin, 95–99, 106, 107–108 Perrin, Noel, 34 Perry, Matthew, 34 Pfizer, 98 phosgene, 73 physics, scientists’ critique of, 217–219 Physics Today, 218–219 Picard, Emile, 82 Pius XII, Pope, 244n29 Planck, Max, 81–82 Polanyi, Michael, 10 political knowledge: and scientific and technological knowledge, 5; technologies as evidence of past, 3–4 Pollard, Ernest, 204–205 Porter, Roy, 217 Powers, Gary, 229
283
Predator drone, 231–232 Pribilsky, Jason, 167, 169 Price, David H., 169 Probstein, Ronald F., 216 Project Ice Worm, 191–192 Prokosch, Eric, 144, 145, 148–149 propaganda: Bernays and, 159–160, 163–164; and capitalism, 166–167; Lasswell and, 164–166; Zilboorg on, 176–177 psychological warfare, 162 PTSD, 77–78. See also shell shock pyrethrum powder, 100–101 Radiation Effects Research Foundation (RERF), 123, 125, 130–131 radioactivity, 102–104, 122–125, 128, 130–131, 179–180. See also nuclear weapons ratio of fire, 38 record keeping and preservation, 47, 197–199 relativity, Einstein’s theory of, 81 remotely piloted aircraft (RPAs), 229, 231 resentment, narratives of, 95 Richards, A. N., 97, 245n10 Rickenbacker, Eddie, 135 Ricketts, Henry, 137 Rilke, Rainer Maria, 84 Rivers, W. H. R., 77 Roe, Anne, 205–206 Roentgen, Wilhelm, 102 Rogers, Fred, 208 Rohde, Joy, 162–163 Roosevelt, Franklin, 92, 104, 106 Rosser, J. Barkley, 209 Rotblat, Joseph, 105–106, 220 Russell, Edmund, 101 Russell, I. Willis, 228 Russia, guns in, 35 Rutherford, Ernst, 103 Safavid Empire, guns in, 35 saltpeter, 26–27 Sands Point Research Project, 245n10 Scarry, Elaine, 157–158
284 Index
Schevitz, Jeffrey, 215–216 Schmidt, Carl, 137 Schull, William J., 124 Schultz, Timothy P., 135 Schuster, Arthur, 82 Schwartz, Charles, 20, 218–219 Schwartz, Stephen I., 187 science and scientific research: civilian versus military, 90–92; during Cold War, 221–223; and development and impact of nuclear weapons, 130; emotion and, 224–227; future mobilization of, 203; internationalistic, 78–83; militarization of, 6–7, 16, 214–221; mobilization of, during WWI, 89–90; mobilization of, during WWII, 86–89, 106–109; normal science, 140; and power relations, 228; speculation on future wars and, 83–85; US federal defense support of, 8–10; and violence, 222, 235–236 Science for the People (SftP), 214–215, 219, 220, 223 scientific knowledge: attack on, 19; and collateral data, 11–12; democracy and, 10–11; epistemological power of, 12–13; funded by United States, 7–10; political structures supporting, 5; repurposing of, in WWII, 94–95, 106–109 scientific method, 19–20 scientists: alienation of expert, 214–215; critique defense spending, 220–221; critique militarization of science, 215–220; exclusion and harassment of, 211–214; perceptions of, of Cold War, 221–223; tensions facing Cold War-era, 203–204; and wartime secrecy, 204–210 Scientists and Engineers for the Social and Political Action (SESPA), 219 Scott, James, 239n4 sea power, 55–61 secrecy, scientists and wartime, 204–210 Selcer, Perrin, 96 Sergeant Stubby, 72, 73
sexual orientation, security clearance revoked based on, 211–212 shadow libraries, 197–199 shell shock, 75–77, 78 Shils, Edward, 207 Silverman, David J., 30, 31, 32 Sime, Ruth, 103 Simpson, Christopher, 163 slave trade, 36–37 Smale, Stephen, 211 Smyth, Henry deWolf, 213 social knowledge: and scientific and technological knowledge, 5; technologies as evidence of past, 3–4; and use of technologies, 5–6 sociotechnical systems, 5–6, 22–23, 196 soldiers: changing meanings of, 46–47; identification and numbering of deceased, 47–48 South Carolina, 60 Soviet Union: and Japanese surrender in WWII, 114, 119; Powers shot down over, 229 Spencer, Brett, 197, 199 “Spidey sense,” 175 Sputnik, 202 Squire, Peter, 175 Stadler, Lewis, 124 standardization, 45, 46–50, 79 standpoint epistemology, 3, 14, 110 Stanley, Matthew, 81 Stark, Laura, 141 statistics, Nightingale’s use of, 52–54 Steinberg, Arthur, 213 Stellman, Jeanne Mager, 151–152 Stellman, Steven D., 151–152 Stimson, Henry L., 111, 112 Subcommittee on “Decompression-Sickness,” 136–137, 138 Surgical Research Team report, 144 Swadesh, Morris, 212–213 Sweeney, Walter, 229 systematic drills, 22, 29–30, 41–42 Szilard, Leo, 104
Index
tanks, seductive qualities of, 1–2 tear gas, 71 technological choice, 40–42, 67–68 technological knowledge: funded by United States, 7–10; political structures supporting, 5; relationship between socially sanctioned violence and, 12 technologies: civilian versus military, 90–92; discussions concerning, 45–46; as evidence of workings of past social and political systems, 3–4; and identification of soldiers, 47–48; impact of, 74–75; seductive qualities of, 1–3, 224; social belief and context in use of, 5–6; speculation on f uture wars and, 83–85; used in WWI, 63, 66–67. See also industrialization telegrams, 66–67 Teller, Edward, 217, 219, 251n45 Terino, John, 184 terror bombing, 161, 228–233 terrorism, 16 “total” war, 46 trauma, 75–79, 126–128 traveling armies, challenges facing, 45 Trenchard, Hugh, 84, 134 trench warfare, 65, 67–68 Trinity Test, 182–183, 184–185 troop training, 38–39 Truman, Harry S., 106, 112–113 Tukey, John W., 209–210 Tuve, Merle, 222 Tye, Larry, 163 typhus, 100–102 Ulam, Stanley, 217 underground bunkers, 196–199 United States: as military power and scientific and technological powerhouse, 7–10; possible impact of nuclear bomb on, 117–118, 120 United States Strategic Bombing Survey (USSBS), 111, 113, 114, 116–120, 128
285
university scientific research, US federal defense support of, 9–10 unknown soldiers, 47–48 US Office of Scientific Research and Development (OSRD), 9, 87, 91, 92–94, 109 Van Keuren, David, 204 Veblen, Osvald, 89 Vicos Project, 167–169 Vietnam War, 151–154, 215, 218–219 violence: drone warfare and, 232; emotional responses to, 75–78; gaining insights into, 235–236; production of knowledge and, 16–17; and science, 222, 235–236; and trauma caused by atomic bomb, 126–128 Virginia, 57–58 von Braun, Werner, 195 Wang, Jessica, 222 Watson, John Broadus, 170 Wayne, John, 184 Wendt, Larry, 214–215 West, Samuel Stewart, 217–218 white masculinity, 173–174 Why War? (Einstein and Freud), 84–85 Wiener, Norbert, 89, 220 Wilkins, Robert, 137 Wilson, Lewis B., 145 Witch-Hunting, Magic and New Philosophy (Easlea), 217 Wolek, Francis W., 214 Woodruff, Charles, 144 world geography: and cold-region nuclear testing, 190–194; Cold War arms race’s impact on, 178–179; and desert nuclear testing, 184–188; and images of earth taken from space, 199–202; and impact of Hanford plutonium site, 181–184; and island nuclear testing, 188–190; nuclear weapons and militarized, 179–181; and satellite projects, 194–196; and underground bunker construction, 196–199
286 Index
World War I: air power in, 13–15, 83–84; beginnings of, 64–66; chemical weapons used in, 68–74; emotional trauma in, 75–78; innovations used in, 63–64; and internationalistic science, 80–82; mobilization of science during, 89–90; technologies used in, 66–67; trench systems of, 65; trench warfare and technological choice in, 67–68 World War II: American Society for Microbiology’s involvement in, 219–220; chemical herbicide development in, 151; and civilian versus military knowledge systems, 90–92; development of atomic bomb during, 102–106; mobilization of science during, 86–89; and Office of Scientific Research and Development, 92–94; repurposing of DDT
during, 99–102, 106–107; repurposing of penicillin during, 95–99; repurposing of scientific knowledge during, 94–95, 106–109. See also Hiroshima and Nagasaki wound ballistics, 4–5, 144–150 x-rays, 102–103 York, Herbert, 221 Yucca Flats, 186–187 Yucca Mountain Nuclear Waste Repository, 187–188 Zeidler, Othmar, 99–100 Zilboorg, Gregory, 176–177 Zuckerman, Solly, 145 Zworykin, Vladimir, 231