Читать книгу The War on Science - Shawn Lawrence Otto - Страница 18
ОглавлениеTurning and turning in the widening gyre
The falcon cannot hear the falconer;
Things fall apart; the centre cannot hold;
Mere anarchy is loosed upon the world,
The blood-dimmed tide is loosed, and everywhere
The ceremony of innocence is drowned;
The best lack all conviction, while the worst
Are full of passionate intensity.
—William Butler Yeats, 1919
An Intellectual Weapon
Science took an important leap in public consciousness during World War II, when it transformed from an exploration of nature into a means to win the war for democracy and against the tyranny that had overtaken Germany, Italy, and Japan. Radar and the atomic bomb were both Allied inventions that had major impacts on the war’s outcome, as did sonar, synthetic rubber, the proximity fuse, the mass production of antibiotics, and other key wartime innovations, with many of the efforts led by emigrants from an increasingly antiscience Third Reich.
The war didn’t start out that way, though. In fact, during the 1930s, Adolf Hitler was an early adopter of the latest science and technology, which he used to great political advantage. He forbade smoking around him because German scientists had shown a link between smoking and lung cancer. He based his politics of white supremacy on ideas he appropriated from early research into genetics. He barnstormed twenty-one cities by airplane—the first politician to use an airplane to campaign on that level—in his 1932 race for president against Paul von Hindenburg, an effort the campaign called “Hitler über Deutschland.” The Nazi Party mounted gramophones—at the time a relative novelty—on vehicles, using the public’s attraction to them to broadcast a uniform political message. Hitler lost the presidential election, but won enough support to be named chancellor in 1933. That year, the Third Reich introduced another weapon with which to spread mass Nazi ideology: the Volksempfänger, or “people’s receiver,” which was offered to the public at low cost and with great success. It had no international shortwave bands, only domestic, which the Nazis filled with propaganda and patriotic music. The world’s first regular television broadcast was instituted in Germany beginning in March 1935, with similar goals, and the Third Reich pioneered the use of the classroom filmstrip to inculcate uniform Nazi ideas about politics and racial pseudoscience in students. In short, Hitler placed science and technology in service of politics, leveraging its new power in ways no one had before.
As Hitler’s minister for armaments, Albert Speer, recounted at his trial in Nuremberg after the war,
Hitler’s dictatorship differed in one fundamental point from all its predecessors in history. It was the first dictatorship in the present period of modern technical development, a dictatorship which made complete use of all technical means for the domination of its own country. Through technical devices like the radio and the loudspeaker, eighty million people were deprived of independent thought.
Science and technology were employed as tools to spread authoritarian ideology and whip up extreme partisanship and nationalism. The German suspicion of government-conducted science, and the desire of citizens to have greater control over it, is likely a reaction to this misuse of science for ideological ends, and motivates European—and particularly German—attitudes toward science to this day.
Germany also made great strides in mechanized warfare and developed key technological advancements to the submarine and the ballistic missile. But the intolerance of the Nazi regime, and the elevation of authoritarian ideology and propaganda over knowledge and science, began to backfire. Berlin may have become a scientific and cultural capital in the late 1920s and early 1930s, but the Nazis considered the city’s artistic and scientific cross-pollination degenerate. As they elevated rhetoric and ideology over science and tolerance, Germany’s intellectuals began to either conform to Nazi authoritarianism or flee. Within a decade, German scientific and technological progress ground to a halt as the Third Reich lost many of its most creative minds to the United Kingdom and the United States.
Presiding over the American science war effort was Edwin Hubble’s boss, Vannevar Bush, an engineer and the president of the Carnegie Institution of Washington. There had been a lack of cooperation between the Europe-friendly science enterprise and the US military during World War I, and administrative barriers to the military’s adoption of new technologies, that Bush was anxious to avoid repeating—particularly with the vast influx of talent into the United States as a result of the expansion of right-wing totalitarianism across Europe. Albert Einstein was the most famous of these immigrants, but there were many others—some of them gay, many of them Jewish, most of them creative intellectuals from both the sciences and the arts. The entire Frankfurt School decamped and reconstituted itself as part of the University in Exile, a home for German and Italian intellectuals dismissed from their teaching jobs in Europe that was created at the New School in New York City. Later, the university also took in many leading French intellectuals at its École libre des hautes études, or Free School for Advanced Studies. Other universities similarly benefited, as did the US economy as a whole. Patents in the fields the émigrés studied increased by 31 percent over prior years, and the result was an innovation contagion. The émigrés’ arrival increased US innovation by attracting a new group of US researchers to their fields, rather than by increasing the productivity of incumbent inventors, according to Stanford economist Petra Moser. US inventors who collaborated with émigré professors began to patent at substantially higher levels in the 1940s and continued to be exceptionally productive in the 1950s, her study found.
The same was true of Southern California, where many of the giants of European cinema fled from Prague and other cities, breathing innovation and creativity into America’s fledgling storytelling industry, transforming Hollywood into the world’s leading cultural powerhouse. Writers, dramatists, architects, dancers, musicians, and philosophers—the gays, Jews, artists, gypsies, and intellectuals rejected by the Nazi jackbooters—similarly enriched US and UK culture with a flood of new ideas and innovations that created much of the West’s postwar culture.
Vannevar Bush saw this growing influx of talent and believed that science and technology would lead to military superiority for whichever country best exploited them. After the Germans invaded Poland in September 1939, Bush became convinced of the need to establish a federal agency that would coordinate US research efforts. He scheduled a hasty meeting in June 1940 with President Franklin D. Roosevelt, who approved the agency in less than ten minutes.
The National Defense Research Committee (NDRC), the forerunner to today’s National Science Foundation, was established on June 27. The open society, the wartime esprit de corps, the federal dollars, and the marshaling of talented citizens and émigrés organized the American science enterprise into an intellectual weapon unlike any seen before. Under the auspices of this and a related agency, Bush initiated and oversaw the development of the atomic bomb (until it was taken over by the military), as well as the development of radar, sonar, and numerous other inventions critical to the war effort, in addition to several significant medical advances, including the mass production of penicillin.
The End of Innocence
One of the four top scientists Bush would appoint to lead the NDRC was Harvard president James B. Conant, who was initially in charge of chemistry and explosives. When the NDRC took on the goal of making an atomic bomb before the Germans could, Conant recruited a former Harvard chemistry major, the charismatic and popular University of California, Berkeley, theoretical physics professor J. Robert Oppenheimer, who was recommended by his friend and fellow Berkeley physicist Ernest Lawrence. It was to be physics’ finest hour, and Oppenheimer, the poetic son of German Jewish immigrants, who read the Bhagavad Gita in Sanskrit and studied philosophy under Alfred North Whitehead, threw himself into the problem with abandon, assembling a crack team of the best minds in physics, including some of his own top students and several European immigrants. In September 1942, the project was turned over to the military under the command of engineer and brigadier general Leslie Groves. Groves recognized Oppenheimer’s brilliance and ambition and appointed him scientific director of what was now code-named the Manhattan Engineer District, or, more simply, the Manhattan Project. The work was “without doubt the most concentrated intellectual effort in history,” wrote William Laurence, science reporter for the New York Times. Science was to be America’s greatest defense against tyranny.
But then, in the blink of an eye, everything changed. The project succeeded, and on August 6, 1945, the United States dropped Little Boy, the first of two of its new bombs of light, on the Japanese city of Hiroshima. On August 9, Fat Man fell on Nagasaki. The bombs proved the power of knowledge once and for all, and Oppenheimer, as the director of the project, was the first public spokesman for the awesome power of science in a new era.
After the euphoria of winning the war had ebbed, the idea that the United States had used science to kill an estimated 110,000 Japanese civilians without any warning—with another 230,000 dying from radiation injuries over the next five years (a side effect the United States at first officially denied)—weighed heavily on Oppenheimer’s conscience, and on the American public’s collective conscience as well.
In addition to his moral unease, Oppenheimer, like many other leading scientists, had a mounting strategic concern that the Soviet Union, with its vast uranium deposits, would engage the United States in an arms race.
Up to this point, the Allies had regarded themselves as fighting the good fight—honorable, fair, and true, with one hand tied behind their backs, like Superman. The obliteration of two cities of civilians avoided what would surely have been a bloody invasion against a radicalized nation that was using suicide bombers, but it also exposed the dark side of the power that science could unleash, and the horrific consequences that can arise when ethics lag behind knowledge. Mainstream Americans, who had been largely proscience during the 1920s and 1930s and through World War II, now became deeply ambivalent. Was it right, what America had done? Was it honorable? And could it come back to hurt them?
Science, and with it democracy, was growing up, and with increased power came the dawning of a new age of responsibility. Seven weeks after the Hiroshima and Nagasaki blasts, Laurence, whom the War Department had contracted to be the atomic bomb’s official historian, characterized this visceral feeling:
The Atomic Age began at exactly 5:30 Mountain War Time on the morning of July 16, 1945, on a stretch of semi-desert land about fifty airline miles from Alamogordo, NM, just a few minutes before the dawn of a new day on this earth. . . . And just at that instant there rose from the bowels of the earth a light not of this world, the light of many suns in one.
There was a sense that scientists had unlocked a power whose use crossed an ethical boundary—that this act had soiled science and might even destroy humanity. Oppenheimer, the poet-physicist, who thought of a verse from the Bhagavad Gita upon seeing the first atomic detonation at Trinity test site in New Mexico—“I am become Death, the destroyer of worlds”—spoke of his growing misgivings at the American Philosophical Society in November:
We have made a thing, a most terrible weapon, that has altered abruptly and profoundly the nature of the world. We have made a thing that by all standards of the world we grew up in is an evil thing. And by so doing, by our participation in making it possible to make these things, we have raised again the question of whether science is good for man, of whether it is good to learn about the world, to try to understand it, to try to control it, to help give to the world of men increased insight, increased power.
Albert Einstein, who had played a key role in alerting President Roosevelt to the possibility of making such a bomb, shared Oppenheimer’s misgivings. He sent a telegram to hundreds of prominent Americans in May 1946, asking for $200,000 to fund a national campaign “to let the people know that a new type of thinking is essential if mankind is to survive and move toward higher levels. . . . This appeal is sent to you only after long consideration of the immense crisis we face.” The telegram contained what has become one of the most famous quotes in science:
The unleashed power of the atom has changed everything save our modes of thinking and we thus drift toward unparalleled catastrophe.
Ethical Infants
Today, the idea that everything has changed “save our modes of thinking” might refer not only to the bomb but also to climate change, biodiversity loss and habitat fragmentation, ocean trawling, geoengineering, synthetic biology, genetic modification, mountaintop removal mining, chemical pollution, pollinator collapse, the sixth mass extinction, the deployment of killer robots on the battlefield, and a host of other science-themed challenges we now face. Science and technology have delivered awesome power to governments and industry, but they have not granted us total awareness of the consequences of this power or sustainable mechanisms for its use. We are ever like teenagers being handed the car keys for the first time.
After the war, this feeling—that our scientific ability had outstripped our moral and ethical development as a society, perhaps as a species—was not limited to physicists. The Austrian Jewish biochemist Erwin Chargaff emigrated to the United States to escape the Nazis in 1935. His work would lead to James Watson and Francis Crick’s discovery of the double-helix structure of DNA. Chargaff’s autobiography described his changed feelings about science:
The double horror of two Japanese city names [Hiroshima and Nagasaki] grew for me into another kind of double horror: an estranging awareness of what the United States was capable of, the country that five years before had given me its citizenship; a nauseating terror at the direction the natural sciences were going. Never far from an apocalyptic vision of the world, I saw the end of the essence of mankind—an end brought nearer, or even made possible, by the profession to which I belonged. In my view, all natural sciences were as one; and if one science could no longer plead innocence, none could.
Military leaders shared a similar concern. Omar Bradley, the first chairman of the US Joint Chiefs of Staff and one of the top generals in North Africa and Europe during World War II, gave blunt voice to this cultural angst in a 1948 Armistice Day speech:
Our knowledge of science has clearly outstripped our capacity to control it. We have many men of science, but too few men of God. We have grasped the mystery of the atom and rejected the Sermon on the Mount. Man is stumbling blindly through a spiritual darkness while toying with the precarious secrets of life and death. The world has achieved brilliance without wisdom, power without conscience. Ours is a world of nuclear giants and ethical infants.
The Endless Frontier: From Wonder to Fear
In November of 1944, Roosevelt had asked Vannevar Bush to consider how the wartime science organization might be extended to benefit the country in peacetime—to improve national security, aid research, fight disease, and develop the scientific talent of the nation’s youth. After the war was won, Bush submitted his report to President Harry S. Truman. Science, the Endless Frontier, made the case that the creation of knowledge is boundless in its potential. The report is widely credited with laying the groundwork for the second golden age of Western science, during which governments, rather than wealthy philanthropists, became the principal funders of scientific research in peacetime as they had been in war.
In his report, Bush argued that science was of central importance to freedom, an argument that was powerfully underscored when, in August 1949, the Soviet Union detonated an atomic bomb of its own, as Oppenheimer had feared it would. The sense of impending doom over the power scientists had unleashed by splitting the atom turned into the fear of a clear and present danger: nuclear war.
In less than a year, a bill creating the National Science Foundation (NSF) was signed into law, and science began to undergo a subtle but profound change in its relationship to Western culture. For two centuries, it had been motivated by a sense of wonder on the part of noble idealists and adventurers, wealthy visionaries, civic-minded philanthropists, and scrappy entrepreneurs. But it was now largely driven by government investments that were, in no small part, motivated by the public’s sense of fear.
Atomic Terrorism
This fear would impact the world for the next fifty years. Western troops developed a gallows humor when referring to the nuclear weapons they monitored and strapped to the bellies of airplanes. Canadian troops stationed at Zweibrücken, a NATO air base in West Germany, called the bomb “a bucket full of sunshine.”
Stateside, American defensiveness bordered on hysteria. Americans knew what a nuclear weapon could do—they’d done it. And now the possibility that it could boomerang back on them was very real. They’d sacrificed and died and put off love, children, and careers in order to beat back the authoritarian threat, and now suddenly it was back in a different way, and what was at risk was society’s most precious asset: their children. The baby boom.
The Federal Civil Defense Administration determined that the country that would win a nuclear war was the one best prepared to survive the initial attack. Achieving this required a homeland mobilization on an unprecedented scale, and children needed to know what to do when nuclear war came. The government commissioned a nine-minute film called Duck and Cover that showed Bert the Turtle pulling into his shell to survive a nuclear explosion that burns everything else. The film exhorted millions of schoolchildren to “duck and cover” like Bert by covering the backs of their heads and necks and ducking under their desks if they saw a bright flash. The film didn’t mention that the gamma-ray burst, which carries most of the lethal radiation, arrives with the flash, nor the fact that school desks were hardly sufficient protection against the flying shrapnel of broken glass and building materials. As the film said,
Now, we must be ready for a new danger, the atomic bomb. First, you have to know what happens when an atomic bomb explodes. You will know when it comes. We hope it never comes, but we must get ready. It looks something like this: there is a bright flash! Brighter than the sun! Brighter than anything you have ever seen! If you are not ready and did not know what to do, it could hurt you in different ways. It could knock you down hard, or throw you against a tree or a wall. It is such a big explosion it can smash in buildings and knock signboards over and break windows all over town. But, if you duck and cover, like Bert, you will be much safer. You know how bad sunburn can feel. The atomic bomb flash can burn you worse than a terrible sunburn.
The children were reminded regularly that, because a nuclear attack could happen at any time, they, like soldiers in a combat zone, needed to maintain a high level of alertness, forever vigilant to the possibility of attack without warning, ready to duck and cover. As Hiroshima City University nuclear historian Bo Jacobs put it,
This is the narrative about nuclear war, about the Cold War, and about childhood that millions of American children, the Baby Boomers, received from their government and from their teachers in their schoolrooms: a tale of a dangerous present and a dismal future. Ducking and covering is, after all, a catastrophic pose, one in which the emphasis is on avoiding head injury at the expense of bodily injury: it is the desperate posture of an attempt at bare survival. To duck and cover is to fall to the ground and hope that you live to stand back up. As we watch each setting of childhood succumb to the bright flash of death and destruction in the film, no grown-ups are in sight; it is up to the children to survive the world that their parents have made for them—a world seemingly without a future, where survival is measured day to day, minute to minute.
They did drills in school, and participated in citywide mock Soviet atomic-bomb attacks. Many were given metal dog tags so their burned bodies could be readily identified by their parents after a nuclear explosion burned them beyond recognition. “While adults perceived a threat to the American way of life—to their health and wellbeing and those of their families—their children learned to fear the loss of a future they could grow into and inhabit. These kids of the Atomic Age wondered if they might be the last children on Earth,” a worry that Jacobs says “had unforeseen and profound effects on the Baby Boomer generation.”
At the same time that the federal government was promoting Duck and Cover, the Office of Civil and Defense Mobilization (OCDM) was broadly distributing public-service pamphlets whose intent was to instill in everyone a sustained alertness to danger, the better to prepare the country to survive the first wave of a nuclear attack. These pamphlets were bundled with vinyl recordings of survival instructions, such as this one from Tops Records:
Our best life insurance may be summed up in four words: be alert; stay alert. This will take some doing on your part. It will take ingenuity; it will take fervor; it will take the desire to survive. . . . We might label our nuclear weapons “instant death.” There is no doubt about it: if you live within a few miles of where one of these bombs strike, you’ll die. Instantly. . . . It may be a slow and lingering death, but it will be equally as final as the death from the bomb blast itself. You’ll die, unless you have shelter. Shelter from the intense heat, and the radiation that is the by-product of a nuclear explosion. . . . Let’s assume bombs fall before you have time to prepare a shelter, or while you wait, in the belief atomic war will never come. We can always hope that man will never use such a weapon, but we should also adopt the Boy Scout slogan: be prepared. . . . It may be safe for you to leave your house after a few hours, or it may be as long as two weeks or more. Two weeks with very little food or water . . . tension . . . unaccustomed closeness. Two weeks with sanitary facilities most likely not operating. No lights. No phone. Just terror.
Fallout shelters were built around the country. New commercial buildings had them. New homes had them, and residents stocked them with water, canned goods, candles, blankets, and tranquilizers. Public buildings had them installed. The possibility of sudden nuclear annihilation at the hands of the Communists became part of everyday life. Many families rehearsed and planned for living for extended periods in dark and rancid basement shelters, and for being separated from one another indefinitely if an attack came when the children were in school.
That the trauma of unending fear and the need for hypervigilance could have psychological or neurological impacts on the developing baby-boom generation didn’t occur to psychologists until the social erosion of the 1960s and 1970s was becoming apparent. Yale psychiatrist Robert Jay Lifton, founding member of the International Physicians for the Prevention of Nuclear War, which was awarded the Nobel Peace Prize in 1985, conducted a series of interviews in the late 1970s with people who had been children during the Cold War. Writer Michael Carey was his assistant, and reported on it in the January 1981 issue of the Bulletin of the Atomic Scientists.
This generation had America’s only formal and extended bomb threat education in its schools, and that education—along with the lessons about the bomb from government, the media and the family—were well-learned. This generation has a collection of memories, images and words that will not disappear, even for those who profess not to be troubled.
Pulitzer Prize–winning Harvard psychiatrist John Mack was part of a 1977 task force for the American Psychiatric Association that sought to understand the effects of the sustained nuclear threat on children’s psyches.
We may be seeing that growing up in a world dominated by the threat of imminent nuclear destruction is having an impact on the structure of the personality itself. It is difficult, however, to separate the impact of the threat of nuclear war from other factors in contemporary culture, such as the relentless confrontation of adolescents by the mass media with a deluge of social and political problems which their parents’ generation seems helpless to change.
Analyzing the results of studies in Canada, the United States, Sweden, Finland, and the USSR, together with her clinical experiences with families and young patients, Canadian psychiatrist Joanna Santa Barbara found that nuclear-age youth were “profoundly disillusioned,” and that this affected their capacity to plan for the future. She urged adults to help in young people’s efforts to overcome their sense of betrayal.
Betrayal, cynicism, absurdity, and a profound mistrust of authority were overriding themes not only in the books of such popular writers as Ken Kesey and Kurt Vonnegut, but also in results from studies and commentary on the issue at the time. Jacobs writes on the psychological history and effects of nuclear war:
While children were supposedly being trained to physically survive an atomic attack, Duck and Cover also delivered a subtle message about the relationship of children and their world to the world of their parents. “Older people will help us like they always do. But there might not be any grown-ups around when the bomb explodes,” the narrator somberly reminds them. “Then, you’re on your own.” Duck and Cover was designed to teach children that they could survive a surprise nuclear war even in the absence of adult caretakers, conveying a powerfully mixed assurance. The film leaves no doubt that the threat of attack is always imminent and that the key to the survival of these children is their constant mental state of readiness for nuclear war: “No matter where we live, in the city or the country, we must be ready all the time for the atomic bomb. . . . Yes, we must all get ready now so we know how to save ourselves if the atomic bomb ever explodes near us.” But the film also reveals that the world children take for granted, the safe world of their childhood, could dissolve at any moment. And when that debacle happens, the adults will be gone; the youngsters will be on their own.
We now know that, in some people, the constant amygdala stimulation of such hypervigilance in childhood can produce long-term effects in the brain that lead to post-traumatic stress disorder. The classic fight-or-flight reaction to perceived threat is a reflexive nervous-system response that has obvious survival advantages in evolutionary terms, say psychiatrists Jonathan Sherin and Charles Nemeroff, who have published on the neurological changes underlying PTSD. However, the constant stimulation of the systems that organize the constellation of reflexive survival behaviors following exposure to perceived threat can cause chronic dysregulation of these systems. This, in turn, can effectively rewire brain pathways, causing certain individuals to become “psychologically traumatized” and to suffer from post-traumatic stress disorder, which can lead to other mental-health issues, impulsivity, and self-medication with drugs, alcohol, sex, or other forms of addiction.
Later studies showed an increased incidence of mental-health problems among the baby-boomer generation that sharply diverged from prior and subsequent generations, and in many ways fit the PTSD profile. Drug and alcohol use ran at rates far higher than any other generation, as did rates of divorce. Baby boomers also experienced a sharp increase in the adolescent suicide rate versus previous or subsequent generations, a problem that, like drug use and divorce, continued to plague the generational cohort into their later years. Suicide rates among baby boomers continue to run as much as 30 percent higher than other age cohorts, and divorce rates in 2010 among those aged fifty and older ran at twice the rate of the prior generation (measured in 1990) while the overall US divorce rate had fallen 20 percent during the same time.
As the largest age demographic, the baby boomers soon began to rule the cultural conversation across the Western world, and particularly in the United States, with what many commentators called an overweening narcissism. During the sixties, when baby boomers were in their teens, they rebelled and self-medicated with sex, drugs, and rock ’n’ roll. In their twenties, as they were getting jobs and finding their identities, they became the me generation. In the 1980s, as they settled down, had families, took up professions, switched to cocaine, and began gentrifying urban neighborhoods their parents had abandoned, they became yuppies: young urban professionals. In the 1990s, as they began investing for retirement, they created the tech bubble. And in the 2000s, as they realized they were mortal, there was a resurgence of big religion in suburban megachurches, and a sharp increase in libertarian, “no new taxes” politics whose goal was to “shrink government until it’s small enough to drown in the bathtub.” Through it all, the generation has largely maintained its mistrust of—and antipathy toward—both government and science. Over the ensuing decades, the baby boomers felt vindicated in this attitude again and again as government officials from the president on down were discredited, and as environmental science began to expose the damage being wrought by the civic, industrial, and agricultural application of pesticides and other chemicals developed during World War II. The world was a mess, thanks to science.
From Duck and Cover to Run Like Hell
We do not have enough data to say that the threat of imminent nuclear annihilation, and the resulting hypervigilence and learned helplessness, are wholly responsible for any of the social or epidemiological characteristics of boomer culture. But historical records suggest it was a significant contributing factor. Whatever the long-range effects, the threat of nuclear war in the 1950s represented terrorism on a new scale. Imagine that ISIS were in charge of a country the size of the Soviet Union and had nuclear weapons trained on the United States; one can get a sense of the era’s fear. Americans knew what these weapons could do, and knew they could be used again. The only option was to plan for an attack on American soil, which was regarded as inevitable. This knowledge changed American culture and its relationship to science in some surprising ways.
For example, it has long been the prevailing opinion that American suburbs developed as a result of the increased use of the car, GI Bill–funded home construction, and white flight from desegregated schools after the 1954 Supreme Court decision in Brown v. Board of Education of Topeka. But in reality the trend started several years before Brown.
The idea had first been pushed by a utopian short film called To New Horizons. Shown at the 1939 World’s Fair in General Motors’ “Futurama” exhibit, which imagined a city of 1960, the film first introduced the idea of a network of expressways. “On all express city thoroughfares, the rights of way have been so routed as to displace outmoded business sections and undesirable slum areas whenever possible,” the film said. But the notion didn’t really get traction until 1945, when the Bulletin of the Atomic Scientists began advocating for “dispersal,” or “defense through decentralization,” as the only realistic defense against nuclear weapons. Federal civil defense officials realized this was an important strategic move. Most city planners agreed, and the United States adopted a completely new way of life by directing all new construction “away from congested central areas to their outer fringes and suburbs in low-density continuous development,” and “the prevention of the metropolitan core’s further spread by directing new construction into small, widely spaced satellite towns.”
General Motors, a major US defense contractor, heavily supported the idea, as did other auto, tire, glass, concrete, oil, and construction companies that stood to gain. GM’s president, Charles Wilson, became Dwight Eisenhower’s secretary of defense in 1953, and, in his Senate confirmation hearing, made the famous blunder of saying that “what was good for our country was good for General Motors and vice versa.” The fifty largest US corporations accounted for a quarter of the country’s gross national product that year, with GM sales alone exceeding 3 percent. Extending the war footing by redirecting resources toward suburban development was good for business and good for the economy, as well as for national defense.
Nuclear safety measures, supported by business interests, drove the abandonment of US cities. After being told that “there is no doubt about it: if you live within a few miles of where one of these bombs strike, you’ll die” and “We can always hope that man will never use such a weapon but we should also adopt the Boy Scout slogan: be prepared,” moving far away from the “target” city seemed wise. Those who could afford to left. Those who remained were generally less affluent, and minorities made up a disproportionate share of the poor.
A far worse development for American urban minorities came in 1954, when the federal Atomic Energy Commission realized that, with the advent of the vastly more powerful hydrogen bomb, “the present national dispersion policy is inadequate in view of existing thermonuclear weapons effects.” The dispersion strategy was akin to “matching a sleeping tortoise against a racing automobile.” By then, however, it was too late; the suburbs were growing rapidly, but offices were still largely downtown. A new strategy was needed—one that had been laid out by General Motors in To New Horizons. President Eisenhower promoted a program of rapid evacuation to rural regions via expressways. As a civil defense official who served from 1953 to 1957 explained, the focus changed “from ‘Duck and Cover’ to ‘Run Like Hell.’”
Cities across the United States ran nuclear-attack drills, each involving tens of thousands of residents, practicing clearing hundreds of city blocks in the shortest possible time. It became clear that this would require massive new transportation arteries in and out of cities. The resulting National Interstate and Defense Highways Act of 1956 was the largest public-works project in history. It created a system that provided easier access from the suburbs into cities, as well as a way to more rapidly evacuate urban areas in case of nuclear war. The new freeways had to be built in a hurry and were routed through the cheapest real estate, which usually meant plowing through vibrant minority communities, displacing “outmoded business sections and undesirable slum areas whenever possible” and uprooting millions of people. Although poverty had been concentrated in these neighborhoods, so was a rich culture and a finely woven fabric of relationships, and the neighborhoods’ destruction ripped apart the social networks that had supported minority communities for years, leading to a generation of urban refugees.
These defense accommodations—with the encouragement and involvement of what Eisenhower would later regretfully refer to as the “military-industrial complex”—brought about immense changes, altering everything from transportation to land development to race relations to energy use to the extraordinary public sums that are now spent on building and maintaining roads. This created social, economic, psychological, and political challenges that are still with us today—all because of science and the bomb.
The Protection Racket
The fear that was changing the nation kicked up another notch with the Soviet launch of Sputnik 1, the first Earth-orbiting satellite, on October 4, 1957. Its diminutive size—about that of a beach ball—made it perhaps the most influential twenty-three-inch-diameter object in history. Traveling at roughly eighteen thousand miles an hour, the shiny little orb circled the planet about once every hour and a half, emitting radio signals that were picked up and followed by amateur radio buffs the world over—but nowhere more closely than in the United States.
Sputnik shocked America in ways that even the 1949 Soviet nuclear test had not. For the first time, the Commies were not just catching up—they were ahead. The fear was that North America stood at risk of being overrun by an authoritarian society. The little orb focused this amorphous fear and placed the entire continent in danger, at least psychologically. In the United States, since the 1949 Soviet nuclear test, debates had been swirling about the need to invest more in education, particularly science, technology, engineering, and mathematics (often referred to as STEM), because of their critical importance to national defense. But until Sputnik, these discussions had foundered on the shoals of congressional indifference. Now those debates came into sharp focus. As historian JoAnne Brown put it:
The struggle for federal aid may have been won in the sky, but it was fought in the basements, classrooms and auditoriums, as educators adapted schools to the national security threat of atomic warfare and claimed a proportionate federal reward for their trouble.
Within a year, the National Defense Education Act of 1958 was passed, with the goals of improving education in defense-related subjects at all grade levels and bolstering Americans’ ability to pursue higher education. The NSF’s budget, which had been quite low, jumped dramatically in 1957 and continued to grow. Science would become a major issue on the presidential campaign trail in 1960. If Americans didn’t recommit to science and technology, it was argued, they might lose the Cold War. Their entire way of life, perhaps their very survival, was at stake, and it all hinged on what they could do to protect themselves by reinvesting in science and technology to beat “those damn Russkies.” American public opinion about science, which, for the twelve years since the bombings of Hiroshima and Nagasaki, had been one of great moral ambivalence, began a new relationship with it almost overnight. Scientists might be sons of bitches, but they were American sons of bitches.
When Science Walked Out on Politics
By this time, it was clear that science was the answer to the twin threats of the arms race and Sputnik—and that America was, in fact, in a science race, as Vannevar Bush had essentially argued in Science, the Endless Frontier. Science had become one of the primary weapons in a new kind of war. The nation that invested the most in science and engineering research and development would lead the world—and perhaps, find safety.
In the span of two short decades, science had attained sacred-cow status enjoyed by few other federal priorities. Gone were the days of scientists needing to reach out to wealthy benefactors to justify and explain their work in order to get funding. The adoption of science as a national strategic priority changed the relationship between science and the public. Over the course of a single generation, government funding allowed scientists to turn inward, away from the public and toward their lab benches, at the very time that the public had developed a love-hate relationship with science.
This love-hate relationship came with the conflicting emotions of need and resentment. Though their work is by nature antiauthoritarian and somewhat artistic, scientists became figures of authority in white lab coats—bland, dry, value-neutral, and above the fray. This new image of science, implanted in baby boomers by hundreds of classroom filmstrips, couldn’t have been less inspiring—or further from the truth. Scientists are very often passionate and curious, interested in many things. They often are world travelers and lovers of the outdoors and the arts. These are the very qualities that typically motivate their interest in science—the exploration of creation—to begin with. But very little of that characteristic passion and curiosity would be communicated to the general public for the next fifty years. Science came to be regarded as a culture of monks: intellectual, quietly cloistered, sexually and creatively dry.
With tax money pouring in from a vastly expanding economy and the public respect afforded the authority of the white lab coat, two generations of scientists instead had only to impress their own university departments and government agencies to keep research funds coming their way. But they no longer had to impress the public, which was growing increasingly mistrustful of science.
At the same time, science was becoming less accessible even to other scientists. As knowledge mounted and research became increasingly specialized, no one could keep up with all the latest findings. There was simply too much information. With scientists unable to follow each other outside their own fields, reaching out to the public seemed a hopeless exercise. What mattered was not process, but results. University tenure tracks rewarded the scientists who had successful research programs and multiple professional publications that attracted large sustaining grants, which, in turn, attracted and funded the top graduate students. But tenure gave no similar consideration to science communication or public outreach. “Those who can, do,” the attitude of scientists became, “and those who can’t, teach.” It was a horrible mistake.
Locked in a subculture of competitive, smart, and passionate people focused on their own research, scientists forgot that they were responsible to—indeed, a part of—the community of taxpayers that funded much of their work and so deserved a say in what they did. Scientists became notoriously cheap donors of both time and money, and withdrew from civic life in other ways. Giving back and participating in the greater civic dialog just wasn’t part of their culture or value system. As in any cloistered society, attitudes of superiority developed within the science community—attitudes that ran counter to the fundamentally antiauthoritarian nature of scientific inquiry.
Many scientists, for example, came to view politics as something dirty and beneath them. Arguing that they did not want to risk their objectivity, they eschewed voicing opinions on political issues. So while science was entering its most dizzyingly productive and politically relevant period yet, very little of this creativity was being relayed to the public. Only the results were publicized. From the public’s perspective, the science community had largely withdrawn into its ivory tower and gone silent. This proved to be a disaster.
Public Sentiment Is Everything
In democracy, there is a mistaken idea that politics is the lowly part of the business—what we have to put up with in order to enact policy—but, in fact, the opposite is true. This mistake is made especially often by scientists, who view politics as tainted. Abraham Lincoln eloquently illustrated this point when he debated his opponent Stephen Douglas in the 1858 Illinois campaign for the US Senate. Lincoln lost that election, but he forced Douglas to explain his position on slavery in a way that alienated the Southern Democrats. That set Lincoln up to defeat him in the race for president two years later.
“Public sentiment,” Lincoln said, “is everything. With public sentiment, nothing can fail; without it, nothing can succeed. Consequently he who molds public sentiment goes deeper than he who enacts statutes or pronounces decisions. He makes statutes and decisions possible or impossible to be executed.”
Thus politics, which moves the invisible hand of democracy, is more important than policy. It reflects and shapes the will of the people. It is the foundation on which policy is based. Lincoln’s thinking in this regard echoed that of Thomas Jefferson. It was also a view that industry would soon adopt with a vengeance.
Scientists were certainly smart enough to realize this, but the structure put in place under Vannevar Bush’s grand vision worked against it. Who now had to worry about shaping public sentiment? As president of the Carnegie Institution of Washington, Bush was familiar with the time and resources that fund-raising required, and his goal was to lift the onus of obtaining research funding off scientists and universities to propel the nation forward in a more coordinated way. Other nations—both East and West—quickly followed suit. But the need to sell the worth of one’s work to the public and donors, to converse about new discoveries and their meaning, and to inspire and excite lay-people may be the only thing that keeps the public invested and supportive in the long term—support that, in a democracy, is critical to sustained effort. Bush may have done his job too well. The shift to public funding changed the incentive structure in science.
This might not have been a problem if scientists had valued public outreach, but, by and large, they didn’t. As economists are quick to point out, people often adjust their behavior to maximize the benefit to themselves in any given transaction, and the economics of the new structure rewarded research but not public outreach or engagement. As a result, most scientists ignored it. Science coasted off the taxpayers’ fear of the USSR, even as public mistrust was building.
The Two Cultures
The growing divide between science and mainstream culture was famously articulated by British physicist, novelist, and science advisor C. P. Snow, a man who straddled many worlds, like the scientist/artist/statesmen of old. In a famous 1959 lecture titled “The Two Cultures and the Scientific Revolution,” Snow warned that the widening communication gulf between the sciences and the humanities threatened the ability of modern peoples to solve their problems:
A good many times I have been present at gatherings of people who, by the standards of the traditional culture, are thought highly educated and who have with considerable gusto been expressing their incredulity at the illiteracy of scientists. Once or twice I have been provoked and have asked the company how many of them could describe the Second Law of Thermodynamics. The response was cold: it was also negative. Yet I was asking something which is about the scientific equivalent of: “Have you read a work of Shakespeare’s?”
I now believe that if I had asked an even simpler question—such as, “What do you mean by mass, or acceleration,” which is the scientific equivalent of saying, “Can you read?”—not more than one in ten of the highly educated would have felt that I was speaking the same language. So the great edifice of modern physics goes up, and the majority of the cleverest people in the Western world have about as much insight into it as their neolithic ancestors would have had.
Scientists didn’t see this as a warning or an invitation to reach out; rather, they viewed it as a criticism of the willful ignorance and snobbishness of those practicing the humanities. To a certain extent, their view was justified: intellectuals weren’t giving their work its due. The fast-growing importance of the sciences was garnering scientists considerable funding and public regard in exchange for the new powers and freedoms they were giving society. Yet that same society’s highbrows, especially in academia, still refused to acknowledge their work’s significance.
But the threat of nuclear war made survival the priority and relegated other important things to the realm of luxuries. Suddenly, citizens didn’t have the luxury of indulging wonder, or the humanities, to the extent they once had. And science had adeptly proven its utility to society, as Snow argued. Although he criticized scientists who could scarcely make their way through Dickens with any understanding of its subtleties, he saved his harshest criticism for British universities—which had underfunded the sciences to the benefit of the humanities, despite the former’s contributions—and for the snobbishness of literary intellectuals. “If the scientists have the future in their bones,” he said, “then the traditional culture responds by wishing the future did not exist.” It was a statement that could just as easily describe the US Congress, or the Canadian parliament, some fifty years later.
The lecture was printed in book form and widely debated in Britain, as well as in Canada and the United States. It has been declared one of the one hundred most influential Western books of the last half of the twentieth century.
For a solution, Snow envisioned the emergence of a third culture of people schooled in both the sciences and the humanities. But that is not what took place. A great change had begun in Western universities, and humanities professors felt themselves slipping from the top spots and being supplanted by scientists, who generally seemed as if they couldn’t have cared less about the humanities. Why bother with all the reading and writing and talking when science was actually doing things? But this was equally shortsighted, and in this shift the West let go of something precious: a grasp on the classics that had informed Western culture. Since scientists couldn’t be bothered with civics, democracy continued to draw its elected leaders primarily from the humanities, creating a culture war that is with us to this day, threatening the ability of Western democracies to solve their problems—just as Snow feared.
Democracy is, as we now know, rooted in science, knowledge, and the biology of natural law. But most of our elected leaders have not had significant training in science, or, more importantly, in how the foundational ideas of modern law and democracy relate to, and grew out of, science. In the middle years of the twentieth century, this was beginning to pose a problem.
The twin threads of fear and resentment created a growing sense that science might be outpacing the ability of a democracy to govern itself. The situation was alarming enough that it compelled President Eisenhower to warn the American people about it. On January 17, 1961, in his farewell address to the nation, he famously warned of the dangers of the emerging “military-industrial complex.” Ike blamed the rise of this behemoth on the federal government’s growing funding of science, and he complained that the solitary inventor was being overshadowed by teams of scientists in cloistered labs, hidden from the watchful eye of the public and awash in taxpayer money. “[I]n holding scientific research and discovery in respect, as we should,” Ike warned, “we must also be alert to the equal and opposite danger that public policy could itself become the captive of a scientific-technological elite.”
How far the United States had come from the days of the first State of the Union address, when George Washington told Congress that “there is nothing which can better deserve your patronage than the promotion of Science and Literature.” Democracy itself had been created by a scientific-technological elite that had included Thomas Jefferson, Benjamin Franklin, George Washington, Benjamin Rush, and other Founding Fathers. Elitism had been something to aspire to. Now, thanks to its association with a cozy cabal of military officers and allied contractors, science had become something to be feared.