Читать книгу The War on Science - Shawn Lawrence Otto - Страница 17
ОглавлениеWhen, in the course of human events, it becomes necessary for one people to dissolve the political bonds which have connected them with another, and to assume among the powers of the earth, the separate and equal station to which the laws of Nature and of Nature’s God entitle them, a decent respect to the opinions of mankind requires that they should declare the causes which impel them to the separation.
We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable rights, that among these are life, liberty and the pursuit of happiness.
—US Declaration of Independence
The Biology of Democracy
The implications of empiricism for government were profound. Suddenly, kings and popes logically—empirically—had no greater claim to authority than anyone else. This was a self-evident truth that proceeded from careful observation of nature through our knowledge-building process—the same process that we use in science. And if it was the case that we were equal and free in the senses these foundational thinkers had described the concepts, then the only form of government that made sense was government of, by, and for the people. The question was how best to implement it.
It can be argued that the method the framers hit upon, democracy, is based to some extent on biology, via its roots in natural law. This argument is supported by recent research in the field of opinion dynamics by Princeton University biologist Simon Levin. His observations of animal herds show that, like human social groups, they follow certain innate rules of organization. A vote is an expression of opinion, and herd animals, quite literally, vote with their feet when determining the overall herd’s grazing patterns.
But the voting is not entirely egalitarian. There are opinion leaders in these herds, just as there are in human social groups. The idea of having equality of opportunity—in practical terms, having the right to vote—exists, but, just as in human society, all individuals do not have equal influence. In America, the framers of the Declaration of Independence had more influence in shaping our democracy than the average Virginia tobacco farmer did, and far more than an indentured servant or a slave. In herds, a very few individual opinion leaders make decisions that influence the entire herd’s grazing patterns. “Individuals in reality are continually gaining new information, and hence becoming informed,” says Levin. “So for sure, any individual could become a leader (just look at elections in the United States). That does not mean that all men are created equal, but in terms of the ability to lead, they all have equal opportunity.”
The idea that natural law and hard-wired biological instincts are somewhat synonymous forces lies at the very foundation of modern law. Remember Calvin’s Case: “The law of nature is that which God at the time of creation of the nature of man infused into his heart, for his preservation and direction.” In today’s language, this is another way of saying “biological nature,” or, more commonly, “human nature”—what drives us naturally, as reasonable creatures. What democracy did was to structure and channel those natural opinion dynamics—what we call natural law—for use in organizing society. Democracy is rooted in our biology.
Science, Art, and Creative Cultures
Science and art are intrinsically related and, in fact, were once one and the same. Both involve the detailed observation and representation of nature in its many aspects; both seek to capture and express some fundamental and perhaps ineffable truth. Both are concerned with the great questions of reality, of life, of an underlying order. Both require a sort of leisured study in a segregated place to maximize creativity, and both are driven forward by an intensely disciplined focus on the craft that can produce astounding bursts of creative insight. Physicists often talk about aesthetic qualities like beauty and symmetry, and indeed there is a long history of art apprehending the forms of nature later uncovered more explicitly by science. Great art and great science both produce a sense of wonderment, and the great artists and scientists are separated from the mediocre ones by the breadth of their minds and the originality of their ideas.
As with mathematicians, who are often also brilliant musicians, scientists frequently seek creative expression in other media such as painting, writing, or sculpture. The exploration of nature, the seeking of insight, the making of things, the importance of technique and finesse—all these drive both art and science.
As natural philosophy, science was considered part of the arts until around 1835, when the term “scientist,” which had been in circulation for a few years, was adopted at the annual meeting of the British Association for the Advancement of Science in order to better describe what scientists did in both the philosophical and technical-analytical senses, using the term scientist “by analogy with artist.”
Because of the close relationship between innate individual potential and creative expression, on the one hand, and the opportunity afforded by democracy, on the other, science has made the greatest advances in liberal (as in free and open) democratic societies that spread freedom and opportunity broadly, like fertilizer, through tolerance, diversity, intellectual and religious freedom, individual privacy, equal individual rights, free public education, freedom of speech, limited government authority, consideration of minority views, and public support for both arts and research that allow creative minds the opportunity to dream of new things. It is the cross-pollination of ideas that seems to have led to the greatest advances and the value is derived not from encouraging art or encouraging science, but from encouraging and supporting creativity.
These open, democratic societies that are supportive of—and attractive to—creative minds acknowledge the role of opinion leaders, but limit those leaders’ actions closely to the feedback of the herd. In The Science of Liberty, Ferris shows how powerfully open societies have promoted the wealth and progress of nations, and how fundamentalist, theocratic, and totalitarian governments—in other words, authoritarian governments—have had comparatively few scientific advances. This resulted in those nations falling further behind the liberal democracies both economically and technologically.
This has occurred for three main reasons. The first is intellectual flight. Historically, the brightest and most creative minds have migrated to open societies, and, once there, have made discoveries and created works of art that advanced and enriched those societies. A classic example is the intellectual flight from fascist Europe in the years leading up to World War II. In the 1920s and early 1930s, Berlin was the world capital of science, culture, and art, and these aspects fed off one another. Persecution—particularly of Jews, homosexuals, and artists—spurred emigration that turned the United States into an intellectual mecca. The United States offered these intellectuals freedom, tolerance, egalitarianism, opportunity, and support for their work, and it had the military strength to protect those ideals. In return, the new immigrants gave the United States enormous breakthroughs in chemistry, biology, and physics, and helped shape Hollywood culture, which, together with advanced technology, became America’s chief cultural export.
However, scientific leadership proceeds from not only openness but also the degree of opportunity available to creative citizens. By making education free and accessible to all, by stimulating cross-pollination and creativity with a diversity of views and languages and support for research and the arts, by financially supporting scientific research and artistic exploration, and by leveling the economic playing field to provide equal opportunity and freedom of inquiry, democratic societies have broadcast the intellectual fertilizer that helps talented people develop their creative potential wherever they may be—and that creative potential, in turn, benefits those societies.
Finally, these open societies with vibrant cross-fertilization between the sciences and the arts have historically produced innovations that have created new, previously unimagined economies, as well as profound technological breakthroughs that have led to the ability to project physical power over the natural world and against rivals. Personal computers were functional, but it took marrying science and technology with the art of design to make them into the ubiquitous and transformative tools that they became.
Combined, these three factors have had a stunningly powerful effect: even more than empowering individuals, they empower ideas. It is this mix of freedom, tolerance, creativity, talent, and diversity in science, in art, and in the social and intellectual interplay between the two that has spawned the great breakthrough cultures that produce new ideas and fresh insights.
A Nation of Thinkers—or Tinkerers?
Jefferson was certainly aware of some of this, and he heavily promoted science during the eight years of his presidency from 1801 to 1809, frequently writing and speaking of its value and importance to the nation and sponsoring major scientific expeditions such as that of Meriwether Lewis and William Clark. Befitting the great westward expansion, in the nineteenth century it was America’s pioneer spirit and can-do attitude that produced the world’s great inventors and implementers, the great trial-and-error engineers involved in communication, lighting, and power, including Eli Whitney, Samuel Morse, Alexander Graham Bell, Thomas Edison, George Westinghouse, Nikola Tesla, and many others. But Europe was still the home of real science and the scientists—the curiosity-driven experimentalists and theorists—who made the fundamental basic-science breakthroughs, including Alessandro Volta, Michael Faraday, Andre Ampere, George Ohm, Charles Darwin, Marie Curie, James Maxwell, Gregor Mendel, Louis Pasteur, Max Planck, Alfred Nobel, and Lord Kelvin.
This focus on tinkering and engineering versus science and discovery (or, in some ways, applied science versus basic science) was partly because America lacked the well-established academies of Europe, and perhaps partly because of the American obsession with building a new country. But it also seemed to have something to do with the American social character itself. French political scholar Alexis de Tocqueville noted this focus on pragmatism and application when he toured America in 1831 and 1832, some fifty-five years after its birth. His report of what he learned, Democracy in America, contains a chapter titled “Why the Americans are More Addicted to Practical than to Theoretical Science.” Tocqueville observed that free men who are equal want to judge everything for themselves, and so they have a certain “contempt for tradition and for forms.” They are men of action rather than reflection, and hold meditation in low regard. “Nothing,” he argued, “is more necessary to the culture of the higher sciences or of the more elevated departments of science than meditation; and nothing is less suited to meditation than the structure of democratic society. . . . A desire to utilize knowledge is one thing; the pure desire to know is another.” Tocqueville argued that this relative disregard for basic, curiosity-driven science, on the one hand, and the focus on applied, objective-driven science, on the other, might eventually be the country’s downfall. He related a striking cautionary tale that resonates powerfully today:
When Europeans first arrived in China, three hundred years ago, they found that almost all the arts had reached a certain degree of perfection there, and they were surprised that a people which had attained this point should not have gone beyond it. At a later period they discovered traces of some higher branches of science that had been lost. The nation was absorbed in productive industry; the greater part of its scientific processes had been preserved, but science itself no longer existed there. This served to explain the strange immobility in which they found the minds of this people. The Chinese, in following the track of their forefathers, had forgotten the reasons by which the latter had been guided. They still used the formula without asking for its meaning; they retained the instrument, but they no longer possessed the art of altering or renewing it. The Chinese, then, had lost the power of change; for them improvement was impossible. They were compelled at all times and in all points to imitate their predecessors lest they should stray into utter darkness by deviating for an instant from the path already laid down for them. The source of human knowledge was all but dry; and though the stream still ran on, it could neither swell its waters nor alter its course.
In other words, China fell under a conservative, authoritarian intellectual fundamentalism that deeply honored tradition but lacked the substance, freedom, and capacity to create anything new. The craft was there, but the creativity, art, and science were gone.
Tocqueville concluded that the basic research that had the power to change the future, to alter the course of the stream at will, was the product of more liberal European thinking. His tale suggests the dangers posed by embracing the form of science at the expense of the process, of tradition and precedent at the expense of openness and creativity, of applied research at the expense of basic science, of fear at the expense of wonder, of utility at the expense of beauty, and of insisting on financially quantifiable projections before an investment is made—the idea of which runs contrary to the entire process of discovery and creativity. Imagine, for example, an insistence on the promise of financial return prior to Darwin’s trips on the Beagle or Neil Armstrong’s first steps on the moon. They would never have happened. And yet it is hard to quantify the enormous wealth that has spun off from those economically unsupportable adventures into the unknown.
But Tocqueville was being a bit unfair in his assessment, or perhaps just taking too short a view of history. Europe was the home of the greatest imperial collapse in Western history, after all, and the collapse was, in many ways, all about science and culture. Like sixteenth-century China, the Roman Empire was the inheritor of centuries of scientific, artistic, and philosophical culture from the ancient Greeks. The ruling classes of Rome were taught this Greek knowledge and Greek culture.
But Roman intellectual and political culture was much more practical than theoretical—more applied than basic, more form than process. And with this approach Romans became increasingly anti-intellectual. There was an assumption that basic, curiosity-driven science just wasn’t necessary. One can see this in the writings of Pliny the Elder. Gaius Plinius Secundus was the most prominent natural philosopher of his day. He published a famous book called Natural History, which was based on the much earlier observational, basic science of the Greek researcher Claudius Galen. But Pliny included several other theories in his book. Unlike Galen’s work, they weren’t based on observations of nature, and they turned out to be mostly wrong. Yet his book became a foundational textbook of the Roman Empire. As the Romans valued science and observation less and less, the artistic and scientific institutions that fed Roman culture began to weaken and decline.
Those who want to dismiss the arguments for basic research—thinking the private sector is the source of today’s innovation and the public sector is a laggard—should consider the arguments by economist Mariana Mazzucato in her book The Entrepreneurial State. The Internet was a technological creation that came out of basic research. The Apple iPhone, while a creation of Apple as a design package, was based on technology that came out of government-funded basic research. Many universities today receive significant funding from licensing fees paid by private firms commoditizing their basic research. Private industry doesn’t have the financial wherewithal to weather the risks that basic research imposes—namely, that a lot of it is wasted looking in the wrong places because of the trial-and-error nature of observational science. But when basic research hits, it hits big, creating entire new economies and transformative breakthroughs. We can’t afford not to do basic research. The only thing we can be sure of is that, if we don’t do it, we won’t get the breakthroughs that solve global problems or make trillions of dollars. The private sector is timid by comparison, Mazzucato argues. It’s the public sector that can be a catalyst for big, bold, problem-solving ideas, which is why the argument for science and democracy is so essential.
Despite his shortsightedness, it’s possible that Tocqueville’s general assessment of America may have been correct, and that the United States would have coasted off its vast natural resource exploitation until its economy eventually ran out of growth, but would have never really led the world, were it not for three major developments. The first grew out of Jefferson’s insistence on public education, which over time did indeed provide opportunity to undiscovered talent. The second, also heavily encouraged by Jefferson, was the burgeoning American university system, which was being built up to rival those of Europe. And the third was the American values of tolerance and freedom, which drew talented immigrants from elsewhere. By the first decades of the twentieth century, all three developments were beginning to pay major dividends for America—and particularly for Republicans.
When Conservatives were Pro-Science and Pro-Immigration
In today’s Western culture, particularly throughout the former British colonies, conservatives have come to be aligned with vested economic and ideological interests, and have come to be seen as antiscience. Science itself, some conservatives have argued, has a liberal bias, a sentiment comedian Stephen Colbert echoed when he quipped that “reality has a well-known liberal bias” at the 2006 White House Correspondents’ Dinner.
In fact, by its very nature, science is both progressive and conservative. It is conservative in that it is retentive of knowledge and cautious about making new assertions until they are fully defensible. But it is also progressive in that it is, and must always be, open to wherever observation leads, independent of belief and ideology, and focused on creating new knowledge.
It would thus be a mistake to characterize scientists as mostly Democrats or mostly Republicans, or mostly liberals or conservatives. They are mostly for freedom, exploration, creativity, caution, and knowledge—and not intrinsically of one or another party, unless a party or political orientation becomes authoritarian and begins to turn its back on evidence. In the early twenty-first century, the political orientation that most stands for freedom, openness, tolerance, caution, and science is the liberals. In the United States, this ideology is represented by the Democrats, which may explain why 55 percent of US scientists polled in 2009 said they were Democrats while only 6 percent said they were Republicans, compared to 35 and 23 percent of the general public, respectively. When one thinks about it, it becomes clear why this is currently the case. The conservative movement has largely become associated with—and financed by—old industry and traditional religion, both of which perceive an existential threat from new science. Rather than supporting exploration of wherever the evidence leads, they have invested big money in an authoritarian model of defending their values and business models, and that means denying science that contradicts those things. The rise of authoritarianism among the Republicans running in the 2016 US presidential elections is an example of this.
Early in the twentieth century this situation was almost reversed. It was the Southern Democrats, defending Jim Crow and traditional religion, who opposed science. Republican Abraham Lincoln had created the National Academy of Sciences in 1863. Republican Teddy Roosevelt, who had grown up wanting to be a scientist, became America’s great defender of wildlife and the environment. Republican William McKinley, who would later be admired by Karl Rove, won two presidential elections, in 1896 and 1900, both times over the anti-evolution Democrat William Jennings Bryan, and supported the creation of the Bureau of Standards, which would eventually become today’s National Institute of Standards and Technology. Bryan’s strident anti-evolution campaigns, culminating in the 1925 Scopes Monkey Trial, helped to drive even more scientists toward the Republican Party.
As an exasperated Republican, the physicist and CalTech chair Robert A. Millikan, wrote in the leading journal Science in 1923, the year he won the Nobel prize,
We have many people even here who hasten to condemn evolution without having the remotest conception of what it is that they are condemning, nor the slightest interest in an objective study of the evidence in the case which is all that “the teaching of evolution” means, men whose decisions have been formed, as are all decisions in the jungle, by instinct, by impulse, by inherited loves and hates, instead of by reason. Such people may be amiable and lovable, just as is any house dog, but they are a menace to democracy and to civilization, because ignorance and the designing men who fatten upon it control their votes and their influence.
Other prominent scientists noted the political divide. The great botanist Albert Spear Hitchcock, who would soon become principal botanist at the US Department of Agriculture, wrote the following spring in the same journal that “it is absurd for a scientist to shiver with fear if he sees a black cat cross his path or if he walks under a ladder. It is equally absurd to believe that all Germans or all democrats, or all Roman Catholics . . . are undesirables and a menace to society.”
By the early twentieth century, the Democratic Party, which originally grew out of Thomas Jefferson’s Anti-Federalists, had become dominated on the national level by Southern religious conservatives and was divided over culture-war issues like evolution, the prohibition of alcohol, restricting immigration, Jim Crow laws, the Ku Klux Klan, and the Catholic faith of Al Smith, the Democratic presidential nominee in 1928. Republicans, by contrast, were the party of Abraham Lincoln and Theodore Roosevelt, of progressive optimism and tolerance, of environmentalism and finance—the party of rationalism and national parks. And by the early 1930s, one of the most famous men in the world was a Republican scientist named Edwin Hubble.
Hubble, who was born in 1889, grew up in Marshfield, Missouri, and then, at the turn of the century, his father John relocated the family to Wheaton, Illinois, where he attended public school and was famous for his athletic prowess. At the time, science was considered a fanciful pursuit and a less-than-solid career path, much like the arts—something suited more for adventurers and wealthy “gentlemen scientists” than professionals. Hubble’s father wanted him to be a lawyer, and when Hubble earned one of the first Rhodes Scholarships while a star student of Millikan’s at the University of Chicago, he went to the Queen’s College at the University of Oxford to study law, not physics.
Still, it was a time when great discoveries were being made in astronomy, which captivated Hubble’s imagination. America was entering a golden age of science, propelled in no small part by the massive philanthropic investments of two Republican men: steel magnate Andrew Carnegie, who funded public libraries across the nation, helped found what is now Carnegie Mellon University, and funded basic scientific research through the Carnegie Institution of Washington (since renamed the Carnegie Institution for Science); and John D. Rockefeller Sr., who endowed the University of Chicago as well as Rockefeller University and Johns Hopkins University’s School of Public Health. As Hubble began secretly studying astronomy on the side while at Oxford, the imagination of the American public was captured by the growing fame of a former Swiss patent officer with wild hair, an ever-present violin, a playful face, and some mind-blowing ideas—one Albert Einstein.
The Hoax of Relativity
Published in 1916 during World War I, Einstein’s general theory of relativity had made the striking prediction that gravity could bend space and so disrupt the straight-line flow of light. On May 29, 1919, with the war over, the British astronomers Sir Arthur Eddington and Andrew Claude de la Cherois Crommelin set out to test the theory by traveling to the island of Príncipe near Africa, and carefully observing the way starlight behaved during a solar eclipse. If Einstein was right, the sun’s gravity would bend the light of stars that were in line with it, making them appear to be slightly offset. The eclipse, which lasted nearly seven minutes, was one of the longest of the twentieth century. It blocked enough sunlight that astronomers could see the stars and measure changes in their apparent locations. If they shifted, Einstein’s theory would be proved.
The test’s audacity drew the attention of scientists and journalists the world over. If Einstein was wrong, his reputation would be ruined. If he was right, he would be celebrated as a genius whose theory changed everything we thought about the universe. The results were dramatically presented at a November joint meeting of the Royal Society of London and the Royal Astronomical Society, and they confirmed Einstein’s predictions spectacularly.
The popular press loved the drama, and Einstein became a household name—a little tramp of a professor who was also a bold genius, with his funny hair and beloved violin, not unlike Charlie Chaplin and his cane. In contrast to Americans’ image of the snobby European intellectual, Einstein connected emotionally as an underdog, a trait that appealed to the antiauthoritarian aspect of the American spirit and was cited in press accounts of his “hero’s welcome” when he first visited America in 1921, the year he won the Nobel Prize.
America’s embrace of Einstein stood in stark contrast to the treatment he was getting at home in Germany. Even though Berlin was the world capital of culture, art, and science, right-wing relativity deniers were on the rise. Like modern climate-science deniers, relativity deniers mounted ad hominem attacks against Einstein, and loudly branded general relativity a “hoax,” despite—or perhaps because of—its recent, dramatic scientific confirmation. They were led by an engineer named Paul Weyland, who formed a small but mysteriously well-funded group that held antirelativity rallies around Germany, denouncing the theory’s “Jewish nature” and organizing a major event at the Berlin Philharmonic Hall on August 24, 1920. Einstein attended, only to suffer more personal attacks. The political animosity grew so bad that he decided to leave Berlin.
“This world is a strange madhouse,” he wrote to a friend three weeks after the rally. “Currently every coachman and every waiter is debating whether relativity theory is correct. Belief in this matter depends on political party affiliation.” His words would be echoed decades later by mystified climate scientists.
Even prominent German physicists were getting into relativity denialism, largely along political lines having to do with nationalism and rising anti-Semitism, which, paradoxically, was occurring as Germany was awash in a new liberalism. The winner of the 1905 Nobel Prize in Physics, Philipp Lenard, who had previously exchanged flattering letters with Einstein, had since become bitter about Jews and jealous of the popular publicity Einstein’s theory was receiving. He now called relativity “absurd” and lent his name to Weyland’s group’s brochures. As a Nobel laureate, he worked behind the scenes to try to deny Einstein the prize.
The Greatest Triumph of Satanic Intelligence
At the same time, antiscience had been growing in the United States in reaction to the perceived evils of the theory of evolution. Like relativity, evolution was seen by social conservatives as undermining moral absolutes—in this case, biblical authority. The movement was led in part by an attractive, charismatic sex-symbol and revivalist named Aimee Semple McPherson, the founder of what may have been the first American evangelical megachurch. The 5,300-seat Angelus Temple in Echo Park, Los Angeles, was equipped with radio towers to broadcast her sermons, and McPherson filled it to capacity three times a day, seven days a week. She stressed the “direct-experience” approach to religion, not unlike the empirical spirituality of the Puritans that had been so central to the creation of Western science. Like the Puritans, she considered the mainstream Protestant churches too orthodox—but unlike the Puritans, her complaint was that the mainstream churches were not authoritarian enough.
This revivalist spirit was propelled by the wave of immigration that followed World War I, which many Americans found disconcerting; the recovery from the traumatic flu pandemic of 1918, which had killed millions; and the return of millions more from the war, many of them still with untreated “shell shock,” the condition we now describe as post-traumatic stress disorder. Fueled by new optimism and cheap labor, the stock market boomed. Moral restrictions were loosening and the country needed to blow off some steam. Materialism soared during what F. Scott Fitzgerald called the Jazz Age, a subject he explored in his 1925 novel The Great Gatsby.
But the line between liberalism and running amok depends upon one’s psychological keel, as Gatsby showed. For many, this powerful mix of materialism, diversity, and newfound tolerance was simply too much, and they began to lose their moral bearings—or to feel that other Americans were losing theirs. McPherson was among the latter group. She set about working to bring order to society, and her moral fierceness offered a bulwark in the storm to many.
By the mid-1920s, McPherson had become a household name. She was made an honorary member of police and fire departments across the country, and, at 10,000 members, she ran the largest Christian congregation in the world. She purchased one of the first three radio stations in Los Angeles, and eventually claimed more than 1,300 affiliated churches that preached her “Foursquare Gospel” of literalist Bible interpretation. From this great platform, she took up a campaign against the two most profound evils threatening America at that time: the drinking of alcohol and the teaching of evolution in public schools.
McPherson was not alone in this holy campaign. William Jennings Bryan, the former secretary of state, had been the Democratic candidate for president for a third time in 1908. He had spoken throughout the United States in favor of Prohibition and against the teaching of evolution, which he believed had led to World War I. If we were in fact “descended from a lower order of animals,” he professed, then there was no God and, as a consequence, nothing underpinning society. Like Thomas Hobbes, he felt that, without an absolute authority, society would fall into decay.
Darwin himself had not seen it this way. He had written to John Fordyce about the issue in 1879, saying, “It seems to me absurd to doubt that a man may be an ardent Theist & an evolutionist,” though Darwin himself had by then given up his own Christianity. In 1880, he wrote to the young lawyer Francis McDermott that “I am sorry to have to inform you that I do not believe in the Bible as a divine revelation & therefore not in Jesus Christ as the son of God,” a view that only became known when the letter was sold at auction in 2015.
Following Bryan’s fiery stump speeches warning of the moral decay that teaching evolution would wreak on society, several states passed laws banning the practice. The most notable of these laws was Tennessee’s Butler Act, signed into law on March 21, 1925. By April, the American Civil Liberties Union had recruited a substitute teacher named John Scopes to break the law in Tennessee, after which the organization would pay for his defense to challenge it. On the other side, Bryan was asked to represent the World Christian Fundamentals Association, defending the law at the resulting trial, and, with it, his personal reputation and political future.
McPherson was a strong supporter of Bryan during the trial. He had been her guest at the Angelus Temple and had watched her preach that social Darwinism had corrupted students’ morality. The teaching of evolution was “the greatest triumph of satanic intelligence in 5,931 years of devilish warfare against the Hosts of Heaven. It is poisoning the minds of the children of the nation,” she had said. During the trial, McPherson sent Bryan a telegram, which read, “Ten thousand members of Angelus Temple with her millions of radio church membership send grateful appreciation of your lion hearted championship of the Bible against evolution and throw our hats in the ring with you.” The confrontation at the trial between Bryan and Scopes’s “sophisticated country lawyer” Clarence Darrow, also a Democrat, was the climax of one of the nation’s earliest major scientific-political-religious controversies. Though Darrow lost the case in the Bible Belt state of Tennessee, the accordant publicity turned American public opinion in support of teaching evolution in public schools.
The Vatican stayed out of this debate, partly because its healthy network of parochial schools meant it had little skin in the game—state laws concerning public-school curricula were of little concern. Even today, in Georgia, the joke is, “If you want your kids to learn about evolution, send them to Catholic school, because they won’t learn it in public school.”
The event also marked a curious milestone: Evangelical Protestants and Roman Catholics had now nearly reversed their respective popular positions with regard to science. While Protestants had once embraced it as Catholics had found themselves at odds, now it was Protestants who were rejecting science and Catholics who were beginning to more fully embrace it—a reversal that astronomer Edwin Hubble would soon help to accelerate.
The Largest Scientific Instrument Known to Man
It was into this hothouse climate that the Protestant-raised Hubble, adorned with the cape, cane, and British accent he had acquired while a Rhodes Scholar at Oxford, returned after the war, having traveled Europe and formed friendships with several of its leading astronomers. He arrived at the Carnegie-funded Mount Wilson Observatory outside Pasadena, California, insisting on being called “Major Hubble.” He quickly made enemies among the other scientists with his pompous airs and his self-aggrandizing tall tales. Looking through the great Hooker Telescope—at 101 inches in diameter and weighing more than one hundred tons, it was by far the largest and most powerful scientific instrument in the world—Hubble was able to view the universe with the light-gathering capacity of more than two hundred thousand human eyes.
Despite his propensity for stretching the truth, Hubble was a very strict Baconian observer when it came to science, limiting his statements only to what he observed and what could be strictly concluded from those observations, as John Locke had prescribed. Despite these conservative precautions, or perhaps because of them, what Hubble saw changed humanity’s view of the universe forever—and would further roil the controversy over science’s role in defining the origins of creation. Hubble photographed a small blinking star in the Andromeda nebula that he identified as a Cepheid variable. Like Galileo’s view of Venus, Hubble’s observation of a Cepheid in Andromeda would become iconic in its power.
The Human Computer That Opened the Heavens
Hubble’s work relied heavily on that of another astronomer, Harvard College Observatory’s Henrietta Leavitt, who had in 1912 shown something remarkable about Cepheid variable stars, which change from dim to bright to dim again over a period ranging from a few hours to about a month. Scientists were trying to figure out a way to measure the distance to stars. It was impossible to tell if a star appeared dim because it was far away, or because it didn’t emit as much light, so this was a difficult task.
As a woman, Leavitt was not allowed to be part of the scientific staff; she was a “computer”—one of several women hired merely to identify and catalog stars and calculate light curves for the male scientists. Leavitt began to suspect that there might be a relationship between the brightness of a variable star and the length of its period. She reasoned that all stars in the Small Magellanic Cloud were roughly the same distance from Earth and so their apparent brightness could be compared to one another. She then created a graph showing the maximum luminosity of each Cepheid variable compared to the length of its period, and found that there was indeed a relationship. The longer the period, the brighter the star actually was at maximum luminosity.
Danish astronomer Ejnar Hertzsprung seized on Leavitt’s insight. Using inductive reasoning, Hertzsprung determined that if two Cepheid variable stars had similar periods but one was dimmer than the other, it was probably farther away. He then searched for and found a Cepheid variable close enough to Earth to measure the distance to it using parallax. Knowing this distance, he measured its apparent brightness and used Leavitt’s graph to reverse engineer its actual brightness. The resulting formula, called the Distance-Luminosity relation, allowed scientists to measure the distances to all Cepheid variables, and thus also to the stars and other nearby formations. The blinking stars became “standard candles” throughout the heavens. Leavitt was paid a premium rate of thirty cents per hour over the usual two bits because of the high quality of her work.
The standard candle measurement of space was an immense discovery. In 1915, American astronomer Harlow Shapley, a Democrat, used it and Mount Wilson’s sixty-inch telescope to map the Milky Way in three dimensions. Shapley’s measurements expanded the known size of the Milky Way severalfold and showed that the sun was not at the center of the galaxy, as had been thought until that moment, but was in fact located in a distant outer arm. This overturned the concept of the centrality of humans yet again, and Shapley was celebrated as the greatest astronomer since Copernicus—a title he himself helped promote—for having achieved the “overthrow” of the heliocentric universe.
The Great Debate: A Cautionary Tale
In what would become an important cautionary tale for science—and, by extension, democracy—Shapley became blinded by his belief that the Milky Way was the entire universe—or maybe by his hubris, in wanting to believe that he had mapped the whole shebang. He argued that the spiral nebulae seen in the heavens were simply wisps of gas and clouds of dust within the Milky Way, rather than entire “island universes”—that is, galaxies—of their own, as fellow astronomer Heber Curtis posited. Shapley debated this point with Curtis at a meeting of the US National Academy of Sciences in April of 1920, in an event famously called the Great Debate.
The debate ended in a draw because there wasn’t yet enough observational data to draw firm conclusions. But this was partly because Shapley, without realizing it, had stopped using Locke’s and Bacon’s inductive reasoning to build knowledge from observation. Instead, he was trying to prove his point with a rhetorical argument—an a priori, top-down, Cartesian approach of first principles that had him arguing more like an attorney than a scientist. When his assistant, Milton Humason, showed him a photographic plate that seemed to indicate the presence of a Cepheid variable in Andromeda, Shapley shook his head and said it wasn’t possible. Humason had unimpressive formal credentials—he had been elevated to assistant from a mule driver and had only an eighth-grade education—but it was Shapley’s a priori ideas that occluded his vision. He took out his handkerchief and wiped the glass plate clean of Humason’s grease pencil marks before handing it back. No one realized it at the time, but Shapley’s career as a major scientist ended in that moment.
Soon after, Shapley moved on to run the Harvard College Observatory while his tale-telling rival, Edwin Hubble, took over the telescope. Adopting the uncredentialed but brilliant Humason as his assistant and adhering to his strict Baconian observational methodology, Hubble soon identified Cepheids in Andromeda and used them to show that the spiral “nebula,” as he called the galaxy, was not part of the Milky Way at all. In fact, it was nearly a million light-years distant—more than three times farther away than the diameter of Shapley’s entire known universe. The Great Debate was settled, and Hubble became an overnight sensation.
A Republican Expansion
In 1929, Hubble and Humason followed this accomplishment up by showing that there is a direct correlation between how far away a galaxy is and its redshift—the degree to which its light waves are shifted to the red end of the spectrum. Light waves are emitted at known frequencies. Redshift is caused by a star’s light waves stretching out, apparently due to the star’s rushing away from Earth, making them lower in frequency and thus shifted toward the red end of the visible-light spectrum.
Scientists had already established that the redshift suggested that light waves were subject to the Doppler effect. We notice the Doppler effect in everyday life when the sound of a train’s whistle or a police siren lowers in pitch as it races away from us. Astronomers believed redshift could be used as a measure of the speed at which a star appears to be moving away from us. Hubble correlated that redshift with distance, and then showed via painstaking observation—performed mostly by Humason, whom Shapley had recommended for promotion to the scientific staff—that the farther away a star is, the greater the redshift. The odd couple of the liar and the mule driver found this to be uniformly true in every direction of the sky. This suggested that the universe itself was probably expanding at an even rate.
To picture this, imagine blowing up a perfectly round balloon until it is no longer flaccid but not yet taut. Now take a marker and mark spots in a grid pattern over the entire surface of the balloon, each spot exactly one inch from the next. Now finish blowing up the balloon and watch what happens. As the balloon expands, every dot moves farther away from every other dot. The space between each pair of dots expands. In addition, the dots that are, say, five dots apart from each other move apart five times faster than the dots that are only one dot apart do because the surface is expanding uniformly. Five times the distance, five times the expansion. This is a close analogy to what Hubble and Humason saw happening in three dimensions in the universe.
This fundamental velocity-distance relationship came to be known as Hubble’s law, and it is recognized as one of the basic laws of nature. It was a liar who, ironically, using the tools and methods of science, discovered some of the universe’s most fundamental truths. But his work also implied something even more momentous.
A Catholic Priest’s Big Bang
Georges Lemaître was a pudgy, pinkish Belgian Jesuit abbé—a Catholic priest—and also a skilled astronomer. Like the Puritans, Lemaître was interested in reading the Book of Nature, something the Catholic Church has come to support in the ensuing centuries, funding major astronomical observatories. He was also a reasonably good mathematician, and he had noticed that Einstein’s general theory of relativity would have implied that the universe was expanding but for a troublesome little mathematical term called the cosmological constant, which Einstein had inserted into his equations. Lemaître saw no convincing scientific or mathematical reason why the cosmological constant should be there. In fact, Einstein himself had originally calculated that the universe was expanding, but he was a theoretician, not an astronomer. When he turned to astronomers for verification of his theory, he found that almost all of them held the notion that the universe existed in a steady state, and that there was no motion on a grand scale. In deference to their observational experience, Einstein adjusted his general theory calculations with a mathematical “fudge factor”—the cosmological constant—that made the universe seem to be steady.
Lemaître worked independently off the same mathematical principles that Einstein had originally laid out. Nine years later, in 1927, he wrote a dissenting paper in which he argued that the universe must be expanding, and that, if it was, the redshifted light from stars was the result. This redshift had been observed by a number of astronomers, but until then there had been no consensus on the cause.
Lemaître saw Hubble’s self-evident observations and clear logic and immediately realized that Hubble’s work confirmed his math—and refuted Einstein’s general theory of relativity. Furthermore, he deduced, if the universe was expanding equally in all directions, it must have initiated in a massive explosion from a single point. This meant that the universe is not infinitely old; it has a certain age, and that the moment of creation—which British astronomer Fred Hoyle later mockingly called the “big bang”—was analogous to God’s first command: Let there be light.
Hubble’s meticulously reported observations and ironclad, self-evident conclusions convinced Einstein that he may have been wrong to insert the cosmological constant. He made a pilgrimage from Germany to Mount Wilson Observatory outside of Pasadena, where he joined Hubble, Humason, Lemaître, and others to speak with Hubble and examine the stars through the one-hundred-inch telescope. Then he held a news conference. Standing with Hubble, Humason, and the other scientists, he made a stunning public announcement. Unlike Shapley, Einstein changed his mind based on the evidence, and removed the cosmological constant from his general theory of relativity, later calling it “the greatest blunder of my life.” The universe was indeed expanding.
Science Rock Star
This dramatic mea culpa by Einstein, who was perhaps the most famous man in the world, drew even more attention to Hubble and the striking depictions of the immense universe coming from the astronomers atop the 5,715-foot Mount Wilson. Breathless newspaper headlines screamed about the gargantuan distances and the millions of new worlds Hubble was discovering. He began to lecture on science to standing-room-only crowds of five thousand people, and he, too, became one of the most famous men in the world for redefining our ideas about our origins.
Decades later, the Hubble Space Telescope would be named in his honor, but, in the 1930s, Hubble’s work captured the public interest like that of few scientists before him. He became one of the first great popularizers of science with his traveling and speaking, even delivering, as part of a scientific lecture series, a ten-minute national radio address heard by millions during the intermission of a New York Philharmonic broadcast. He and his wife Grace were the special guests of director Frank Capra at the March 4, 1937, Academy Awards ceremony, where Capra, the academy’s president that year, won best director for Mr. Deeds Goes to Town. Hubble became the toast of Hollywood, and a long line of actors and directors made the journey up Mount Wilson to peer through the lens of his telescope.
Unlike the admiration that his papers produced, this activity engendered the wrath of his fellow scientists, who scorned him as an arrogant egotist and shameless self-promoter. His protégé, Allan Sandage, discoverer of quasars and further mapper of the universe, said that Hubble “didn’t talk to other astronomers very much, but he was certainly not arrogant when he was in the company of other people.” Regardless of his temperament, he had the talent to back up his celebrity. And, in part because of the press coverage, the transparency of his work, and his popular speaking, the public felt in on his discoveries, embracing them rather than becoming suspicious.
Among the many celebrities who came to visit Hubble on the mountain was Aimee Semple McPherson. Milt Humason, who was a famous womanizer, told Sandage that, in 1926, during a month-long disappearance in which McPherson claimed to have been kidnapped, tortured, and held for ransom in Mexico, the attractive radio evangelist had actually been up on Mount Wilson, enjoying Humason’s special attentions in the Kapteyn Cottage. If true, this would seem an example of the phenomenon of preachers and politicians who attempt to impose rules on society in areas in which they themselves have weaknesses, perhaps seeking to control their own overpowering and unacceptable urges.
Hubble maintained a tolerant but skeptical relationship toward all religions. But according to Sandage (a Democrat), when it came to politics Hubble was a staunch Republican who colluded with other Republican scientists to schedule known Democrats for telescope time on Election Day to prevent them from voting.
In 1951, Pope Pius XII gave a momentous speech in which he addressed Hubble’s work and the big bang theory, stating that the big bang proved the existence of God by showing there was a moment of creation, which meant there must be a creator. A friend of Hubble’s read the text of the pope’s speech in the Los Angeles Times and wrote to him,
I am used to seeing you earn new and even higher distinctions; but till I read this morning’s paper I had not dreamed that the Pope would have to fall back on you for proof of the existence of God. This ought to qualify you, in due course, for sainthood.
Hubble, heralded by scientists as the greatest astronomer since Galileo, and loved by the public and the press for his indefatigable popularization of astronomy, had managed to bring the relationship between science and the Roman Catholic Church full circle.