Читать книгу The War on Science - Shawn Lawrence Otto - Страница 16

Оглавление

Chapter 3

RELIGION, MEET SCIENCE

The value of science to a republican people, the security it gives to liberty by enlightening the minds of its citizens, the protection it affords against foreign power, the virtue it inculcates, the just emulation of the distinction it confers on nations foremost in it; in short, its identification with power, morals, order and happiness (which merits to it premiums of encouragement rather than repressive taxes), are topics, which your petitioners do not permit themselves to urge on the wisdom of Congress, before whose minds these considerations are always present, and bearing with their just weight.

—Thomas Jefferson, 1821

In the Beginning

To understand the ironic situation in which modern democracies find themselves—their major challenges largely revolving around science, yet few elected leaders understanding these issues and few political reporters reporting on them—we have to understand what makes the relationship what it is, and the best place to start is to look at how science shaped or didn’t shape the forming of the world’s oldest democracy: the United States of America, on which so many other democracies are patterned.

Was America always a nation of science? Or was it founded as a Christian nation? Has there always been a conflict between religion and science? What, exactly, are the relationships among science, politics, freedom, and religion in America and, by extension, in other democracies? Why did science get so advanced there? Most important, why do we keep having conflicts over it? Can democracy—and Earth’s ecosystem—survive them?

Contrary to what many fundamentalist politicians and televangelists have claimed, America was not founded as a Christian nation. The land was initially settled by Puritans, but the country was founded on the principles of science, something the Puritans valued greatly. In fact, 150 years after the Pilgrims’ arrival, the Founding Fathers took great pains to expunge religious thinking from the writings that laid the legal and philosophical foundations for the country they wished to form, beginning with the Declaration of Independence.

They carefully carved out a new, secular form of government based on limited powers for the authorities and reservation of most freedoms for the people, including freedom of inquiry and expression, and freedom of and from religion. The founding documents guaranteed protection of these freedoms, and of the people’s right to experiment with and modify their government, by using an eighteenth-century version of crowdsourcing they called democracy. This was something entirely new. It was not coincidental with the scientific revolution, but, rather, a natural outgrowth of it. The liberties these founding principles afforded have in turn produced the highest standard of living, the greatest scientific and technological advances, and the greatest power in the history of the planet.

God’s Natural Law Is Reason

Among the first to seize upon the discovery of the New World were the English merchants of Jamestown, Virginia. When their party of 104 men and boys landed in 1607, they were looking for gold, but instead they wound up growing and selling something even more profitable: tobacco.

They were followed some thirteen years later by those seeking not fortune, but freedom of religion. The Puritans began to settle Massachusetts as early as 1620, forming fundamentalist enclaves in niches carved out of the “savage” new continent. They disliked the Catholic-leaning authority of the Church of England and they believed in progress and innovation, that the Bible was God’s true law, and that it provided a plan for living.

Puritanism wasn’t just a theology, it was a whole set of ideas that included taking an antiauthoritarian, experimental, empirical approach to discovering the natural laws by which God’s creation abided. In exercising his will, God did not contradict reason. Rather, he revealed himself to humans through two books: the Book of Revelation, made accessible by faith, and the Book of Nature, made accessible by observation and reason. Science was the “handmaiden” to theology, assisting in the study of “the vast library of creation” as a vehicle to religious understanding.

This thinking can be traced to that of the Islamic Mu’tazilites several hundred years before, at a time when Islam was the keeper of scientific knowledge during Europe’s Dark Ages. The Mu’tazilites’ primary ethos had been to celebrate the power of reason and the human intellect. God spoke not only through the Quran, but through his creations, so we could discern his will by studying nature. It is our intellect, not a literalist reading of an old book, they argued, that guides us toward a true knowledge of God and the basis of morality. The idea resurfaced among Puritans of the late sixteenth century, whose more prominent members had read the old Mu’tazilite books on science.

This idea that God does not contradict reason and that his laws are implicit in nature also lies at the foundation of English common law, as first set forth in Christopher St. Germain’s 1518 treatise The Doctor and Student, which relates a hypothetical conversation between a doctor of divinity and a student of the laws of England and established common law’s moral basis.

St. Germain was a Protestant polemicist during the reign of King Henry VIII, a time when a great battle was raging between the Catholic Church, which was the highest authority in all matters, and the antiauthoritarian Protestants, who promoted do-it-yourself study of the Bible and nature. In fact, The Doctor and Student was published just a year after Martin Luther posted his Ninety-Five Theses on a church door. As the reform movement swept through Europe, monks were thrown out of monasteries and told to marry nuns, as Luther did in 1525 when he married an ex-nun. Adherents to Luther’s philosophy destroyed and looted the Catholic churches of the bones of their saints and other relics and jewels, condemning the objects as false idols. The truth was to be found in the Bible and in direct experience, they believed, not in the pronouncements of the pope in Rome.

This thinking required a reexamination of the world and the devising of a new order based not upon the authority of the church, but upon reason. The question at hand for St. Germain amid this upheaval was “what be the very grounds of the law of England.” He offered first that the law of God underlies reason:

The law of God is a certain law given by revelation to a reasonable creature, shewing him the will of God, willing that creatures reasonable be bound to do a thing, or not to do it, for obtaining of the felicity eternal.

He then declared that reason and natural law are synonymous:

As when any thing is grounded upon the law of nature, they say, that reason will that such a thing be done; and if it be prohibited by the law of nature, they say it is against reason, or that reason will not suffer that to be done.

Therefore, nature was knowable, and God’s will could be understood by studying nature to discern its laws. This was a powerful idea that put man into an immediate relationship with God and nature, without an intermediary authority figure. Evidence from the study of nature was to be the basis of the laws of England. To the Puritans, then, there were no conflicts in the ideas of religion, law, reason, and science. All were varying examinations of natural law.

The idea of natural law developed over the next ninety years as Protestantism flourished in England. In 1608, the great English jurist Edward Coke sought to more clearly define it. Coke was a Puritan sympathizer who spent his career working to protect individual liberty and make sure the monarchy’s arbitrary authority was circumscribed by the rule of law, an idea the Puritans very much favored.

His report, Calvin’s Case, became a foundational document in English law. In it, Coke wrote,

The law of nature is that which God at the time of creation of the nature of man infused into his heart, for his preservation and direction.

And in an oft-quoted section of his 1628 Institutes of the Lawes of England, Coke broadened this idea in an important way that for the first time turned to the crowdsourcing model the American founders would eventually adopt. Seeking to limit the caprice of the “royal prerogative” by which the king claimed the authority to do whatever he wanted, Coke argued that, while natural law motivated individual men, “an infinite number of grave and learned men” working over “successions of ages” could refine and perfect the laws derived from this initial natural moral basis. In other words, this was an early form of scientific literature and peer review, but also of the idea of the rule of law. Coke called this aggregation “artificiall [sic] reason,” which he defined as “perfect reason, which commands those things that are proper and necessary and which prohibits contrary things.”

Thus, law had a basis in physical reality through the hard-wired biological instincts of humans that God had infused into their hearts at the time of their creation, but its full force and power came from socially aggregating those insights. No one man’s authority, even the king’s, stood above it.

This science-friendly Protestant perspective—that one could establish law and understand God’s will by studying nature and, over time, aggregating and refining a body of knowledge that bound even the king—stood in stark contrast to the position taken by the Roman Catholic Church when, in 1633, church authorities denied the validity of astronomical science and indicted Galileo for heresy for simply describing what he found by observing nature.

The poet John Milton, author of Paradise Lost, visited Galileo at Arcetri, the hilly area to the South of Florence where he was under house arrest. Milton told the British Parliament of his visit in 1644, when protesting an order by England to make authors submit their writings first to the government for approval, and he warned them of the dangers of censorship.

I could recount what I have seen and heard in other countries, where this kind of inquisition tyrannizes; when I have sat among their learned men, for that honor I had, and been counted happy to be born in such a place of philosophic freedom as they supposed England was, while themselves did nothing but bemoan the servile condition into which learning amongst them was brought; that this was it which had damped the glory of Italian wits; that nothing had been there written now these many years but flattery and fustian. There it was that I found and visited the famous Galileo, grown old, a prisoner to the Inquisition, for thinking in astronomy otherwise than the Franciscan and Dominican licensers thought. And though I knew that England then was groaning loudest under the prelatical [church] yoke, nevertheless I took it as a pledge of future happiness that other nations were so persuaded of her liberty.

By the end of the seventeenth century, as Anglican clergy in London were preaching Newton’s science, Italian scientists were standing trial in Naples for stating “that there had been men before Adam composed of atoms equal to those of other animals.”

The Islamic Keepers of Science

This second fall of Italy as a world leader of science (the first being the fall of the Roman Empire) was not unlike what had happened to the Islamic Empire at the very dawn of the Renaissance. Through the long centuries of the Dark Ages, it was not Christianity but Islam that had kept the flame of science alive. Turkish Ottoman muskets and superior military technology had conquered the Balkans, Ukraine, Crimea, Palestine, Lebanon, Syria, Arabia, and much of North Africa, creating a vast Ottoman empire. Scholars in this golden age of Islam laid the foundation for much of modern Western thinking in ways few people realize today, down to the language and the numerical and mathematical systems we use. The word algebra, for instance, comes from al-gabr, Arabic for “completion,” one of two ways of solving quadratic equations developed by “the father of algebra,” Muhammad ibn Musa al-Khwārizmī, whose last name (al-Khwārizmī), when translated into Latin, is Algoritmi, the root of the word algorithm.

Baghdad’s House of Wisdom madrasa, where al-Khwārizmī taught, was the largest university and the greatest repository of books in the medieval world. Its scholars studied the ancient Greek, Persian, and Sanskrit texts and, based on what they learned there, developed their own science in astronomy, cartography, chemistry, geography, mathematics, medicine, and zoology. In fact, the approach of using the empirical observation of nature to discover the objective truth of things was first used not by Francis Bacon but by an eminent Islamic scientist, Ibn al-Haytham.

It was al-Haytham’s Optics and other Arabic texts, translated into Latin, that informed and inspired the early European thinkers of the Renaissance. The first English astronomer, Walcher of Malvern, noted for using an astrolabe to measure the time of several solar and lunar eclipses, was also the first English scholar of Arabic and one of the first translators of Arabic treatises into Latin (from which he likely learned much of his science), in the late eleventh century. Roger Bacon, the thirteenth-century scientist and Franciscan friar, described a cycle of observation, hypothesis, experimentation, and independent verification, which sounds an awful lot like the modern scientific method, and which he got from studying Optics. Al-Haytham is the first scientist we know of, as the term is used today, to describe someone guided by empirical observation of nature. Optics, written between 1028 and 1038, was translated into Latin and printed in Europe in 1572, and was read by the most influential scientists of the day, including Kepler, Galileo, and Descartes. The book describes the scientific method Francis Bacon would soon champion—to start with observation and induction, being cautious about conclusions and wary of the swaying power of opinion. As al-Haytham put it,

[We should begin] our investigation with an inspection of the things that exist and a survey of the conditions of visible objects. We should distinguish the properties of particulars, and gather by induction what pertains to the eye when vision occurs and what is found in the manner of sensation to be uniform, unchanging, manifest and not subject to doubt. After which we should ascend in our enquiry and reasoning, gradually and orderly, criticizing premises and exercising caution in regard to conclusions—our aim in all that we make subject to inspection and review being to employ justice, not to follow prejudice, and to take care in all that we judge and criticize that we seek the truth and not be swayed by opinion.

But by the time al-Haytham was read by the great minds of Western science, Muslim freedom of inquiry had long since been sacrificed, and Islamic science was no more. British nuclear physicist and Iraqi scholar Jim al-Khalili, who has written extensively about early Islamic science, notes,

There were very few . . . Christian scholars whose achievements could rival their Muslim counterparts until the end of the fifteenth century and the arrival of Renaissance geniuses such as Leonardo da Vinci. By that time, European universities would have contained the Latin translations of the works of all the giants of Islam, such as Ibn Sīna, Ibn al-Haytham, Ibn Rushd, al-Rāzi, al-Khwārizmi and many others. In medicine in particular, translations of Arabic books continued to be studied and printed well into the eighteenth century.

Among the European scholars influenced by their Islamic counterparts before them were Roger Bacon, whose work on lenses relied heavily on his study of Ibn al-Haytham’s Optics, and Leonardo of Pisa (Fibonacci), who introduced algebra and the Arabic numeral characters after being strongly influenced by the work of al-Khwārizmi. Some historians have even argued that the great German astronomer Johannes Kepler may have been inspired to develop his groundbreaking work on elliptical orbits after studying the work of the twelfth-century Andalusian astronomer al-Bitrūji (Alpetragius), who had tried and failed to modify the Ptolemaic model.

But at the very moment Protestantism and the practitioners of the new science were blossoming in Italy, Germany, France, and England, and beginning to draw on the works of their Muslim counterparts, science was shutting down in the Islamic world.

The first reason was politics. A conservative, literalist scientist-theologian named al-Ghazāli, who is influential in Muslim thinking to this day, wrote a critique of Muslim scientists, or Mu’tazilites, called The Incoherence of the Philosophers, in which he attacked their assimilation of the ideas of Aristotle and the concept of a natural causality of things. Writing of fire burning cotton, he said,

The one who enacts the burning by creating blackness in the cotton, [causing] separation in its parts, and making it cinder or ashes is God, either through the mediation of His angels or without mediation. As for fire, which is inanimate, it has no action. For what proof is there that it is the agent? They have no proof other than observing the occurrence of the burning at the [juncture of] only contact with the fire. Observation, however, [only] shows the occurrence [of burning] at [the time of the contact with the fire], but does not show the occurrence [of burning] by [the fire] and that there is no other cause for it.

They should stick closer to the text of the Quran, he argued. Be more “authentic.” The cause of things was not nature, but God.

Those who followed al-Ghazāli adopted the same literalist view as Christian fundamentalists do today: the only cause of anything was God, and the texts of the old books, in this case not the Bible but the Quran and the Hadīth (the recorded conversations of the Prophet Muhammad), gave Muslims everything they would ever need to know about their faith, and so the sort of philosophical debate and reasoning practiced by the Mu’tazilites was not only unnecessary—it was un-Islamic. As for science, what was the point? If the cause of everything was God, God was the only answer.

This fundamentalist interpretation led to many of the antiscience, anti-Western beliefs that have held back progress in more fundamentalist Muslim countries to this day. “The innate religious conservatism of the school of thought that grew around [al-Ghazāli’s] work inflicted lasting damage on the spirit of rationalism and marked a turning point in Islamic philosophy,” argues Al-Khalili.

But the second reason was perhaps even more powerful: the Islamic world’s failure to do what the Europeans, and particularly the followers of Martin Luther, were doing: adopt the printing press, a new technology that was making knowledge much more widely available. While devout Muslim scholars were painstakingly hand-copying holy books with artistic fealty, Lutherans were printing Bibles by the thousands and putting knowledge in the hands of the people to judge for themselves.

The DNA of Western Thought

Each arm of the double helix of Western Christianity—Roman Catholicism and the emerging Protestantism—embodied the two distinct worldviews of the authoritarian and the antiauthoritarian: that rules, methods, and laws were either proscribed from on high or built up by individuals in consensus.

These two views had always been present, but they were amplified in 1517, when Martin Luther posted his Ninety-Five Theses challenging church authorities to debate principles that seemed defensible only by virtue of the church’s authority over its subjects. In Luther’s view, the church had become corrupt, telling people they could buy their way into heaven by purchasing “indulgences,” the proceeds of which the church used to finance building St. Peter’s Basilica in Rome. “Why does not the pope, whose wealth is today greater than the wealth of the richest Crassus [a legendarily greedy first-century Roman businessman],” Luther asked, “build this one basilica of St. Peter with his own money rather than the money of poor believers?” Luther’s theses split the church between those who clung to authority and tradition and those who believed in man’s individual connection to God. Protestantism, with its streak of populist anti-authoritarianism, was born.

Luther’s grand movement, and the very idea that knowledge could be accessible by individuals without an intervening authority, had been made possible by the 1451 invention of the printing press. For the first time, books could be mass-produced, permitting knowledge and its attendant power to be spread widely. Luther used this new technology to distribute power among the people with his 1534 translation of the Bible from Greek, Hebrew, and Latin into common German. More than one hundred thousand copies of the Luther Bible were sold within forty years of its publication (an unfathomable number for the time), and millions heard its message. People could suddenly study the Bible and come to their own conclusions without the intercession of a pope or priest. The printing press laid the intellectual foundation for the scientific revolution that was to come.

This marked an important moment in human history, when Western thought was split into twin, competing paths: the authoritarian and the anti-authoritarian. The other three major sources of human power—government, economics, and science—developed similar authoritarian, top-down and anti-authoritarian, bottom-up strains of thought over the ensuing centuries as power was demystified.

As in religion, in government there are authoritarian, totalitarian models such as monarchy, dictatorship, and fascism on the one hand and antiauthoritarian models like democracy and anarchy on the other. In economics, communism and capitalism are the opposing theories, as are (less extremely) the ideas of John Maynard Keynes about the need for government stewardship of the economy on the one hand and Milton Friedman’s laissez-faire, free-market focus on the other. And in renaissance science, the split fell between the two competing paths of knowledge that were first proposed by the Catholic René Descartes and the Protestant Francis Bacon.

Descartes versus Bacon

Descartes stressed the importance of deduction, a method of reasoning from the top down using “first principles,” beginning with his famous line “I think, therefore I am.” In his 1637 Discourse on the Method of Rightly Conducting One’s Reason and of Seeking Truth in the Sciences, he began by embracing skepticism and approaching the entire world with doubt, free of preconceived notions. There was nothing, he said, he could count on as real if all things were subject to skepticism. Ah, but wait—he was here, thinking these thoughts. Therefore, he must be real.

Beginning with his mind as the only reliable foundation, Descartes regarded the senses as unreliable and the sources of untruth and illusion. He concluded that reliable truth about reality could be determined only by a mind that was separate and distinct from the physical body, thereby originating the concept of the mind-body split, or Cartesian (from “Descartes”) dualism. To Descartes, a conclusion is valid if and only if it follows logically from the premise, as do the syllogisms Aristotle defined in his classic book on logic, Organon. For example:

All men are mortal. Socrates is a man. Therefore, Socrates is mortal.

Bacon, in contrast, thought Aristotle had gotten it all wrong. He liked the ideas of Ibn al-Haytham and Roger Bacon before him, and he stressed nearly the opposite approach. Bacon was a lawyer who worked under Edward Coke, the attorney general, a position he would eventually assume himself. Toward the end of his legal career, he turned more of his attention to science and published what would become a foundational volume, Novum Organum Scientiarum, or “New Organon”—a “new instrument” of science. It was a devastating attack on Aristotle’s book and the logic of the Greeks with its emphasis on top-down reasoning and disdain for experimentation. In it he argued instead for using the inductive method of reasoning, which underlies much of the scientific method we use today. Inductive reasoning proceeds from the bottom up by observing with the senses and then building in logical steps to reach a general conclusion about reality. An example would be:

All observed swans are white; therefore, all swans are white.

This method clearly has a limitation: its conclusions are provisional and always subject to disproof. All it takes is the discovery of a single nonwhite swan to invalidate the statement. This is why one hears scientists talking about the “theory” of evolution. It is not an observed fact; rather, it is a conclusion that is supported by all the facts observed so far, but one can never be absolutely sure because one can never see the whole universe at once, and because of the provisional nature of inductive reasoning, scientists hold out the possibility, no matter how small, that it could be invalidated. Science thus demands intellectual honesty, and a scientific conclusion will always contain a provisional statement:

All observed swans are white; therefore, all swans are probably white.

In practice, Bacon’s method doesn’t bother scientists, or most reasonable people, because the chances of being wrong, while present, are usually in a practical sense very small. It is, for example, theoretically possible that chemical processes taking place in your body could cause you to spontaneously combust, but we don’t live our lives worrying about it because the probability is extremely small. That is why math and statistics have become such important parts of science: they quantify the relative probability that a conclusion is true or false.

Puritan Science

Since Protestantism was rooted in a protest against Catholic authority, Puritans did not take kindly to the Catholic Church’s indictment of Galileo, or to the idea that opinions that were supported by observation of nature, and thus were evidence of God’s law, could be decreed contrary to holy scripture. In fact, the growing conflict between the Puritans and the Church of England—established in 1534 because the Roman Catholic Church would not annul the marriage of King Henry VIII to Catherine of Aragon, preventing him from marrying again—arose because the Puritans thought the Church of England was not anti-Catholic enough. Thus their name: Puritans.

In 1604, their frustration led King James to authorize a new translation of the Bible, the King James version, to address their concerns. Nonetheless, many Puritans viewed having a monarch as the spiritual leader of the church (as is the case with the Church of England) as an irreconcilable compromise, a substitution of king for pope that had been made solely for the matrimonial benefit of Henry VIII. The monarchy’s royal prerogative seemed to them yet another hypocritical corruption of authority, akin to the Catholic Church’s “indulgences.”

Puritans became even more upset when James’s successor, King Charles I, hurriedly married the French Catholic princess Henrietta Maria within two months of his coronation, before Parliament could meet to forbid it. Soon after, he began appointing Catholic Lords to his court. He appointed William Laud the new Archbishop of Canterbury in 1633. Laud replaced the wooden communion tables with stone altars, installed railings around them, and ordered the use of candles, causing Puritans to complain that he was altering the Anglican churches to be more Catholic. Laud responded by closing Puritan churches and firing nonconformist clergy.

Separatist Congregationalist Puritans began emigrating to America again, fearing a return of Catholic absolutism and partisan political reprisals. Other Puritans organized as dissenters to King Charles’s use of arcane laws to levy personal taxes and his aggressive authoritarian power grab. Edward Coke, now in Parliament, sought to limit the bottom-wing king’s powers, for example by serving as chief author of the Petition of Right, passed in 1628, which laid out several basic rights that the United States would later adopt. Among these were that taxes could be levied only by Parliament, not by the king; that martial law could not be imposed in peacetime; that prisoners had to be able to challenge the legitimacy of their detentions through a writ of habeas corpus; and that soldiers could not be billeted in private residences. But in 1629, Charles rebuffed this attempt, dissolved Parliament, and asserted personal rule by extended royal prerogative. This state of affairs lasted until the English Civil Wars, from 1642 to 1646, in 1648, and from 1650 to 1651, which pitted Royalists (authoritarians) against the largely Puritan Parliamentarians (antiauthoritarians) and eventually led to both Laud’s and Charles’s beheadings.

After the Civil Wars ended and the Church of England was restored, many Puritans broke away. At first these “nonconformists” were again persecuted, but by the 1660s they were tolerated. Because of their emphasis on individual liberty over external authority, their adherents included some of the greatest minds of the age, including Isaac Newton.

Newton provides an example of how the idea of “science” had not yet fully emerged as something separate from religion in early Enlightenment thinking. In fact, during the seventeenth century, the word “scientist” was not commonly used to describe experimenters at all; they were called “natural philosophers,” an extension of the Puritan idea of the study of the Book of Nature. Science had also not fully emerged as a separate concept, but was sometimes thought of as a method or style of study in the arts, rather than a discretely defined set of disciplines. This was true even into Thomas Jefferson’s day. Jefferson himself usually used the word to mean what today we call the hard sciences, but sometimes he used it to refer simply to the rigorous study of other fields, such as the “sciences” of language, mathematics, and philosophy.

By 1663, a time when Puritans were a decided minority in England, 62 percent of the natural philosophers of the famed Royal Society of London were Puritans, including Newton, who had studied Ibn al-Haytham’s work on light and refraction, and who wrote far more on religion and alchemy than he did on science. Newton believed in the inerrancy of scripture, biblical prophecy, and that the apocalypse would come in 2060. He was “not the first of the age of reason. He was the last of the magicians,” said economist John Maynard Keynes, who purchased a collection of Newton’s papers in 1936 and was astounded to find more than one million words on alchemy and four million on theology, dwarfing his scientific work. Newton went on to create calculus and to publish Philosophiae Naturalis Principia Mathematica, or Mathematical Principles of Natural Philosophy (today, Mathematical Principles of Science), upon which modern physics was founded.

Eighty-nine years later, Principia was one of the main sources Thomas Jefferson drew upon for inspiration as he sat in the two second-story rooms he had rented from Jacob Graff in Philadelphia, writing the Declaration of Independence.

The Scientist-Politician

Racked by the threat of war and with its political power resting on uncertain ground, in June 1776 the Continental Congress appointed Jefferson, along with Benjamin Franklin, John Adams, Roger Sherman, and Robert Livingston to secretly draft the document. The committee delegated the writing of the first draft to Jefferson.

Like Bacon, who had died of pneumonia after conducting an experiment on preserving meat with snow, Jefferson was both an accomplished attorney and a passionate scientist. On July 4, 1776, the day the Continental Congress eventually adopted the Declaration of Independence, Jefferson took the time to record the local temperature on four separate occasions as part of a broader research project he was conducting. His measurements typically also included barometric pressure and wind speed. His goal was to improve meteorological science to refine farmer’s almanacs and improve weather forecasting throughout the colonies, both of which were of personal importance to Jefferson as a farmer.

Jefferson also had knowledge of physics, mechanics, anatomy, architecture, botany, archeology, paleontology, and civil engineering. He was an avid astronomer. He carried a small telescope with him wherever he went and recorded the eclipse of 1778 with great precision, although he was frustrated by the cloudy conditions. As president, he commissioned the Lewis and Clark expedition. He sold it to Congress as an economic initiative, but he sent his presidential secretary, Meriwether Lewis, for training with the top scientists of the day and instructed him to conduct it as a scientific expedition.

Jefferson’s love of science is well known among students of science policy. “Science is my passion, politics my duty,” he said. In writing to a friend just prior to the end of his term as president of the United States, he said,

Never did a prisoner, released from his chains, feel such relief as I shall on shaking off the shackles of power. Nature intended me for the tranquil pursuits of science, by rendering them my supreme delight, but the enormity of the times in which I have lived, has forced me to take part in resisting them, and to commit myself to the boisterous ocean of political passions.

Jefferson was also very familiar with Coke, whose Institutes he had studied as a law student. This heady mix of science, law, and politics, and the idea of circumscribing the power of the monarch, would lead Jefferson to carve out a founding document for the United States that was based not on religion or God, but on knowledge and reason. Whereas religious authority and proximity to God could be endlessly argued between different faiths or countries, Jefferson reasoned that a country based on the more narrowly defined rule of men—a democracy—was removed from this, freeing both religion and the government. This being the Enlightenment, Jefferson needed to convince the world’s nations that American independence should be respected as rational and correct, and that they should not intercede in the revolution, so he had to build the most inspiring and unassailable Enlightenment argument possible. As his friend and advisor Benjamin Franklin later noted dryly after signing the declaration Jefferson would craft, “We must all hang together or most assuredly we will all hang separately.” Their very lives would depend on the quality of Jefferson’s argument.

How Do We Know Things?

Holed up in his rented rooms, faced with this awesome responsibility, the thirty-three-year-old took up his quill pen. He considered Francis Bacon, Isaac Newton, and John Locke, whom he had studied at the College of William and Mary, to be the three most important thinkers of all time. He called them “my trinity of the three greatest men the world had ever produced.” Writing on a portable “lap desk” of his own design, he labored to create a document that reflected the clear, axiomatic logic of John Locke, who instructed that “in all sorts of reasoning, every single Argument should be managed as a mathematical demonstration; where the connexion of ideas must be followed till the mind is brought to the source on which it bottoms.” Bottoming his argument out on an irrefutable foundation was what Jefferson needed to do to avoid hanging.

Like Newton and Bacon, Locke was an Englishman and a Protestant, and he is credited with creating the philosophy of empiricism, on which much of modern science is based. He divided human thought into two categories: knowledge and belief. Locke was aware of the many divisions within Christianity, with each faith arguing that it was the one true religion. This was true not only of the great divide between Protestantism and Catholicism, but also of lesser divides between German Lutheranism and English Protestantism, as well as between the Church of England and the dissenters: the Puritans, and within them the sects of Presbyterians, separatist Congregationalists (from whose congregations came the American Pilgrims), and Baptists. Each could not be the one true religion, so some method of ascertaining truth or falsehood had to be developed, or the conflicting claims were likely to go on forever. This led him to ask some fundamental questions: How do we know something to be true? What is the basis of knowledge?

Locke’s An Essay Concerning Human Understanding, published in 1689, just two years after Newton’s Principia, strove to answer that question, by laying out what can be known empirically, how it is that we know it, and the inherent limits of knowledge. He began, building on Bacon and Ibn al-Haytham, whom he, too, had studied, with observation of the natural world. He then divided knowledge into three types: intuitive, demonstrative, and sensitive.

Intuitive knowledge is “self-evident” to anyone looking at it, and it carries the least doubt of the three types of knowledge. Three is more than two, black is not white, and the presence or absence of a thing are examples of intuitive knowledge.

Demonstrative knowledge, the second type of knowledge, is slightly less certain than intuitive knowledge. Agreement or disagreement is not immediately clear, but instead depends on the use of reason to demonstrate “by necessary consequences, as incontestable as those in mathematics,” that something is so. Each step in a reasoning process—which, as St. Germain had described, was the process of discovering the natural law of things—must in and of itself be intuitively evident. For example:

I can show you using these two apples in my left hands and these two apples in my right hand that two plus two equals four.

or:

A feather falls more slowly in air than a penny does. When we remove the air with a vacuum pump, the feather and the penny fall at the same rate. Therefore, air changes how gravity acts on different objects.

These intermediate steps in reasoning are called “proofs.” Each conclusion is reliable because it is ultimately traceable, step by demonstrable step, back to a self-evident foundation in the natural world. It is this mathematical reasoning that would distinguish knowledge as reliable and separate from opinion.

Locke called the third kind of knowledge “sensitive knowledge,” meaning that we get it directly from our senses. For example, we may become aware of a rose by its scent, then look for its presence.

But, in a nod to Descartes, he acknowledged that our senses are often wrong. Sometimes it’s not a rose we smell, but perfume; sometimes we see not a pond, but a mirage. Locke argued that sensitive knowledge is thus much less certain than intuitive or demonstrative knowledge.

Finally, he said,

Whatever comes short of one of these, with what assurance soever embraced, is but faith, or opinion, but not knowledge, at least in all general truths.

Seventeen Days in June

This approach was critical to Jefferson because it laid the foundational argument for democracy, which was implicit in a different form in Coke’s argument for the primacy of English common law: If we can discover the truth by using reason and observation—i.e. by using science—then anyone can discover the truth, and therefore no one is naturally better able or more entitled to discover the truth than anyone else. Because of this, political leaders and others in positions of authority do not have the right to impose their beliefs on other people. By natural law, the people themselves retain this inalienable right. Based on Locke’s ideas of knowledge, and Coke’s ideas of law, the antiauthoritarian equality of all men in their ability to use reason to discern the truth for themselves is logically self-evident. It is intuitive knowledge. And that’s the heart of—and the most powerful argument for—democracy.

Jefferson worked for seventeen days to craft a document that was grand and yet achieved the unassailable quality of a logical proof. The axiomatic beauty of the argument he was reaching for would indelibly tie science, knowledge, law, freedom, and democracy together in a single common cause of human advancement, and it would proclaim the inalienable right of the people to reject authoritarian tyranny as illegitimate.

But despite his best intentions, in his rough draft, Jefferson foundered on the shoals of authoritarian religious assumptions left over from Hobbes’s bleak and brutal era—a fact that illustrates how deeply rooted these assumptions are, even for a scientist like Jefferson, and how slow and careful a process is required to tease out what is knowledge from what, to quote Locke, is “but faith, or opinion.” The misstep occurred in the opening of the second paragraph, when Jefferson wrote:

We hold these truths to be sacred and undeniable; that all men are created equal.

The Edit That Changed the World

When Jefferson showed his draft to Franklin, Franklin made several firm, bold deletions, striking the words “sacred and undeniable.” Drawing on Locke, whom he too admired, he replaced Jefferson’s reference to divine authority with the antiauthoritarian words “self-evident,” which Locke had used in his Essay:

The idea of a supreme being, infinite in power, goodness, and wisdom, whose workmanship we are, and on whom we depend; and the idea of ourselves, as understanding rational beings, being such as are clear in us, would, I suppose, if duly considered and pursued, afford such foundations of our duty and rules of action, as might place morality amongst the sciences capable of demonstration: wherein I doubt not but from self-evident propositions, by necessary consequences, as incontestable as those in mathematics, the measures of right and wrong might be made out to any one that will apply himself with the same indifferency [sic] and attention to the one, as he does to the other of these sciences.

Franklin’s edit, it may be argued, helped to make the United States into the scientific and technological powerhouse it became, and helped to define democracy as a secular form of government instead of a theocratic one. At the time America’s most renowned scientist, Franklin was also an admirer of Newton’s Principia and a friend of the Scottish economist David Hume. Hume had written extensively on natural law and liberty, which Jefferson had drawn on in the sentence, and he defined liberty as freedom of choice:

By liberty, then, we can only mean a power of acting or not acting, according to the determinations of the will; this is, if we choose to remain at rest, we may; if we choose to move, we also may.

And even though Newton did not see a conflict between science and religion, neither did he insist upon applying religious thinking to the realm of science, which is the realm of “understanding,” as he put it. “A man may imagine things that are false,” Newton said, “but he can only understand things that are true.”

Newton and Hume both instead rested their arguments on empiricism, “bottoming them out” in the natural world with evidence, and so it had to be with Jefferson’s argument for liberty. Hume argued,

Whatever definition we may give of liberty, we should be careful to observe two requisite circumstances; first, that it be consistent with plain matter of fact; secondly, that it be consistent with itself. If we observe these circumstances, and render our definition intelligible, I am persuaded that all mankind will be found of one opinion with regard to it.

This is the persuasive power that Jefferson was reaching for by tying his arguments back to the plain matter of fact laid bare by his venerated “trinity” of three great men, together with the aggregated authority of grave and learned men in English common law as established by Coke, so that “all mankind [would] be found of one opinion with regard to” the right of the United States to declare its independence.

Franklin understood that Jefferson’s words had inadvertently confused the realms of knowledge and faith, resting the principle being argued—that all men are created equal and are endowed by their creator with certain inalienable rights—on an authoritarian, religious assertion, which, as Locke himself had shown, could be argued indefinitely. It was therefore weak as a political argument, a matter of mere belief, and anyone with a slightly different interpretation of faith could simply disregard it.

Franklin knew Jefferson was reaching for something more powerful, and he knew how to take it there. He instead rested the principle on reason and Locke’s intuitive knowledge, moving the founding argument for the United States firmly out of the realm of religious authority (as in “sacred and undeniable,” i.e., “God is on our side,” or “God save the monarchy,” always arguable assertions) and into the realm of man, reason, and the laws of nature that flowed from empiricism, antiauthoritarianism, and nature itself.

It was self-evident.

In the process they created something entirely new: a nation that respected and tolerated religion in every sense, but did not base its authority on religion. A nation whose authority was instead based on the underlying principles of liberty, reason, and science.

The War on Science

Подняться наверх