Читать книгу A Short History Of Progress - Ronald Wright - Страница 8

GAUGUIN’S QUESTIONS

Оглавление

THE FRENCH PAINTER and writer Paul Gauguin — by most accounts mad, bad, and dangerous to know — suffered acutely from cosmological vertigo induced by the work of Darwin and other Victorian scientists.

In the 1890s, Gauguin ran away from Paris, family, and stockbroking career to paint (and bed) native girls in the tropics. Like many a troubled soul, he could not escape so easily from himself, despite great efforts to do so with the help of drink and opium. At the bottom of his disquiet lay a longing to find what he called the “savage” — primordial man (and woman), humanity in the raw, the elusive essence of our kind. This quest eventually drew him to Tahiti and other South Sea islands, where traces of a pre-contact world — an unfallen world, in his eyes — lingered beneath the cross and tricolore.

In 1897, a mail steamer docked at Tahiti bringing terrible news. Gauguin’s favourite child, Aline, had died suddenly from pneumonia. After months of illness, poverty, and suicidal despair, the artist harnessed his grief to produce a vast painting — more a mural in conception than a canvas1 — in which, like the Victorian age itself, he demanded new answers to the riddle of existence. He wrote the title boldly on the image: three childlike questions, simple yet profound. “D’Où Venons Nous? Que Sommes Nous? Où Allons Nous?” Where do we come from? What are we? Where are we going?

The work is a sprawling panorama of enigmatic figures amid scenery that might be the groves of heathen Tahiti or an unruly Garden of Eden: worshippers or gods; cats, birds, a resting goat; a great idol with a serene expression and uplifted hands seeming to point at the beyond; a central figure plucking fruit; an Eve, the mother of mankind, who is not a voluptuous innocent like other women in Gauguin’s work but a withered hag with a piercing eye inspired by a Peruvian mummy. Another figure turns in amazement to a young human pair who, as the artist wrote, “dare to consider their destiny.”2

Gauguin’s third question — Where are we going? — is what I want to address in this book. It may seem unanswerable. Who can foretell the human course through time? But I think we can answer it, in broad strokes, by answering the other two questions first. If we see clearly what we are and what we have done, we can recognize human behaviour that persists through many times and cultures. Knowing this can tell us what we are likely to do, where we are likely to go from here.

Our civilization, which subsumes most of its predecessors, is a great ship steaming at speed into the future. It travels faster, further, and more laden than any before. We may not be able to foresee every reef and hazard, but by reading her compass bearing and headway, by understanding her design, her safety record, and the abilities of her crew, we can, I think, plot a wise course between the narrows and bergs looming ahead.

And I believe we must do this without delay, because there are too many shipwrecks behind us. The vessel we are now aboard is not merely the biggest of all time; it is also the only one left. The future of everything we have accomplished since our intelligence evolved will depend on the wisdom of our actions over the next few years. Like all creatures, humans have made their way in the world so far by trial and error; unlike other creatures, we have a presence so colossal that error is a luxury we can no longer afford. The world has grown too small to forgive us any big mistakes.

Despite certain events of the twentieth century, most people in the Western cultural tradition still believe in the Victorian ideal of progress, a belief succinctly defined by the historian Sidney Pollard in 1968 as “the assumption that a pattern of change exists in the history of mankind… that it consists of irreversible changes in one direction only, and that this direction is towards improvement.”3 The very appearance on earth of creatures who can frame such a thought suggests that progress is a law of nature: the mammal is swifter than the reptile, the ape subtler than the ox, and man the cleverest of all. Our technological culture measures human progress by technology: the club is better than the fist, the arrow better than the club, the bullet better than the arrow. We came to this belief for empirical reasons: because it delivered.

Pollard notes that the idea of material progress is a very recent one — “significant only in the past three hundred years or so”4 — coinciding closely with the rise of science and industry and the corresponding decline of traditional beliefs.5 We no longer give much thought to moral progress — a prime concern of earlier times — except to assume that it goes hand in hand with the material. Civilized people, we tend to think, not only smell better but behave better than barbarians or savages. This notion has trouble standing up in the court of history, and I shall return to it in the next chapter when considering what is meant by “civilization.”

Our practical faith in progress has ramified and hardened into an ideology — a secular religion which, like the religions that progress has challenged, is blind to certain flaws in its credentials. Progress, therefore, has become “myth” in the anthropological sense. By this I do not mean a belief that is flimsy or untrue. Successful myths are powerful and often partly true. As I’ve written elsewhere: “Myth is an arrangement of the past, whether real or imagined, in patterns that reinforce a culture’s deepest values and aspirations…. Myths are so fraught with meaning that we live and die by them. They are the maps by which cultures navigate through time.”6

The myth of progress has sometimes served us well — those of us seated at the best tables, anyway — and may continue to do so. But I shall argue in this book that it has also become dangerous. Progress has an internal logic that can lead beyond reason to catastrophe. A seductive trail of successes may end in a trap.

Take weapons, for example. Ever since the Chinese invented gunpowder, there has been great progress in the making of bangs: from the firecracker to the cannon, from the petard to the high explosive shell. And just when high explosives were reaching a state of perfection, progress found the infinitely bigger bang in the atom. But when the bang we can make can blow up our world, we have made rather too much progress.

Several of the scientists who created the atomic bomb recognized this in the 1940s, telling politicians and others that the new weapons had to be destroyed. “The unleashed power of the atom has changed everything save our modes of thinking,” Albert Einstein wrote, “and we thus drift toward unparalleled catastrophes.” And a few years later, President Kennedy said, “If mankind does not put an end to war, war will put an end to mankind.”

When I was a boy, in the 1950s, the shadow of too much progress in weaponry— of Hiroshima, Nagasaki, and vaporized Pacific islands — had already fallen over the world. It has now darkened our lives for about sixty years, and so much has been said on the subject that I needn’t add more.7My point here is that weapons technology was merely the first area of human progress to reach an impasse by threatening to destroy the planet on which it developed.

At the time, this progress trap was seen as an aberration. In all other fields, including those of nuclear power and chemical pesticides, the general faith in progress was largely unshaken. Advertisements of the 1950s showed a smiling “Mrs. 1970,” who, having bought the right brand of vacuum cleaner, was enjoying the future in advance. Each year’s motor car looked different from the previous year’s (especially if it wasn’t). “Bigger! Wider! Longer!” sang the girls in a jingle, automakers being keen, then as now, to sell bigger as better. And peasants were freed from vermin with generous dustings of DDT in what became known as the Third World — that unravelling tapestry of non-Western cultures seen as a relic of “backwardness” torn between the superpowers. In both its capitalist and communist versions, the great promise of modernity was progress without limit and without end.

The collapse of the Soviet Union led many to conclude that there was really only one way of progress after all. In 1992 Francis Fukuyama, a former U.S. State Department official, declared that capitalism and democracy were the “end” of history — not only its destination but its goal.8 Doubters pointed out that capitalism and democracy are not necessarily bedfellows, citing Nazi Germany, modern China, and the worldwide archipelago of sweatshop tyrannies. Yet Fukuyama’s naive triumphalism strengthened a belief, mainly on the political right, that those who have not chosen the true way forward should be made to do so for their own good — by force, if necessary. In this respect, and in the self-interest it obscures, the current ideology of progress resembles the missionary projects of past empires, whether seventh-century Islam, sixteenth-century Spain, or nineteenth-century Britain.

Since the Cold War ended, we have held the nuclear genie at bay but have not begun to stuff it back in its bottle. Yet we are busy unleashing other powerful forces — cybernetics, biotechnology, nanotechnology — that we hope will be good tools, but whose consequences we cannot foresee.

The most immediate threat, however, may be nothing more glamorous than our own waste. Like most problems with technology, pollution is a problem of scale. The biosphere might have been able to tolerate our dirty old friends coal and oil if we’d burned them gradually. But how long can it withstand a blaze of consumption so frenzied that the dark side of this planet glows like a fanned ember in the night of space?

Alexander Pope said, rather snobbishly, that a little learning is a dangerous thing; Thomas Huxley later asked, “Where is the man who has so much as to be out of danger?”9 Technology is addictive. Material progress creates problems that are — or seem to be — soluble only by further progress. Again, the devil here is in the scale: a good bang can be useful; a better bang can end the world.

So far I have spoken of such problems as if they were purely modern, arising from industrial technologies. But while progress strong enough to destroy the world is indeed modern, the devil of scale who transforms benefits into traps has plagued us since the Stone Age. This devil lives within us and gets out whenever we steal a march on nature, tipping the balance between cleverness and recklessness, between need and greed.

Palaeolithic hunters who learnt how to kill two mammoths instead of one had made progress. Those who learnt how to kill 200 — by driving a whole herd over a cliff — had made too much. They lived high for a while, then starved.

Many of the great ruins that grace the deserts and jungles of the earth are monuments to progress traps, the headstones of civilizations which fell victim to their own success. In the fates of such societies — once mighty, complex, and brilliant — lie the most instructive lessons for our own. Their ruins are shipwrecks that mark the shoals of progress. Or — to use a more modern analogy — they are fallen airliners whose black boxes can tell us what went wrong. In this book, I want to read some of these boxes in the hope that we can avoid repeating past mistakes, of flight plan, crew selection, and design. Of course, our civilization’s particulars differ from those of previous ones. But not as much as we like to think. All cultures, past and present, are dynamic. Even the most slow-moving were, in the long run, works in progress. While the facts of each case differ, the patterns through time are alarmingly — and encouragingly — similar. We should be alarmed by the predictability of our mistakes but encouraged that this very fact makes them useful for understanding what we face today.

Like Gauguin, we often prefer to think of the deep past as innocent and unspoiled, a time of ease and simple plenty before a fall from paradise. The words “Eden” and “Paradise” feature prominently in the titles of popular books on anthropology and history. For some, Eden was the pre-agricultural world, the age of hunting and gathering; for others, it was the pre-Columbian world, the Americas before the white man; and for many, it was the pre-industrial world, the long stillness before the machine. Certainly there have been good and bad times to be alive. But the truth is that human beings drove themselves out of Eden, and they have done it again and again by fouling their own nests. If we want to live in an earthly paradise, it is up to us to shape it, share it, and look after it.

In pondering his first question — Where do we come from? — Gauguin might have agreed with G. K. Chesterton, who remarked, “Man is an exception, whatever else he is.… If it is not true that a divine being fell, then we can only say that one of the animals went entirely off its head.”10 We now know much more about that 5-million-year process of an ape going off its head, so it is hard, today, to recapture the shock felt around the world when the implications of evolutionary theory first became clear.

Writing in 1600, Shakespeare had Hamlet exclaim, “What a piece of work is a man! How noble in reason! how infinite in faculty!… in action how like an angel! in apprehension how like a god!”11 His audience would have shared Hamlet’s mix of wonder, scorn, and irony at human nature. But very few, if any, would have doubted that they were made as the Bible told: “And God said, Let us make man in our image, after our likeness.”

They were prepared to overlook theological rough spots posed by sex, race, and colour. Was God black or blond? Did he have a navel? And what about the rest of his physical equipment? Such things didn’t bear thinking about too closely. Our kinship with apes, which seems so obvious now, was unsuspected; apes were seen (if seen, which was rarely in Europe in those days) as parodies of man, not cousins or possible forebears.

If they thought about it at all, most people of 1600 believed that what we now call scientific method would simply open and illuminate the great clockwork set in place by Providence, as God saw fit to let humans share in admiration of his handiwork. Galileo’s troubling thoughts about the structure of the heavens were an unexploded bomb, unproven and unassimilated. (Hamlet still subscribes to a pre-Copernican universe, a “brave o’er-hanging firmament.”) The inevitable collision between scriptural faith and empirical evidence was barely guessed at. Most of the really big surprises — the age of the earth, the origin of animals and man, the shape and scale of the heavens — still lay ahead. Most people of 1600 were far more alarmed by priests and witches than by natural philosophers, though the lines between these three were often unclear.

From the biblical definition of man, and the commonsense principle that it takes one to know one, Hamlet thinks he knows what a human being is, and most Westerners continued to think they knew what they were for another 200 years. The rot of rational doubt on the matter of our beginnings did not set in until the nineteenth century, when geologists realized that the chronology in the Bible could not account for the antiquity they read in rocks, fossils, and sediments. Some civilizations, notably the Maya and the Hindu, assumed that time was vast or infinite, but ours always had a petty notion of its scale. “The poor world is almost six thousand years old,” sighs Rosalind in As You Like It,12 a typical estimate derived from the patriarchal lifetimes, “begats,” and other clues in the Old Testament. Half a century after Rosalind’s sigh, Archbishop Ussher of Armagh and his contemporary John Lightfoot took it upon themselves to pinpoint the very moment of Creation. “Man was created by the Trinity,” Lightfoot declared, “on October 23, 4004 B.C., at nine o’clock in the morning.”13

Such precision was new, but the idea of a young earth had always been essential to the Judaeo-Christian view of time as teleological — a short one-way trip from Creation to Judgment, from Adam to Doom. Newton and other thinkers began to voice doubts about this on theoretical grounds, but they had no real evidence or means of testing their ideas. Then, in the 1830s, while the young Charles Darwin was sailing round the world aboard the Beagle, Charles Lyell published his Principles of Geology, arguing that the earth transformed itself gradually, by processes still at work, and might therefore be as old as Newton had proposed — some ten times older than the Bible allowed.14

Under Queen Victoria, the earth aged quickly — by many millions of years in decades — enough to make room for Darwin’s evolutionary mechanism and the growing collection of giant lizards and lowbrowed fossil humans being dug up around the world and put on show in South Kensington and the Crystal Palace.15

In 1863, Lyell brought out a book called Geological Evidences of the Antiquity of Man, and in 1871 (twelve years after his Origin of Species), Darwin published The Descent of Man. Their ideas were spread by enthusiastic popularizers, above all Thomas Huxley, famous for saying, in a debate on evolution with Bishop Wilberforce, that he would rather acknowledge an ape for his grandfather than be a clergyman careless with the truth.16 Hamlet’s exclamation therefore became a question: What exactly is a man? Like children who reach an age when they’re no longer satisfied that a stork brought them into the world, a newly educated public began to doubt the old mythology.

By the time Gauguin was painting his masterpiece at the end of the century, the first two of his questions were getting concrete answers. His compatriot Madame Curie and others working on radioactivity were uncovering nature’s timekeepers: elements in rock that break down at a measurable rate. By 1907, the physicists Boltwood and Rutherford could show that the earth’s age is reckoned not in millions of years but in billions.17 Archaeology showed that the genus Homo was a latecomer, even among mammals, taking shape long after early pigs, cats, and elephants began walking the earth (or, in the case of whales, gave up walking and went swimming). “Man,” wrote H. G. Wells, “is a mere upstart.”18

What was extraordinary about human development — the one big thing that set us apart from other creatures — was that we “leveraged” natural evolution by developing cultures transmissible through speech from one generation to the next. “The human word,” Northrop Frye wrote in another context, “is the power that orders our chaos.”19 The effect of this power was unprecedented, allowing complex tools, weapons, and elaborate planned behaviours. Even very simple technology had enormous consequences. Basic clothing and built shelter, for example, opened up every climate from the tropics to the tundra. We moved beyond the environments that had made us, and began to make ourselves.

Though we became experimental creatures of our own devising, it’s important to bear in mind that we had no inkling of this process, let alone its consequences, until only the last six or seven of our 100,000 generations. We have done it all sleepwalking. Nature let a few apes into the lab of evolution, switched on the lights, and left us there to mess about with an ever-growing supply of ingredients and processes. The effect on us and the world has accumulated ever since. Let’s list a few steps between the earliest times and this: sharp stones, animal skins, useful bits of bone and wood, wild fire, tame fire, seeds for eating, seeds for planting, houses, villages, pottery, cities, metals, wheels, explosives. What strikes one most forcefully is the acceleration, the runaway progression of change — or to put it another way, the collapsing of time. From the first chipped stone to the first smelted iron took nearly 3 million years; from the first iron to the hydrogen bomb took only 3,000.

The Old Stone Age, or Palaeolithic era, lasted from the appearance of toolmaking hominids, nearly 3 million years ago, until the melting of the last ice age, about 12,000 years ago. It spans more than 99.5 per cent of human existence. During most of that time, the pace of change was so slow that entire cultural traditions (revealed mainly by their stone tool kits) replicated themselves, generation after generation, almost identically over staggering periods of time. It might take 100,000 years for a new style or technique to be developed; then, as culture began to ramify and feed on itself, only 10,000; then mere thousands and centuries. Cultural change begat physical change and vice versa in a feedback loop.

Nowadays we have reached such a pass that the skills and mores we learn in childhood are outdated by the time we’re thirty, and few people past fifty can keep up with their culture — whether in idiom, attitudes, taste, or technology — even if they try. But I am getting ahead in the story. Most people living in the Old Stone Age would not have noticed any cultural change at all. The human world that individuals entered at birth was the same as the one they left at death. There was variation in events, of course — feasts, famines, local triumphs and disasters — but the patterns within each society must have seemed immutable. There was just one way to do things, one mythology, one vocabulary, one set of stories; things were just the way they were.

It is possible to imagine exceptions to what I have just said. The generation that saw the first use of fire, for instance, was perhaps aware that its world had changed. But we can’t be sure how quickly even that Promethean discovery took hold. Most likely, fire was used, when available from wildfires and volcanoes, for a long time before it was kept. And then it was kept for a very long time before anyone learnt it could be made. Some may remember the 1981 film Quest for Fire, in which the lithe figure of Rae Dawn Chong scampers about in nothing but a thin layer of mud and ashes. The film was based on a novel published in 1911 by the Belgian writer J. H. Rosny.20 Rosny’s original title was La Guerre du Feu — The War for Fire — and the book, more than the film, explores deadly competition between various human groups to monopolize fire in much the same way that modern nations try to monopolize nuclear weapons. Throughout the hundreds of centuries when our ancestors tended a flame but could not make one, putting out their rivals’ campfire in an Ice Age winter would have been a deed of mass murder.

The first taming of fire is hard to date. All we know is that people were using fire by at least half a million years ago, possibly twice that.21 This was the time of Homo erectus, the “upright man,” who was much like us from the neck down, but whose braincase had only about two-thirds the modern capacity. Anthropologists are still debating when Homo erectus first appeared and when he and she were superseded, which is largely a matter of defining that evolutionary stage. Scholars are even more divided on how well erectus could think and speak.

Modern apes, whose brains are much smaller than those of erectus, use simple tools, have wide knowledge of medicinal plants, and can recognize themselves in a mirror. Studies using non-verbal language (computer symbols, sign language, etc.) show that apes can employ a vocabulary of several hundred “words,” though there is disagreement on what this ability says about ape communication in the wild. It is clear that different groups of the same species — for example, chimps in separate parts of Africa — have different habits and traditions, passed on to the young just as in human groups. In short, apes have the beginnings of culture. So do other intelligent creatures, such as whales, elephants, and certain birds, but no species except humankind has reached the point at which culture becomes the main driver of an evolutionary surge, outrunning environmental and physical constraints.

The bloodlines of man and ape split about 5 million years ago, and as I mentioned, hominids making crude stone tools appeared some 2 million years later. It would therefore be foolish to underestimate the skills of Homo erectus, who, by the time he was toasting his callused feet at a campfire half a million years ago, was nine-tenths of the way along the road from an ancestral ape to us. With the taming of fire came the first spike on the graph of human numbers. Fire would have made life much easier in many environments. Fire kept caves warm and big predators away. Cooking and smoking greatly increased the reliable food supply. Burning of undergrowth extended grazing lands for game. It is now recognized that many supposedly wild landscapes inhabited down to historic times by hunter-gatherers — the North American prairies and the Australian outback, for instance — were shaped by deliberate fire-setting.22 “Man,” wrote the great anthropologist and writer Loren Eiseley, “is himself a flame. He has burned through the animal world and appropriated its vast stores of protein for his own.”23

About the last big thing the experts agree on is that Homo erectus originated in Africa, the home of all early hominids, and by a million years ago was living in several temperate and tropical zones of the “Old World,” the contiguous Eurasian landmass. This is not to say the Upright Man was thick on the ground, even after he tamed fire. Perhaps fewer than 100,000 people, scattered in family bands, were all that stood between evolutionary failure and the 6 billion of us here today.24

After Homo erectus the evolutionary path gets muddy, trodden into a mire by rival tribes of anthropologists. One camp, that of the “multiregional” hypothesis, sees Homo erectus evolving by fits and starts into modern humanity wherever he happened to be through gene diffusion, otherwise known as mating with strangers. This view seems to fit well with many of the fossil finds but less well with some interpretations of DNA. Another camp — the “Out of Africa” school — sees most evolutionary change taking place on that continent, then erupting over the rest of the world.25 In this second view, successive waves of new and improved humans kill off, or, at any rate, outcompete, their forerunners wherever they find them, until all the lowbrows are gone. This theory implies that each new wave of African man was a separate species, unable to breed with other descendants of the previous kind — which may be plausible if different types evolved without contact for long periods but is less likely over shorter spans of time.26

The debate over the path of human progress gets most heated when we reach our controversial cousins, the Neanderthals. These lived mainly in Europe and northwest Asia in quite recent times — well within the last one-twentieth of the human journey. A Neanderthal Gauguin, thawed out from a receding glacier today, might wake up and ask, “Who were we? Where did we come from? Where did we go?” The answers would depend on whom he approached. Experts cannot even agree on his scientific name.

In round figures, Neanderthals appear about 130,000 years ago and disappear about 100,000 years later. Their “arrival” date is less certain than their departure, but it seems they evolved at about the same time as early examples of what is thought to be our modern kind — often called Cro-Magnon, after a rockshelter in the lovely Dordogne region of southern France, where the human fossil record is the richest in the world.

Ever since they were first identified, Neanderthals have been the butt of what I call “palaeo-racism,” lampooned as cartoon cavemen, a subhuman, knuckle-dragging breed. H. G. Wells called them the “Grisly Folk” and made an unflattering guess at how they might have looked: “an extreme hairiness, an ugliness… a repulsive strangeness in his… low forehead, his beetle brows, his ape neck, and his inferior stature.”27 Many have claimed that Neanderthals were cannibals, which could be true, for so are we — later humans have a long record of cannibalism, right down to modern times.28

The first Neanderthal skeleton was unearthed in 1856 from a cave in a valley near Düsseldorf, Germany. The place had been named after the composer Joachim Neumann, who had rather affectedly rendered his surname into Greek as “Neander.” Englished, Neanderthal is simply “Newmandale.” Fitting enough: a new man had indeed come to light in the dale, a new man at least 30,000 years old. Not that Neanderthal Man’s seniority was recognized immediately. The French, noting the skull’s thickness, were inclined to think it had belonged to a German. The Germans said it was most likely from a Slav, a Cossack mercenary who had crawled into the cave and died.29 But just three years later, in 1859, two things happened: Darwin published On the Origin of Species and Charles Lyell, visiting the gravels of the River Somme (to become infamous, not sixty years later, as a human slaughterhouse), recognized chipped flints as weapons from the Ice Age.

Once the scientists of the day had acknowledged that the Neanderthaler wasn’t a Cossack, they cast him in the newly minted role of the “missing link” — that elusive creature loping halfway across the evolutionary page between an ape and us. The New Man became the right man at the right time, the one who, “in his glowering silence and mystery, would show… the unthinkable: that humans were animals.”30 It was assumed that he had little or no power of speech, ran like a baboon, and walked on the outsides of his feet. But as more bones were unearthed and analysed, this view did not stand up. The most “apelike” skeletons were found to be sufferers from osteoarthritis, severely crippled individuals who had evidently been supported for years by their community. Evidence also came to light that the “grisly folk” had not only cared for their sick but also buried their dead with religious rites — with flowers and ochre and animal horns — the first people on earth known to do so. And last but not least, the Neanderthal brain turned out to be bigger than our own. Perhaps Homo neanderthalensis was really not so brutish after all. Perhaps he deserved to be promoted to a subspecies of modern man: Homo sapiens neanderthalensis. And if that were so, the two variants could, by definition, have interbred.31

Before the two began to compete in Europe, the Cro-Magnons lived south of the Mediterranean and the Neanderthals north. Then as now, the Middle East was a crossroads. Dwelling sites in that turbulent region show occupation by both Neanderthals and Cro-Magnons beginning about 100,000 years ago. We can’t tell whether they ever lived there at exactly the same times, let alone whether they shared the Holy Land harmoniously. Most likely their arrangement was a kind of time-share, with Neanderthals moving south out of Europe during especially cold spells in the Ice Age and Cro-Magnons moving north from Africa whenever the climate warmed. What is most interesting is that the material culture of the two groups, as shown by their artefacts, was identical over a span of more than 50,000 years. Archaeologists find it difficult to say whether any given cave was occupied by Neanderthals or Cro-Magnons unless human bone is found with the tools. I take this as strong evidence that the two groups had very similar mental and linguistic capabilities, that neither was more primitive or “less evolved.”

No Neanderthal flesh, skin, or hair has yet come to light, so we can’t say whether these people were brown or blond, hairy as Esau or smooth as Jacob. Nor do we know much about the Cro-Magnons’ superficial appearance, though genetic studies suggest that most modern Europeans may be descended from them.32 We know these populations only by their bones. Both were roughly the same height, between five and six feet tall with the usual variation between sexes. But one was built for strength and the other for speed. The Neanderthal was heavyset and brawny, like a professional weightlifter or wrestler. The Cro-Magnon was slighter and more gracile, a track athlete rather than a bodybuilder. It is hard to know how far these differences were innate, and how much they reflected habitat and lifestyle. In 1939, the anthropologist Carleton Coon drew an amusing reconstruction of a Neanderthal cleaned up, shaved, and dressed in a fedora, jacket, and tie. Such a man, Coon remarked, might pass unnoticed on the New York subway.

As such analogies suggest, the variation between Neanderthal and Cro-Magnon skeletons does not fall far outside the range of modern humans. Put side by side, the bony remains of Arnold Schwarzenegger and Woody Allen might exhibit a similar contrast. The skull, however, is another matter. The so-called classic Neanderthal (which is a rather misleading term because it is self-fulfilling, based on the more pronounced examples) had a long, low skull with strong brow ridges in front and a bony ledge across the nape of the neck, the Neanderthal “bun” or “chignon.” The jaw was robust, with strong teeth and a rounded chin; the nose was broad and presumably squat. At first glance the design looks archaic, much the same architecture as that of Homo erectus. But — as noted — the Neanderthal brain was bigger on average than the Cro-Magnon. Coon’s subway rider had a thick skull but not necessarily a thick head.

What this adds up to, I think, is that the supposedly archaic characteristics of the Neanderthal were in fact an overlay of cold-climate adaptations on an essentially modern human frame.33 The high foreheads of modern people can get so chilled that the brain is damaged, and icy air can freeze the lungs. The Neanderthal brain was sheltered by the massive brows and the low, yet roomy, vault. Air entering Neanderthal lungs was warmed by the broad nose, and the whole face had a better blood supply. Thickset, brawny people do not lose body heat as quickly as slender people. Signs of similar adaptation (in body shape, at least) can be seen among modern Inuit, Andeans, and Himalayans — and this after only a few thousand years of living with intense cold, beside the 100,000 during which Europe’s Neanderthals made their living on the front lines of the Ice Age.

Things seem to have gone well enough for them until Cro-Magnons began moving north and west from the Middle East, about 40,000 years ago. Until then, the cold had been the Neanderthals’ great ally, always turning invaders back sooner or later, like the Russian winter. But this time the Cro-Magnons came to stay. The invasion seems to have coincided with climatic instability linked to sudden reversals of ocean currents that caused freezing and thawing of the North Atlantic in upsets as short as a decade.34 Such sharp changes — severe as the worst predictions we now have for global warming — would have devastated animal and plant communities on which the Neanderthals depended. We know that they ate a lot of big game, which they hunted by ambush — breaks in their bones are similar to those sustained by rodeo cowboys, showing they went in close for the kill. And we know that they were not usually nomadic, occupying the same caves and valleys year-round. Humans in general have been called a “weed species,” thriving in disrupted environments, but of these two groups, the Neanderthals were the more rooted. The Cro-Magnons were the invasive briars. Climate change would have made life difficult for everyone, of course, but unstable conditions could have given the edge to the less physically specialized, weaker at close quarters but quicker on their feet.

I remember seeing a cartoon when I was a schoolboy — I think it may have been in Punch — showing three or four bratty Neanderthal children standing on a cliff, badgering their father: “Daddy, Daddy! Can we go and throw rocks at the Cro-Magnons today?” For about ten millennia, from 40,000 to 30,000 years ago, the late Neanderthals and the early Cro-Magnons probably did throw rocks at each other, not to mention dousing campfires, stealing game, and perhaps seizing women and children. At the end of that unimaginably long struggle, Europe and the whole world belonged to our kind, and the “classic” Neanderthal was gone forever. But what really happened? Did the Neanderthal line die out, or was it to some degree assimilated?

The 10,000-year struggle was so gradual that it may have been scarcely perceptible — a fitful, inconclusive war with land lost and won at the rate of a few miles in a lifetime. Yet, like all wars, it sparked innovation. New tools and weapons appeared, new clothing and rituals, the beginnings of cave painting (an art form that would reach its height during the last great fling of the Ice Age, after the classic Neanderthals had gone). We also know that cultural contact went both ways. Late Neanderthal sites in France show change and adaptation at a pace never seen before.35 By then, near the end, the war’s implications must have become dreadfully clear. It seems that the last Neanderthal bands held out in the mountains of Spain and Yugoslavia, driven like Apaches into rougher and rougher terrain.

If the warfare picture I have sketched has any truth to it, then we face unpalatable conclusions. This is what makes the Neanderthal debate so emotional: it is not only about ancient people but about ourselves. If it turns out that the Neanderthals disappeared because they were an evolutionary dead end, we can merely shrug and blame natural selection for their fate. But if they were in fact a variant or race of modern man, then we must admit to ourselves that their death may have been the first genocide. Or, worse, not the first — merely the first of which evidence survives. It may follow from this that we are descended from a million years of ruthless victories, genetically predisposed by the sins of our fathers to do likewise again and again. As the anthropologist Milford Wolpoff has written on this period: “You can’t imagine one human population replacing another except through violence.”36 No, you can’t — especially on the bloodstained earth of Europe, amid Stone Age forebodings of the final solution and the slaughter of the Somme.

In the aftermath of the Second World War, William Golding explored ancient genocide in his extraordinary novel The Inheritors. With wonderful assurance, Golding takes the reader inside the minds of an unnamed group of early humans. The book’s epigraph, from Wells, invokes Neanderthals, though the anthropological specifics fit better with much earlier stages of mankind. Golding’s folk are gentle, naive, chimp-like woodland dwellers. They eat no meat except the leavings of big predators; they are poor speakers, using telepathy as much as language; they have fire but few weapons, and have never suspected there is anyone else in the world except themselves.

Yet Golding’s anachronisms don’t matter: his people may not fit any particular set of bones from the real past, but they stand for many. In the course of a few spring days, the forest dwellers are invaded for the first time by people like us, who with their boats, bonfires, arrows, raucous voices, wholesale tree-felling, and drunken orgies baffle and fascinate the “forest devils” even as they kill them one by one. At the end, only a mewling baby remains, kept by a woman who has lost her own child to drain the milk from her breasts. The invaders then move on through the new land, their leader plotting further murders — murders now amongst themselves — as he sharpens a weapon, “a point against the darkness of the world.”

Golding had no doubt that the ruthless were the winners of prehistory, but another question he raised is still unsettled: Does any Neanderthal blood flow in modern humans? How likely is it that during 10,000 years of interaction, there was no sex, unconsensual though it may have been? And if there was sex, were there children? DNA studies on Neanderthal remains have been inconclusive so far.37 But the skeleton of a child found recently in Portugal strongly suggests interbreeding, as do bones from Croatia and elsewhere in the Balkans.38

I also have personal evidence that Neanderthal genes may still be with us. A few modern people have telltale ridges on their heads.39 I happen to have one — a bony shelf across the back of the skull that looks and feels like the Neanderthal bun. So until new findings come along to settle the matter, I choose to believe that Neanderthal blood still flows, however faint, in the Cro-Magnon tide.40

Despite the many details of our ancestry still to be worked out, the twentieth century has broadly answered the first two of Gauguin’s questions. There is no room for rational doubt that we are apes, and that, regardless of our exact route through time, we come ultimately from Africa. But unlike other apes, we tamper, and are tampering more than ever, with our destiny. For a long time now, there has been no such thing as that Enlightenment wild goose which Gauguin sought, the Natural Man. Like those arthritic Neanderthals who were cared for by their families, we cannot live without our cultures. We have met the maker of Hamlet’s “piece of work” — and it is us.

A Short History Of Progress

Подняться наверх