Читать книгу The Times History of the World - Richard Overy - Страница 5
INTRODUCTION ‘The State of the World’
ОглавлениеThe choice of Beijing, capital of China, as the host city for the 2008 Olympic Games has produced an extraordinary, if brief, historic marriage of East and West. The games symbolize the world of classical Greece, whose legacy has played such an exceptional part in the development of the Western world. Greek civilization gave the West professional medicine, geometry, ethical speculation, democracy, an ideal of participatory citizenship, codified law, the first history, a science of politics and an artistic heritage imitated again and again down the ages. Many of our common terms today—from economics to psychiatry—are Greek in origin.
China, on the other hand, is seat of the most ancient and continuous of civilizations. Always the site of the largest fraction of the world’s population, China for thousands of years, despite waves of invasions, sustained a way of life and a social structure which proved remarkably enduring. Chinese values and intellectual life were not, unlike Greek civilization, diffused widely outside the frontiers of what was loosely defined as ‘China’. Western critics in the 19th century regarded China as a stagnant culture, unmoved for centuries, but the artistic, scientific and intellectual life of China, though very different from that of the West, was rich and diverse. A good case can be made for arguing that China has been a fixed point throughout the period of recorded history, where Greek culture has been anything but continuous, relying for much of its survival on the intercession of the Arab cultures of the Middle East that succeeded the Roman Empire, in which aspects of Greek thought were kept alive and then re-exported to late medieval Europe.
The China of the 2008 Olympics is still a central part of the world story, but it has come part way to meet the West. From the late 19th century traditional Chinese society crumbled under Western impact. A nationalist revolution overthrew the emperors and the old way of life after 1911. A second communist revolution transformed China into a more modern industrial state after 1949. Over the past 25 years China has undergone a third revolutionary wave by embracing the fruits of modern global capitalism and becoming one of the world’s major economic players. China has not become an Asian ‘West’, but has adapted what the West has had to offer and has turned China into a world ‘superpower’. The relationship between East and West has come full circle. For centuries the West pushed outwards into the world exporting, usually violently, a version of Western civilization. China was long resistant to this pressure; now China can exert pressure of its own, challenging the monopoly hitherto enjoyed by the remorseless march of Western economics, political models, consumerism and popular culture.
The meeting of Greece and China weaves together two of the central threads of world history. But the Olympics are also a symbolic fusion of ancient and modern. Although the original games are far removed from the glossy, commercialized, technically sophisticated and ruinously expensive modern version, their revival is a reminder that there are easily understood reference points back to the Europe of more than 2,000 years ago. Boxing, wrestling, javelin-throwing and running are simply what they are, the same for a modern audience as they were for the Greeks. Even the marathon, the icon of the current Western obsession with keeping fit, describes a Greek legend, when a soldier runner covered 26 miles non-stop under a gruelling sun from the Battle of Marathon to Athens to warn of the approaching Persian fleet, only to drop dead from the effort on his arrival. Distant though the ancient world seems, the span of recorded human history is remarkably short in relation to the long history of prehistoric man and the infinitely longer history of the earth. The span can be covered by just a hundred human lives of 60 years, stretched out one after the other. Only 50 human lives will take you back to those first Olympic Games.
To think about the past as something connected by a continuous thread of human activity runs the danger of imposing a false sense of unity, but for much of the earth’s surface, over long periods of time, fundamental change has been absent. Anthropological evidence has for a long time been able to describe practices and beliefs that are clearly connected with a world so distant that it has been transmuted into myth. One hundred human lives laid end-to-end is not very many. To put it another way: it is possible to house an artefact from every major civilization of the past 5,000 years in a single cabinet and to recognize that until the last few hundred years those artefacts—whether a pot, a fertility doll, an arrow-head, a shoe, a coin—bear a remarkable underlying similarity. The recorded history of the world can be read at one level as a unitary experience, a brief 4 percent of the time modern hominids have been evolving, a hundred human lives.
Of course these lives were not the same wherever they were lived. Whatever homologies can be detected between peoples and civilizations, the experience of world history over the past 6,000 years is a series of fractured narratives, divided geographically and segmented by differing cultures, religious practices and political orders. The whole course of world history has been a process of cultural exchange and discovery, of imperial expansion and decline; sometimes links once made were then ruptured again; at other times communication enriched both cultures. In the past 500 years that process of discovering, mapping and understanding the world as a whole has accelerated, but for most previous civilizations the ‘known world’ was only what was immediately known. The modern concept of ‘world history’ which this book encapsulates was meaningless to most human civilizations through most of human history. For large areas of the globe there was no written culture so that ‘history’ survived as myth or folk memory, dating was arbitrary or non-existent, and the world was circumscribed by the very limited geographical reach of particular peoples. Rome was an exception, but even for Romans the known world was centred on the Mediterranean and the barbarous (meaning alien) outside was scarcely understood or valued. China for centuries regarded itself as the centre of the universe, and the outside world, to the extent that it intruded at all, was supposed to revolve like so many blighted planets around the Chinese sun. The history of the world is a very Western idea and it has become knowable only in the last century or so as Europeans and their descendants overseas produced sophisticated archaeological techniques and scholarly skills to unlock many of the remaining secrets of the past. When the English novelist H. G. Wells wrote his famous Outline of History, published in 1920, he was able to do so only on the foundation of an outpouring of new research in the last decades of the 19th century. Wells was preoccupied, he wrote in his introduction, with ‘history as one whole’, and he was one of the first to attempt it.
The more that came to be known about the many civilizations and cultures that made up human history, the more tempting it was, like Wells, to try to see history as a whole and to explain the process of historical change as a uniform one. This ambition had roots in the 19th century, where it was famously attempted by the German thinkers Georg Hegel and his erstwhile disciple Karl Marx, who both suggested that historical change was dynamic, the result of shifting patterns of thought or the transition from one economic system to another, each stage of human development incorporating the best from the past but each an advance on the one before it, until humankind finally reached an ideal society. The 19th-century view, coloured by the remarkable technical progress of the age, was to try to see a purpose behind historical change—not a mere random set of events, or a set of parables or myths to educate the present, but a triumphant account of the ascent of man. Neither Hegel nor Marx was a historian, and they both regarded China as a backwater that had somehow failed to move like the rest of the world. The 20th century witnessed more historically sophisticated attempts to find a unity in world history. The German philosopher Oswald Spengler published just after the First World War two volumes of an ambitious study of the pattern of all world history. Each civilization, Spengler argued, had a natural life-cycle, like any organism, of birth, growth, maturity and death, a run of approximately 1,000 years each. He called his volumes The Decline of the West in order to argue against the optimism of the previous century and to demonstrate that Western civilization, for all its belief that it represented the full flowering of human history, was doomed to go the way of the rest. The British historian Arnold Toynbee thought Spengler’s view of history too schematic, but he produced 10 volumes of A Study of History between 1934 and 1954 in which he too detected a common pattern in all previous civilizations which explained their birth, rise to cultural fruition and eventual collapse. Both Spengler and Toynbee rejected the idea that the purpose of history was the triumph of the West, but they both thought that history could be understood as a single, repeated pattern, from ancient Egypt to the modern West.
Few historians now accept that world history works like this. The rise and fall of civilizations evidently has causes, many of which are explored in the pages that follow. But it does not follow from this that history ought to progress, or that it follows internal laws or patterns of development. History does not move forward entirely blindly, but its progress is more often than not accidental, not patterned, and the circumstances of its development contingent rather than purposive—a product of a particular set of circumstances at a particular time rather than a necessary progress from one stage to the next. The same objection can be raised to the popular idea that there are turning points in history, key battles or events that have determined the course of history. Some events are clearly more important than others. History might now be written differently if the Roman army had not defeated Hannibal at Zama in 202 BC, but this was just one event in a much wider world of human activity, insignificant in India or China of the 3rd century BC. On balance human history moves forward on a broad front, less affected by ‘turning points’ than might be expected. If one set of events had never happened, there would just be a different narrative which would now be accepted as part of the past as readily as any other. History has neither pattern nor purpose. It is simply the record of what has been.
There are nonetheless broad common factors that have shaped the development of human communities wherever they have settled. The most important element has been the continuous and complex relationship between mankind and the natural world. Natural phenomena have defined a great deal of the human story. Until quite recently most natural forces were beyond human capacity to control or mediate or even to understand. Some still remain so. In the spring of 2008 a ferocious cyclone, which laid waste large parts of southern Myanmar, and a powerful earthquake in China, killed at least 150,000 people between them. Natural disasters—earthquakes, tidal waves, volcanic eruptions, soil erosion, rising sea levels, crop failure—have been a constant feature of all history. The shaping of the landscape determined patterns of settlement, forms of husbandry, the possibility of exploration and trade. The seas and rivers have been both barrier and pathway. The siting of cities, artificial additions to the landscape, has been determined by access to river communications, or the existence of a natural harbour, or the natural defensive walls provided by high outcrops of rock or hillside. For the past 5,000 years and the introduction of widespread agriculture, the relationship between population size and food supply has added a further natural factor restricting or enhancing the prospects of particular societies, or creating violent tensions between communities that lived by hunting and those with settled pastoral traditions. This competition is not confined to the ancient past, when, for example, waves of hunters from the plains of Eurasia descended on Europe in the 5th and 6th centuries; the near extermination of the North American buffalo in the 19th century by white hunters, an animal on which some Native American tribal societies depended, opened the way for the vast grain-growing prairie belt and the emasculation of the Native American population.
The supply of food, or its absence, famine, is a constant through human history. It exercised the ancient Egyptians, who developed complex irrigation systems to compensate for a buoyant population surrounded by desert; 3,000 years later Adolf Hitler argued that Germans needed ‘living space’ in Eastern Europe to provide a proper balance between population and food supply; the contemporary world, trying to support a vastly greater population, witnesses famines in Africa side-by-side with an overabundance of food in the richer West. A new food crisis in 2008 has prompted the bleak conclusion that food output must expand 50 per cent by 2030 to meet demand. For most people through most of recorded history the search for food has been unyielding. In hunting communities, as long as there existed a wealth of animal life or fish, food was not a problem. In settled, agricultural communities, on the other hand, the supply of food was restricted either by problems of soil or changeable climate or by the maldistribution of food between rich and poor, or both. Tilling the soil was no guarantee of a decent diet; a Roman feast or a groaning Victorian banquet gives no clue as to how inadequate was the food supply for the slaves who grew and garnered it in Roman Italy, or for the Victorian poor, most of them cut off from the land and dependent on a monotonous starch-rich diet. In post-Renaissance Italy there developed one of the most sophisticated cuisines in the world, informed by a wealth of gastronomic master-works, but the later peasant workers of the Po Valley suffered debilitating pellagra from eating a stodgy maize-based diet that inflated their abdomens and eventually killed them. In settled civilizations, an adequate, varied, artistically presented or innovative diet was the preserve of the rich. It was no accident that the Russian Revolution of February 1917 began with a demonstration for bread by hungry women in St Petersburg (Petrograd).
The relationship between mankind and environment has changed a good deal over the past 200 years. Larger and more regular food supplies together with changes in healthcare have provoked a population explosion. Global population was around 800 million in the 18th century; currently it is an astonishing 6.7 billion. A result has been the massive expansion of the agricultural base, partly from utilising virgin lands, partly from raising yields artificially through plant- and stock-breeding or the addition of chemical fertiliser. These changes have provoked deforestation and the transformation of natural habitat. Heavy hunting has brought thousands of land and sea creatures to the edge of extinction. The world’s urban population has grown dramatically since 1900 and now stands at just over 50 per cent of the whole, producing huge sprawling cities and high levels of human pollution. To meet the daily needs of such a population has meant expanding industrial production, depleting the earth’s natural resources, and creating a growing chemical imbalance in the atmosphere that has damaged the ozone layer and threatens through so-called ‘global warming’ to undermine the fragile basis on which 6 billion people can subsist. Demands for a higher living standard from Western populations already rich in resources, and for catch-up living standards in much of the rest of the world, has accelerated the depletion of resources, the transformation of the landscape and the unnatural climate change. The rich United States has 5 per cent of the world’s population but generates annually 25 per cent of the ‘greenhouse gases’ that cause climate change. The most alarming scenarios are now painted of the capacity of man to forge new natural disasters to which there will be no answer—enough methane gas perhaps to cause a global explosion in a century’s time, or the release of bacteria from the frozen icecaps millions of years old, from which current populations would have no prospect of immunity. The relationship between man and nature has about it a profound irony. The attempt to master the natural world has simply given nature new and more terrible powers.
Only in one respect has it proved possible to tame nature sufficiently to alter human society for the good. Over the past 150 years, in itself a fraction of the long history of man, it has proved possible to understand and then prevent or cure most medical conditions. For all the rest of human history, disease and disability were an ever-present reality for which there was almost no effective relief. The establishment of cities and animal husbandry combined to create ideal conditions for the establishment of a cluster of endemic epidemic diseases which periodically killed off wide swathes of the human host. The earliest epidemics in the cities of the first civilizations in China, Egypt or Mesopotamia included smallpox, diphtheria, influenza, chickenpox and mumps. With the opening of trade routes and regular invasions, disease could be spread from populations that had developed some immunity to those biologically vulnerable. Athens was struck by a devastating plague in 430 BC which undermined its political power; the Antonine plague in the late-2nd-century AD Roman Empire killed around one-quarter of the populations it infected, probably with smallpox. Bubonic plague, transmitted by fleas carried on rats, killed around two-thirds of its victims. Plague originating in Egypt in 540 AD spread to the Eastern Mediterranean where again one-quarter of the population died. The famous Black Death in the 14th century swept from Asia to Europe, killing an estimated 20 million and reducing Europe’s population by one-quarter. Epidemics died out partly because the pathogens had no other victims to kill. Modernity was no safeguard either. Cholera coincided with the industrialization and urbanization of Europe and produced regular pandemics in Asia, the Middle East and Europe between the 1820s and the 1890s. ‘Spanish influenza’ struck Europe at the end of the First World War with populations unnaturally weakened by lack of food; it was the world’s worst pandemic, killing 60 million people in just two years.
The attempt to understand and explain the nature of disease, and if possible cure it, goes back to the very earliest periods of recorded history. Classic Chinese medicine (now usually described as Traditional Chinese Medicine or TCM) is thought to date back almost 5,000 years. The standard text on ‘Basic Questions of Internal Medicine’ (known as Neijing Suwen) was written, according to legend, by the Yellow Emperor around 2,600 BC; the earliest surviving version dates from at least 2,000 years ago. Early Chinese medicine was rooted in a broader philosophical system based in one case on Confucianism, in the second on Taoism. Confucianism rejected the idea of anatomical or surgical invasion in the belief that the body was sacred; instead the use of acupuncture or massage was preferred, influencing internal disease by external means. Taoism saw health related entirely to achieving harmony between the different elements of the world, the Yin and the Yang. Disease was a consequence of lack of harmony. Chinese medicine focused on herbal remedies and acupuncture as means to restore that harmony rather than more violent medical intervention. Close observation of morbid symptoms was regarded as essential to understand what combination of remedies was needed. During the brief Sui dynasty (581–618 AD) a group of doctors composed The General Treatise on the Causes and Symptoms of Disease which comprised 50 volumes and described some 1,700 conditions. The classic texts retained an enduring influence down to the 20th century when successive modernising regimes tried to substitute Western medicine with only limited success.
The other classic tradition arose in Greece from the 5th century BC based on the teachings of the secular theorist Hippocrates, born around 460, whose famous ‘oath’, that doctors should at the least do no harm, is still sworn by Western doctors today. Like Chinese medicine, Greek medicine relied on explaining disease as an absence of harmony in the body between four elements or ‘humours’ that composed it. The elements were blood, choler (yellow bile), phlegm and black bile. These humours corresponded to the elements identified by Greek science as universal components—air (blood), fire (choler), water (phlegm) and earth (black bile). Cure for any imbalance was based on a range of options—bleeding, diet, exercise and occasional surgery. These views, revived in medieval Europe, exercised a continuing influence down to the time when modern medical science made its first appearance in the European Renaissance, and even beyond it. The problem for Greek as for Chinese medicine was the strong prejudice against direct anatomical research on human cadavers. In all pre-scientific medical systems an absence of proper understanding of the function of the body and the cause of disease meant that cures were largely accidental. Recent tests on 200 Chinese traditional herbal remedies for malaria found that only one, by chance, contained anything that might contribute to a cure.
Only the onset of serious research on how the body worked—perhaps the most famous example was William Harvey’s discovery of the circulation of the blood, published in 1628—made it possible to understand how the body was affected by particular conditions and to suggest prophylaxis. Even then the growing understanding of the body did little to help prevent epidemics until the onset of vaccination (introduced in late-18th-century Britain for smallpox) and the path-breaking research of the French chemist, Louis Pasteur, and the German doctor Robert Koch, which by the 1880s had confirmed that disease was caused by bacteria, each different micro-organism responsible for a particular disease. The discovery of antibiotic properties in penicillin mould in 1928 completed the therapeutic revolution. From the mid-19th century onwards the older medical traditions, which had limited or no medical efficacy, were superseded by a science-based medicine which has pushed the frontiers of biochemistry, neurology, physiology and pharmacology almost to their limits and has, at least temporarily, conquered almost all known diseases and a large number of internal medical disorders.
Only the identification of the HIV virus in the 1980s, which attacks the body’s immune system, made it clear that even the most scientifically advanced medicine may not in the future be able to stem new and unexpected forms of epidemic. For the fortunate few generations in the West who have been the full beneficiaries of the medical revolution, the transformation has been extraordinary. For all the rest of recorded human history there was no effective cure for most diseases and humans survived only because of a complex struggle between the micro-organisms and the human immune system. Death was ever-present and social attitudes and religious beliefs had to be rooted in the expectation of high levels of mortality. For those who survived there were disfiguring illnesses, crippling medical conditions, poor eyesight, chronic toothache, and so on. For women throughout history there was the debilitating cycle of births and the ever-present risk of maternal death. Pain, like premature death, was a permanent visitor.
To make matters worse, throughout human history both death and pain have been inflicted unnaturally, the product of deliberate violence on the part of human communities. Man, and almost always the male of the species, is a uniquely aggressive and punitive creature. Although attempts have been made over the past century to demonstrate that other animal species indulge in deliberate violence, animal violence is instinctive, not conscious. Mankind, on the other hand, has throughout recorded history, and evidently long before that, been able to premeditate the use of violence directed at other humans. Some anthropologists, following the 18th-century French philosopher Jean-Jacques Rousseau, have tried to argue that early man was most likely peaceable, and that only the tensions generated by more complex forms of social life introduced higher levels of violence. But the range and sophistication of pre-historic weapons, first stone, then iron and bronze, makes the idea of a pacific prehistoric state implausible. It is of course true that with settled communities, centred on cities, violence came to be organized through the use of armies. The evolution of a specialized human function for organising and legitimating the use of violence is evident in the very earliest recorded history. The soldier, armed with an ever more lethal armoury, runs in an unbroken line from all corners of the ancient world where complex civilizations arose. In tribal communities, without settled urban life, inter-tribal and intra-tribal violence was often ritualized, the young males of the tribe using violence as a rite of passage or a sacred obligation.
There is no single answer to the question of why violence should be such a hallmark of world history, but it can be found on almost every page. The German legal theorist Carl Schmitt, writing in the 1920s, claimed that the human community has always been divided between ‘friend’ and ‘foe’, those who are included in the group and those who are excluded. Simplistic though the distinction might seem, the concept of the alien, the other, the barbarian, the enemy, or the excluded also runs as a thread through all history. Treatment of the ‘other’ has always been harsh, even in the modern age with its vain efforts to impose some kind of restraints or norms on military behaviour and state violence. Yet even this distinction leaves a great deal unexplained. Human beings do not just fight each other in pitched battles using soldiers who know what to expect. They punish human victims in hideously painful and savage ways. Coercive social relationships have been far more common than consensual ones. Victims, even those from among ‘the included’ who are guilty of crime, have been tortured, executed, beaten, imprisoned in ways so ingeniously atrocious and gratuitously cruel that it is difficult not to assume that violence is the normal human condition and the very recent and limited experience of peace and respect for the individual a merciful historical anomaly. Violence is also universal, not some characteristic of ‘savage society’ as self-righteous Victorian imperialists like to think. Civilizations, however sophisticated, have indulged in violence of every kind. Religions have often led the way in devising grotesque ways to seek out heresies and exorcize devils. At the Museum of Torture Instruments in Guadalest in Spain (by no means the only such museum) are displayed roomfuls of fearful devices designed to extract confessions from across early-modern Europe, including the unhappy victims of the notorious Spanish Inquisition—iron crowns with spikes which tighten around the victim’s head, sharp stakes that could impale the whole length of a human body without killing the victim immediately. Human beings have devoted a deplorable amount of effort to inflicting suffering, and seem to have done so with few moral qualms.
There have been many attempts to explain why wars happen, or why human history is so soaked in blood. There is no single concept of war (though there is ‘warfare’, the art of fighting) that can embrace all the many forms of war or the thousands of separate historical reasons why particular wars break out, evident from the pages that follow. Early-20th-century anthropologists were inclined to argue that war might have had some important function in primitive societies or in the age of early state formation but they could see no justification for it in the modern age. The idea that war, and other forms of violence, were a throwback to a past age now thinly papered over with ‘civilization’ was urged by the Austrian psychoanalyst Sigmund Freud when he reflected on the reasons for the prolonged and deadly fighting in the First World War: ‘the primitive, savage and evil impulses of mankind have not vanished in any individual, but continue their existence, although in a repressed state’. Freud thought war rapidly exposed the savage persona inside and later argued that the more ‘civilized’ a people became, the more likely it was that the dam of repression would burst and uncontrollable violence result.
Whether this really is the mechanism that releases violence, Freud proved all too right in his prediction. In the late 19th century it was still just possible to imagine that the barbarities of earlier history, when cities were sacked, their populations put to the sword, fine buildings burned, was a thing of the past (though this did not prevent European troops on a punitive expedition from destroying the stunning Summer Palace in Beijing in 1860, an act of wanton vandalism that witnesses compared with the sack of Rome by the Goths). But the 20th century has been the bloodiest in all of human history, witness to somewhere between 85 and 100 million violent deaths, and millions more wounded, maimed, tortured, raped and dispossessed. It includes the deliberate murder of the European Jews which must rank with anything else in scale and horror from the past 6,000 years. It will be difficult for historians in a few hundred years’ time to see what separates the Mongol sack of Samarkand in 1220, which left only a few of the inhabitants alive, from the Allies’ destruction of Hamburg in 1943, which burnt the city to the ground and killed 40,000 people in hideous ways in just two days. The second was, of course, quicker and more efficient, but the moral defence usually mounted, that war is war, is a maxim as comprehensible in the ancient world as it would have been to Genghis Khan or Napoleon. So-called civilization displays precisely Freud’s divided self—capable of self-restraint and social progress, but capable of sudden lapses into barbarism.
The impact of famine, disease and war on human history was famously illustrated by the English 18th-century clergyman, Thomas Malthus, who argued in his Essay on the Principle of Population, published in 1798, that throughout history the dangers of overpopulation were always checked by the operation of these three elements. It is tempting to turn this argument on its head and wonder how it is that the human species survived at all under the multiple assault of violence, hunger and epidemic but it took another English biologist, Charles Darwin, with the publication of The Origin of Species in 1859, to explain that species survived through natural selection. The survival of Homo sapiens was thus biologically explicable; the stronger survived, the weaker perished. In a crude sense that was true, and for decades thereafter it was assumed that harsh though the realities of history had been, they had been necessary hardships to produce a biologically and intellectually progressive species. Both writers have in the end been confounded by a further paradox of the modern age: population has risen to levels often predicted as insupportable, but growth has scarcely been dented by the incidence of disease or violence or hunger, while natural selection has been overturned by modern medicine and welfare policies. The most violent and deadly century has at the same time been the century with the highest survival rates.
Grim though the past has often been, history has not been an unmediated story of suffering borne by an uncomprehending and victimized humanity. From the very earliest times human societies needed to make sense of the chaos and dangers around them, or to justify the hardships they faced, or the reality of unpredictable or premature death or to find some wider moral universe which sanctioned acceptable forms of behaviour and penalized others. Religion was able to satisfy all these needs and religious beliefs, like warfare, have been a constant for at least six millennia. Consideration of religion raises awkward questions about the nature of ‘world history’ because for most human societies through most of time, the material world described by modern historians has only been one part of the universe of human experience. Religious communities are connected to other unseen states and unknowable sites which have been, and for many still are, as profound a part of reality as the political structures and economic systems of the visible world. Belief in a world of spirits or an afterlife, or in unseen and divine guardians, or in a sublime universal ‘other’ has made historical experience multi-dimensional, natural as well as supernatural. For medieval Christians the world was one link in a complex chain between heaven and hell, which included the nether world of purgatory where souls were left to wait entry to paradise. For ancient Egyptians the other world was so real that kings talked and walked with the gods, and when they died took with them their household, animals, and furnishings. So widespread was the belief that the dead, or at least the kings, nobles and priests, needed to take possessions with them beyond the grave that modern knowledge of past cultures has been enormously expanded by the votive offerings and funerary furnishings found in excavated graves.
Belief in the supernatural, the divine, a world of the spirit, the reality of a soul that could live on beyond the decay of the earthly body, magic, superstition and witchcraft created for the inhabitants of all but the most recent communities a sphere of experience that was always larger than the material world around them. Belief was used to explain the apparently inexplicable, to ward off evil, to promote well-being, induce harmony of being and to prepare the mortal body for the world or worlds to come. The link with a world beyond mere physical observation has proved remarkably enduring, even in the secular, liberal West. In southern Italy images of saints and the Madonna are still carried through villages to offer protection against floods or volcanic eruptions or to encourage rainfall. The concept of ‘the Limbo of the Infants’, introduced as a term by the Catholic Church around 1300 to describe a haven for the souls of babies who died before there was time for baptism, in which they enjoyed a natural happiness, but were denied access to heaven, was all but set aside in 2007 when the Church announced that unbaptized infants should be entrusted to the possible mercy of God. Protests from parents anxious that their dead children should have a sure destination forced the Church to admit that Limbo was still a possibility. All attempts to provide a secular alternative to traditional Islam have foundered on the continuing vitality of the values and practices of the faith which is bound to a world beyond this one. Suicide bombers are recruited on the promise that they will be welcomed at once by the souls of the faithful when they cross the threshold of death.
Religions of every kind have exerted an extraordinary psychological power. This has been served in a number of ways. For thousands of years the finest buildings and monuments have been dedicated to religious purposes; in tribal societies the sacred—totems, ancestral graveyards—have exerted powerful fears and provoked an instinctive reverence. The numerous cathedrals, mosques and temples built in Christian, Islamic and Buddhist communities from medieval times onwards as gateways to the divine are among the richest architecture in the world, constructed in societies where for the poor the monumental buildings were awe-inspiring expressions of the spiritual. Religions were also the source of sanctioned behaviour. The rules laid down for social practice, custom, family life, or sexual conduct, are almost all religious in origin. A great many religions have been vehicles for constructing a male-centred society in which women were compelled to accept an ascribed and restricted gender role or risk severe forms of punishment or social discrimination. Many moral codes or legal systems were constructed by lay authorities—for example, Justinian’s Codex, or the Code Napoleon—but they relied on a conception of acceptable behaviour that was derived from the core moral teaching of the Church. In traditional Islam there should ideally be no distinction between religious precept and state law. In early Chinese history the emperors were accorded divine status, making the law, but making it as gods. In Japanese society, where the emperors also enjoyed quasi-divine status, to die willingly for the emperor was a moral obligation that overrode all others.
Religious belief was always difficult to challenge because the threat that unbelief or heresy posed was a threat to an entire way of viewing the world. For a great many communities governed by animist or polytheistic systems of belief there were no reasons, and usually no means, for questioning the ground in which such belief was rooted. There was no question of earning salvation, but simply obeying the customary rites and endorsing the beliefs of a given system. Monotheistic religions, in which respect for the deity and reverence for doctrine earned the right to salvation, were altogether more problematic. Arguments about Christian doctrine brought regular schism, provoking the rift between Orthodox Christianity in Eastern Europe and Western Catholic Christianity in 1054, and further schism between Catholic and Protestant Christianity in the 16th century. Fear of heterodoxy, or of the diabolical, provoked Catholicism into regular heresy hunts and the extraction of confessions through torture. Protestant and Catholic were burnt at the stake for their faith in the struggle over the Reformation. Radical Protestantism was also fearful of idolatry or witchcraft and the last witches were famously burnt in Salem, Massachusetts, in 1692. Islam was also schismatic. In 680 AD the faith divided between Sunni and Shiite sects over disagreements on doctrine (including the Shia insistence that Allah could take human form), and the two branches are still engaged in violent confrontation throughout the Middle East. Convinced of the rightness of their cause, monotheistic religions enjoy a strong imperative to convert; those outside the pale, regarded as pagans or infidels, are damned. Conversion was seen as an obligation, part of God’s purpose to ensure that among the many competing claims to a divine order only one could be the right one.
To claim no religious allegiance has been a recent and limited option, confined largely to the Western world. Atheism became publicly admissible in the 19th century without fear of punishment but the public denial of God still attracts outrage. Secularists over the past two centuries have been keen to separate Church and state, but have not necessarily been irreligious. The strident rejection of the supernatural was identified with 19th-century socialism whose world view was materialist. Atheism appealed to a progressive intelligentsia hostile to what they saw as stale Christian convention. When the German poet-philosopher Friedrich Nietzsche famously announced in Thus Spake Zarathustra, published in 1888, that ‘God is dead!’, he challenged what he saw as the great lie, dating back 2,000 years, and found a limited intellectual audience more than willing to accept a godless reality. In the early 20th century atheism was formally adopted by the Soviet Union, and communist China after 1949, but in neither case was it possible to eradicate belief. Atheism is now widely regarded as a declining intellectual force in an age of religious revivalism. The wide popular hostility to Richard Dawkins’s recently published The God Delusion (2006) is testament to how necessary it is even for societies where church attendance is moribund that the material world is not just all there is.
For much of recorded history what was known or believed to be knowable was bound up with religion. Religious institutions and the priesthood were the depositories of knowledge passed down, like the famous Jewish Talmud, from generation to generation. The earliest work of ‘wisdom literature’ in ancient Egypt, perhaps in the world, was attributed to Imhotep, high priest of Heliopolis under Djoser, king between 2,654 and 2,635 BC. Religious buildings housed valuable manuscripts, not only sacred books but treatises on many subjects. During the early Christian era in Europe, in what use to be known as the ‘Dark Ages’, monasteries and churches kept alive traditions of teaching, writing and recording. The Venerable Bede, based at the monastery in Wearmouth-Jarrow in the north-east of England in the early 8th century, helped to collect together an estimated 300–500 volumes, one of the largest libraries of books in the then Western world. Western education was dominated by the Church until the 18th century. Knowledge of this kind was limited in several ways. First, it was confined to a very small elite who could read and write. A distinct literary or official language was developed which could be fully understood only by the favoured few. Although the earliest writing can be dated back to the Sumerian civilization in present-day Iraq around 5,000 years ago, and then appearing in Egypt and China, the overwhelming majority of all humans who lived between then and the last few centuries were illiterate. Knowledge for them was limited to what could be conveyed orally, or crudely illustrated. For most people information was passed on through rumour, superstition, ritual, songs, sagas and folk tales. Second, it was limited by the theological or philosophical priorities of those who held the key to knowledge, reinforcing existing views of the known world, or of man’s relation to the universe, or of social hierarchy. Knowledge was used instrumentally, rather than for its own sake, confirming the existing order rather than encouraging critical or subversive discourse.
Knowledge in this sense did not inhibit technique. From the earliest settled communities onwards rapid strides were made in the practical skills associated with metallurgy, construction, irrigation, sculpture, and the production of artefacts of often stunning originality and beauty. The contrast between the last 6,000 years and the previous tens of thousands of years is remarkable. Early man made painfully slow progress in the development of sophisticated tools of stone or bone; humans in settled communities, with a division of labour and access to trade, could transfer technologies or fashions in a matter of years. By the time of the late Roman Empire, as any visit to a museum of classical archaeology will confirm, the range and sophistication of everything from daily products to major pieces of engineering was as advanced as anything that could be found for another thousand years. Practical skill was not, nevertheless, knowledge. Understanding of the natural world, like understanding of the supernatural, was conditional. It was possible to build the most technically remarkable and artistically splendid cathedral but still to believe that the earth was flat and hell really existed.
The development of a critical, sceptical, speculative science that did not endorse existing beliefs but deliberately undermined them, was a historical development of exceptional importance. The foundations of a speculative intellectual life were to be found in ancient Greece, whose philosophers, poets and playwrights produced work of real originality whose central concerns, despite the passage of 2,000 years, engaged the enthusiasm of educated Europeans when the classics were rediscovered in the late medieval period. Nineteenth-century intellectuals could write as if little separated their age from that of Plato or Aristotle or Aeschylus. The critical breakthrough in understanding the nature of material reality by thinking critically about accepted world-views was begun, however, during the sixteenth and seventeenth centuries, and associated mainly with the rise of a body of experimental or deductive science based on close observation. The key names are well-known. The Polish astronomer Nicolaus Copernicus dared to argue that the earth revolved around the sun in a book only published the year of his death, in 1543; the Italian astronomer Galileo Galilei extended these observations and in many other ways paved the way for much modern physical science, utilising recent developments in the mechanical sciences; the Englishman Thomas Hobbes laid the foundations of modern political science and human psychology in his Leviathan, published in 1651; in 1687 the mathematician Isaac Newton in his Principia Mathematica announced the law of gravity and ushered in a new age of mechanical physics. The scientific and philosophical revolution precipitated by the late 17th century in Europe opened the way to developing a modern understanding of nature and natural laws and above all accepting that such things were intrinsically knowable, not part of a Divine Plan whose purpose was not to be questioned. The new principle, according to the late 18th-century Prussian philosopher, Immanuel Kant, was sapere aude—‘dare to know’.
Those who pioneered a critical, scientific view of the world ran great risks. In 1616 the Catholic Church banned Copernican teaching, and placed Galileo under house arrest for challenging scripture. Galileo was fortunate: a few years before, in 1600, Giordano Bruno, another Copernican, was burnt at the stake in Rome. Hobbes was forced into exile, suspected of atheism; John Locke, who wrote the founding text of modern liberal representative government in the 1680s was also forced to write in exile, and his works circulated in parts of Europe in secret, too subversive for open sale. Writers of the 18th-century ‘Enlightenment’, during which critical thinking began to flourish for the first time, had to steer a careful line between what could or could not be said. Rousseau was also banned for life from his native city of Geneva for his radical democratic views. But it was a tide that could not be held back. By the early 19th century most of the modern Western sciences had been established on a firm scientific basis; political and social theory exploded traditional claims to authority (expressed most clearly in the founding of the American Republic in 1776 and the French Revolution of 1789); organized religion in its Western guise was shown to be unable to defend its major contentions about the nature of the universe and of man’s place in it and an alternative, naturalistic, rational model of the world was substituted. The triumph of free expression now seems irreversible, but the revolution represented by modern thought was not inevitable and its progress was subject to fits and starts. It is still not entirely clear why the prevailing authorities in Europe came to tolerate the new intellectual wave when a century before it might have been violently suppressed. The publication in 1859 of On Liberty by the English philosopher John Stuart Mill summed up what had been achieved in modern Europe. There was no other freedom, Mill asserted, more fundamental than the right to say what you like without fear that you will be silenced.
The formal acquisition of scientific, material knowledge about all aspects of the natural world and its application to human societies has been responsible for transforming world history more fundamentally than any other development in the past 6,000 years. Whatever case can be made for showing that there are strong lines of continuity throughout world history, the possibilities opened up by transcending the narrow world view of a God-centred and God-given universe have been unprecedented. It is a story intimately bound to the wider history of the rise of Europe (which with European expansion to America came to be regarded as the Western world) over the past 500 years. Historians have often been tempted to see this is as a happily progressive narrative while the rest of the world stagnated. From a Western perspective the idea of ‘the triumph of the West’ has an evident plausibility. Yet it begs the larger question of why Europe did evolve in very different ways, not only from the other civilizations existing alongside, but from all previous civilizations. What has been distinctive about the West, as Karl Marx argued in the mid-19th century, is the fact that it proved capable of expanding world-wide; Marx thought that no other culture or civilization would be capable of withstanding what Europe had to offer or what it forced upon them.
There is no agreed or straightforward answer to the question ‘why Europe?’ Geography was clearly favourable—a temperate climate, generally adequate food supplies, population growth steady but not excessively large, few of the debilitating, parasite-borne diseases that affected large parts of Africa and Asia with elephantiasis, river-blindness, bilharzia or malaria. The long European shoreline, never very far from any human habitation, encouraged the development of seaborne trade and exploration and the development of early sea power. Seafaring technology was one of the earliest and most important of the technical revolutions and Europeans exploited it fully. Europe also succeeded in stemming the tide of regular invasion which had characterized European history for almost a thousand years from the collapse of the Western Roman Empire. The Tatar invasions of the 13th century and the expansion of the Ottoman Turkish Empire into south-eastern Europe during the early modern period were checked sufficiently to allow central and western Europe to consolidate the state system, to build a settled network of cities, and a regular trading network. The military organization of Europe was transformed by the application of gunpowder and the development of cannon and musket-fire. Although these innovations were usually used against other Europeans, they gave Europeans a clear advantage whenever they found themselves fighting non-European peoples. It is sometimes argued that post-Reformation Protestantism, with its emphasis on individualism, played an important part in making Europe different, but the earliest explorers and imperialists were Catholic Portuguese and Spanish, while the Americas were discovered by an Italian from Genoa, Cristoforo Colombo. The long history of the Crusades against the Arab Middle East showed that there was nothing passive about Catholic Christianity.
The distinctive characteristic of European societies as they solidified into an early version of the modern states’ system was their willingness to look outwards towards the wider world. The voyages of discovery were not isolated examples of a lucky piece of exploration, but rapidly embraced the whole globe, making it clear in the process that the earth was round rather than flat. Only Europeans embraced the world in this way: map-making, navigation, inland exploration, elaborate descriptions of native communities and exotic fauna and flora, all contributed to creating a view of the world fundamentally different from the view from Constantinople or Beijing. Not only did Europeans discover large areas of the hitherto unknown (at least to Europeans) but they began a process of aggressive settlement across the Americas, in parts of Africa and India and into the archipelagos of the western Pacific ‘spice islands’. If occasionally briefly reversed, European expansion proved irresistible and European appetites insatiable. The Spanish conquistador Hernán Cortés captured the Aztec capital Tenochtitlán in 1520 with 300 Spanish troops and some local allies aided by the fact that around half the city’s 300,000 inhabitants had died of imported smallpox. Once the imperial toeholds were established across the oceans, Europeans never abandoned them. They became a source of remarkable wealth, helping eventually to make Europe richer than any rival civilization, and making it possible to defend and extend the imperial frontier.
Wealth itself would not have made Europe distinctive. The rulers of China and India were fabulously rich. What made the difference was how that wealth was used. The application of rational organization and scientific technique made possible a remarkable economic revolution. An important fraction of the wealth generated in Europe was mobile wealth, mobilized to develop yet further wealth by banks and commercial houses, which developed across Europe from the late 17th century. This was the engine that made commercial capitalism possible and it was fuelled by an acquisitive urge that was subject to few customary or religious restrictions. From the late 18th century the mobile wealth was used to fund a second revolution of technique. Although inventiveness was nothing strictly European—Chinese scientists and engineers had anticipated many European discoveries, including gunpowder—the critical difference was the application of invention. The development of steam technology in Britain made possible the mobilization of new and efficient forms of energy quite distinct from the water or horse-powered technologies of other cultures. The development of gas and later generated electricity as an energy source, the mastery of turbine technology, the perfection of rail locomotion, were all uniquely Western, a blend of European and American innovation. In a mere hundred years the gap between Western technique and the rest of the world was unbridgeable, making possible the rapid expansion of European states as imperial powers. The British American colonies won their independence in 1783, and European settlers, enjoying the same technical advantages and territorial ambitions, occupied the whole area of North America between Mexico and Canada by the middle years of the 19th century.
The economic and technical revolutions relied on a high level of social and spatial mobility. Europeans moved abroad in large numbers, bringing with them Christianity, guns, and trade. In western Europe there were few barriers to social mobility, allowing new classes of successful bankers, merchants and manufacturers to play an influential part in public affairs. The establishment of secure property rights and respect for individual wealth-making removed any legal inhibitions on the right to make money. The publication of Adam Smith’s classic The Wealth of Nations in 1776 provided a sound intellectual basis for the claim that the interests of communities were best served by allowing the free play of market forces and individual pursuit of economic well-being. Economic individualism and belief in the benign concept of the market had no equivalent in other cultures. Internal mobility was also important. The new industries attracted large numbers of rural workers who were no longer tied to the soil, at least in western Europe. Rapid population growth from the late 18th century, which threatened to put a severe strain on food supplies, was absorbed into the new cities; at the same time rising agricultural yields and the application of modern techniques (fertiliser cycles, threshing machines, stock breeding) made it just possible for the mobile urban population to be fed. The new wealth could then be used to fund overseas food production and imported foodstuffs. In 1877 the first refrigerated food was carried on board ship between Argentina and France making it possible to bring meat and fruit half-way across the world.
The economic revolution was accompanied by other important changes. In Europe and the United States the idea of education for all replaced the traditional distinction between illiterate mass and the educated few. Education was basic for most people, but opportunities for higher forms of training or for university expanded throughout the 19th century and became general in the twentieth. Civil rights and the rule of law were applied in most European states and the settler communities overseas, and limited progress was made towards representative forms of government. One of the most striking aspects of the move to greater emancipation was the gradual recognition in the liberal West that women should have equal rights—social, sexual, political—with men, even if the principle has not always worked as it should. Finally, the idea of the modern nation-state, in which identity was derived from being a citizen of a particular nation, defined by territory, shared culture and language, although far from universal even in Europe in the 19th century (and certainly not applied to Europe’s empires), set the model that has been subsequently established worldwide. The United Nations now counts 195 sovereign states, all but three as members.
The impact of Western wealth, military advantage, technology and ambition on the rest of the world was catastrophic. India was conquered, the Mughal emperors overthrown, and British rule imposed. China succeeded in keeping the West at bay, but at the cost of regular punitive expeditions, and the final sapping of China’s traditional political system by Western-educated Chinese who wanted China to adopt modern politics and economics. The Ottoman Empire crumbled under the remorseless pressure of Europe, which took over the whole of North Africa and encroached on the Ottoman Middle East. The Empire finally collapsed in 1919 at the end of the First World War. Everywhere else traditional societies, long isolated from any contact with a wider world, were visited, annexed, fought over and incorporated into the Western orbit. What resulted was usually an unstable mix of tradition and novelty, the old order sufficiently challenged or undermined that it could no longer function effectively, the new order mediated by surviving social traditions, religious practices and native cultures. The one exception was Japan. Contact with the West in the 1850s was perceived to be an immediate threat. In 1868 the Tokugawa Shogunate was overthrown, the Meiji emperor restored, and a rapid process of modernization undertaken to shield Japan from Western imperialism. Within forty years Japan’s modern armed forces could defeat the much larger Russian army and navy in the war of 1904–5; in the 1930s Japan invaded large parts of China and in 1941 Japanese forces launched a swift and successful campaign against American and European territories in the Pacific and South-East Asia which was reversed only by the exploitation of Western technologies yet more advanced.
The changes ushered in by the rise of European and American power have developed exponentially. The history of the past 250 years shows a dizzying transformation: global horizons have narrowed with mass communication and the development of a homogenized consumer culture; a level of knowledge and technical achievement unimaginable a century ago makes it possible to explore planets millions of miles distant, to revisit the earliest moments of the universe, to understand the genetic codes that dictate human biology, to harness lasers and micro-electronic components to produce a technical base not only of exceptional sophistication, but one that is also democratic in its reach. Some sense of the sheer speed of change can be illustrated in numerous ways, but few examples are more remarkable than the difference between the colonial wars of the late 19th century, fought with Gatling machine guns, rifles and small artillery pieces and the Second World War fought only forty years later with tanks, high-speed aircraft, radar, radio, missiles, and, in its late stages, with jet aircraft and nuclear weapons.
The Western experience, for all its technical and social achievements, has nonetheless been profoundly ambiguous. There have been perhaps no other civilizations which have been so publicly anxious about the prospects for their survival, so fearful of pride before a fall. The two world wars, both generated in Europe, compromised that claim to be the heartland of modern civilization and a source of social progress and moral authority, which had been relayed throughout the last decades of the 19th century. Exporting ideas about civil rights and nationhood accelerated the decline and disappearance of the old European empires. The transfer of the British crown colony of Hong Kong to Chinese rule in 1997 marked a symbolic end to a long history of coercive European expansion and acknowledged China’s growing international stature. The export of Western technology and commercial skills resulted in the collapse of many European industries and the transfer of large-scale manufacturing to the rapidly growing economies of eastern and south-eastern Asia. The global reach of Western commerce and the remorseless march of English as the global language has produced a backlash against what are perceived to be new forms of imperialism, and against the crass failure of Western states to understand the complex differences that still mark off communities in Asia, Africa, Latin America or the Middle East from the Western model. Islamic terrorism is only one of the many fruits of hostility to the idea that somehow the Western model ought to be appropriate in any cultural or geographical context.
Where, then, is this history going? Accelerated change can be read several ways: it could either mean speeding downhill to the edge of the precipice, or climbing rapidly to a richer, more secure and more peaceable world. Historians would do well do be humble in the face of the future. The unpredictable and unpredicted can be found throughout the chapters that follow. How few commentators and Sovietologists thought in the late 1980s that the Soviet bloc would possibly collapse in a matter of a few years; how many observers thought, wrongly, that HIV/AIDS would provoke an unstoppable pandemic which would decimate the world’s population. One thing can be said with certainty: for all the talk of a new unipolar world built around the massive military power of the United States and the appeal of the Western model, the foreseeable future will have China, Russia, India and the Middle East, the great bulk of the world’s population, developing in ways that are not consistent with an ideal Western model, capable of exerting a growing influence on global economic structures and the distribution of political influence, able perhaps to restore at least some of that diversity in historical experience characteristic of all recorded history up to the 19th century.
Taking the longer view there is little to be said. A hundred human lives of 60 years will take us to 8,000 AD. Perhaps the acceleration of history will provoke a sudden crash long before that. There remain the awful paradoxes that the more ‘progress’ there has been, the more violence, discrimination and crime has been generated and the more economic desires are satisfied, the nearer the earth moves to ecological crisis. As Nietzsche remarked more than a century ago, ‘the universe does not need man’. Human history may well be finite. On the other hand, the history of the world hitherto has shown man to be a remarkably adaptable, ambitious, unscrupulous, technically adept creature. This history so far is no simple parable of survival and triumph; the future of the world may have to be just that.
Richard Overy, 2008