Читать книгу 10% Human: How Your Body’s Microbes Hold the Key to Health and Happiness - Alanna Collen, Alanna Collen - Страница 11

ONE Twenty-First-Century Sickness

Оглавление

In September 1978, Janet Parker became the last person on Earth to die of smallpox. Just 70 miles from the place where Edward Jenner had first vaccinated a young boy against the disease with cowpox pus from a milkmaid, 180 years earlier, Parker’s body played host to the virus in its final outing in human flesh. Her job as a medical photographer at the University of Birmingham in the UK would not have put her in direct jeopardy were it not for the proximity of her dark room to the laboratory beneath. As she sat ordering photographic equipment over the telephone one afternoon that August, smallpox viruses travelled up the air ducts from the Medical School’s ‘pox’ room on the floor below, and brought on her fatal infection.

The World Health Organisation (WHO) had spent a decade vaccinating against smallpox around the world, and that summer they were on the brink of announcing its complete eradication. It had been nearly a year since the final naturally occurring case of the disease had been recorded. A young hospital cook had recovered from a mild form of the virus in its final stronghold of Somalia. Such a victory over disease was unprecedented. Vaccination had backed smallpox into a corner, ultimately leaving it with no vulnerable humans to infect, and nowhere to go.

But the virus did have one tiny pocket to retreat to – the Petri dishes filled with human cells that researchers used to grow and study the disease. The Medical School of Birmingham University was one such viral sanctuary, where one Professor Henry Bedson and his team were hoping to develop the means to quickly identify any pox viruses that might emerge from animal populations now that smallpox was gone from humans. It was a noble aim, and they had the blessing of the WHO, despite inspectors’ concerns about the pox room’s safety protocols. With just a few months left before Birmingham’s lab was due to close anyway, the inspectors’ worries did not justify an early closure, or an expensive refit of the facilities.

Janet Parker’s illness, at first dismissed as a mild bug, caught the attention of infectious disease doctors a fortnight after it had begun. By now she was covered in pustules, and the possible diagnosis turned to smallpox. Parker was moved into isolation, and samples of fluid were extracted for analysis. In an irony not lost on Professor Bedson, his team’s expertise in identifying pox viruses was called upon for verification of the diagnosis. Bedson’s fears were confirmed, and Parker was moved to a specialist isolation hospital nearby. Two weeks later on 6 September, with Parker still critically ill in hospital, Professor Bedson was found dead at his home by his wife, having slit his own throat. On 11 September 1978, Janet Parker died of her disease.

Janet Parker’s fate was that of many hundreds of millions before her. She had been infected by a strain of smallpox known as ‘Abid’, named after a three-year-old Pakistani boy who had succumbed to the disease eight years previously, shortly after the WHO’s intensive smallpox eradication campaign had got under way in Pakistan. Smallpox had become a significant killer across most of the world by the sixteenth century, in large part due to the tendency of Europeans to explore and colonise other regions of the world. In the eighteenth century, as human populations grew and became increasingly mobile, smallpox spread to become one of the major causes of death around the world, killing as many as 400,000 Europeans each year, including roughly one in ten infants. With the uptake of variolation – a crude and risky predecessor of vaccination, involving intentional infection of the healthy with the smallpox fluids of sufferers – the death toll was reduced in the latter half of the eighteenth century. Jenner’s discovery of vaccination using cowpox in 1796 brought further relief. By the 1950s, smallpox had been all but eliminated from industrialised countries, but there were still 50 million cases annually worldwide resulting in over 2 million deaths each year.

Though smallpox had released its grip on countries in the industrialised world, the tyrannical reign of many other microbes continued in the opening decade of the twentieth century. Infectious disease was by far the dominant form of illness, its spread aided by our human habits of socialising and exploring. The exponentially rising human population, and with that, ever-greater population densities, only eased the person-to-person leap that microbes needed to make in order to continue their life cycle. In the United States, the top three causes of death in 1900 were not heart disease, cancer and stroke, as they are today, but infectious diseases, caused by microbes passed between people. Between them, pneumonia, tuberculosis and infectious diarrhoea ended the lives of one-third of people.

Once regarded as ‘the captain of the men of death’, pneumonia begins as a cough. It creeps down into the lungs, stifling breathing and bringing on a fever. More a description of symptoms than a disease with a sole cause, pneumonia owes its existence to the full spectrum of microbes, from tiny viruses, through bacteria and fungi, to protozoan (‘earliest-animal’) parasites. Infectious diarrhoea, too, can be blamed on each variety of microbe. Its incarnations include the ‘blue death’ – cholera – which is caused by a bacterium; the ‘bloody flux’ – dysentery – which is usually thanks to parasitic amoebae; and ‘beaver fever’ – giardiasis, again from a parasite. The third great killer, tuberculosis, affects the lungs like pneumonia, but its source is more specific: an infection by a small selection of bacteria belonging to the genus Mycobacterium.

A whole host of other infectious diseases have also left their mark, both literally and figuratively, on our species: polio, typhoid, measles, syphilis, diphtheria, scarlet fever, whooping cough and various forms of flu, among many others. Polio, caused by a virus that can infect the central nervous system and destroy nerves controlling movements, paralysed hundreds of thousands of children each year in industrialised countries at the beginning of the twentieth century. Syphilis – the sexually transmitted bacterial disease – is said to have affected 15 per cent of the population of Europe at some point in their lifetime. Measles killed around a million people a year. Diphtheria – who remembers this heart-breaker? – used to kill 15,000 children each year in the United States alone. The flu killed between five and ten times as many people in the two years following the First World War than were killed fighting in the war itself.

Not surprisingly these scourges had a major influence on human life expectancy. Back then, in 1900, the average life expectancy across the whole planet was just thirty-one years. Living in a developed country improved the outlook, but only to just shy of fifty years. For most of our evolutionary history, we humans have managed to live to only twenty or thirty years old, though the average life expectancy would have been much lower. In one single century, and in no small part because of developments in one single decade – the antibiotic revolution of the 1940s – our average time on Earth was doubled. In 2005, the average human could expect to live to sixty-six, with those in the richest countries reaching, again on average, the grand old age of eighty.

These figures are highly influenced by the chances of surviving infancy. In 1900, when up to three in ten children died before the age of five, average life expectancy was dramatically lower. If, at the turn of the next century, rates of infant mortality had remained at the level they were in 1900, over half a million children would have died before their first birthday in the United States each year. Instead, around 28,000 did. Getting the vast majority of children through their first five years unscathed allows most of them to go on and live to ‘old age’ and brings the average life expectancy up accordingly.

Though the effects are far from fully felt in much of the developing world, we have, as a species, gone a long way towards conquering our oldest and greatest enemy: the pathogen. Pathogens – disease-causing microbes – thrive in the unsanitary conditions created by humans living en masse. The more of us we cram onto our planet, the easier it becomes for pathogens to make a living. By migrating, we give them access to yet more humans, and in turn, more opportunity to breed, mutate and evolve. Many of the infectious diseases we have contended with in the last few centuries originated in the period after early humans had left Africa and set up home across the rest of the world. Pathogens’ world domination mirrored our own; few species have as loyal a pathogenic following as us.

For many of us living in more developed countries, the reign of infectious diseases is confined to the past. Just about all that remain of thousands of years of mortal combat with microbes are memories of the sharp prick of our childhood immunisations followed by the ’reward’ of a polio-vaccine-infused sugar lump, and perhaps more clearly, the melodramatic queues outside the dinner hall as we waited with our school friends for a teenage booster shot. For many children and teenagers growing up now, the burden of history is even lighter, as not only the diseases themselves, but once-routine vaccinations, such as the dreaded ‘BCG’ for tuberculosis, are no longer necessary.

Medical innovations and public health measures – largely those of the late nineteenth and early twentieth centuries – have made a profound difference to life as a human. Four developments in particular have taken us from a two-generation society to a four-, or even five-generation society in just one, long, lifetime. The first and earliest of these, courtesy of Edward Jenner and a cow named Blossom, is, of course, vaccination. Jenner knew that milkmaids were protected from developing smallpox by virtue of having been infected by the much milder cowpox. He thought it possible that the pus from a milkmaid’s pustules might, if injected into another person, transfer that protection. His first guinea pig was an eight-year-old boy named James Phipps – the son of Jenner’s gardener. Having inoculated Phipps, Jenner went on to attempt to infect the brave lad, twice injecting pus from a true smallpox infection. The young boy was utterly immune.

Beginning with smallpox in 1796, and progressing to rabies, typhoid, cholera and plague in the nineteenth century, and dozens of other infectious diseases since 1900, vaccination has not only protected millions from suffering and death, but has even led to countrywide elimination or complete global eradication of some pathogens. Thanks to vaccination, we no longer have to rely solely on our immune systems’ experiences of full-blown disease to defend us against pathogens. Instead of acquiring natural defences against diseases, we have circumvented this process using our intellect to provide the immune system with forewarning of what it might encounter.

Without vaccination, the invasion of a new pathogen prompts sickness and possibly death. The immune system, as well as tackling the invading microbe, produces molecules called antibodies. If the person survives, these antibodies form a specialist team of spies that patrol the body looking out specifically for that microbe. They linger long after the disease has been conquered, primed to let the immune system know the moment there is a reinvasion of the same pathogen. The next time it is encountered, the immune system is ready, and the disease can be prevented from taking hold.

Vaccination mimics this natural process, teaching the immune system to recognise a particular pathogen. Instead of suffering the disease to achieve immunity, now we suffer only the injection, or oral administration, of a killed, weakened or partial version of the pathogen. We are spared illness but our immune systems still respond to the introduction of the vaccine, and produce antibodies that help the body to resist disease if the same pathogen invades for real.

Society-wide vaccination programmes are designed to bring about ‘herd immunity’ by vaccinating a large enough proportion of the population that contagious diseases cannot continue their spread. They have meant that many infectious diseases are almost completely eliminated in developed countries, and one, smallpox, has been totally eradicated. Smallpox eradication, as well as dropping the incidence of the disease from 50 million cases a year worldwide to absolutely none in little more than a decade, has saved governments billions in both the direct cost of vaccination and medical care, and the indirect societal costs of illness. The United States, which contributed a disproportionately large amount of money to the global eradication effort, recoups its investment every twenty-six days in unspent costs. Governmental vaccination schemes for a dozen or so other infectious diseases have dramatically reduced the number of cases, reducing suffering and saving lives and money.

Today, most countries in the developed world run vaccination programmes against ten or so infectious diseases, and half a dozen are marked for regionwide elimination or global eradication by the World Health Organisation. These programmes have had a dramatic effect on the incidence of these diseases. Before the worldwide eradication programme for polio began in 1988, the virus affected 350,000 people a year. In 2012, the disease was confined to just 223 cases in only three countries. In just twenty-five years, around half a million deaths have been prevented and 10 million children who would have been paralysed are free to walk and run. Likewise for measles and rubella: in a single decade, vaccination of these once-common diseases has prevented 10 million deaths worldwide. In the United States, as in most of the developed world, the incidence of nine major childhood diseases has been reduced by 99 per cent by vaccination. In developed countries, for every 1,000 babies born alive in 1950, around forty would die before their first birthday. By 2005, that figure had been reduced by an order of magnitude, to about four. Vaccination is so successful that only the oldest members of Western society can remember the horrendous fear and pain of these deadly diseases. Now, we are free.

After the development of the earliest vaccines came a second major health innovation: hygienic medical practice. Hospital hygiene is something we are still under pressure to improve today, but in comparison with the standards of the late nineteenth century, modern hospitals are temples of cleanliness. Imagine, instead, wards crammed full with the sick and dying, wounds left open and rotting, and doctors’ coats covered in the blood and gore of years of surgeries. There was little point in cleaning – infections were thought to be the result of ‘bad air’, or miasma, not germs. This toxic mist was thought to rise from decomposing matter or filthy water – an intangible force beyond the control of doctors and nurses. Microbes had been discovered 150 years previously, but the connection had not been made between them and disease. It was believed that miasma could not be transferred by physical contact, so infections were spread by the very people charged with curing them. Hospitals were a new invention, born of a drive towards public health care and a desire to bring ‘modern’ medicine to the masses. Despite the good intentions, they were filthy incubators for disease, and those attending them risked their lives for the treatment they needed.

Women suffered most as a result of the proliferation of hospitals, as the risks of labour and giving birth, rather than falling, actually rose. By the 1840s, up to 32 per cent of women giving birth in hospital would subsequently die. Doctors – all male at that time – blamed their deaths on anything from emotional trauma to uncleanliness of the bowel. The true cause of this horrifyingly high death rate would at last be unravelled by a young Hungarian obstetrician by the name of Ignaz Semmelweis.

At the hospital where Semmelweis worked, the Vienna General, women in labour were admitted on alternate days into two different clinics. One was run by doctors, and the other by midwives. Every second day, as Semmelweis walked to work, he’d see women giving birth on the street outside the hospital doors. On those days, it was the turn of the clinic run by doctors to admit labouring women. But the women knew the odds for their survival would not be good if they could not hold on until the following day. Childbed fever – the cause of most of the deaths – lurked in the doctors’ clinic. So they waited, cold and in pain, in the hope that their baby would delay its entrance to the world until after midnight had struck.

Getting admitted to the midwife-run clinic was, relatively speaking, a far safer proposition. Between 2 and 8 per cent of new mothers would die of childbed fever in the care of midwives – far fewer than succumbed in the doctors’ clinic.

Despite his junior status, Semmelweis began to look for differences between the two clinics that might explain the death rates. He thought overcrowding and the climate of the ward might be to blame, but found no evidence of any difference. Then, in 1847, a close friend and fellow doctor, Jakob Kolletschka, died after being accidentally cut by a student’s scalpel during an autopsy. The cause of death: childbed fever.

After Kolletschka’s death, Semmelweis had a realisation. It was the doctors who were spreading death among the women in their ward. Midwives, on the other hand, were not to blame. And he knew why. Whilst their patients laboured, the doctors would pass the time in the morgue, teaching medical students using human cadavers. Somehow, he thought, they were carrying death from the autopsy room to the maternity ward. The midwives never touched a corpse, and the patients dying on their ward were probably those whose post-natal bleeding meant a visit from the doctor.

Semmelweis had no clear idea of the form that death was taking on its passage from the morgue to the maternity ward, but he had an idea of how to stop it. To rid themselves of the stench of rotting flesh, doctors often washed with a solution of chlorinated lime. Semmelweis reasoned that if it could remove the smell, perhaps it could remove the vector of death as well. He instituted a policy that doctors must wash their hands in chlorinated lime between conducting autopsies and examining their patients. Within a month, the death rate in his clinic had dropped to match that of the midwives’ clinic.

Despite the dramatic results Semmelweis achieved in Vienna and later in two hospitals in Hungary, he was ridiculed and ignored by his contemporaries. The stiffness and stench of a surgeon’s scrubs were said to be a mark of his experience and expertise. ‘Doctors are gentlemen, and gentlemen’s hands are clean,’ said one leading obstetrician at the time, all the while infecting and killing dozens of women each month. The mere notion that doctors could be responsible for bringing death, not life, to their patients caused huge offence, and Semmelweis was cast out of the establishment. Women continued to risk their lives giving birth for decades, as they paid the price of the doctors’ arrogance.

Twenty years later, the great Frenchman Louis Pasteur developed the germ theory of disease, which attributed infection and illness to microbes, not miasma. In 1884, Pasteur’s theory was proved by the elegant experiments of the German Nobel prize-winning doctor Robert Koch. By this time, Semmelweis was long dead. He had become obsessed by childbed fever, and had gone mad with rage and desperation. He railed against the establishment, pushing his theories and accusing his contemporaries of being irresponsible murderers. He was lured by a colleague to an insane asylum, under the pretence of a visit, then forced to drink castor oil and beaten by the guards. Two weeks later, he died of a fever, probably from his infected wounds.

Nonetheless, germ theory was the breakthrough that gave Semmelweis’s observations and policies a truly scientific explanation. Steadily, antiseptic hand-washing was adopted by surgeons across Europe. Hygienic practices became common after the work of the British surgeon Joseph Lister. In the 1860s, Lister read of Pasteur’s work on microbes and food, and decided to experiment with chemical solutions on wounds to reduce the risk of gangrene and septicaemia. He used carbolic acid, which was known to stop wood from rotting, to wash his instruments, soak dressings and even to clean wounds during surgery. Just as Semmelweis had achieved a drop in the death rate, so too did Lister. Where 45 per cent of those he operated on had died before, Lister’s pioneering use of carbolic acid slashed mortality by two-thirds, to around 15 per cent.

Closely following Semmelweis’s and Lister’s work on hygienic medical practice was a third public health innovation – a development that prevented millions from becoming ill in the first place. As in many developing countries today, water-borne diseases were a major health hazard in the West before the twentieth century. The sinister forces of miasma were still at work, polluting rivers, wells and pumps. In August 1854, the residents of London’s Soho district began to fall ill. They developed diarrhoea, but not as you or I might know it. This was white, watery stuff, and there was no end of it. Each person could produce up to 20 litres per day, all of which was dumped in the cesspits beneath Soho’s cramped houses. The disease was cholera, and it killed people in their hundreds.

Dr John Snow, a British doctor, was sceptical of the miasma theory, and had spent some years looking for an alternative explanation. From previous epidemics, he had begun to suspect that cholera was water-borne. The latest outbreak in Soho gave him the opportunity to test his theory. He interviewed Soho residents and mapped cholera cases and deaths, looking for a common source. Snow realised that the victims had all drunk from the same water pump on Broad Street (now Broadwick Street) at the heart of the outbreak. Even deaths further afield could be traced back to the Broad Street pump, as cholera was carried and passed on by those infected there. There was one anomaly: a group of monks in a Soho monastery who got their water from the same pump were completely unaffected. It was not their faith that had afforded them protection, though, but their habit of drinking the pump’s water only after they had turned it into beer.

Snow had looked for patterns – connections between those who had become ill, reasons why others had escaped, links explaining the appearance of the disease outside its Broad Street epicentre. His rational study used logic and evidence to unravel the outbreak and trace its source, eliminating red herrings and accounting for anomalies. His work led to the disabling of the Broad Street pump and the subsequent discovery that a nearby cesspit had overflowed and was contaminating the water supply. It was the first-ever epidemiological study – that is, it used the distribution and patterns of a disease to understand its source. John Snow went on to use chlorine to disinfect the water supplying the Broad Street pump, and his chlorination methods were quickly put to use elsewhere. As the nineteenth century came to a close, water sanitation had become widespread.

As the twentieth century unfolded, all three public health innovations became more and more sophisticated. By the end of the Second World War, a further five diseases could be prevented through vaccination, taking the total to ten. Medical hygiene techniques were adopted internationally, and chlorination became a standard process in water-treatment plants. The fourth and final innovation to put an end to the reign of microbes in the developed world began with one world war and concluded with the second. It was the result of the hard work, and good fortune, of a handful of men. The first of these, the Scottish biologist Sir Alexander Fleming, is famously credited with ‘accidentally’ discovering penicillin in his laboratory at St Mary’s Hospital in London. In fact, Fleming had been hunting for antibacterial compounds for years.

During the First World War he had treated wounded soldiers on the Western Front in France, only to see many of them die from sepsis. When the war came to an end and Fleming returned to the UK, he made it his mission to improve upon Lister’s antiseptic carbolic acid dressings. He soon discovered a natural antiseptic in nasal mucus, which he called lysozyme. But, as with carbolic acid, it could not penetrate beneath the surface of wounds, so deep infections festered. Some years later, in 1928, Fleming was investigating staphylococci bacteria – responsible for boils and sore throats – when he noticed something odd on one of his Petri dishes. He had been on holiday, and had returned to a messy lab bench full of old bacterial cultures, many of which had been contaminated with moulds. As he sorted through them, he noticed one dish in particular. Surrounding a patch of Penicillium mould was a clear ring, completely free of the staphylococci colonies that covered the remainder of the plate. Fleming spotted its significance: the mould had released a ‘juice’ that had killed the bacteria around it. That juice was penicillin.

Though growing the Penicillium had been unintentional, Fleming’s recognition of its potential importance was anything but accidental. It began a process of experimentation and discovery that would span two continents and twenty years, and revolutionise medicine. In 1939, a team of scientists at Oxford University, led by the Australian pharmacologist Howard Florey, thought they could make more use of penicillin. Fleming had struggled to grow significant quantities of the mould, or to extract the penicillin it produced. Florey’s team managed it, isolating small amounts of liquid antibiotic. By 1944, with the financial support of the War Production Board in the United States, penicillin was produced in sufficient quantities to meet the needs of soldiers returning from the D-Day invasion of Europe. Sir Alexander Fleming’s dream of beating the infections of the war wounded was realised, and the following year he, Florey, and one other member of the Oxford team, Sir Ernst Boris Chain, received the Nobel Prize in Medicine or Physiology.

Over twenty varieties of antibiotics have subsequently been developed, each attacking a different bacterial weakness, and providing our immune systems with backup when they are overwhelmed by infection. Before 1944, even scratches and grazes could mean a frighteningly high chance of death by infection. In 1940, a British policeman in Oxfordshire called Albert Alexander was scratched by a rose thorn. His face became so badly infected that he had to have his eye removed, and he was on the verge of death. Howard Florey’s wife Ethel, who was a doctor, persuaded Florey that Constable Alexander should become the first recipient of penicillin.

Within twenty-four hours of being injected with a tiny quantity of penicillin, the policeman’s fever dropped, and he began to recover. The miracle was not to be, however. A few days into his treatment, penicillin supplies ran out. Florey had attempted to extract any remaining penicillin from the constable’s urine to continue the treatment, but on the fifth day, the policeman died. It is unthinkable now to die from a scratch or an abscess, and we often take antibiotics without heed to their life-saving properties. Surgery, too, would carry enormous risk were it not for the protective shield of intravenous antibiotics given before the first cut is made.

Our twenty-first-century lives are a kind of sterile ceasefire, with infections held at bay through vaccinations, antibiotics, water sanitation and hygienic medical practice. We are no longer threatened by acute and dangerous bouts of infectious disease. Instead, the past sixty years have seen a collection of previously rare conditions rise to prominence. These chronic ‘twenty-first-century illnesses’ have become so common that we accept them as a normal part of being human. But what if they are not ‘normal’?

Looking around among your friends and family, you won’t see smallpox, measles or polio any more. You might think how lucky we are; how healthy we are these days. But look again and you might see things differently. You might see the sneezing and red, itchy eyes of your daughter’s hay fever in the spring. You might think of your sister-in-law, who has to inject herself with insulin several times a day because of her type 1 diabetes. You might be worried your wife will end up in a wheelchair with multiple sclerosis as her aunt did. You might have heard about your dentist’s little boy who screams, and rocks himself, and won’t make eye contact, now that he has autism. You might get impatient with your mother who is too anxious to do the shopping. You might be searching for a washing powder that doesn’t make your son’s eczema worse. Your cousin might be the awkward one at dinner who can’t eat wheat because it gives her diarrhoea. Your neighbour might have slipped unconscious whilst searching for his EpiPen after accidentally eating nuts. And you might have lost the battle to keep your weight where beauty magazines, and your doctor, say it should be. These conditions – allergies, autoimmune diseases, digestive troubles, mental health problems and obesity – are the new normal.

Let’s take allergies. Perhaps there’s nothing alarming about your daughter’s hay fever, as 20 per cent of her friends also snuffle and sneeze their way through summer. You are not surprised by your son’s eczema, because one in five of his classmates have it too. Your neighbour’s anaphylactic attack, terrifying though it was, is common enough that all packaged foods carry warnings if they ‘may contain nuts’. But have you ever asked yourself why one in five of your children’s friends have to take an inhaler to school in case they suffer an asthma attack? Being able to breathe is fundamental to life, yet without medication, millions of children would find themselves gasping for breath. What about why one in fifteen children are allergic to at least one type of food? Can that be normal?

Allergies affect nearly half of us in developed countries. We dutifully take our antihistamines, avoid picking up the cat, and check the ingredients lists of everything we buy. We unthinkingly do what is necessary to stop our immune systems overreacting to the most ubiquitous and innocuous of substances: pollen, dust, pet hair, milk, eggs, nuts, and so on. These substances are being treated by the body as if they are germs that need to be attacked and removed. It hasn’t always been this way. In the 1930s, asthma was rare, affecting perhaps one child in every school. By the 1980s, it had shot up, and one child in every class was affected. In the last decade or so, the rise has levelled off, but it has left a quarter of children with asthma. The same goes for other allergies: peanut allergies, for instance, trebled in just ten years at the end of the last century, and then doubled again in the next five years. Now we have nut-free zones in schools and workplaces. Eczema and hay fever too were once rare and are now a fact of life.

This is not normal.

What about autoimmune diseases? Your sister-in-law’s insulin habit is common enough, with type 1 diabetes affecting, as it does, about 4 in every 1,000 people. Most people have heard of the multiple sclerosis (or MS) that’s destroyed your wife’s aunt’s nerves. And then there’s rheumatoid arthritis wrecking joints, coeliac disease attacking the gut, myositis shredding muscle fibres, lupus pulling apart cells at their core, and about eighty other such conditions. As with allergies, the immune system has gone rogue, attacking not just the germs that bring disease, but the body’s own cells. You might be surprised to learn that among them, autoimmune diseases affect nearly 10 per cent of the population in the developed world.

Type 1 diabetes (T1D) makes for a great example, because it is an unmistakable condition, so records are relatively reliable. ‘Type 1’ is the version of diabetes that usually strikes early, often in the teenage years, attacking the cells of the pancreas, and completely preventing the production of the hormone insulin. (In type 2 diabetes, insulin is produced, but the body has grown less sensitive to it, so it doesn’t work as well.) Without insulin, any glucose in the blood – whether that’s from the simple sugars in sweets and desserts or from the carbohydrates in pasta and bread – cannot be converted and stored. It builds up and quickly becomes toxic, bringing with it a raging thirst and constant need to urinate for the unfortunate teenager. The patient wastes away, and weeks or months later, death follows, often from kidney failure. That is, unless insulin is injected. Pretty serious, then.

Fortunately, compared to most conditions, it’s straightforward to diagnose, and always has been. These days, a quick check of the amount of glucose in the blood after fasting usually gives it away, but even 100 years ago diabetes could be detected by a willing doctor. I say willing, because the test for it involved tasting the patient’s urine. A sweetness within the tang indicated that there was so much glucose in the blood that it had been forced out into the urine by the kidneys. Though undoubtedly more cases were missed in the past than now, and many would have gone unrecorded, our understanding of the prevalence of type 1 diabetes over time is a reliable indicator of the changing status of autoimmune diseases.

About 1 in 250 people in the West are stuck playing the role of their own pancreas, calculating how much insulin they need and then injecting it, to store away the glucose they have consumed. What’s extraordinary is that this high prevalence is new: type 1 diabetes was almost non-existent in the nineteenth century. Hospital records for Massachusetts General Hospital in the US, kept over seventy-five years until 1898, log only twenty-one cases of diabetes diagnosed in childhood, out of nearly 500,000 patients. It’s not a case of missed diagnosis, either – that urine-taste test, the rapid weight loss and the inevitable fatal outcome made the disease easy to recognise even back then.

Once formal records had been set up just before the Second World War, the prevalence of type 1 diabetes could be tracked. Around 1 or 2 children in every 5,000 were affected in the US, UK and Scandinavia. The war itself altered nothing, but not long afterwards, something changed, and cases began to rise. By 1973, diabetes was six or seven times as common as it had been in the Thirties. In the Eighties, the rise levelled off at its current figure of about 1 in 250.

The rise in diabetes is matched by rises in other autoimmune conditions. Multiple sclerosis destroyed the nervous systems of twice as many people at the turn of the millennium as it did two decades previously. Coeliac disease, in which the presence of wheat prompts the body to attack the cells of the intestine, is a startling thirty or forty times as common now as it was in the 1950s. Lupus, inflammatory bowel disease and rheumatoid arthritis too have been on the rise.

This is not normal.

What about our collective battle with excess weight? Odds are I’m right in my flippant assumption that you struggle with your weight, as well over half of us in the Western world are either overweight or obese. It’s astonishing to think that being a healthy weight puts you in the minority now. Being fat is so typical that old shop mannequins have been replaced by larger versions, and television shows turn weight loss into a game. These changes are perhaps to be expected: statistically speaking, being overweight is the reality for most people.

But it didn’t use to be. To us now, looking back at black and white photographs of the skinny young men and women of the Thirties and Forties enjoying a spell of hot weather in shorts and swimwear, these healthy people appear emaciated, with prominent ribs and lean bellies. But they are not – they are simply not carrying our modern baggage. At the start of the twentieth century, human body weights were uniform enough that few thought to keep records. But, prompted by a sudden rise in weight gain in the 1950s at the epicentre of the obesity epidemic – America – the government began keeping track. In the first national survey in the early 1960s, 13 per cent of adults were already obese. That is, they had a Body Mass Index (weight in kilograms divided by height in metres squared) of over 30. A further 30 per cent were overweight (a BMI between 25 and 30).

By 1999, the proportion of obese American adults had more than doubled to 30 per cent, and many previously healthy adults had piled on the pounds, keeping the overweight category at a plump 34 per cent. That’s a total of 64 per cent overweight or obese. Trends in the UK followed the same pattern, with a bit of a lag: in 1966, just 1.5 per cent of the adult population were obese and 11 per cent were overweight. By 1999, 24 per cent were obese and 43 per cent overweight – that’s 67 per cent of people now heavier than they should be. Obesity is not just about excess weight, either. It can lead to type 2 diabetes, heart disease and even some cancers, all of which are increasingly common.

You don’t need me to tell you, this is not normal.

Tummy troubles too are on the rise. Your cousin may be awkward for trying out a gluten-free diet, but she’s possibly not the only one at the table who suffers from irritable bowel syndrome, which affects up to 15 per cent of people. The name implies a similar level of discomfort to a midge bite, and belies the ruinous impact of the condition on the quality of life of its sufferers. Proximity to a toilet takes priority over more meaningful pursuits for most sufferers, and a near-absence of need for one makes pursuit of anything but colonic relief worthless for the remaining patients. Inflammatory bowel diseases like Crohn’s disease and ulcerative colitis too are on the rise, leaving the worst affected with a bowel so damaged it has to be replaced by a colostomy bag outside the body.

This is definitely not normal.

And finally we come to mental health conditions. Your dentist’s autistic son has more company than ever before, as 1 in 68 children (but 1 in every 42 boys) are on the autistic spectrum. Back in the early 1940s autism was so rare it hadn’t even been given a name. Even by the time records began in 2000, it was less than half as common as it is now. You’d be right in thinking that at least some of these extra cases are due to increasing awareness and perhaps some over-diagnosis, but most experts agree that the rise in autism prevalence is genuine – something has changed. Attention deficit disorders, Tourette’s syndrome and obsessive–compulsive disorder are all also on the rise. Depression and anxiety disorders too.

This increase in mental suffering is not normal.

Except these conditions are now so very ‘normal’, you might not even have realised that they are new illnesses, rarely encountered by our great-grandparents and those before them. Even doctors are often unaware of the histories of the conditions they treat, having received their medical training only in the context of today’s doctors’ experiences. As with the rise in cases of appendicitis, a change forgotten by today’s medics, what matters most to front-line carers is the patients in their charge and the treatments available to them. Understanding the provenance of illness is not their responsibility, and as such, changes in prevalence are incidental to them.

In the twenty-first century, life is different thanks to the four public health innovations of the nineteenth and twentieth centuries, and so disease is different too. But our twenty-first-century illnesses are not simply another layer of ill-health, hidden beneath infectious disease, but an alternative set of conditions, created by the way we live now. At this point you might be wondering how these illnesses can possibly have something in common, such a disparate group do they seem. From the sneezing and itching of allergies, to the self-destruction of autoimmunity, the metabolic misery of obesity, the humiliation of digestive disorders and the stigma of mental health conditions, it’s as if our bodies, in the absence of infectious diseases, have turned on themselves.

We could accept our new fate and be grateful that we will, at least, live long lives free from the tyranny of the pathogen. Or we could ask what has changed. Could there be a link between conditions that seem unrelated, like obesity and allergies, irritable bowel syndrome and autism? Does the shift from infectious diseases to this new set of illnesses indicate that our bodies need infections to stay balanced? Or is the correlation between declining infectious disease and rising chronic illness merely hinting at a deeper cause?

We are left with one big question: Why are these twenty-first-century illnesses happening?

At the moment, it’s fashionable to look to genetics for the source of disease. The Human Genome Project has unearthed a whole heap of genes that, when mutated, can result in illness. Some mutations guarantee disease: a change to the code of the HTT gene on chromosome 4, for example, will always result in Huntington’s disease. Others mutations simply increase the likelihood: misspellings in the genes BRCA1 and BRCA2 raise a woman’s risk of breast cancer to up to an eight in ten chance in her lifetime, for instance.

Although this is the era of the genome, we cannot blame our DNA alone for the rise in our modern diseases. While one person might carry a version of a gene that makes them, say, more likely to become obese, that gene variant could not become dramatically more common in the population as a whole inside a single century. Human evolution just does not progress that rapidly. Not only that, but gene variants only grow more common through natural selection if they are beneficial, or their detrimental effects are suppressed. Asthma, diabetes, obesity and autism bring few advantages to their victims.

With genetics excluded as the cause of the rise, our next question must be: Has something changed in our environment? Just as a person’s height is a result of not only their genes, but their environment – nutrition, exercise, lifestyle and so on – so is their disease risk. And this is where it gets complicated, as so very many aspects of our lives have changed in the last century, and pinpointing which are causes and which are mere correlations requires the patient process of scientific evaluation. For obesity and its related illnesses, changes in the way we eat are clear to see, but how this affects other twenty-first-century illnesses is less obvious.

The diseases in question offer up few clues as to their joint origin. Could the same changes in our environments that lead to obesity also generate allergies? Can there really be a common cause of mental health conditions like autism and obsessive–compulsive disorder, and gut disorders like irritable bowel syndrome?

Despite the disparities, two themes emerge. The first, clearly binding allergies and autoimmune diseases, is the immune system. We are looking for a culprit which has interfered with the immune system’s ability to determine our bodily threat level, making overreactions all too common. The second theme, often hidden behind more socially acceptable symptoms, is gut dysfunction. For some modern illnesses, the link is clear: IBS and inflammatory bowel disease have bowel disturbances at the core of their presentation. For others, although it is less overt, the connection is still there. Autistic patients struggle with chronic diarrhoea; depression and IBS go hand in hand; obesity has its origin in what passes through the gut.

These two themes, the gut and the immune system, might also seem unrelated, but a closer look at the anatomy of the gut provides a further clue. Asked about their immune system, most people might think of white blood cells and lymph glands. But that’s not where most of the action is. In fact, the human gut has more immune cells than the rest of the body put together. Around 60 per cent of the immune system’s tissue is located around the intestines, particularly along the final section of the small intestine and into the caecum and the appendix. It’s easy to think of the skin as the barrier between us and the outside world, but for every square centimetre of skin, you have two square metres of gut. Though it’s on the ‘inside’, the gut has just a single layer of cells between what’s essentially the outside world, and the blood. Immune surveillance along the intestines, therefore, is critical – every molecule and cell that passes by must be assessed and quarantined if necessary.

Although the threat of infectious disease has all but gone, our immune systems are still under fire. But why? Let’s turn to the technique pioneered by Dr John Snow during Soho’s cholera outbreak of 1854: epidemiology. Since Snow first applied logic and evidence to unravelling the mystery of the source of cholera, epidemiology has become a mainstay of medical sleuthing. It couldn’t be simpler: we ask three questions: (1) Where are these diseases occurring? (2) Who are they affecting? and (3) When did they become a problem? The answers provide us with clues that can help us to answer the overall question: Why are twenty-first-century illnesses happening?

The map of cholera cases that John Snow produced in answer to Where? gave away cholera’s likely epicentre – the Broad Street pump. Without much detective work, it’s clear to see that obesity, autism, allergies and autoimmunity all began in the Western world. Stig Bengmark, professor of surgery at University College London, puts the epicentre of obesity and its related diseases in the southern states of the US. ‘States like Alabama, Louisiana and Mississippi have the highest incidence of obesity and chronic diseases in the US and the world,’ he says. ‘These diseases spread, with a pattern similar to a tsunami, across the world; to the west to New Zealand and Australia, to the north to Canada, to the east to Western Europe and the Arab world and to the south, particularly Brazil.’

Bengmark’s observation extends to the other twenty-first-century illnesses – allergies, autoimmune diseases, mental health conditions and so on – all of which have their origin in the West. Of course geography alone does not explain the rise; it merely gives clues as to other correlates, and with luck, the cause. The clearest correlate of this particular topography of illness is wealth. A great accumulation of evidence points to the correlation between chronic diseases and affluence, from grand-scale comparisons of the gross national product of entire countries, to contrasts between socio-economic groups living in the same local area.

In 1990, the population of Germany provided an elegant natural experiment into the impact of prosperity on allergies. After four decades apart, East and West Germany were reunifying following the fall of the Berlin Wall the previous year. These two states had much in common; they shared a location, a climate, and populations composed of the same racial groups. But whilst those living in West Germany had prospered, eventually catching up and keeping pace with the economic developments of the Western world, East Germans had existed in a state of suspended animation since the Second World War and were significantly poorer than their West German neighbours. This difference in wealth was somehow related to a difference in health. A study by doctors at Munich University’s Children’s Hospital found that the richer West German children were twice as likely to have allergies, and three times as likely to suffer from hay fever.

This is a pattern that repeats itself for many allergic and autoimmune conditions. American children living in poverty are historically less likely to suffer from food allergies and asthma than their wealthier counterparts. The children of ‘privileged’ families in Germany, as judged by their parents’ educations and professions, are significantly more likely to suffer from eczema than those from less privileged backgrounds. Children from impoverished homes in Northern Ireland are not as prone to developing type 1 diabetes. In Canada, inflammatory bowel disease more often accompanies a high salary than a low one. The studies go on and on, and the trends are far from local. Even a country’s gross national product can be used to predict the extent of twenty-first-century illnesses within its population.

The rise of so-called Western illnesses is no longer limited to Western countries. With wealth comes chronic ill-health. As developing countries play economic catch-up, the diseases of civilisation spread. What began as a Western problem threatens to engulf the rest of the planet. Obesity tends to lead the way, and has affected large swathes of the population already, including those in developing countries. Its collective of associated conditions such as heart disease and type 2 diabetes (an insensitivity to insulin, rather than a lack of it) are trailing not far behind. Allergic disorders, including asthma and eczema, are also at the forefront of the spread, with rises under way across middle-income countries in South America, Eastern Europe and Asia. Autoimmune diseases and behavioural conditions appear to lag the most, but are now particularly common in the upper-middle-income countries, including Brazil and China. Just as many of our modern illnesses reach a plateau in the wealthiest countries, these conditions begin their ascent elsewhere.

When it comes to twenty-first-century illnesses, money is dangerous. The size of your salary, the wealth of your neighbourhood and status of your country all contribute to your risk. But of course, simply being rich does not make you ill. Money may not buy happiness, but it does buy clean water, freedom from infectious disease, calorie-rich foods, an education, a job in an office, a small family, holidays to far-flung places, and many other luxuries besides. Asking Where? tells us not just the location of our modern plagues, but that it is money that’s bringing us chronic ill-health.

Intriguingly though, this relationship between increasing wealth and poorer health disconnects at the very richest end of the scale. The wealthiest people in the wealthiest countries appear to be better able to lift themselves clear of the chronic disease epidemic. What begins as a preserve of the rich (think tobacco, takeaway food and ready meals) ends as the staple of the poor. Meanwhile, the well-off gain access to the latest health information, the best health care, and the freedom to make choices that keep them well. Now, while the richest cohorts of society in developing countries gain weight and acquire allergies, it is the poorest in developed countries who are more and more likely to be overweight and to suffer from chronic ill-health.

Next, we must ask Who? Does wealth and a Western lifestyle bring ill-health to everyone, or are some groups affected more than others? It’s a pertinent question: in 1918, as many as 100 million people died from the flu pandemic that swept the globe after the First World War. Asking Who? provided an answer, that, with today’s medical knowledge, could potentially have considerably reduced the death toll. Whereas flu usually kills vulnerable members of society – the young, the old, and the already sick – the 1918 flu killed mainly healthy young adults. These victims, in the prime of their lives, are likely to have died not from the flu virus itself, but from the ‘cytokine storm’ unleashed by their immune systems in an attempt to clear the virus. The cytokines – immune messenger chemicals which ramp up the immune response – can inadvertently lead to a reaction that’s more dangerous than the infection itself. The younger and fitter the patients, the greater the storm their immune systems created, and the more likely they were to die from the flu. Asking Who? tells us something of what made this particular flu virus so dangerous, and would have enabled us to direct medical care not just at fighting the virus, but also towards calming the storm.

Who? is composed of three elements. What age are those affected by twenty-first-century illnesses? Are there differences in how these conditions affect people of different races? And are the sexes affected equally?

Let’s start with age. It’s easy to assume that diseases associated with developed, wealthy countries, where health care is good, are an inevitable consequence of our ageing population. Of course new diseases are on the rise! you might think. We live so long now! Surely so many of us living well into our seventies and eighties guarantees a whole host of new health challenges? Of course, as we release ourselves from the burden of death-by-pathogen, we will inevitably suffer from death-by-something-else, but many of the illnesses we face now are not simply diseases of old age, released by our longer life expectancy. Unlike cancer, whose rise is at least partly attributable to the cellular replacement process breaking down in older bodies, twenty-first-century illnesses are not all old-age-related. In fact, most of them show a preference for children and young adults, despite being relatively rare among these age groups during the age of infectious disease.

Food allergies, eczema, asthma and skin allergies often begin at birth or in the first few years of a child’s life. Autism typically presents itself in toddlers, and is diagnosed before the age of five. Autoimmune diseases can hit at any time, but many show themselves at a young age. Type 1 diabetes, for example, typically reveals itself in childhood and the early teens, though it can also crop up in adulthood. Multiple sclerosis, the skin condition psoriasis, and inflammatory bowel diseases such as Crohn’s disease and ulcerative colitis, all typically attack in the twenties. And lupus usually affects people between the ages of fifteen and forty-five. Obesity too, is a disease that can start young, with around 7 per cent of American babies considered over the normal weight at birth, rising to 10 per cent by the time they are toddlers, and about 30 per cent becoming overweight later in childhood. Older people are not immune to twenty-first-century illnesses – almost all of them can strike suddenly at any age – but the fact that they so often affect the young suggests it is not the ageing process itself that triggers them.

Even among those diseases that kill people in the West in ‘old’ age – heart attacks, strokes, diabetes, high blood pressure and cancers – most have their roots in weight gain that begins in childhood or early adulthood. We can’t attribute deaths from these conditions to our longer lifespans alone, as even those people in traditional societies who make it to eighty or ninety years old very rarely die of this set of ‘age-related’ illnesses. Twenty-first-century illnesses are not limited by the burgeoning top tier of our demographic ranks, but rather are hitting us, like the 1918 flu, in what should be the prime of our lives.

On to race. The Western world – North America, Europe and Australasia – is a largely white place, so are our new health problems actually a genetic predisposition among white people? In fact, within these continents, whites do not consistently have the highest rates of obesity, allergies, autoimmunity or autism. Blacks, Hispanics and South Asians tend to have higher incidences of obesity than whites, and allergies and asthma disproportionately affect blacks in some areas and whites in others. No clear pattern emerges for autoimmune diseases, with some, such as lupus and scleroderma, affecting blacks more, and others, including childhood diabetes and multiple sclerosis, tending to prefer whites. Autism does not appear to affect races differently, though black children are often diagnosed later.

Could what seem like racial differences actually be largely due to other factors, such as wealth or location, rather than to the genetic tendencies of each race? In an elegantly designed statistical study, the higher rate of asthma in black American children than in other races was found to be due not to race itself, but the greater tendency of black families to live in inner-city urban locations, where asthma is more common in all children. Rates of asthma among black children growing up in Africa are, as in most less developed regions, low.

A neat way to untangle the effects of ethnicity and environment in bringing on twenty-first-century illnesses is to look at the health of migrants. In the 1990s, civil war led to a large exodus of families from Somalia to Europe and North America. Having escaped turmoil in their own country, the Somali diaspora faced a fresh battle. Whereas rates of autism are extremely low in Somalia, the incidence in children born to Somali migrants rapidly jumped to match that of non-migrant children. Among the large Somali community of Toronto, Canada, autism is referred to as ‘the Western disease’, as so many migrant families are affected by it. In Sweden too, children of immigrants from Somalia have three or four times the rate of autism as Swedish children. Race, then, seems far less important than location.

So what about the final aspect of Who?: sex. Do women and men suffer equally? That women have stronger immune systems may not come as a surprise to anyone who has witnessed a bout of ‘man flu’. But unfortunately, in this immune-mediated epidemic of chronic ill-health, women’s immune superiority proves a disadvantage. While men seem to succumb to the most benign of colds, women battle demons that only their immune systems can see.

Autoimmune diseases show the widest divergence, with the vast majority of disorders affecting more women than men, though several affect both sexes equally, and a couple show a preference for males. Allergies, though more common in boys than girls, affect more women than men after puberty. Gut disorders too affect more women than men – just slightly so for inflammatory bowel disease, but twice as many women have irritable bowel syndrome.

Perhaps surprisingly, obesity also seems to affect women more than men, particularly in developing countries. But when measurements other than BMI are used, such as waist circumference, these suggest that men and women actually suffer equally from dangerous levels of excess weight. Likewise, although it appears that some mental health conditions, including depression, anxiety and obsessive– compulsive disorder, affect more women than men, part of this difference may be down to the male reluctance to admit to feeling blue. In autism, it is males who carry the burden, with five times as many boys affected as girls. Perhaps in autism, as with allergies, which tend to strike young, and those autoimmune diseases that begin in childhood, the pre-pubertal onset makes all the difference. Without the influence of adult sex hormones, these illnesses are not subject to the same female bias.

Women’s strong immune systems are likely to be behind the female preponderance of several twenty-first-century illnesses. For conditions that involve overreactions of the immune system, such as allergies and autoimmunity, a stronger starting point is likely to lead to a greater response. Sex hormones, genetics and lifestyle differences could all play a role too – the jury’s out on exactly why women are worse affected. Whatever it is, the female bias in these modern plagues emphasises the immune system’s underlying role in their development. Twenty-first-century illnesses are not diseases of old age. They are not diseases of genetic inheritance. They are diseases of the young, the privileged, and those of immune fortitude, especially women.

We have reached the final question of our epidemiological mystery: When? Arguably, this is the most important question of all. I have been calling the modern chronic disease epidemic one of twenty-first-century sickness, though its root is not in this young century, but the last. What a century it was, the twentieth, bringing some of the greatest innovations and discoveries of all of human history. But over the course of its one hundred years, following the near-elimination of serious infectious disease in the developed world, came a new set of illnesses which went from being exceptionally rare to remarkably common. Among the many developments that took place in the last century lies the change, or cluster of changes, that have caused this rise. Pinpointing the moment that the rise began could provide our greatest clue as to its origin.

You may have got a feel for the timings already. In the US, a sharp upturn in type 1 diabetes cases began around the mid-century. Analysis of conscript data in both Denmark and Switzerland placed it in the early 1950s, in the Netherlands in the late 1950s, and in slightly less developed Sardinia in the 1960s. Rises in asthma and eczema started in the late Forties and early Fifties, and increases in Crohn’s disease and multiple sclerosis took off in the Fifties. Trends in obesity were first recorded on a large scale in the Sixties, making it difficult to determine the start of the epidemic as we see it now, but some experts point to the end of the Second World War in 1945 as a likely turning point. A sharp upturn in cases of obesity took off in the 1980s, but the origin of the rise certainly occurred before then. Similarly, the number of children diagnosed with autism each year was not recorded until the late Nineties, but the condition was first described in the mid-1940s.

Something changed around the middle of the last century. Perhaps more than one thing changed, and perhaps it continued to change in the decades that followed. That change has spread around the world since, enveloping ever more countries as the decades go by. To find the cause of our twenty-first-century illnesses, we must look at the changes centred on one extraordinary decade: the 1940s.

From asking What?, Where?, Who? and When?, we have established four things. First, our twenty-first-century illnesses often arise in the gut, and are associated with the immune system. Second, they strike young, often in children, teenagers and young adults, and many affect more women than men. Third, these illnesses occur in the Western world, but are now on the rise in developing countries as they modernise. Fourth, the rise began in the West in the 1940s, and developing countries followed suit later.

And so we return to the big question: Why have these twenty-first-century illnesses taken over? What is it about our modern, Western, wealthy lives that is making us so chronically ill?

As individuals and as a society, we have gone from frugal to indulgent; from traditional to progressive; from lacking luxuries to being bombarded by them; from poor health care to excellent medical services; from a budding to a blooming pharmaceutical industry; from active to sedentary; from provincial to globalised; from make-do-and-mend to refresh-and-replace; and from prudish to uninhibited.

Amongst these changes, and in answer to our mystery, are 100 trillion tiny clues waiting to be followed.

10% Human: How Your Body’s Microbes Hold the Key to Health and Happiness

Подняться наверх