Читать книгу Survival of the Sickest: The Surprising Connections Between Disease and Longevity - Jonathan Prince - Страница 6

Chapter One IRONING IT OUT

Оглавление

Aran Gordon is a born competitor. He’s a top financial executive, a competitive swimmer since he was six years old, and a natural long-distance runner. A little more than a dozen years after he ran his first marathon in 1984 he set his sights on the Mount Everest of marathons – the Marathon des Sables, a 150-mile race across the Sahara Desert, all brutal heat and endless sand that test endurance runners like nothing else.

As he began to train he experienced something he’d never really had to deal with before – physical difficulty. He was tired all the time. His joints hurt. His heart seemed to skip a funny beat. He told his running partner he wasn’t sure he could go on with training, with running at all. And he went to the doctor.

Actually, he went to doctors. Doctor after doctor – they couldn’t account for his symptoms, or they drew the wrong conclusion. When his illness left him depressed, they told him it was stress and recommended he talk to a therapist. When blood tests revealed a liver problem, they told him he was drinking too much. Finally, after three years, his doctors uncovered the real problem. New tests revealed massive amounts of iron in his blood and liver – off-the-charts amounts of iron.

Aran Gordon was rusting to death.

Hemochromatosis is a hereditary disease that disrupts the way the body metabolizes iron. Normally, when your body detects that it has sufficient iron in the blood, it reduces the amount of iron absorbed by your intestines from the food you eat. So even if you stuffed yourself with iron supplements you wouldn’t load up with excess iron. Once your body is satisfied with the amount of iron it has, the excess will pass through you instead of being absorbed. But in a person who has hemochromatosis, the body always thinks that it doesn’t have enough iron and continues to absorb iron unabated. This iron loading has deadly consequences over time. The excess iron is deposited throughout the body, ultimately damaging the joints, the major organs, and overall body chemistry. Unchecked, hemochromatosis can lead to liver failure, heart failure, diabetes, arthritis, infertility, psychiatric disorders, and even cancer. Unchecked, hemochromatosis will lead to death.

For more than 125 years after Armand Trousseau first described it in 1865, hemochromatosis was thought to be extremely rare. Then, in 1996, the primary gene that causes the condition was isolated for the first time. Since then, we’ve discovered that the gene for hemochromatosis is the most common genetic variant in people of Western European descent. If your ancestors are Western European, the odds are about one in three, or one in four, that you carry at least one copy of the hemochromatosis gene. Yet only one in two hundred people of Western European ancestry actually have hemochromatosis disease with all of its assorted symptoms. In genetics parlance, the degree that a given gene manifests itself in an individual is called penetrance. If a single gene means everyone who carries it will have dimples, that gene has very high or complete penetrance. On the other hand, a gene that requires a host of other circumstances to really manifest, like the gene for hemochromatosis, is considered to have low penetrance.

Aran Gordon had hemochromatosis. His body had been accumulating iron for more than thirty years. If it were untreated, doctors told him, it would kill him in another five. Fortunately for Aran, one of the oldest medical therapies known to man would soon enter his life and help him manage his iron-loading problem. But to get there, we have to go back.

Why would a disease so deadly be bred into our genetic code? You see, hemochromatosis isn’t an infectious disease like malaria, related to bad habits like lung cancer caused by smoking, or a viral invader like smallpox. Hemochromatosis is inherited – and the gene for it is very common in certain populations. In evolutionary terms, that means we asked for it.

Remember how natural selection works. If a given genetic trait makes you stronger – especially if it makes you stronger before you have children – then you’re more likely to survive, reproduce, and pass that trait on. If a given trait makes you weaker, you’re less likely to survive, reproduce, and pass that trait on. Over time, species “select” those traits that make them stronger and eliminate those traits that make them weaker.

So why is a natural-born killer like hemochromatosis swimming in our gene pool? To answer that, we have to examine the relationship between life – not just human life, but pretty much all life – and iron. But before we do, think about this – why would you take a drug that is guaranteed to kill you in forty years? One reason, right? It’s the only thing that will stop you from dying tomorrow.

Just about every form of life has a thing for iron. Humans need iron for nearly every function of our metabolism. Iron carries oxygen from our lungs through the bloodstream and releases it in the body where it’s needed. Iron is built into the enzymes that do most of the chemical heavy lifting in our bodies, where it helps us to detoxify poisons and to convert sugars into energy. Iron-poor diets and other iron deficiencies are the most common cause of anemia, a lack of red blood cells that can cause fatigue, shortness of breath, and even heart failure. (As many as 20 percent of menstruating women may have iron-related anemia because their monthly blood loss produces an iron deficiency. That may be the case in as much as half of all pregnant women as well – they’re not menstruating, but the passenger they’ re carrying is hungry for iron too!) Without enough iron our immune system functions poorly, the skin gets pale, and people can feel confused, dizzy, cold, and extremely fatigued.

Iron even explains why some areas of the world’s ocean are crystal clear blue and almost devoid of life, while others are bright green and teeming with it. It turns out that oceans can be seeded with iron when dust from land is blown across them. Oceans, like parts of the Pacific, that aren’t in the path of these iron-bearing winds develop smaller communities of phytoplankton, the single-celled creatures at the bottom of the ocean’s food chain. No phytoplankton, no zooplankton. No zooplankton, no anchovies. No anchovies, no tuna. But an ocean area like the North Atlantic, straight in the path of iron-rich dust from the Sahara Desert, is a green-hued aquatic metropolis. (This has even given rise to an idea to fight global warming that its originator calls the Geritol Solution. The notion is basically this – dumping billions of tons of iron solution into the ocean will stimulate massive plant growth that will suck enough carbon dioxide out of the atmosphere to counter the effects of all the CO2 humans are releasing into the atmosphere by burning fossil fuels. A test of the theory in 1995 transformed a patch of ocean near the Galápagos Islands from sparkling blue to murky green overnight, as the iron triggered the growth of massive amounts of phytoplankton.)

Because iron is so important, most medical research has focused on populations who don’t get enough iron. Some doctors and nutritionists have operated under the assumption that more iron can only be better. The food industry currently supplements everything from flour to breakfast cereal to baby formula with iron.

You know what they say about too much of a good thing?

Our relationship with iron is much more complex than it’s been considered traditionally. It’s essential – but it also provides a proverbial leg up to just about every biological threat to our lives. With very few exceptions in the form of a few bacteria that use other metals in its place, almost all life on earth needs iron to survive. Parasites hunt us for our iron; cancer cells thrive on our iron. Finding, controlling, and using iron is the game of life. For bacteria, fungi, and protozoa, human blood and tissue are an iron gold mine. Add too much iron to the human system and you may just be loading up the buffet table.

In 1952, Eugene D. Weinberg was a gifted microbial researcher with a healthy curiosity and a sick wife. Diagnosed with a mild infection, his wife was prescribed tetracycline, an antibiotic. Professor Weinberg wondered whether anything in her diet could interfere with the effectiveness of the antibiotic. We’ve only scratched the surface of our understanding of bacterial interactions today; in 1952, medical science had only scratched the surface of the scratch. Weinberg knew how little we knew, and he knew how unpredictable bacteria could be, so he wanted to test how the antibiotic would react to the presence or absence of specific chemicals that his wife was adding to her system by eating.

In his lab, at Indiana University, he directed his assistant to load up dozens of petri dishes with three compounds: tetracycline, bacteria, and a third organic or elemental nutrient, which varied from dish to dish. A few days later, one dish was so loaded with bacteria that Professor Weinberg’s assistant assumed she had forgotten to add the antibiotic to that dish. She repeated the test for that nutrient and got the same result – massive bacteria growth. The nutrient in this sample was providing so much booster fuel to the bacteria that it effectively neutralized the antibiotic. You guessed it – it was iron.

Weinberg went on to prove that access to iron helps nearly all bacteria multiply almost unimpeded. From that point on, he dedicated his life’s work to understanding the negative effect that the ingestion of excess iron can have on humans and the relationship other life-forms have to it.

Human iron regulation is a complex system that involves virtually every part of the body. A healthy adult usually has between three and four grams of iron in his or her body. Most of this iron is in the bloodstream within hemoglobin, distributing oxygen, but iron can also be found throughout the body. Given that iron is not only crucial to our survival but can be a potentially deadly liability, it shouldn’t be surprising that we have iron-related defense mechanisms as well.

We’re most vulnerable to infection where infection has a gateway to our bodies. In an adult without wounds or broken skin, that means our mouths, eyes, noses, ears, and genitals. And because infectious agents need iron to survive, all those openings have been declared iron no-fly-zones by our bodies. On top of that, those openings are patrolled by chelators – proteins that lock up iron molecules and prevent them from being used. Everything from tears to saliva to mucus – all the fluids found in those bodily entry points – are rich with chelators.

There’s more to our iron defense system. When we’re first beset by illness, our immune system kicks into high gear and fights back with what is called the acute phase response. The bloodstream is flooded with illness-fighting proteins, and, at the same time, iron is locked away to prevent biological invaders from using it against us. It’s the biological equivalent of a prison lockdown – flood the halls with guards and secure the guns.

A similar response appears to occur when cells become cancerous and begin to spread without control. Cancer cells require iron to grow, so the body attempts to limit its availability. New pharmaceutical research is exploring ways to mimic this response by developing drugs to treat cancer and infections by limiting their access to iron.

Even some folk cures have regained respect as our understanding of bacteria’s reliance on iron has grown. People used to cover wounds with egg-white-soaked straw to protect them from infection. It turns out that wasn’t such a bad idea – preventing infection is what egg whites are made for. Egg shells are porous so that the chick embryo inside can “breathe.” The problem with a porous shell, of course, is that air isn’t the only thing that can get through it – so can all sorts of nasty microbes. The egg white’s there to stop them. Egg whites are chock-full of chelators (those iron locking proteins that patrol our bodies’ entry points) like ovoferrin in order to protect the developing chicken embryo – the yolk – from infection.

The relationship between iron and infection also explains one of the ways breast-feeding helps to prevent infections in newborns. Mother’s milk contains lactoferrin – a chelating protein that binds with iron and prevents bacteria from feeding on it.

Before we return to Aran Gordon and hemochromatosis, we need to take a side trip, this time to Europe in the middle of the fourteenth century – not the best time to visit.

From 1347 through the next few years, the bubonic plague swept across Europe, leaving death, death, and more death in its wake. Somewhere between one-third and one-half of the population was killed – more than 25 million people. No recorded pandemic, before or since, has come close to touching the plague’s record. We hope none ever will.

It was a gruesome disease. In its most common form the bacterium that’s thought to have caused the plague (Yersinia pestis, named after Alexander Yersin, one of the bacteriologists who first isolated it in 1894) finds a home in the body’s lymphatic system, painfully swelling the lymph nodes in the armpits and groin until those swollen lymph nodes literally burst through the skin. Untreated, the survival rate is about one in three. (And that’s just the bubonic form, which infects the lymphatic system; when Y. pestis makes it into the lungs and becomes airborne, it kills nine out of ten – and not only is it more lethal when it’s airborne, it’s more contagious!)

The most likely origin of the European outbreak is thought to be a fleet of Genoese trading ships that docked in Messina, Italy, in the fall of 1347. By the time the ships reached port, most of the crews were already dead or dying. Some of the ships never even made it to port, running aground along the coast after the last of their crew became too sick to steer the ship. Looters preyed on the wrecks and got a lot more than they bargained for – and so did just about everyone they encountered as they carried the plague to land.

In 1348 a Sicilian notary named Gabriele de’Mussi tells of how the disease spread from ships to the coastal populations and then inward across the continent:

Alas! Our ships enter the port, but of a thousand sailors hardly ten are spared. We reach our homes; our kindred … come from all parts to visit us. Woe to us for we cast at them the darts of death! … Going back to their homes, they in turn soon infected their whole families, who in three days succumbed, and were buried in one common grave.

Panic rose as the disease spread from town to town. Prayer vigils were held, bonfires were lighted, churches were filled with throngs. Inevitably, people looked for someone to blame. First it was Jews, and then it was witches. But rounding them up and burning them alive did nothing to stop the plague’s deadly march.

Interestingly, it’s possible that practices related to the observance of Passover helped to protect Jewish neighborhoods from the plague. Passover is a week-long holiday commemorating Jews’ escape from slavery in Egypt. As part of its observance, Jews do not eat leavened bread and remove all traces of it from their homes. In many parts of the world, especially Europe, wheat, grain, and even legumes are also forbidden during Passover. Dr. Martin J. Blaser, a professor of internal medicine at New York University Medical Center, thinks this “spring cleaning” of grain stores may have helped to protect Jews from the plague, by decreasing their exposure to rats hunting for food – rats that carried the plague.

Victims and physicians alike had little idea what was causing the disease. Communities were overwhelmed simply by the volume of bodies that needed burying. And that, of course, contributed to the spread of the disease as rats fed on infected corpses, fleas fed on infected rats, and additional humans caught the disease from infected fleas. In 1348 a Sienese man named Agnolo di Tura wrote:

Father abandoned child, wife husband, one brother another, for this illness seemed to strike through the breath and sight. And so they died. And none could be found to bury the dead for money or friendship. Members of a household brought their dead to a ditch as best they could, without priest, without divine offices … great pits were dug and piled deep with the multitude of dead. And they died by the hundreds both day and night. … And as soon as those ditches were filled more were dug. … And I, Agnolo di Tura, called the Fat, buried my five children with my own hands. And there were also those who were so sparsely covered with earth that the dogs dragged them forth and devoured many bodies throughout the city. There was no one who wept for any death, for all awaited death. And so many died that all believed it was the end of the world.

As it turned out, it wasn’t the end of the world, and it didn’t kill everyone on earth or even in Europe. It didn’t even kill everyone it infected. Why? Why did some people die and others survive?

The emerging answer may be found in the same place Aran Gordon finally found the answer to his health problem – iron. New research indicates that the more iron in a given population, the more vulnerable that population is to the plague. In the past, healthy adult men were at greater risk than anybody else – children and the elderly tended to be malnourished, with corresponding iron deficiencies, and adult women are regularly iron depleted by menstruation, pregnancy, and breast-feeding. It might be that, as Stephen Ell, a professor at the University of Iowa, wrote, “Iron status mirror[ed] mortality. Adult males were at highest risk on this basis, with women [who lose iron through menstruation], children, and the elderly relatively spared.”

There aren’t any highly reliable mortality records from the fourteenth century, but many scholars believe that men in their prime were the most vulnerable. More recent – but still long ago – outbreaks of bubonic plague, for which there are reliable mortality records, demonstrate that the perception of heightened vulnerability in healthy adult men is very real. A study of plague in St. Botolph’s Parish in 1625 indicates that men between fifteen and forty-four killed by the disease outnumbered women of the same age by a factor of two to one. So let’s get back to hemochromatosis. With all this iron in their systems, people with hemochromatosis should be magnets for infection in general and the plague in particular, right?

Wrong.

Remember the iron-locking response of the body at the onset of illness? It turns out that people who have hemochromatosis have a form of iron locking going on as a permanent condition. The excess iron that the body takes on is distributed throughout the body – but it isn’t distributed everywhere throughout the body. And while most cells end up with too much iron, one particular type of cell ends up with much less iron than normal. The cells that hemochromatosis is stingy with when it comes to iron are a type of white blood cell called macrophages. Macrophages are the police wagons of the immune system. They circle our systems looking for trouble; when they find it, they surround it, try to subdue or kill it, and bring it back to the station in our lymph nodes.

In a nonhemochromatic person, macrophages have plenty of iron. Many infectious agents, like tuberculosis, can use that iron within the microphage to feed and multiply (which is exactly what the body is trying to prevent through the iron-locking response). So when a normal macrophage gathers up certain infectious agents to protect the body, it inadvertently is giving those infectious agents a Trojan horse access to the iron they need to grow stronger. By the time those macrophages get to the lymph node, the invaders in the wagon are armed and dangerous and can use the lymphatic system to travel throughout the body. That’s exactly what happens with bubonic plague: the swollen and bursting lymph nodes that characterize it are the direct result of the bacteria’s subversion of the body’s immune system for its own purposes.

Ultimately, the ability to access iron within our macrophages is what makes some intracellular infections deadly and others benign. The longer our immune system is able to prevent an infection from spreading by containing it, the better it can develop other means, like antibodies, to overwhelm it. If your macrophages lack iron, as they do in people who have hemochromatosis, those macrophages have an additional advantage – not only do they isolate infectious agents and cordon them off from the rest of the body, they also starve those infectious agents to death.

New research has demonstrated that iron-deficient macrophages are indeed the Bruce Lees of the immune system. In one set of experiments, macrophages from people who had hemochromatosis and macrophages from people who did not were matched against bacteria in separate dishes to test their killing ability. The hemochromatic macrophages crushed the bacteria – they are thought to be significantly better at combating bacteria by limiting the availability of iron than the nonhemochromatic macrophages.

Which brings us full circle. Why would you take a pill that was guaranteed to kill you in forty years? Because it will save you tomorrow. Why would we select a gene that will kill us through iron loading by the time we reach what is now middle age? Because it will protect us from a disease that is killing everyone else long before that.

Hemochromatosis is caused by a genetic mutation. It predates the plague, of course. Recent research has suggested that it originated with the Vikings and was spread throughout Northern Europe as the Vikings colonized the European coastline. It may have originally evolved as a mechanism to minimize iron deficiencies in poorly nourished populations living in harsh environments. (If this was the case, you’d expect to find hemochromatosis in all populations living in iron-deficient environments, but you don’t.) Some researchers have speculated that women who had hemochromatosis might have benefited from the additional iron absorbed through their diet because it prevented anemia caused by menstruation. This, in turn, led them to have more children, who also carried the hemochromatosis mutation. Even more speculative theories have suggested that Viking men may have offset the negative effects of hemochromatosis because their warrior culture resulted in frequent blood loss.

As the Vikings settled the European coast, the mutation may have grown in frequency through what geneticists call the founder effect. When small populations establish colonies in unpopulated or secluded areas, there is significant inbreeding for generations. This inbreeding virtually guarantees that any mutations that aren’t fatal at a very early age will be maintained in large portions of the population.

Then, in 1347, the plague begins its march across Europe. People who have the hemochromatosis mutation are especially resistant to infection because of their iron-starved macrophages. So, though it will kill them decades later, they are much more likely than people without hemochromatosis to survive the plague, reproduce, and pass the mutation on to their children. In a population where most people don’t survive until middle age, a genetic trait that will kill you when you get there but increases your chance of arriving is – well, something to ask for.

The pandemic known as the Black Death is the most famous – and deadly – outbreak of bubonic plague, but historians and scientists believe there were recurring outbreaks in Europe virtually every generation until the eighteenth or nineteenth century. If hemochromatosis helped that first generation of carriers to survive the plague, multiplying its frequency across the population as a result, it’s likely that these successive outbreaks compounded that effect, further breeding the mutation into the Northern and Western European populations every time the disease resurfaced over the ensuing three hundred years. The growing percentage of hemochromatosis carriers – potentially able to fend off the plague – may also explain why no subsequent epidemic was as deadly as the pandemic of 1347 to 1350.

This new understanding of hemochromatosis, infection, and iron has provoked a reevaluation of two long-established medical treatments – one very old and all but discredited, the other more recent and all but dogma. The first, bleeding, is back; the second, iron dosing, especially for anemics, is being reconsidered in many circumstances.

Bloodletting is one of the oldest medical practices in history, and nothing has a longer or more complicated record. First recorded three thousand years ago in Egypt, it reached its peak in the nineteenth century only to be roundly discredited as almost savage over the last hundred years. There are records of Syrian doctors using leeches for bloodletting more than two thousand years ago and accounts of the great Jewish scholar Maimonides’ employing bloodletting as the physician to the royal court of Saladin, sultan of Egypt, in the twelfth century. Doctors and shamans from Asia to Europe to the Americas used instruments as varied as sharpened sticks, shark’s teeth, and miniature bows and arrows to bleed their patients.

In Western medicine, the practice was derived from the thinking of the Greek physician Galen, who practiced the theory of the four humours – blood, black bile, yellow bile, and phlegm. According to Galen and his intellectual descendants, all illness resulted from an imbalance of the four humours, and it was the doctor’s job to balance those fluids through fasting, purging, and bloodletting.

Volumes of old medical texts are devoted to how and how much blood should be drawn. An illustration from a 1506 book on medicine points to forty-three different places on the human body that should be used for bleeding – fourteen on the head alone.

For centuries in the West, the place to go for bloodletting was the barber shop. In fact, the barber’s pole originated as a symbol for bloodletting – the brass bowl at the top represented the bowl where leeches were kept; the one at the bottom represented the bowl for collecting blood. And the red and white spirals have their origins in the medieval practice of hanging bandages on a pole to dry them after they were washed. The bandages would twist in the wind and wrap themselves in spirals around the pole. As to why barbers were the surgeons of the day? Well, they were the guys with the razor blades.

Bloodletting reached its peak in the eighteenth and nineteenth centuries. According to medical texts of the time, if you presented to your doctor with a fever, hypertension, or dropsy, you would be bled. If you had an inflammation, apoplexy, or a nervous disorder, you would be bled. If you suffered from a cough, dizziness, headache, drunkenness, palsy, rheumatism, or shortness of breath, you would be bled. As crazy as it sounds, even if you were hemorrhaging blood you would be bled.

Modern medical science has been skeptical of bloodletting for many reasons – at least some of them deserved. First of all, eighteenth- and nineteenth-century reliance on bleeding as a treatment for just about everything is reasonably suspect.

When George Washington was ill with a throat infection, doctors treating him conducted at least four bleedings in just twenty-four hours. It’s unclear today whether Washington actually died from the infection or from shock caused by blood loss. Doctors in the nineteenth century routinely bled patients until they fainted; they took that as a sign they’d removed just the right amount of blood.

After millennia of practice, bloodletting fell into extreme disfavor at the beginning of the twentieth century. The medical community – even the general public – considered bleeding to be the epitome of everything that was barbaric about prescientific medicine. Now, new research indicates that – like so much else – the broad discrediting of bloodletting may have been a rush to judgment.

First of all, it’s now absolutely clear that bloodletting – or phlebotomy, as it’s known today – is the treatment of choice for hemochromatosis patients. Regular bleeding of hemochromatosis patients reduces the iron in their systems to normal levels and prevents the iron buildup in the body’s organs that is so damaging.

It’s not just for hemochromatosis, either – doctors and researchers are examining phlebotomy as an aid in combating heart disease, high blood pressure, and pulmonary edema. And even our complete dismissal of historic bloodletting practices is getting another look. New evidence suggests that, in moderation, bloodletting may have had a beneficial effect.

A Canadian physiologist named Norman Kasting discovered that bleeding animals induces the release of the hormone vasopressin; this reduces their fevers and spurs their immune system into higher gear. The connection isn’t unequivocally proven in humans, but there is much correlation between bloodletting and fever reduction in the historic record. Bleeding also may have helped to fight infection by reducing the amount of iron available to feed an invader, providing an assist to the body’s natural tendency to hide iron when it recognizes an infection.

When you think about it, the notion that humans across the globe continued to practice phlebotomy for thousands of years probably indicates that it produced some positive results. If everyone who was treated with bloodletting died, its practitioners would have been out of business pretty quickly.

One thing is clear – an ancient medical practice that “modern” medical science dismissed out of hand is the only effective treatment for a disease that would otherwise destroy the lives of thousands of people. The lesson for medical science is a simple one – there is much more that the scientific community doesn’t understand than there is that it does understand.

Iron is good. Iron is good. Iron is good.

Well, now you know that, like just about every other good thing under the sun, when it comes to iron, it’s moderation, moderation, moderation. But until recently, current medical thinking didn’t recognize that. Iron was thought to be good, so the more iron the better.

A doctor named John Murray was working with his wife in a Somali refugee camp when he noticed that many of the nomads, despite pervasive anemia and repeated exposure to a range of virulent pathogens, including malaria, tuberculosis, and brucellosis, were free of visible infection. He responded to this anomaly by deciding to treat only part of the population with iron at first. Sure enough, he treated some of the nomads for anemia by giving them iron supplements, and suddenly the infections gained the upper hand. The rate of infection in nomads receiving the extra iron skyrocketed. The Somali nomads weren’t withstanding these infections despite their anemia: they were withstanding these infections because of their anemia. It was iron locking in high gear.

Thirty-five years ago, doctors in New Zealand routinely injected Maori babies with iron supplements. They assumed that the Maori (the indigenous people of New Zealand) had a poor diet, lacking iron, and that their babies would be anemic as a result.

The Maori babies injected with iron were seven times as likely to suffer from potentially deadly infections, including septicemias (blood poisoning) and meningitis. Like all of us, babies have isolated strains of potentially harmful bacteria in their systems, but those strains are normally kept under control by their bodies. When the doctors gave these babies iron boosters, they were giving booster fuel to the bacteria, with tragic results.

It’s not just iron dosing through injection that can cause this blossoming of infections; iron-supplemented food can be food for bacteria too. Many infants can have botulism spores in their intestines (the spores can be found in honey, and that’s one of the reasons parents are warned not to feed honey to babies, especially before they turn one). If the spores germinate, the results can be fatal. A study of sixty-nine cases of infant botulism in California showed one key difference between fatal and nonfatal cases of botulism in babies. Babies who were fed with iron-supplemented formula instead of breast-fed were much younger when they began to get sick and more vulnerable as a result. Of the ten who died, all had been fed with the iron-enhanced formula.

By the way, hemochromatosis and anemia aren’t the only hereditary diseases that have gained pride of place in our gene pool by offering protection from another threat, and they’re not all related to iron. The second most common genetic disease in Europeans, after hemochromatosis, is cystic fibrosis. It’s a terrible, debilitating disease that affects different parts of the body. Most people with cystic fibrosis die young, usually from lung-related illness. Cystic fibrosis is caused by a mutation in a gene called CFTR; it takes two copies of the mutated gene to cause the disease. Somebody with only one copy of the mutated gene is known as a carrier but does not have cystic fibrosis. It’s thought that at least 2 percent of people descended from Europeans are carriers, making the mutation very common indeed from a genetic perspective. New research suggests that, sure enough, carrying a copy of the gene that causes cystic fibrosis seems to offer some protection from tuberculosis. Tuberculosis, which has also been called consumption because of the way it seems to consume its victims from the inside out, caused 20 percent of all the deaths in Europe between 1600 and 1900, making it a very deadly disease. And making anything that helped to protect people from it look pretty attractive while lounging in the gene pool.

Aran Gordon first manifested symptoms of hemochromatosis as he began training for the Marathon des Sables – that grueling 150-mile race across the Sahara Desert. But it would take three years of progressive health problems, frustrating tests, and inaccurate conclusions before he finally learned what was wrong with him. When he did, he was told that untreated he had five years to live.

Today, we know that Aran suffered the effects of the most common genetic disorder in people of European descent – hemochromatosis, a disorder that may very well have helped his ancestors to survive the plague.

Today, Aran’s health has been restored through bloodletting, one of the oldest medical practices on earth.

Today, we understand much more about the complex interrelationship of our bodies, iron, infection, and conditions like hemochromatosis and anemia.

What doesn’t kill us, makes us stronger.

Which is probably some version of what Aran Gordon was thinking when he finished the Marathon des Sables for the second time in April 2006 – just a few months after he was supposed to have died.

Survival of the Sickest: The Surprising Connections Between Disease and Longevity

Подняться наверх