Читать книгу Survival of the Sickest: The Surprising Connections Between Disease and Longevity - Jonathan Prince - Страница 7
Chapter Two A SPOONFUL OF SUGAR HELPS THE TEMPERATURE GO DOWN
ОглавлениеThe World Health Organization estimates that 171 million people have diabetes – and that number is expected to double by 2030. You almost certainly know people with diabetes – and you certainly have heard of people with diabetes. Halle Berry, Mikhail Gorbachev, and George Lucas all have diabetes. It’s one of the most common chronic diseases in the world, and it’s getting more common every day.
Diabetes is all about the body’s relationship to sugar, specifically the blood sugar known as glucose. Glucose is produced when the body breaks down carbohydrates in the food we eat. It’s essential to survival – it provides fuel for the brain; it’s required to manufacture proteins; it’s what we use to make energy when we need it. With the help of insulin, a hormone made by the pancreas, glucose is stored in your liver, muscles, and fat cells (think of them as your own internal OPEC) waiting to be converted to fuel as necessary.
The full name of the disease is actually diabetes mellitus – which literally means “passing through honey sweet.” One of the first outward manifestations of diabetes is the need to pass large amounts of sugary urine. And for thousands of years, observers have noticed that diabetics’ urine smells (and tastes) particularly sweet. In the past Chinese physicians actually diagnosed and monitored diabetes by looking to see whether ants were attracted to someone’s urine. In diabetics, the process through which insulin helps the body use glucose is broken, and the sugar in the blood builds up to dangerously high levels. Unmanaged, these abnormal blood sugar levels can lead to rapid dehydration, coma, and death. Even when diabetes is tightly managed, its long-term complications include blindness, heart disease, stroke, and vascular disease that often leads to gangrene and amputation.
There are two major types of diabetes, Type 1 and Type 2, commonly called juvenile diabetes and adult-onset diabetes, respectively, because of the age at which each type is usually diagnosed. (Increasingly, adult-onset diabetes is becoming a misnomer: skyrocketing rates of childhood obesity are leading to increasing numbers of children who have Type 2 diabetes.)
Some researchers believe that Type 1 diabetes is an autoimmune disease – the body’s natural defense system incorrectly identifies certain cells as outside invaders and sets out to destroy them. In the case of Type 1 diabetes, the cells that fall victim to this biological friendly fire are the precise cells in the pancreas responsible for insulin production. No insulin means the body’s blood sugar refinery is effectively shut down. As of today, Type 1 diabetes can only be treated with daily doses of insulin, typically through self-administered injections, although it is also possible to have an insulin pump surgically implanted. On top of daily insulin doses, Type 1 requires vigilant attention to blood sugar levels and a superdisciplined approach to diet and exercise.
In Type 2 diabetes, the pancreas still produces insulin – sometimes even at high levels – but the level of insulin production can eventually be too low or other tissues in the body are resistant to it, impairing the absorption and conversion of blood sugar. Because the body is still producing insulin, Type 2 diabetes can often be managed without insulin injections, through a combination of other medications, careful diet, exercise, weight loss, and blood sugar monitoring.
There is also a third type of diabetes, called gestational diabetes because it occurs in pregnant women. Gestational diabetes can be a temporary type of diabetes that tends to resolve itself after pregnancy. In the United States, it occurs in as much as 4 percent of pregnant women – some 100,000 expectant mothers a year. It can also lead to a condition in the newborn called macrosomia – which is a fancy term for “really chubby baby” as all the extra sugar in the mother’s bloodstream makes its way across the placenta and feeds the fetus. Some researchers think this type of diabetes may be “intentionally” triggered by a hungry fetus looking for Mommy to stock the buffet table with sugary glucose.
So what causes diabetes? The truth is, we don’t fully understand. It’s a complex combination that can involve inheritance, infections, diet, and environmental factors. At the very least, inheritance definitely causes a predisposition to diabetes that can be triggered by some other factor. In the case of Type 1 diabetes, that trigger may be a virus or even an environmental trigger. In the case of Type 2, scientists think many people pull the trigger themselves through poor eating habits, lack of exercise, and resulting obesity. But one thing is clear – genetics contributes to Type 1 and especially to Type 2 diabetes. And that’s where, for our purposes, things really start to heat up. Or, more precisely, to cool down, as you’ll see shortly.
There’s a big difference in the prevalence of Type 1 and Type 2 diabetes that is largely based on geographic origin. Even though there seems to be a stronger genetic component to Type 2 diabetes, it is also closely related to lifestyle; 85 percent of people who have this type of diabetes are obese. That means it’s currently much more common in the developed world because easy access to high-calorie, low-nutrient junk food means so many more people are obese – but it seems clear that the predisposition to Type 2 diabetes exists across population groups. There are higher levels of incidence in certain populations, of course – but even that tends to occur hand in hand with higher levels of obesity. The Pima Indians of the southwestern United States, for example, have a staggering rate of diabetes – nearly half of all adults. It’s possible that their historic hunter-gatherer lifestyle produced metabolisms more suited for the Atkins diet than the carbohydrate- and sugar-heavy diet that European farmers survived on for centuries. Type 1 diabetes is different – it is much, much more common in people of Northern European descent. Finland has the highest rate of juvenile diabetes in the world. Sweden is second, and the United Kingdom and Norway are tied for third. As you head south, the rate drops lower and lower. It’s downright uncommon in people of purely African, Asian, and Hispanic descent.
When a disease that is caused at least partially by genetics is significantly more likely to occur in a specific population, it’s time to raise the evolutionary eyebrows and start asking questions – because that almost certainly means that some aspect of the trait that causes the disease today helped the forebears of that population group to survive somewhere back up the evolutionary line.
In the case of hemochromatosis, we know that the disease probably provided carriers with protection from the plague by denying the bacteria that causes it the iron it needs to survive. So what could diabetes possibly do for us? To answer that, we’re going to take another trip down memory lane – this time measured, not in centuries, but in millennia. Put your ski jackets on; we’re looking for an ice age.
Until about fifty years ago, the conventional wisdom among scientists who studied global climate change was that large-scale climate change occurred very slowly. Today, of course, people from Al Gore to Julia Roberts are on a mission to make it clear that humanity has the power to cause cataclysmic change in just a few generations. But before the 1950s, most scientists believed that climate change took thousands, probably hundreds of thousands, of years.
That doesn’t mean they didn’t accept the notion that glaciers and ice sheets had once covered the Northern Hemisphere. They were just happily certain that glaciers moved, well, glacially: eons to descend and epochs to recede. Humanity certainly didn’t have to worry about it – nobody was ever going to be run over by a speeding glacier. If massive climate change was going to lead us into a new ice age, we’d have a few hundred thousand years to do something about it.
Of course, there were some contrary voices singing a different tune, but the larger scientific community paid them very little regard. Andrew Ellicott Douglass was an astronomer working in Arizona in 1895 when he first started cutting down trees to examine them for evidence of any effect from a specific solar activity, called sunspots, that occurs in cycles. He never found that – but he did ultimately invent dendrochronology, the scientific technique of studying tree rings for clues about the past. One of his first observations was that tree rings were thinner during cold or dry years and thicker during wet or warm years. And by rolling back the years, one ring at a time, he discovered what appeared to be a century-long climate change that occurred around the seventeenth century, with a significant drop in temperature. The reaction of the scientific community was a collective “Nah.” As far as the climate change community was concerned, Douglass was cutting down trees in a forest with nobody there to hear it. (According to Dr. Lloyd Burckle of Columbia University, not only was Douglass right: the hundred-year cold spell he discovered was responsible for some beautiful music. Burckle says the superior sound of the great European violin makers, including the famous Stradivari, is the result of the high-density wood from the trees that grew during this century-long freeze – denser because they grew less during the cold and had thinner rings as a result.)
More evidence of the possibility of rapid climate change accumulated. In Sweden, scientists studying layers of mud from lake bottoms found evidence of climate change that occurred much more quickly than anyone at the time thought possible. These scientists discovered large amounts of pollen from an Arctic wildflower called Dryas octopetala in mud cores from only 12,000 years ago. Dryas’s usual home is the Arctic; it only truly flourished across Europe during periods of significant cold. Its widespread prevalence in Sweden around 12,000 years ago seemed to indicate that the warm weather that had followed the last ice age had been interrupted by a rapid shift back to much colder weather. In honor of the telltale wildflower, they named this arctic reprise the Younger Dryas. Of course, given prevailing thinking, even these scientists believed that the “rapid” onset of the Younger Dryas took 1,000 years or so.
It’s hard to underestimate the chilling effect conventional wisdom can have on the scientific community. Geologists of the time believed the present was the key to the past – if this is the way the climate behaves today, that’s the way it behaved yesterday. That philosophy is called uniformitarianism and, as the physicist Spencer Weart points out in his 2003 book The Discovery of Global Warming, it was the guiding principle among scientists of the time:
Through most of the 20th century, the uniformitarian principle was cherished by geologists as the very foundation of their science. In human experience, temperatures apparently did not rise or fall radically in less than millennia, so the uniformitarian principle declared that such changes had never happened in the past.
If you’re positive something doesn’t exist, you’re not going to look for it, right? And because everyone was certain that global climate changes took at least a thousand years, nobody even bothered to look at the evidence in a way that could reveal faster change. Those Swedish scientists studying the layers of lake bottom clay who first postulated the “rapid” thousand-year onset of the Younger Dryas? They were looking at chunks of mud spanning centuries; they never looked at samples small enough to demonstrate faster change. The proof that the Younger Dryas descended on the Northern Hemisphere much more rapidly than they thought was right in front of their eyes – but they were blinded by their assumptions.
By the 1950s and 1960s, the uniformitarian vise started to lose its hold, or at least change its grip, as scientists began to understand the potential of catastrophic events to produce rapid change. In the late 1950s, Dave Fultz at the University of Chicago built a mock-up of the earth’s atmosphere using rotating fluids that simulated the behavior of atmospheric gases. Sure enough, the fluids moved in stable, repeating patterns – unless, that is, they were disturbed. Then, even the smallest interference could produce massive changes in the currents. It wasn’t proof by a long shot, but it certainly was a powerful suggestion that the real atmosphere was susceptible to significant change. Other scientists developed mathematical models that indicated similar possibilities for rapid shifts.
As new evidence was discovered and old evidence was re-examined, the scientific consensus evolved. By the 1970s there was general agreement that the temperature shifts and climate changes leading into and out of ice ages could occur over mere hundreds of years. Thousands were out, hundreds were in. Centuries were the new “rapid.”
There was a new consensus around when – but a total lack of agreement about how. Perhaps methane bubbled up from tundra bogs and trapped the heat of the sun. Perhaps ice sheets broke off from the Antarctic and cooled the oceans. Maybe a glacier melted into the North Atlantic, creating a massive freshwater lake that suddenly interrupted the ocean’s delivery of warm tropical water to the north.
It’s fitting that hard, cold proof was eventually found in hard, cold ice.
In the early 1970s, climatologists discovered that some of the best records of historic weather patterns were filed away in the glaciers and ice plateaus of northern Greenland. It was hard, treacherous work – if you’re imagining the stereotypical lab rat in a white coat, think again. This was Extreme Sports: Ph.D. – multinational teams trekking across miles of ice, climbing thousands of feet, hauling tons of machines, and enduring altitude sickness and freakish cold, all so they could bore into a two-mile core of ice. But the prize was a pristine and unambiguous record of yearly precipitation and past temperature, unspoiled by millennia and willing to reveal its secrets with just a little chemical analysis. Once you paid it a visit, of course.
By the 1980s, these ice cores definitively confirmed the existence of the Younger Dryas – a severe drop in temperature that began around 13,000 years ago and lasted more than a thousand years. But that was just, well, the tip of the iceberg.
In 1989 the United States mounted an expedition to drill a core all the way to the bottom of the two-mile Greenland ice sheet – representing 110,000 years of climate history. Just twenty miles away, a European team was conducting a similar study. Four years later, both teams got to the bottom – and the meaning of rapid was about to change again.
The ice cores revealed that the Younger Dryas – the last ice age – ended in just three years. Ice age to no ice age – not in three thousand years, not in three hundred years, but in three plain years. What’s more, the ice cores revealed that the onset of the Younger Dryas took just a decade. The proof was crystal clear this time – rapid climate change was very real. It was so rapid that scientists stopped using the word rapid to describe it, and started using words like abrupt and violent. Dr. Weart summed it up in his 2003 book:
Swings of temperature that scientists in the 1950s believed to take tens of thousands of years, in the 1970s to take thousands of years, and in the 1980s to take hundreds of years, were now found to take only decades.
In fact, there have been around a score of these abrupt climate changes over the last 110,000 years; the only truly stable period has been the last 11,000 years or so. Turns out, the present isn’t the key to the past – it’s the exception.
The most likely suspect for the onset of the Younger Dryas and the sudden return to ice age temperatures across Europe is the breakdown of the ocean “conveyor belt,” or thermohaline circulation, in the Atlantic Ocean. When it’s working normally – or at least the way we’re used to it – the conveyor carries warm tropical water on the ocean surface to the north, where it cools, becomes denser, sinks, and is carried south through the ocean depths back to the Tropics. Under those circumstances, Britain is temperate even though it’s on the same latitude as much of Siberia. But when the conveyor is disrupted – say, by a huge influx of warm fresh water melting off the Greenland ice sheet – it may have a significant impact on global climate and turn Europe into a very, very cold place.
Just before the Younger Dryas, our European ancestors were doing pretty well. Tracing human migration through DNA, scientists have documented a population explosion in Northern Europe as populations that had once migrated north out of Africa now moved north again into areas of Europe that had been uninhabitable during the last ice age (before the Younger Dryas). The average temperature was nearly as warm as it is today, grasslands flourished where glaciers had once stood, and human beings thrived.
And then the warming trend that had persisted since the end of the last ice age kicked rapidly into reverse. In just a decade or so, average yearly temperatures plunged nearly thirty degrees. Sea levels dropped by hundreds of feet as water froze and stayed in the ice caps. Forests and grasslands went into a steep decline. Coastlines were surrounded by hundreds of miles of ice. Icebergs were common as far south as Spain and Portugal. The great, mountainous glaciers marched south again. The Younger Dryas had arrived, and the world was changed.
Though humanity would survive, the short-term impact, especially for those populations that had moved north, was devastating. In less than a generation, virtually every learned method of survival – from the shelters they built to the hunting they practiced – was inadequate. Many thousands of humans almost certainly froze or starved to death. Radiocarbon dating from archaeological sites provides clear evidence that the human population in Northern Europe went into a steep decline, showing a steep drop-off in settlements and other human activity.
But humans clearly survived; the question is, how? Certainly some of our success was due to social adaptation – many scientists think that the Younger Dryas helped to spur the collapse of hunter-gatherer societies and the first development of agriculture. But what about biological adaptation and natural selection? Scientists believe some animals perfected their natural ability to survive cold spells during this period – notably the wood frog, which we’ll return to later. So why not humans? Just as the European population may have “selected” the hemochromatosis gene because it helped its carriers withstand the plague, might some other genetic trait have provided its carriers with superior ability to withstand the cold? To answer that, let’s take a look at the effect of cold on humans.
Immediately upon his death in July 2002, baseball legend Ted Williams was flown to a spa in Scottsdale, Arizona, checked in, and given a haircut, a shave, and a cold plunge. Of course, this wasn’t your typical Arizona spa – this was the Alcor Life Extension cryonics lab, and Williams was checking in for the foreseeable future. According to his son, he hoped that future medical science might be able to restore him to life.
Alcor separated Williams’s head from his body, drilled a couple of dime-size holes in it, and froze it in a bucket of liquid nitrogen at minus 320 degrees Fahrenheit. (His body got its own cold storage container.) Alcor brochures suggest that “mature nanotechnology” might be able to reanimate frozen bodies “perhaps by the mid-21st century,” but they also note that cryonics is a “last-in-first-out process wherein the first-in may have to wait a very long time.”
Make that a very, very long time, like … never. Unfortunately for Williams and the other sixty-six superchilled cadavers at Alcor, human tissue doesn’t react well to freezing. When water is frozen, it expands into sharp little crystals. When humans are frozen, the water in our blood freezes, and the ice shards cut blood cells and cause capillaries to burst. It’s not dissimilar to the way a pipe bursts when the water’s left on in an unheated house – except no repairman can fix it.
Of course, just because we can’t survive a true deep freeze doesn’t mean our bodies haven’t evolved many ways to manage the cold. They have. Not only is your body keenly aware of the danger cold poses, it’s got a whole arsenal of natural defenses. Think back to some time when you were absolutely freezing – standing still for hours on a frigid winter morning watching a parade, riding a ski lift with the wind whipping across the mountain. You start to shiver. That’s your body’s first move. When you shiver, the increased muscle activity burns the sugar stored in your muscles and creates heat. What happens next is less obvious, but you’ve felt the effect. Remember the uncomfortable combination of tingling and numbness in your fingers and toes? That’s your body’s next move.
As soon as the body senses cold, it constricts the thin web of capillaries in your extremities, first your fingers and toes, then farther up your arms and legs. As your capillary walls close in, blood is squeezed out and driven toward your torso, where it essentially provides a warm bath for your vital organs, keeping them at a safe temperature, even if it means the risk of frostbite for your extremities. It’s natural triage – lose the finger, spare the liver.
In people whose ancestors lived in particularly cold climates – like Norwegian fishermen or Inuit hunters – this autonomic response to cold has evolved with a further refinement. After some time in the cold, the constricted capillaries in your hands will dilate briefly, sending a rush of warm blood into your numbed fingers and toes before constricting again to drive the blood back into your core. This intermittent cycle of constriction and release is called the Lewis wave or “hunter’s response,” and it can provide enough warmth to protect your extremities from real injury, while still ensuring that your vital organs are safe and warm. Inuit hunters can raise the temperature in the skin of their hands from near freezing to fifty degrees in a mater of minutes; for most people it takes much longer. On the other hand, people descended from warm-weather populations don’t seem to have this natural ability to protect their limbs and their core at the same time. During the frigid cold of the Korean War, African American soldiers were much more prone to frostbite than other soldiers.
Shivering and blood vessel constriction aren’t the only ways the body generates and preserves heat. A portion of the fat in newborns and some adults is specialized heat-generating tissue called brown fat, which is activated when the body is exposed to cold. When blood sugar is delivered to a brown fat cell, instead of being stored for future energy as it is in a regular fat cell, the brown fat cell converts it to heat on the spot. (For someone acclimated to very cold temperatures, brown fat can burn up to 70 percent more fat.) Scientists call the brown fat process nonshivering thermogenesis, because it’s heat creation without muscle movement. Shivering, of course, is only good for a few hours; once you exhaust the blood sugar stores in your muscles and fatigue sets in, it doesn’t work anymore. Brown fat, on the other hand, can go on generating heat for as long as it’s fed, and unlike most other tissues, it doesn’t need insulin to bring sugar into cells.
Nobody’s written the Brown Fat Diet Book yet because it requires more than your usual lifestyle change. Adults who don’t live in extreme cold don’t really have much, if any, brown fat. To accumulate brown fat and get it really working, you need to live in extreme cold for a few weeks. We’re talking North Pole cold. And that’s not all – you’ve got to stay there. Once you stop sleeping in your igloo, your brown fat stops working.
The body has one more response to the cold that’s not completely understood – but you’ve probably experienced it. When most people are exposed to cold for a while, they need to pee. This response has puzzled medical researchers for hundreds of years. It was first noted by one Dr. Sutherland, in 1764, who was trying to document the benefits of submersing patients in the supposedly healing – but cold – waters of Bath and Bristol, England. After immersing a patient who suffered from “dropsy, jaundice, palsy, rheumatism and inveterate pain in his back,” Sutherland noted that the patient was “pissing more than he drank.” Sutherland chalked the reaction up to external water pressure, figuring (quite wrongly) that fluid was simply being squeezed out of his patient, and it wasn’t until 1909 that researchers connected increased urine flow, or diuresis, to cold exposure.
The leading explanation for cold diuresis – the need to pee when it’s cold – is still pressure; but not external pressure, internal pressure. The theory is that as blood pressure climbs in the body’s core because of constriction in the extremities, the body signals the kidneys to offload some of the extra fluid. But that theory doesn’t fully explain the phenomenon, especially in light of recent studies.
The U.S. Army Research Institute of Environmental Medicine has conducted more than twenty years of study into human response to extreme heat, cold, depth, and altitude. Their research conclusively demonstrates that even highly cold-acclimated individuals still experience cold diuresis when the temperature dips toward freezing. So the question persists: Why do we need to pee when we’re cold? This certainly isn’t the most pressing question facing medical researchers today – but as you’ll soon discover, the possibilities are intriguing. And the answers may shed light on much bigger issues – like a disease that currently affects 171 million people.
Let’s put aside the delicate subject of cold diuresis and turn to one much more suitable for the dinner table – ice wine: delicious, prized, and – supposedly – created by accident. Four hundred years ago, a German vintner was hoping to squeak just a few more growing days out of the late autumn when his fields were hit by a sudden frost, or so the story goes. The grapes were curiously shrunken, but, not wanting to let his entire harvest go to waste, he decided to pick the frozen grapes anyway and see what would come of it, hoping for the best. He let the grapes defrost and then pressed the crop as he usually did but was disappointed when it yielded just one-eighth of the juice he was expecting. Since he had nothing to lose, he put his meager yield through the fermentation process.
And discovered that he had a hit on his hands. The finished wine was insanely sweet. Since its first, semilegendary, certainly accidental harvest, some winemakers have specialized in ice wine, waiting every year for the first frost so they can harvest crops of frozen grapes. Among the many ways wine is rated, graded, and weighted today, it is measured on a “sugar scale.” Typical table wine runs from 0 to 3 on the sugar scale. Ice wine runs from 18 to 28.
The shrunken nature of the grapes is due to water loss. Chemically speaking, it’s not difficult to guess why grapes might have evolved to offload water at the onset of a freeze – the less water in the grape, the fewer ice crystals there are to damage the delicate membranes of the fruit.
How about the sharp increase in sugar concentration? That makes sense too. Ice crystals are only made of pure water – but the temperature at which they start to form depends on what else is suspended in the fluid where the water is found. Anything dissolved in water interferes with its ability to form the hexagonal latticework of solid ice crystals. Average seawater, for example, full of salt, freezes at around 28 degrees Fahrenheit instead of the 32 degrees we think of as water’s freezing point. Think about the bottle of vodka some people keep in their freezer. Usually, alcohol is about 40 percent of the liquid volume in the bottle; it does a great job of interfering with the creation of ice – vodka doesn’t freeze until you cool it down to around minus 20 degrees Fahrenheit. Even most water in nature doesn’t freeze at exactly 32 degrees, because it usually contains trace minerals or other impurities that lower the freezing point.
Like alcohol, sugar is a natural antifreeze. The higher the sugar content in a liquid, the lower the freezing point. (Nobody knows more about sugar and freezing than the food service chemists at 7-Eleven who were in charge of developing a sugar-free Slurpee beverage. In regular Slurpees, the sugar is what helps to keep the frozen treat slurpable – it prevents the liquid from completely freezing. So when they tried to make sugar-free Slurpees, they kept making sugar-free blocks of ice. According to a company press release, it took two decades for researchers to develop a diet Slurpee by combining artificial sweeteners with undigestible sugar alcohols.) So when the grape dumps water at the first sign of frost, it’s actually protecting itself in two ways – first, by reducing water volume; and second, by raising the sugar concentration of the water that remains. And that allows the grape to withstand colder temperature without freezing.
Eliminating water to deal with the cold? That sounds an awful lot like cold diuresis – peeing when you’re cold. And higher levels of sugar? Well, we know where we’ve heard that; but before we get back to diabetes, let’s make one more stop: the animal kingdom.
Many animals thrive in the cold. Some amphibians, like the bullfrog, spend the winter in the frigid but unfrozen water at the bottom of lakes and rivers. The mammoth Antarctic cod happily swims beneath the Antarctic ice; its blood contains an antifreeze protein that sticks to ice crystals and prevents them from growing. On the Antarctic surface, the woolly bear caterpillar lives through temperatures as low as minus 60 degrees Fahrenheit for fourteen years, until it turns into a moth and flies off into the sunset for a few short weeks.
But of all the adaptations to cold under the sun – or hidden from it – none is as remarkable as the little wood frog’s.
The wood frog, Rana sylvatica, is a cute little critter about two inches long, with a dark mask across its eyes like Zorro’s, that lives across North America, from northern Georgia all the way up to Alaska, including north of the Arctic Circle. On early spring nights you can hear its mating call – a “brack, brack” that sounds something like a baby duck’s. But until winter ends, you won’t hear the wood frog at all. Like some animals, the wood frog spends the entire winter unconscious. But unlike hibernating mammals that go into a deep sleep, kept warm and nourished by a thick layer of insulating fat, the wood frog gives in to the cold entirely. It buries itself under an inch or two of twigs and leaves and then pulls a trick that – despite Ted Williams’s possible hopes and Alcor’s best efforts – seems to come straight out of a science fiction movie.
It freezes solid.
If you were on a winter hike and accidentally kicked one of these frogsicles out into the open, you’d undoubtedly assume it was dead. When completely frozen, it might as well be in suspended animation – it has no heartbeat, no breathing, and no measurable brain activity. Its eyes are open, rigid, and unnervingly white.
But if you pitched a tent and waited for spring, you’d eventually discover that little old Rana sylvatica has a few tricks up its frog sleeves. Just a few minutes after rising temperatures thaw the frog, its heartbeat miraculously sparks into gear and it gulps for air. It will blink a few times as color returns to its eyes, stretch its legs, and pull itself up into a sitting position. Not long after that, it will hop off, none the worse for wear, and join the chorus of defrosted frogs looking for a mate.
Nobody knows the wood frog better than the brilliant and irrepressible Ken Storey, a biochemist from Ottawa, Canada, who, along with his wife, Janet, has been studying them since the early 1980s. Storey had been studying insects with the ability to tolerate freezing when a colleague told him about the wood frog’s remarkable ability. His colleague had been collecting frogs for study and accidentally left them in the trunk of his car. Overnight, there was an unexpected frost and he awoke to discover a bag of frozen frogs. Imagine his surprise later that day when they thawed out on his lab table and started jumping around!
Storey was immediately intrigued. He was interested in cryopreservation – freezing living tissue to preserve it. Despite the bad rap it gets for its association with high-priced attempts to freeze the rich and eccentric for future cures, cryopreservation is a critical area of medical research that has the potential to yield many important advances. It has already revolutionized reproductive medicine by giving people the opportunity to freeze and preserve eggs and sperm.
The next step – the ability to extend the viability of large human organs for transplants – would be a huge breakthrough that could save thousands of lives every year. Today, a human kidney can be preserved for just two days outside the human body, while a heart can last only a few hours. As a result, organ transplants are always a race against the clock, with very little time to find the best match and get the patient, organ, and surgeon into the same operating room. Every day in the United States, a dozen people die because the organ they need hasn’t become available in time. If donated organs could be frozen and “banked” for later revival and transplant, the rates of successful transplants would almost surely climb significantly.
But currently it’s impossible. We know how to use liquid nitrogen to lower the temperature of tissue at the blinding speed of 600 degrees per minute, but it isn’t good enough. We have not figured out how to freeze large human organs and restore them to full viability. And, as was mentioned, we’re nowhere near the ability to freeze and restore a whole person.
So when Storey heard about the freezing frog, he jumped at the opportunity to study it. Frogs have the same major organs as humans, so this new direction for his research could prove amazingly useful. With all our technological prowess, we can’t freeze and restore a single major human organ – and here was an animal that naturally manages the complex chemical wizardry of freezing and restoring all its organs more or less simultaneously. After many years of study (and many muddy nights trudging through the woodlands of southern Canada on wood frog hunts), the Storeys have learned a good deal about the secrets behind Rana sylvatica’s death-defying freezing trick.
Here’s what they’ve uncovered: Just a few minutes after the frog’s skin senses that the temperature is dropping near freezing, it begins to move water out of its blood and organ cells, and, instead of urinating, it pools the water in its abdomen. At the same time, the frog’s liver begins to dump massive (for a frog) amounts of glucose into its bloodstream, supplemented by the release of additional sugar alcohols, pushing its blood sugar level up a hundredfold. All this sugar significantly lowers the freezing point of whatever water remains in the frog’s bloodstream, effectively turning it into a kind of sugary antifreeze.
There’s still water throughout the frog’s body, of course; it’s just been forced into areas where ice crystals will cause the least damage and where the ice itself might even have a beneficial effect. When Storey dissects frozen frogs he finds flat sheets of ice sandwiched between the skin and muscle of the legs. There will also be a big chunk of ice in the abdominal cavity surrounding the frog’s organs; the organs themselves are largely dehydrated and look wizened as raisins. In effect, the frog has carefully put its own organs on ice, not unlike adding ice to coolers containing human organs as they’re readied for transport to transplant. Doctors remove an organ, place it into a plastic bag, and then place the bag in a cooler full of crushed ice so the organ is kept as cool as possible without actually being frozen or damaged.
There’s water in the frog’s blood, too, but the rich concentration of sugar not only lowers the freezing point, it also minimizes damage by forcing the ice crystals that eventually form into smaller, less jagged shapes that won’t puncture or slash the walls of cells or capillaries. Even all of this doesn’t prevent every bit of damage, but the frog has that covered, too. During the winter months of its frozen sleep, the frog produces a large volume of a clotting factor called fibrinogen that helps to repair whatever damage might have occurred during freezing. Eliminating water and driving up sugar levels to deal with the cold: Grapes do it. Now we know that frogs do it. Is it possible that some humans adapted to do it, too?
Is it a coincidence that the people most likely to have a genetic propensity for a disease characterized by exactly that (excessive elimination of water and high levels of blood sugar) are people descended from exactly those places most ravaged by the sudden onset of an ice age about 13,000 years ago?
As a theory, it’s hotly controversial, but diabetes may have helped our European ancestors survive the sudden cold of the Younger Dryas.
As the Younger Dryas set in, any adaptation to manage the cold, no matter how disadvantageous in normal times, might have made the difference between making it to adulthood and dying young. If you had the hunter’s response, for instance, you would have an advantage in gathering food, because you were less likely to develop frostbite.
Now imagine that some small group of people had a different response to the cold. Faced with year-round frigid temperatures, their insulin supply slowed, allowing their blood sugar to rise somewhat. As in the wood frog, this would have lowered the freezing point of their blood. They urinated frequently, to keep internal water levels low. (A recent U.S. Army study shows there is very little harm caused by dehydration in cold weather.) Suppose these people used their brown fat to burn that over-supply of sugar in their blood to create heat. Perhaps they even produced additional clotting factor to repair tissue damage caused by particularly deep cold snaps. It’s not hard to imagine that these people might have had enough of an advantage over other humans, especially if, like the wood frog, the spike in sugar was only temporary, to make it more likely that they would survive long enough to reach reproductive age.
There are tantalizing bits of evidence to bolster the theory.
When rats are exposed to freezing temperatures, their bodies become resistant to their own insulin. Essentially, they become what we would call diabetic in response to the cold.
In areas with cold weather, more diabetics are diagnosed in colder months; in the Northern Hemisphere, that means more diabetics are diagnosed between November and February than between June and September.
Children are most often diagnosed with Type 1 diabetes when temperatures start to drop in late fall.
Fibrinogen, the clotting factor that repairs ice-damaged tissue in the wood frog, also mysteriously peaks in humans during winter months. (Researchers are taking note – that may mean that cold weather is an important, but underappreciated, risk factor for stroke.)
A study of 285,705 American veterans with diabetes measured seasonal differences in their blood sugar levels. Sure enough, the veterans’ blood sugar levels climbed dramatically in the colder months and bottomed out during the summer. More telling, the contrast between summer and winter was even more pronounced in those who lived in colder climates, with greater differences in seasonal temperature. Diabetes, it seems, has some deep connection to the cold.
We don’t know enough today to state with certainty that the predisposition to Type 1 or Type 2 diabetes is related to human cold response. But we do know that some genetic traits that are potentially harmful today clearly helped our ancestors to survive and reproduce (hemochromatosis and the plague, for example). So while it’s tempting simply to question how a condition that can cause early death today could ever confer a benefit, that doesn’t look at the whole picture.
Remember, evolution is amazing – but it isn’t perfect. Just about every adaptation is a compromise of sorts, an improvement in some circumstances, a liability in others. A peacock’s brilliant tail feathers make him more attractive to females – and attract more attention from predators. Human skeletal structure allows us to walk upright and gives us large skulls filled with big brains – and the combination means an infant’s head can barely make it through its mother’s birth canal. When natural selection goes to work, it doesn’t favor adaptations that make a given plant or animal “better” – just whatever it takes for it to increase the chances for survival in its current environment. And when there’s a sudden change in circumstances that threatens to wipe out a population – a new infectious disease, a new predator, or a new ice age – natural selection will make a beeline for any trait that improves the chance of survival.
“Are they kidding?” said one doctor when told of the diabetes theory by a reporter. “Type 1 diabetes would result in severe ketoacidosis and early death.”
Sure – today.
But what if a temporary diabetes-like condition occurred in a person who had significant brown fat living in an ice age environment? Food would probably be limited, so dietary bloodsugar load would already be low, and brown fat would convert most of that to heat, so the ice age “diabetic’s” blood sugar, even with less insulin, might never reach dangerous levels. Modern-day diabetics, on the other hand, with little or no brown fat, and little or no exposure to constant cold, have no use – and thus no outlet – for the sugar that accumulates in their blood. In fact, without enough insulin the body of a severe diabetic starves no matter how much he or she eats.
The Canadian Diabetes Association has helped to fund Ken Storey’s study of the incredible freezing frog. It understands that just because we haven’t definitively linked diabetes and the Younger Dryas doesn’t mean we shouldn’t explore biological solutions to high blood sugar found elsewhere in nature. Coldtolerant animals like the wood frog exploit the antifreezing properties of high blood sugar to survive. Perhaps the mechanisms they use to manage the complications of high blood sugar will help lead us to new treatments for diabetes. Plants and microbes adapted to extreme cold might produce molecules that could do the same.
Instead of dismissing connections, we need to have the curiosity to pursue them. And in the case of diabetes, sugar, water, and cold, there are clearly plenty of connections to pursue.