Читать книгу The End of Food - Thomas F. Pawlick - Страница 8
ОглавлениеTHE DECLINE OF THE TOMATO ISN’T THE ONLY tragic story in the modern supermarket, nor are the USDA’s nutrient tables the only source documenting what’s happening. A tour through the recent literature on food in the U.S., Canada, and Britain, followed by a walk through just about any corporate chain-owned grocery store, produces a horrific picture of losses so steep, and continuing at such a rate, that it is not an exaggeration to speak–literally–of the coming end of food.
Start with some of the most obvious losses, and their potential consequences in terms of human health.
On July 6, 2002, the Toronto Globe and Mail began publishing a series of articles on food, including one by reporter Andre Picard, who wrote:
Fruits and vegetables sold in Canadian supermarkets today contain far fewer nutrients than they did 50 years ago. Vital vitamins and minerals have dramatically declined in some of our most popular foods.
Take the potato, by far the most consumed food in Canada. The average spud has lost 100 percent of its vitamin A, which is important for good eyesight; 57 percent of its vitamin C and iron, a key component of healthy blood; and 28 percent of its calcium, essential for building healthy bones and teeth.
It also lost 50 percent of its riboflavin and 18 percent of its thiamine. Of the seven key nutrients measured, only niacin levels have increased.... The story is similar for 25 fruits and vegetables that were analyzed [in a Globe and Mail, CTV study].1
Picard’s numbers were based on food tables supplied by the Canadian government, but were fairly close to what the USDA tables were showing. In fact, some of the Canadian data was taken originally from the USDA tables. What Picard was saying was generally true for the U.S. as well.
It was also true in Britain. There, researcher Anne-Marie Mayer published a study in the British Food Journal, a respected scholarly source for nutritionists and food specialists. She wanted to answer the question: “Has the nutritional quality (particularly essential mineral content) of fruits and vegetables changed this century during the period of changes in the food system and the modernization of agriculture?”2 To get her answer, she looked at the United Kingdom’s equivalent of the USDA food tables, The Chemical Composition of Foods, for 1936 and 1991, and compared the contents of 20 vegetables and fruits. Her results:
There were significant reductions in the levels of calcium, magnesium, copper and sodium in vegetables, and magnesium, iron, copper, and potassium in fruits. The greatest change was the reduction in copper levels in vegetables to less than one-fifth of the old level. The only mineral that showed no significant differences over the 50- year period was phosphorus. 3
SMALL NUMBERS, BIG CONSEQUENCES
Numbers. Most supermarket shoppers are not math majors, nor experts in statistics. And besides, numbers sitting there on the page all by themselves are a little boring. Just how important is it whether the trace amounts of copper, say, in a given food are going up or down by a few milligrams? A milligram is a very small thing, one one-thousandth of a gram. And a gram is only a bit more than a twenty-eighth of an ounce. A milligram of some substances would be so small we couldn’t even see it without a microscope–a mere speck. What difference does it make whether we swallow a couple of specks of anything on a given day?
The answer is, a big difference. The human body is a mysterious thing, resilient and adaptable, but amazingly complex. The very resilience and adaptability that have made us the most successful species in the history of our planet depend on an intricate number of finely tuned relationships–between our bodies and the environment, between each of the organs inside our bodies, and between each of the cells inside those organs. We walk a thousand minute bioenvironmental tightropes every day, every moment, teetering and tipping, and righting ourselves just in time to maintain equilibrium, staying within the narrow band of conditions, both internal and external, that allow us to survive.
If the outside temperature falls below 32 degrees F and we aren’t wearing protective clothing, we freeze to death in a few minutes. If it climbs much above 120 degrees F, and we can’t find shade or an airconditioned building, we can die of heatstroke.
I’ve personally experienced both temperature extremes. Stranded by the highway late at night, during a January blizzard in Quebec, I huddled inside my snowbound car with only a thermal snowmobile suit and a lit candle between me and the –28 degree F winter wind outside. When the snowplow smacked into my back bumper in the morning, I was grateful that a couple of spots of frostbite on my cheeks were the worst the cold had done.
A decade later, crossing the Arabian desert during the first Gulf War, with temperatures well above 120 degrees F, I noticed that the air actually burned my lungs when I inhaled, while the sweat evaporated almost before it formed on the surface of my skin. It was a tremendous physical relief to pull off the road and walk into an air-conditioned restaurant in that throbbing, shimmering heat. If I’d stayed out in it much longer, after five hours in a car whose air-conditioning system had broken down, I’d have paid a steep price.
A narrow band.
As for copper, we’re all familiar with this practical, beautiful metal. We make pennies out of it, and electrical wiring, and polished kettles to boil our tea water. Anyone who’s traveled to Michigan’s Upper Peninsula has seen copper ore, and probably bought a nugget or two at some souvenir stand. Copper roofs, turned green by contact with the air, adorn many of our best-known architectural monuments.
A normal, healthy person has about 100 milligrams of copper in his or her body, distributed throughout a variety of cells and tissues. Only 100 mg. A few specks. What can happen if that number goes up or down?
An article in a recent issue of Discover magazine ought to give an idea.4 The author, a neurologist from Concord, New Hampshire, recounted his experience with a 22-year-old patient he called Megan, who suffered from a disorder called Wilson’s disease. Wilson’s is a genetic disorder that prevents the body from properly eliminating excess amounts of copper. As Dr. John R. Pettinato explained:
Copper is an essential trace element, and most diets provide about one quarter more than is needed for cellular metabolism. The liver processes this excess copper into bile, which is excreted in the stool. Some people inherit a defect in this processing pathway, and symptoms occur as harmful amounts of copper accumulate in the brain and the liver. 5
A few specks too many accumulated in Megan. She became depressed, anxious, and developed anorexia, as well as a bad case of the shakes. Her legs and head shook, and she was rarely tremor-free. Then she began to drool, especially at night; “her extremities had become stiff, and her arms didn’t swing naturally when she walked. She felt dizzy and off balance and seemed to shuffle.”6
If Pettinato hadn’t quickly diagnosed and treated her, Megan might have gotten a lot worse. The full range of symptoms of Wilson’s can include hepatitis, liver damage, tremors, slurred speech, lack of coordination, cramping, emotionality, depression, parkinsonism, psychosis, and “other bizarre behaviors.” Some patients die. All that from a few specks too many of a single element.
Of course, Megan was suffering from too much copper. What about too little? As the authors of Understanding Nutrition note, copper deficiency is relatively rare, but is seen in some malnourished children. “Copper deficiency in animals raises blood cholesterol and damages blood vessels, raising questions about whether low dietary copper might contribute to cardiovascular disease in humans,” say Whitney and Rolfes.7
Copper is only one of the many nutrients our bodies need. Some, like iron, or vitamins A and C, are of major importance and have been studied in great detail over the years. Others, like selenium or molybdenum, or vitamins E or K, have received less attention and their roles in keeping us healthy are only beginning to be understood. As late as 1975, the USDA food tables didn’t even list selenium or vitamins D and E, and they have only recently begun to include the amino acids.
Also only partially understood are the effects each of these nutrients have on each other, or on the body when working in tandem, such as the interconnections between sodium and calcium intake noted in Chapter One. The point is that they are all important; each one affects the others, working with them or against them, in an intricate living symphony of chemical and biochemical reactions. Even the smallest excesses or deficiencies can provoke myriad unexpected results, which we ignore at our peril.
SCURVY KNAVES
In the 1800s, when Herman Melville wrote his classic whaling novel Moby Dick (the movie version, a century later, starred Gregory Peck), sailors would stay at sea for months, even years, and their stores of fresh vegetables would often be exhausted long before they could put into port for more provisions. Forced to subsist on diets of salt pork and biscuit, they developed a whole range of diseases stemming from dietary deficiencies, the best known of which was scurvy (“Ahoy there, you scurvy knave!”).
The first sign of scurvy was fatigue, which kept getting worse. Then the sailor’s gums would start bleeding, followed by his skin. The blood vessels under his skin would appear to turn red and swell. If the man cut himself, the cut wouldn’t heal. His fingers and toes would swell, and his body hair would turn curly and kinky. Horny growths would appear on the skin, particularly his buttocks. He would experience increasing pain in his joints, would become pale and lethargic and unable to sleep. Next his teeth would start falling out and finally he would start to hemorrhage. Finally, thankfully, he would die.8
As many as two-thirds of a ship’s crew would die this way during a long voyage. Experiments by British physician James Lind finally isolated the cause—lack of citrus or other fruits containing what was then called the “antiscorbutic factor.” Isolated nearly 200 years later, the factor was found to be a carbon compound similar to glucose, which was dubbed “ascorbic acid”—today’s vitamin C.9 Eventually, the British navy solved the problem by requiring all of its sailors to drink lime juice during long voyages, thus giving rise to the slang nickname for an Englishman, “limey.”
And what has the potato lost over the past 50 years? In Canada, 57 percent of its vitamin C. The American tomato has lost 16.9 percent of its vitamin C just since 1963. And broccoli, described by reporter Picard as “a food that epitomizes the dictates of healthy eating,”10 has, according to the USDA tables, lost fully 45 percent of this crucial nutrient since John Kennedy died.
Are Americans and Canadians likely to break out suddenly with the symptoms of advanced scurvy? Probably not in the short-term future, since other food items—including limes, lemons, and grapefruit— still contain considerable ascorbic acid. But the general trend toward drastic vitamin C loss in so many food items at the same time, a steady move away from what makes for good health and toward nutritional poverty, is hardly reassuring.
It’s even less reassuring if one takes such scourges as heart disease or cancer into account. The causes of cancer, what John Wayne called “the big C,” and which killed him shortly after he made his last classic western, The Shootist, are in many ways still a mystery to researchers. But so-called “free radicals” are not that mysterious.
As most of us were probably told in high school chemistry class (and promptly forgot once the exam was over), free radicals are molecules, or groups of atoms, in which one of the atoms in the group has an “unpaired” electron in its outer shell, making it unstable. Since atoms always seek stability, these molecules only exist very briefly, as intermediate products of earlier chemical reactions. As soon as they encounter another molecule with which they can combine, or from which they can scavenge an electron to pair with their extra one, they do so.
The human body is constantly creating free radicals, most often during the process of oxidizing, or “burning” food for energy.11 This process produces a type of free radical called “reactive oxygen,” which can begin a very destructive chain reaction as it attempts to bond with other atoms and achieve stability. It’s a bit like the biblical raging lion, which “goes about, seeking whom it may devour.” Snatching an electron from another atom, leaving it unstable, the oxygen radical creates “another, uglier than itself,” which will in turn attack yet other atoms, creating more free radicals, and so on, and so on.
Raging about within our bodies, these sub-microscopic biochemical lions “may irritate or scar artery walls, which invites artery-clogging fatty deposits around the damage,” the so-called hardening of the arteries that leads to heart disease.12 There is also “a growing body of evidence” that “many of the things we associate with getting older—memory loss, hearing impairment—can be traced to the cumulative effects of free radicals damaging DNA … thus diminishing the body’s energy supply.”13 Scientists have also implicated oxidative stress in the development of arthritis and cataracts.
Worst of all, free radicals can have a “mutation-causing or mutagenic effect on human DNA, which can be a factor leading to cancer.”14 Too many free radicals, in fact, may have been what killed “the Duke,” the very symbol of cowboy courage and manly strength.
A normal, healthy human body provided with a balanced diet has a set of natural defenses against free radicals, in the form of “anti-oxidants.” These are substances which can chemically interact with free radicals and “neutralize” them in various ways without themselves turning into radicals. Like so many microscopic Buffys stamping out vampires without becoming vampires, they de-fang the radicals, rendering them harmless.
Foremost among these are various enzymes (proteins that help along chemical reactions without themselves being changed in the process), and the vitamins C and E. The chemical “de-fanging” activity of the enzymes depends heavily on the presence of the minerals selenium, copper, manganese, and zinc.
And exactly what is missing or declining in the foods sold in our modern supermarkets? Vitamin C (decreased by 57 percent in Canada’s potatoes, declining fast in America’s tomatoes, broccoli, and a host of other vegetables and fruits), and copper (down across-the-board by four-fifths in vegetables in England, but unfortunately not measured in the USDA tables for 1963 or 1975). The USDA did not analyze for selenium, manganese, or zinc until recently, nor for vitamin E.
What about vitamin A, down by 43.3 percent in red, ripe tomatoes in the U.S. since 1950, by 30.5 percent in tomato juice and 27.4 percent in tomato catsup since 1963? What is it good for?
First, it plays a crucial role in vision, helping to maintain the clarity of the cornea of the human eye, and in the conversion of light energy into nerve impulses in the retina. Without sufficient vitamin A, people can go blind.
In addition, vitamin A is needed to maintain what biologists call “epithelial” tissues in the body. These are the cells that form the internal and external surfaces of our bodies and their organs. They include our skin, which shields us from the outside world, and the walls that separate each of our internal organs from the others, as well as the mucus secretions that ease the movement of foods through the human digestive tract.
What could happen if a person were to stop eating vitamin A-rich foods? The authors of Understanding Nutrition are blunt: “Deficiency symptoms would not begin to appear until after [the body’s] stores were depleted—one to two years for a healthy adult but much sooner for a growing child. Then the consequences would be profound and severe.”15
In children, this could mean an upsurge in the negative effects of such infectious diseases as measles which, despite a vaccine available mostly in the rich countries, still kills some two million children worldwide every year. As Whitney and Rolfes explain: “The severity of the illness often correlates with the degree of vitamin A deficiency; deaths are usually due to related infections such as pneumonia and severe diarrhea. Providing large doses of vitamin A reduces the risk of dying from these infections.”16
More obvious results would include night blindness, in which a vitamin A-deficient person’s ability to see at night is sharply curtailed. Whitney and Rolfes provide a graphic description:
The person loses the ability to recover promptly from the temporary blinding that follows a flash of bright light at night or to see after the lights go out. In many parts of the world, after the sun goes down, vitamin A-deficient people become night-blind: children cannot find their shoes or toys, and women cannot fetch water or wash dishes. They often cling to others, or sit still, afraid that they may trip and fall or lose their way if they try to walk alone. 17
This condition may progress to total blindness.
Another result of vitamin A deficiency is “keratinization,” a condition where the victim’s epithelial surfaces are adversely affected. Mucus secretion drops, interfering with normal absorption of food along the digestive tract, causing general malnutrition. Problems also develop in the lungs, interfering with oxygen absorption, as well as in the urinary tract, the inner ear, and for women in the vagina. On the body’s outer surface, “the epithelial cells change shape and begin to secrete the protein keratin—the hard, inflexible protein of hair and nails. The skin becomes dry, rough, and scaly as lumps of keratin accumulate.”18
An attractive picture, eh? Blind, disease-prone children, short of breath, suffering from malnutrition, with problems peeing, and with scaly lumps all over their skins. Will we be seeing this in the near future? Again, probably not. But the tendency is there, and steadily increasing. Who can say where we’ll be in another 20 or another 50 years, if present trends continue unabated? The Canadian potato, remember, has already lost all of its vitamin A.
And what of iron, down by more than half in Canada’s potatoes, by 10 percent in the American tomato, and by various amounts in many other fruits and vegetables?
Statistically, low iron is the world’s most common nutrient deficiency, and is particularly dangerous for menstruating or pregnant women and for growing children. Iron is absolutely necessary for the proper maintenance of hemoglobin in the blood and myoglobin in the muscles. It helps both of these proteins carry and release oxygen, permitting the biochemical reactions that give us energy. As the authors of Understanding Nutrition explain, a series of events can be triggered by insufficient iron in the body, events which can ultimately lead to life-threatening anemia:
Long before the red blood cells are affected and anemia is diagnosed, a developing iron deficiency affects behavior. Even at slightly lowered iron levels, the complete oxidation of pyruvate is impaired, reducing physical work capacity and productivity. With reduced energy available to work, plan, think, play, sing, or learn, people simply do these things less. They have no obvious deficiency symptoms; they just appear unmotivated, apathetic and less physically fit.... A restless child who fails to pay attention in class might be thought contrary. An apathetic homemaker who has let housework pile up may be thought lazy. 19
If the iron deficiency continues and worsens, it eventually leads to full-blown iron-deficiency anemia:
In iron-deficiency anemia, red blood cells are pale and small. They can’t carry enough oxygen from the lungs to the tissues, so energy metabolism in the cells falters. The result is fatigue, weakness, headaches, apathy, pallor, and poor resistance to cold temperatures... The skin of a fair person who is anemic may become noticeably pale. 20
Such a condition can be particularly damaging if it occurs in growing children.
At the same time that they are losing nutrients, other vegetables and fruits are also suffering a drastic decline in the number of varieties available to consumers. Just as the number of tomato varieties is sharply limited in the supermarket, so are those of potatoes and apples. As investigative journalist Brewster Kneen noted in his landmark book, From Land to Mouth: Understanding the Food System:
Even though there are 2,000 species of potato in the genus solanum, all the potatoes grown in the United States, and most of those grown commercially everywhere else, belong to one species, solanum tuberosum. Twelve varieties of this one species constitute 85 percent of the U.S. potato harvest, but the one variety favored by most processors, the Russet Burbank, is by far the dominant variety. By 1982, 40 percent of the potatoes planted in the United States were Russet Burbanks.21
To witness the poverty of choice among apples, just walk into the corner supermarket and look on the shelves. Most chain stores have only three varieties on display: red delicious, golden delicious, and Granny Smith. Sometimes a Canadian store will also feature MacIntosh. Looking at such a display, it is useful to keep in mind that at the turn of the last century “there were more than 7,000 apple varieties grown in the United States. By the dawn of the twenty-first century, over 85 percent of these varieties, more than 6,000, had become extinct.”22
By the year 2000, 73 percent of all the lettuce grown in the U.S. was one variety: iceberg.23
ACROSS-THE-BOARD DEGENERATION
Examples of the rapid decline in nutrients in our foods are not limited to vegetables and fruits. A general, across-the-board degeneration affects nearly everything we eat.
For instance, according to the USDA tables, chicken–which many of us eat in an attempt to avoid steroid-rich red meats–is in deep trouble. Skinless, roasted white chicken meat has lost 51.6 percent of its vitamin A since 1963. Dark meat has lost 52 percent. White meat has also lost 39.9 percent of its potassium, while dark meat has lost 25.2 percent.
And what has chicken gained? Light meat, 32.6 percent fat, and 20.3 percent sodium; dark meat, 54.4 percent fat and 8.1 percent sodium. Let’s hear it for fat and salt.
Dairy products are no better. According to the USDA, creamed cottage cheese–eaten by millions of dieting men and women precisely because it is seen as a low-fat source of calcium and phosphorus to maintain strong bones and teeth–has in fact gained 7.3 percent fat since 1963, while losing 36.1 percent of its calcium, 13.1 percent of its phosphorus, and—incidentally—fully 53.3 percent of its iron. And what has it gained, besides fat? Hey, you guessed it: 76.85 percent in sodium.
We are also seeing increases in carbohydrates, which include sugars and starches. Good old healthy broccoli, for example, while losing 45 percent of its vitamin C, has seen its carbohydrate content jump upward by 13.8 percent since 1963.
As for bread, traditional mainstay of the Western diet, the highly processed nature of the typical soft, all-but-crustless, bleached-flour white supermarket loaf makes it hard to evaluate. In the process of manufacture, the nutritionally best parts of the original wheat grain are sifted, milled, or chemically bleached out of the flour to make it as perfectly white as possible. The purpose here is purely cosmetic, designed to accommodate the widespread–and completely irrational– public prejudice that white bread is somehow “better.” Then a small portion of these nutrients is put back in to “enrich” (the pure irony of the industry’s euphemism here is almost comic) what would otherwise be a loaf of nothing. So-called “enriched” white bread thus actually does contain some nutrients. But compared to loaves made by more traditional baking methods, the supermarket product is rather pathetic.
This was demonstrated as long ago as the 1970s, when the consumer-oriented Harrowsmith magazine conducted a comparison analysis of three loaves of bread: a) a mass-market “enriched” white loaf of Weston’s bread, taken from the supermarket shelf; b) a white loaf made by a local, small-town bakery; c) a home-baked loaf that used the “Cornell bread” recipe developed by nutritionist Dr. Clive MacKay. The results of an independent laboratory analysis:
The Weston loaf proved highest in fat and chloride (calculated as salt) and lowest in protein and phosphorus, as well as in the B vitamins (niacin, thiamine, riboflavin and B12). The homemade loaf was highest in protein, iron, calcium, phosphorus and the B vitamins niacin, riboflavin and B12. It was lowest in chloride and, surprisingly, in fiber. The bake shop loaf was highest in fiber, in vitamin B6 and folic acid, and lowest in both fat and calcium.” 24
Even those who represent the manufacturers of the spongy white “tissue bread” sold in supermarkets admit its inferiority. The managing director of the Baking Council of Canada, the lobbying and public relations arm of the baking industry, told Harrowsmith’s reporters he “does not eat Weston or any other mass-produced bread himself... he shops instead at a small specialty bakery ... adding that large industrial bakers could not match its quality.”25
Then there is that traditional favorite, the hot dog, so closely identified with warm summer days and baseball, with Fourth of July and Canada Day picnics, as to be a virtual North American icon. No diehard fan’s day at the diamond could be complete, sitting out in the bleachers, without a cold beer and a couple of hot dogs for lunch.
Except that the dog, in terms of food value, is almost worthless— especially the prized all-beef frankfurter, for which people are ironically willing to pay a premium, thinking they’re getting something more for their money.
The average hot dog is actually 58 percent water, 20 percent fat, 3 percent ash and 6 percent sugar. Less than 13 percent of each sausage is made up of actual protein, and even this is of poor quality, consisting for the most part of scrapings from animals’ bones after the main cuts of meat have already been taken in the packing plants.26
An eight-year study by the non profit Protegez-vous (protect yourself ) magazine in Canada’s Quebec Province found that most hot dogs–including all-beef, as well as blended chicken and turkey, and blended beef and pork hot dogs–contained “the minimum required by law of protein, too much sodium and too much fat” and that they were of generally “bad quality.”27 Only vegetarian hot dogs contained a reasonable amount of food value.
As for “all-beef ” hot dogs, which many buyers seek on the mistaken assumption that they contain more protein and less “fatty” meats like pork, the magazine concluded: “The all-beef hot dogs were the worst of our study: Not only are they among the most costly, but their saturated fat and sodium content is much too high. Avoid them.”28 The study authors advised readers that veggie dogs have “almost double the protein, a third less fat and sodium, and hardly any saturated fats compared to meat hot dogs.”29
PILL POPPING TO THE RESCUE?
The above examples, taken chiefly from the produce, meat, and dairy sections, don’t even touch on the subject of the highly processed foods that make up most of the other products found in the cans, heat-sealed tinfoil or plastic envelopes, and cardboard boxes that line an increasing percentage of supermarket aisles (see the following chapter). But the drop in nutrients in these areas alone has been so drastic, and so constant, that many nutritionists are now saying that in order to be assured of a healthy “diet” all people must routinely take daily dietary supplements—in the form of multivitamin pills.
“Absolutely,” biochemical nutritionist Dr. Aileen Burford Mason told Globe and Mail reporter Picard. “When I hear people say, ‘you can get all the nutrients you need from food,’ I ask them: where is there a shred of evidence that is true? They are in denial.”30 Dr. Walter Willett, chairman of the Harvard University School of Public Health, agreed, calling a daily multivitamin “a good, cheap insurance policy.”31
Unfortunately, Dr. Willett’s cheap insurance policy may not be as good as he believes. Over-the-counter multivitamin and mineral tablets, of the kind we’ve all seen lining drugstore shelves in a bewildering variety of colorful and confusing packages, contain substances in pure, artificially concentrated form, as extracted via large-scale industrial processes. But the human body doesn’t seem to work in a large-scale, industrial way. Whitney and Rolfes elaborate:
In general, the body absorbs nutrients best from foods in which the nutrients are diluted and dispersed among other substances that may facilitate their absorption. Taken in pure, concentrated form, nutrients are likely to interfere with one another’s absorption or with the absorption of nutrients in foods eaten at the same time. Documentation of these effects is particularly extensive for minerals: Zinc hinders copper and calcium absorption, iron hinders zinc absorption, calcium hinders magnesium and iron absorption, and magnesium hinders the absorption of calcium and iron. Similarly, binding agents in supplements limit mineral absorption.
Although minerals provide the most familiar and best-documented examples, interference among vitamins is now being seen as supplement use increases. The vitamin A precursor beta carotene, long thought to be nontoxic, interferes with vitamin E metabolism when taken over the long term as a dietary supplement. Vitamin E, on the other hand, antagonizes vitamin K activity and so should not be used by people being treated for blood-clotting disorders. Consumers who want the benefits of optimal absorption of nutrients should use ordinary foods, selected for nutrient density and variety. 32
The British government, in May 2003, went quite a bit further than the textbook authors, bluntly warning consumers in the United Kingdom that some over-the-counter vitamin and mineral supplements can actually endanger health, especially if taken in high doses. The British food standards agency, following a four-year safety review by independent scientific advisors, concluded that “long-term use of six substances, vitamin B6, beta-carotene, nicotinic acid (niacin), zinc, manganese, and phosphorus, might also cause irreversible health damage.”33
Sir John Krebs, chairman of the food agency, said “While in most cases you can get all the nutrients you need from a balanced diet, many people choose to take supplements. But taking some high-dose supplements over a long period could be harmful.”34
A recent study published in the British medical journal The Lancet looked closely at the effects of regularly taking vitamin E and betacarotene pills as a supplement to prevent heart disease. “Vitamin E and beta-carotene pills are useless for warding off major heart problems, and beta-carotene, a source of vitamin A, may be harmful,” an Associated Press (AP) summary of the study reported.35
Researchers at the Cleveland Clinic Foundation gleaned similar conclusions from analysis of the pooled results of 15 key studies involving nearly 220,000 people–far more than needed to be statistically sound. “The public health viewpoint would have to be that there’s really nothing to support widespread use of these vitamins,” said Dr. Ian Graham, of Trinity College, Ireland.36
According to the AP: “The researchers found that vitamin E did not reduce death from cardiovascular or any other cause and did not lower the incidence of strokes. Beta-carotene was linked with a 0.3 percent increase in the risk of cardiovascular death and a 0.4 percent increase in the risk of death from any cause.”
Because the pills didn’t help, however, does not mean that vitamin E or beta-carotene themselves are not helpful in preventing disease. It only means that commercially produced pills that contain these substances in concentrated form may not help. Said the AP:
The idea that antioxidant vitamins might ward off heart trouble was plausible. Test tube studies indicated that antioxidants protect the heart’s arteries by blocking the damaging effects of oxygen. The approach works in animals, and studies show that healthy people who eat vitamin-rich food seem to have less heart disease.
However, experts say that perhaps antioxidants work when they are in food, but not when in pills. 37
Gulping jarfuls of orange, pink, or blue artificially concentrated vitamin tablets in an effort to offset the increasing nutritional poverty of our corporate/commercial food supply may actually end up making things worse, not better.
“Whenever the diet is inadequate, the person should first attempt to improve it so as to obtain the needed nutrients from foods,” say Whitney and Rolfes.38
Great advice, but how can we follow it if the foods available at our supermarkets have few or no nutrients? If the trend lines over the past 50 years continue to hold true, it would seem that our food supply system is heading inexorably toward a diet made up largely of “nonfoods” that contain increasingly fewer measurable nutrients, except for the relatively dangerous ones of fat, salt, and sugar.
Twenty or more years from now, if these trends aren’t halted, will the “food” offered commercially in chain stores be nothing more than an attractively colorful but inert, sweet- or salty-tasting physical solid we swallow to give ourselves the illusion of eating, while we try hopelessly to obtain our real nourishment by juggling a smorgasbord of pills? “Hey, Jack, come on over for Thanksgiving dinner, we’re having roast pill, with non-gravy!”
Whatever the future holds, for the past 50 years the nutrients have been leaching out of nearly everything we eat, leaving a vacuum that commercially produced vitamin pills can’t fill.
And, as the saying goes, “nature abhors a vacuum.”
Something else is already filling it.