Читать книгу Wheat Belly Total Health: The effortless grain-free health and weight-loss plan - Dr Davis William - Страница 9
Chapter 2 Let Them Eat Grass
ОглавлениеI asked the waiter, ‘Is this milk fresh?’ He said, ‘Lady, three hours ago it was grass.’ Phyllis Diller
Grasses are everywhere.
They grow on mountains, along rivers and lakes, in valleys, vast steppes, savannahs, prairies, golf courses and your garden. And they now reign supreme in the human diet.
Grasses are wonderfully successful life forms. They are geographically diverse, inhabiting every continent, including Antarctica. They are a study in how life can adapt to extremes, from the tundra to the tropics. Grasses are prolific and hardy, and they evolve rapidly to survive. Even with the explosive growth of the human population, worldwide expansion of cities and suburbs, and tarmac spanning the US coast-to-coast, grasses still cover 20 per cent of the earth’s surface area. Just as insects are the most successful form of animal life on the planet, grasses are among the most successful of plants. Given their ubiquity, perhaps it’s not unexpected that we would try to eat them. Humans have experimented with feasting on just about every plant and creature that ever inhabited the earth. After all, we are creatures who make food out of tarantulas and poisonous puffer fish.
While grasses have served as food for many creatures (they’ve even been recovered from fossilized dinosaur faeces), they were not a food item on our dietary menu during our millions of years of adaptation to life on this planet. Pre-Homo hominids, chimpanzee-like australopithecines that date back more than 4 million years, did not consume grasses in any form or variety, nor has any species of Homo prior to sapiens. Grasses were simply not instinctively regarded as food. Much as you’d never spot an herbivorous giraffe eating the carcass of a hyena or a great white shark munching on sea kelp, humans did not consume any part of this group of plants, no matter how evolutionarily successful, until the relatively recent past.
The seeds of grasses are a form of ‘food’ added just a moment ago in archaeological time. For the first 2,390,000 years of our existence on earth, or about 8,000 generations, we consumed things that hungry humans instinctively regarded as food. Then, 10,000 years or just over 300 generations ago, in times of desperation, we turned to those darned seeds of grasses. They were something we hoped could serve as food, since they were growing from every conceivable environmental nook and cranny.
So let us consider what this stuff is, the grasses that have populated our world, as common as ants and earthworms, and been subverted into the service of the human diet. Not all grasses, of course, have come to grace your dinner plate – you don’t save and eat the clippings from cutting your lawn, do you? – so we’ll confine our discussion to the grasses and seeds that humans have chosen to include on our dinner plates. I discuss this issue at some length, because it’s important for you to understand that consumption of the seeds of grasses underlies a substantial proportion of the chronic problems of human health. Accordingly, removing them yields unexpected and often astounding relief from these issues and is therefore an absolutely necessary first step towards regaining health, the ultimate goal of this book. We will spend a lot of time talking about how recovering full health as a non-grass-consuming Homo sapiens of the 21st century – that means you – also means having to compensate for all of the destruction that has occurred in your body during your unwitting grain-consuming years. You’ve consumed what amounts to a dietary poison for 20, 30 or 50 years, a habit that your non-grain-accustomed body partially – but never completely – adapts to, endures or succumbs to. You then remove that poison and, much as a chronic alcoholic needs to recover and heal his liver, heart, brain and emotional health after the flow of alcohol ceases, so your body needs a bit of help to readjust and regain health minus the destructive seeds of grasses.
So what makes the grasses of the world a food appropriate for the ruminants of the earth, but not Homo sapiens? There is no single factor within grains responsible for its wide array of bowel-destroying effects – there is an arsenal.
Non-Wheat Grains: You Might As Well Eat Jelly Beans
There is no question that, in this barrel of rotten apples, wheat is the rottenest. But you still may not want to make cider with those other apples.
What I call ‘non-wheat grains’, such as oats, barley, rye, millet, teff, sorghum, corn and rice, are nonetheless seeds of grasses with potential for curious effects in nonruminant creatures not adapted to their consumption. I would classify non-wheat grains as less bad than the worst – modern wheat – but less bad is not necessarily good. (That extraordinarily simple insight – that less bad is not necessarily good – is one that will serve you well over and over as you learn to question conventional nutritional advice. You will realize that much of what we have been told by the dietary community, the food industry and even government agencies violates this basic principle of logic again and again.) Less bad can mean that a variety of undesirable health effects can still occur with that seed’s consumption – those effects will just not be as bad as those provoked by modern wheat.
So what’s the problem with the seeds of non-wheat grasses? While none achieve the nastiness of the seeds of modern wheat, they each have their own unique issues. For starters, they’re all high in carbohydrates. Typically, 60 to 85 per cent of the calories from the seeds of grasses are in the form of carbohydrates. This makes sense, since the carbohydrate stored in the seed was meant to provide nutrition to the sprouting plant as it germinates. But the carbohydrate in seeds, called amylopectin A, is rapidly digested by humans and raises blood sugar, gram for gram, higher than table sugar does.
For instance, a 125 g (4½ oz) serving of cooked organic, stoneground oatmeal has nearly 50 grams of net carbohydrates (total carbohydrates minus fibre, which we subtract because it has no glycaemic potential), or the equivalent of slightly more than 11 teaspoons of sugar, representing 61 per cent of the calories in oatmeal. This gives it a glycaemic index (GI, an index of blood sugar-raising potential) of 55, which is enough to send blood sugar through the roof and provoke all the phenomena of glycation, i.e., glucose modification of proteins that essentially acts as biological debris in various organs. This irreversible process leads to conditions such as cataracts, hypertension, the destruction of joint cartilage that results in arthritis, kidney disease, heart disease and dementia. (Note that a glycaemic index of 55 falls into what dietitians call the ‘low’ glycaemic index range, despite the potential to generate high blood sugars. We discuss this common fallacy in Chapter 5.) All non-wheat grasses, without exception, raise blood sugar and provoke glycation to similar degrees.
Human manipulation makes it worse. If corn is not consumed as intact kernels but instead is pulverized into fine cornflour, the surface area for digestion increases exponentially and accounts for the highest blood sugars possible from any food. This is why the glycaemic index of cornflour is 90 to 100, compared with 60 for corn on the cob and 59 to 65 for sucrose or table sugar.
For years, we’ve been told that ‘complex’ carbohydrates are better for us than ‘simple’ sugars because the lengthy carbohydrate molecules of amylopectin A and amylose in grains don’t raise blood sugar as high as sugars with one or two sugar molecules, such as glucose (one sugar) or sucrose (two sugars: glucose and fructose), do. But this is simply wrong, and this silly distinction is therefore being abandoned: the GI of complex carbohydrates is the same as or higher than that of simple sugars. The GI of whole wheat bread: 72; the GI of millet as a hot cereal: 67. Neither are any better than the GI of sucrose: 59 to 65. (Similar relationships hold for the glycaemic load, a value that factors in typical portion size.) The World Health Organization (WHO) and the Food and Agriculture Organization of the United Nations have both advised dropping the complex versus simple distinction, and rightly so, as grains, from a blood sugar viewpoint, are the same as or worse than sugar.
And the problems with non-wheat grains don’t end with blood sugar issues.
Lectins: Good Enough for the KGB
The lectin proteins of grains are, by design, toxins. Lectins discourage creatures, such as moulds, fungi and insects, from eating the seeds of a plant by sickening or killing them. After all, the seed is the means by which plants continue their species. When we consume plants, we consume defensive lectins. Lectin proteins’ effects on humans vary widely, from harmless to fatal. Most plant lectins are benign, such as those in spinach and white mushrooms, which cause no adverse effects when consumed as a spinach salad. The lectin of castor beans is an entirely different story; its lectin, ricin, is highly toxic and is fatal even in small quantities. Ricin has been used by terrorists around the world. Gyorgy Markov, Bulgarian dissident and critic of the Soviet government, was murdered by KGB agents in 1978 when he was poked with the tip of an umbrella laced with ricin.
The lectin of the seed of wheat is wheat germ agglutinin (WGA). It is neither as benign as the lectin of spinach nor as toxic as the lectin of ricin; it is somewhere in between. WGA wreaks ill effects on everyone, regardless of whether you have coeliac disease, gluten sensitivity or no digestive issues at all. The lectins of rye, barley and rice are structurally identical to WGA and share all of its properties and are also called ‘WGA’. (The only substantial difference is that rye, barley and rice express a single form of lectin, while genetically more complex wheat expresses up to three different forms.) Interestingly, 21 per cent of the amino acid structure of WGA lectins overlaps with ricin, including the active site responsible for shutting down protein synthesis, the site that accounts for ricin’s exceptional toxicity.1
Lectin proteins have the specific ability to recognize glycoproteins (proteins with a sugar side chain). This makes plant lectins effective in recognizing common glycoproteins on, say, the surface of a fungal cell. But that same process can occur in humans. When a minute quantity, such as 1 milligram, of WGA is purified and intestinal tissue is exposed to it, intestinal glycoproteins are bound and severe damage that resembles the effects of coeliac disease results.2 We also know that WGA compounds the destructive intestinal effects of coeliac disease started by gliadin and other grain prolamin proteins.3 If you have inflammatory bowel disease, ulcerative colitis, or Crohn’s disease, grain lectins intensify the inflammation, making cramps, diarrhoea, bleeding and poor nutrient absorption worse.
WGA is oddly indestructible. It is unaffected by cooking, boiling, baking or frying. WGA is also untouched by stomach acid. Though the acid produced in the human stomach is powerfully corrosive (dip your finger in a glass full of stomach acid and you won’t have a finger for very long), WGA is impervious to it, entering the stomach and passing through the entire gastrointestinal tract unscathed, undigested and free to do what it likes to any glycoproteins exposed along the way.
While most WGA remains confined to the intestine, doing its damage along the 30-foot length of this organ, we know that a small quantity gets into your bloodstream. (We know this because people commonly develop antibodies to this protein.) Once WGA enters the bloodstream, odd things happen: red blood cells clump (or ‘agglutinate’, the basis for WGA’s name), which can, under certain circumstances (obesity, smoking, sedentary living, dehydration, etc.), increase the tendency of blood to clot – the process that leads to heart attack and stroke. WGA is often called a mitogen because it activates cell division, or mitosis (a concept familiar to anyone who studies cancer, a disease characterized by unrestrained mitosis). WGA has indeed been demonstrated to cause mitosis in lymphocytes (immune system cells) and cells lining the intestine.4 We know that such phenomena underlie cancer, such as the intestinal lymphoma that afflicts people with coeliac disease.5 WGA also mimics the effects of insulin on fat cells. When WGA encounters a fat cell, it acts just as if it were insulin, inhibiting activation of fat release and blocking weight loss while making the body more reliant on sugar sources for energy.6 WGA also blocks the hormone leptin, which is meant to shut off appetite when the physical need to eat has been satisfied. In the presence of WGA, appetite is not suppressed, even when you’re full.7
All in all, grain lectins are part of a potent collection of inflammatory factors. Indigestible or only partially digestible, they fool receptors and thwart hormonal signals after gaining entry to our bodies through the seeds of grasses.
VIP: Very Important Peptide
The lectin found in wheat, rye, barley and rice (WGA) also blocks the action of another very important hormone called vasoactive intestinal peptide, or VIP. 8 While studies have been confined mostly to experimental models, not humans, the blocking of VIP has the potential to explain many of the peculiar phenomena that develop in people who consume grains but do not have coeliac disease or gluten sensitivity.
VIP plays a role in dozens of processes. It is partly responsible for:
• Activating the release of cortisol from the adrenal glands9
• Modulating immune defences against bacteria and parasites in the intestine10
• Protecting against the immune destruction of multiple sclerosis11
• Reducing phenomena that can lead to asthma and pulmonary hypertension (increased pressure in the lungs)12
• Maintaining healthy balance of the immune system that prevents inflammatory bowel diseases, Crohn’s disease and ulcerative colitis13
• Promoting sleep and maintaining circadian rhythms (day–night cycles)14
• Participating in determining taste in the tongue15
• Modulating the immune and inflammatory response in skin that protects us from psoriasis16
In other words, the diseases that are at least partially explained by blocking VIP sure look and sound like the collection of conditions that we witness, day in, day out, in wheat-consuming people: low cortisol levels responsible for low energy, worsening of asthma and pulmonary hypertension, worsening of Crohn’s disease and ulcerative colitis, disruption of sleep, distortions of taste such as the reduced sensitivity to sweetness (meaning you need more sugar for sweetness) and psoriasis. The VIP pathway may prove to be one of the important means by which grains disrupt numerous aspects of health.
Grains and a Mouthful of Bacteria
Grains affect the microorganisms that inhabit our bodies. These microbiota live on your skin and in your mouth, vagina and gastrointestinal tract.
Over the last few years, there has been a new scientific appreciation for the composition of human microbiota. We know, for instance, that experimental animals raised in an artificial sterile environment and thereby raised with a gastrointestinal tract that contains no microorganisms have impaired immunity, are prone to infections, are less efficient at digestion and even develop structural changes of the gastrointestinal tract that differ from creatures that harbour plentiful microorganisms. The microorganisms that inhabit our bodies are not only helpful; they are essential for health.
The bacteria that share in this symbiotic relationship with our bodies today are not the same as those carried by our ancestors. Human microorganisms underwent a shift 10,000 years ago when we began to consume the seeds of grasses. DNA analyses of dental plaque from ancient human teeth demonstrate that oral flora of primitive non-grain-consuming humans was different from that of later grain-consuming humans. Alan Cooper, PhD, of the University of Adelaide Centre for Ancient DNA, and Keith Dobney, PhD, of the University of Aberdeen, analysed bacterial DNA from teeth of hunter-gatherers before grains. They then compared it with early grain-adopting humans and later Neolithic, Bronze Age and medieval populations – periods when agriculture flourished. Pre-grain hunter-gatherers demonstrated wide diversity of oral bacterial species, predominant in species unassociated with dental decay. Grain-consuming humans, in contrast, demonstrated reduced diversity, with what the researchers called a ‘more disease causing configuration’, a pattern that worsened the longer humans consumed grains.17 Mouth bacteria underwent another substantial shift 150 years ago during the Industrial Revolution, with the proliferation of even greater disease-causing species, such as Streptococcus mutans, coinciding perfectly with the mechanical milling of flours. Disease-causing species of oral flora are now ubiquitous and dominate the mouths of modern humans, sustained by modern consumption of grains and sugars.18 Dr Dobney comments: ‘Over the past few hundred years, our mouths have clearly become a substantially less diverse ecosystem, reducing our resilience to invasions by disease-causing bacteria.’19
This study rounds out what anthropologists have been telling us for years: when humans first incorporated grains into our diets, we experienced an explosion of tooth decay, tooth loss and tooth abscesses.20 We now know that grains, from einkorn and barley to maize and millet, were responsible for this marked shift in dental health, because they caused disturbances in oral microorganisms.
Insights into oral flora do not necessarily tell us what happened to bowel flora, though there is some overlap. Even though we all begin our lives with sterile gastrointestinal tracts ripe to be populated with organisms provided at birth from the vaginal canals of our mothers, many events occur during our development that lead to divergences between the organisms in our mouths and those in our bowels – such as the appearance of teeth, stomach acidification, the hormonal surge of puberty and antibiotics. Nonetheless, we can still take some lessons about human diet and bowel flora by studying . . .
The Science of Scatology
In addition to knowing that the oral flora of humans changed once we chose to consume grains, we also know that primitive humans had different bowel flora than modern humans. The ancient remains of human faeces, or coprolites, have been recovered from caves and other locations where humans congregated, ate, slept, died and, of course, moved their bowels.
Though we have to make allowances for the inevitable degeneration of faecal material over time, we can make observations on the varieties of bacterial species present in coprolites and thereby primitive human intestinal tracts. We know, for instance, that some Treponema, a species of bacteria important for digestion of fibrous foods and anti-inflammatory effects, are widely present in coprolites of pre-grain cultures but are nearly absent from modern humans.21
These observations are important because we know that abnormal conditions of the gastrointestinal tract, such as irritable bowel syndrome, peptic ulcers and ulcerative colitis, are associated with changes in bowel flora composition.22 We may uncover a connection between these changes in flora and autoimmune diseases, weight control, cancer and other conditions.
We don’t know how many of these changes are due to diet and how many are due to the diseases themselves, but we do know with certainty that the composition of human oral and bowel flora underwent changes over time. And the facts are clear: when humans began to consume the seeds of grasses, the microorganisms cohabiting our bodies changed, and they changed in ways that affect our health.
Let’s now discuss each non-wheat grain individually and explain why, like wheat, they do your health no favours.
Maybe We’ll Chew a Cud: Adaptations to Consuming the Seeds of Grasses
It would be wrong to argue that no human adaptations have evolved over the several thousand years we’ve consumed the seeds of grasses. There have indeed been several changes in the human genetic code that have developed in grain-consuming societies and that are thereby notably absent in non-agricultural native North and South American, South Pacific and Australian populations.
• Genes for increased expression of the salivary enzyme amylase, determined by the AMY1 gene, allow increased digestion of the amylopectin starches of grains.23
• The gene for haemochromatosis, a condition of excessive iron storage that increases the number of red blood cells in the bloodstream, is believed to be an adaptation to iron deficiency that developed in grain-consuming humans. Because it is a relatively recent mutation, genes for enhanced iron absorption are carried by less than 10 per cent of people of northern European descent.
• Variations in genes that determine diabetes susceptibility are believed to have evolved with the consumption of the seeds of grasses, with recent variants providing partial protection from the disease.24 Judging by the worldwide explosion of diabetes, though, these attempts at genetic adaptation are inadequate.
Yes, as a species, we are trying to adapt to a diet dominated by the seeds of grasses and their adverse health effects, but such adaptations are not enough. We haven’t had sufficient time to adapt to the many effects of prolamin proteins, lectins, and changes in oral and bowel flora, or the mental, emotional or autoimmune effects of grain consumption (all of which I discuss in later chapters). These continue at a high level across all populations that enthusiastically consume the seeds of grasses. Perhaps, in another few hundred thousand years, we will fully adapt and thrive without disease while consuming the seeds of grasses. The Homo sapiens of a grain-dominated future may chew a cud, grow some extra stomach compartments and add ‘moo’ to the dictionary.
50 Shades of Grain
This man, whom I once thought of as a romantic hero, a brave shining white knight – or the dark knight, as he said. He’s not a hero; he’s a man with serious, deep emotional flaws, and he’s dragging me into the dark.
E. L. James, Fifty Shades of Grey
All of the grains that fill the modern diet to bursting are grasses. Ground, baked, roasted, toasted and popped, they come in an astounding variety of forms, colours and flavours, as they are among the most popular ingredients in modern processed food. Who would have guessed that popcorn and pretzels are closely related, or that tortillas and Danish pastries are kissing cousins? Beneath those comforting smells and flavours, however, are buried dark secrets, undisclosed confidences and demons ready to engulf you in their embrace, enfolding your mind and body in their effects. As wheat is a grass, its bewitching effects are shared to various degrees by the seeds of other grasses.
The problems posed by the tortured relationship between wheat and humans are largely shared by other wheat-derived grains, including triticale (a cross between wheat and rye), bulgur and traditional strains of wheat such as spelt and kamut. When discussing ‘wheat’, I therefore am referring to all the closely related grains in the wheat family. Let’s consider several of the most popular non-wheat grains in all their lurid glory.
Rye
The history of rye consumption dates back to the early days of wheat consumption, when humans first experimented with consuming einkorn. Rye, another grass, grew as a weed in fields of wheat, an example of Vavilovian mimicry, or the ability of a weed to mimic a cultivated plant. This weed came to be recognized by humans as yet another seed of a grass that could be consumed, and farmers often harvested both wheat and rye with the same sickle or thresher without bothering to separate them.
Rye has gained some blessings in nutritional circles because compared with wheat, it has less potential to trigger insulin, despite identical potential for raising blood sugar.25 (To be fair, just about anything compared with Triticum aestivum, our favourite grain to bash, comes up smelling like roses.)
Rye and wheat share a high content of gliadin protein, with all its potentially toxic effects. (Rye gliadin is called secalin, although the structures are nearly identical.) The secalin protein has similar potential to do bad things as its gliadin counterpart.26 Likewise, the lectin of rye is nearly identical to wheat’s destructive lectin, wheat germ agglutinin, and therefore shares its potential for causing intestinal toxicity, clumping red blood cells, provoking abnormal growth of immune system lymphocytes and mimicking insulin.27 Rye shares with wheat a peculiar and only recently recognized phenomenon: the formation of acrylamide, a compound believed to be a carcinogen and neurotoxin.28 Rye and wheat contain a high content of the amino acid asparagine, which, when heated at high temperatures during baking or deep-frying, reacts with the plentiful carbohydrates present to form acrylamide. (It also forms in chips.) Modern reliance on nitrogen-rich synthetic fertilizers also boosts the asparagine content of rye and wheat, increasing acrylamide formation further.
For all practical purposes, given the crossbreeding that has occurred via natural Vavilovian means as well as the breeding efforts of humans, the differences are minor, meaning that they are virtually one and the same. Being wheat-free should also mean being rye-free.
Rye and the Work of the Devil
Rye has the unique potential to be infected with a parasitic fungus, Claviceps purpurea, that produces a human toxin called ergotamine. When ingested in, say, a loaf of rye bread, it exerts a range of hallucinogenic effects on humans, partly because it is converted to lysergic acid diethylamide, commonly known as LSD.
History is filled with fascinating and terrifying stories of humans exposed to rye and ergotamine. Because some victims afflicted with contaminated rye experienced an intense dermatitis (skin inflammation), the condition became known as St Anthony’s Fire, named after the early 11th-century sanctuary operated by monks to treat victims of ergot poisoning. During the Middle Ages, writers described hysterical outbursts afflicting previously normal people, including thrashing and writhing while shouting, ‘I’m burning!’ The afflicted would eventually collapse, after which their bodies would blacken. And at least one observer has ascribed the madness of the Salem witch trials to ergotamine poisoning after determining that many of the 19 young women accused of being witches lived near a rye field. A ‘witch cake’ made of rye flour was fed to a dog to confirm a ‘bewitching’ effect.29
The rye itself was, of course, entirely innocent, since it was the common parasitic infestation of the grass that was to blame. But, as with so many other matters surrounding the relationship between the seeds of grasses and the hapless humans who try to consume them, it should come as no surprise that it is a relationship fraught with danger.
Barley
The origins of barley consumption parallel that of einkorn and emmer wheat in the Fertile Crescent, which is now Iraq, Iran and Turkey. For many years, barley was the preferred grain among ancient people of Greece and Egypt, spreading to Europe 7,000 years ago. Barley has largely been demoted to animal fodder, with most human exposure nowadays coming in the form of the barley malt used to make beer. As with rye, barley also shares many characteristics with its close grass relative, wheat. People with coeliac disease, for instance, who avoid wheat because it’s a source of gluten (and thereby gliadin), must also avoid barley due to gliadin’s similarities with barley’s equivalent protein, hordein. Gliadin and hordein overlap extensively, suggesting that the peculiar human effects of wheat are shared by barley.30 The lectin of barley is also virtually identical to wheat germ agglutinin, thereby sharing its potential for gastrointestinal toxicity. Barley’s allergic effects also overlap with those of wheat, meaning that the same asthma, sinus drainage and congestion, skin rashes and gastrointestinal distress provoked by a wheat allergy can also be provoked by barley.31
Corn
After modern wheat and its problematic closest brethren, rye, barley, bulgur and triticale, corn is the next problem grass. (For the sake of clarity, I will call maize by its North American colloquial name, ‘corn’. While corn outside the United States and Canada can mean wheat or be a nonspecific term for any grain, here it will be used to refer to maize.)
Like einkorn wheat, corn is among the oldest of cultivated grains, dating back 10,000 years to pre-Mayan times in South America, but corn didn’t make it onto European menus until 1493, when Christopher Columbus brought seeds to Spain. Corn was rapidly embraced, largely replacing barley and millet due to its spectacular yield per acre. Widespread, habitual consumption of cornbread and polenta resulted in deficiencies of niacin (vitamin B3) and the amino acids lysine and tryptophan, causing widespread epidemics of pellagra, evidenced as what doctors of the age called ‘The Four Ds’: dermatitis, diarrhoea, dementia and death. Even today, pellagra is a significant public health issue in rural South America, Africa and China. Meanwhile, in coastal Peru, Ecuador, Mexico and the Andes mountain highlands, increased corn consumption led to increased tooth decay, tooth loss, anaemia and iron deficiency, as well as loss of height in children and adults.32
Today, farmers fatten livestock by feeding them intact corn kernels. But much of the corn consumed by humans is in the form of cornflour, or derivatives of corn such as high-fructose corn syrup. This concentrated source of fructose is a form of sugar that fails to signal satiety – you don’t know when to stop. Corn and wheat jockey for inclusion in just about every processed food, many of which contain both. Corn in some form is therefore found in obvious sources, such as corn chips, cornbread, breakfast cereals, soft drinks with high-fructose corn syrup, tacos and tortillas, but also in some not-so-obvious foods, including hamburger meat, ketchup, salad dressings, yoghurt, soup mixes, sweets, seasoning mixes, mayonnaise, marinara sauce, fruit drinks and peanut butter.
Corn strains with the highest proportion of rapidly digested amylopectin, rather than the less efficiently digested amylose, are chosen to grind into cornflour. Given the exponential increase in surface area that results when corn is reduced to granules or powder, these products are responsible for extravagant rises in blood sugar. With a glycaemic index of 90 to 100, the highest of any food, they are perfectly crafted to contribute to diabetes.33
Corn allergies are on the rise, probably due to changes in alpha-amylase inhibitor proteins, lipid transfer proteins and others. Because the various grasses that we call ‘grains’ are genetically related, there can be overlapping grain allergies in humans exposed to them.34 Repeated and prolonged exposure to corn proteins, as in people who work in agriculture, food production or the pharmaceutical industry (cornflour is found in pills and capsules), can lead to as many as 90 per cent of workers developing a corn allergy.35 Such extravagant levels of allergy development do not occur in people working with apples, beef, kale or olives – only grains.
The zein protein of corn triggers antibodies reactive to wheat gliadin, which can lead to gastrointestinal distress, diarrhoea, bloating, bowel urgency and acid reflux after corn consumption.36 The immune response responsible for the destruction of the small intestine that occurs in people with coeliac disease can also be triggered, though less severely, by the zein protein of corn. Nevertheless, cornflour is – wrongly – used in gluten-free foods.37
Though they look quite different and the modern processed products that emerge from them look, smell and taste quite different, wheat and corn are too closely related for comfort. Minimal to no exposure is the desired strategy for non-ruminant Homo sapiens.
Genetic Modification: Don’t Look, Don’t Tell
Since gene-splicing technology made it possible to insert or remove specific genes in plants and animals, we have been reassured repeatedly by the FDA, the USDA and by agribusiness that the products of this technology are safe for the environment and for human consumption. And they have 90-day animal testing data to prove it.
While wheat was manipulated with methods that pre-date genetic modification and therefore didn’t raise many eyebrows, other genetically modified (GM) grains, especially corn and rice, have somehow escaped public scrutiny, and strains have made it onto supermarket shelves in North America and other parts of the world. Recent studies have raised questions about the safety of GM crops, as well as the herbicides and pesticides that go with them. One French research group, for instance, obtained internal proprietary research data from Monsanto that were used to justify claims of safety for both glyphosate-resistant corn and Bt toxin corn, the two most prevalent GM crops. (This information was not relinquished voluntarily, but rather was obtained by a court order.) When they tried to reproduce the Monsanto data but applied more detailed tissue analyses, they failed to reproduce the same benign findings, instead reporting evidence for kidney, liver, heart, spleen and adrenal toxicity with both forms of GM corn.38 The first effort to extend the period of observation beyond 90 days raised more disturbing questions. Over two years of observation, increased mortality, breast tumours, liver damage and pituitary disruption from both glyphosate-resistant corn and glyphosate itself were reported, in contrast to Monsanto’s benign 90-day findings.39
Further questions have been raised regarding the safety of Bt toxin corn. This strain of corn has a gene for a protein that’s toxic to insects inserted right into it, so it kills pests who try to eat the plant. While Bt toxin-expressing bacteria have been sprayed on crops by organic farmers for 40 years with apparent safety, critics have pointed out that GM corn now expresses Bt toxin within the seed (the corn kernels) directly ingested by consumers. One study in mice demonstrated toxic effects on blood cell formation,40 while another observed prediabetic patterns.41 Genetically modified rice has also been demonstrated to change the composition of bowel flora in mice, with decreased healthy Lactobacillus and increased unhealthy Escherichia coli species.42
Glyphosate itself, the world’s most widely used herbicide, is applied to glyphosate-resistant corn. Various studies suggest it has oestrogenic activity, promoting the growth of breast cancer cells; disrupts male fertility; and disrupts endocrine function in a number of other ways.43 There is also the issue of the environmental impact of glyphosate on wildlife, including aquatic bacteria and amphibians, such as frogs, which experience toxic effects.44
Interestingly, one strain of rice – Golden Rice, which has been genetically modified to express beta-carotene to alleviate the vitamin A deficiency that plagues rice-consuming societies – has been at the forefront of the biotechnology effort to paint genetic modification as something beautiful to behold and safe for consumption. Agribusiness giant Syngenta has been promoting Golden Rice as an example of what the science of genetic modification can accomplish, despite the vigorous opposition of many farmers who wish to avoid using GM grains. Critics have also accused its promoters of trying to capitalize on a common nutrient deficiency by a more profitable route than, say, just having vitamin A-deficient populations eat an occasional sweet potato, which would match or exceed the benefits provided by Golden Rice. (But you can’t trademark a regular, nutritious sweet potato.)
Much of the science purporting to explore the safety of GM crops reads more like marketing than science, with researchers gushing about the safety and nutrition of the crop, herbicide or pesticide in question, rather than impartially reporting the science. This brings us to the fundamental problem when deep-pocketed influences such as agribusiness or the pharmaceutical industry are involved: How much can we believe when much of the positive ‘science’ is generated by those who stand to benefit from it?
Rice
Despite sharing a genetic heritage with other grasses, rice is among the more benign of grains, though it’s far from harmless. Viewed from the perspective of the ancient human experience that reveals the destructive health effects of other grasses, ancient rice is the only grain that was not associated with effects such as increased tooth decay, facial malformations and iron deficiency.45 The less-harmful nature of rice can be partly explained by the very low content (less than 1 per cent) of prolamin proteins in rice.46
The history of rice as yet another seed of grasses consumed by humans dates back 8,000 years to the foothills of the Himalayas, followed by evidence for human cultivation in southern China 4,000 years ago. Rice is the ideal commodity food, as it can be stored for many years without degrading. Health problems from rice, unlike other grains, are less common. Nonetheless, overreliance on rice with the husk removed (i.e., white or polished rice) led to widespread problems with beriberi, a condition that results in partial paralysis and heart failure due to a lack of the B vitamin thiamin – conditions that, I believe you would agree, are beriberi bad. This condition can develop within a few weeks, and it became a problem that plagued Asian sailors and soldiers given rations largely consisting of rice.
As with the seeds of all other grasses, rice shares the potential for excessive glycaemic effects. Carbs account for 85 per cent of the calories in rice, among the highest of all seeds of grasses. Rice-consuming cultures, for instance, can still experience plenty of diabetes. But the comforting notion that rice is among the most benign of grains is being challenged, as it has been the recipient of extensive genetic modification. This includes efforts to make it glyphosate resistant and able to express the Bt toxin, posing the same safety questions as for glyphosate-resistant and Bt toxin-containing corn.
And there’s another issue looming over this particular seed of a grass: rice is unique among grasses in its natural ability to concentrate inorganic arsenic from soil and water. (We can’t blame agribusiness for this effect.) Rice has a high arsenic content, according to reports confirmed by FDA analyses, though the FDA reassures us that no acute toxicity develops from such exposure.47 Substantial research, however, has associated chronic arsenic exposure with multiple forms of cancer, as well as cardiovascular and neurological diseases.48 In Bangladesh, where arsenic exposure is a major public health problem, increasing chronic arsenic exposure, starting at low levels, is associated with premalignant skin lesions, high blood pressure, neurological dysfunction and increased mortality.49 This analysis suggests that adverse health effects can manifest with chronic exposure provided by as little as one serving (approximately 185 g (6½ oz) cooked) of rice per day. The FDA had previously established an upper limit for arsenic in apple juice of 10 parts per billion; analyses of rice have found many rice products approaching or exceeding this cutoff.
The data that already exist linking low-level exposure of arsenic-contaminated water with increases in many chronic diseases is, in my mind, all the information we need. Makes you shudder to think about the old Rice Diet. Although at the more benign end of the spectrum as far as seeds of grasses go, enthusiastic consumption of rice in any form (white, brown or wild) is clearly not a good idea for health. Occasional consumption of small quantities (around 50 g (2 oz)) is probably all a healthy human can tolerate before triggering such concerns.
Oats
Oats are relative newcomers to the human dietary grass experience, having been first consumed only about 3,000 years ago. Few cultures embraced this grain, often regarding it as fodder for livestock or the food of barbarians, until the Welsh and Scots became avid oat consumers. Yet another close relative of wheat and member of the grass family, its gliadin-like protein, avenin, shares less overlap in its structure than its counterparts in rye and barley do. For this reason, the role of oats in the diet of people with coeliac disease has been debated for 50 years. The avenin protein is clearly more benign, though some oat varieties can mimic the immune effects of gliadin.50 (The notion of ‘gluten-free oats’ is therefore a fiction, as they still have a protein that can overlap in structure and effect.) Oats lack a lectin protein, so they do not contribute to the intestinal damage and inflammation inflicted by wheat germ agglutinin.51 This focus on the relatively benign nature of oats in comparison with the worst grain of all, though, falsely lulls people into thinking that just because it doesn’t have gluten-like properties, it must be good for you. Once again, overly simplistic nutritional thinking can get us into trouble.
There is plenty of talk about oats being ‘heart healthy’ and a rich source of soluble fibre, referring to the beta-glucan in oats that has been shown to reduce total and LDL cholesterol. All of that is true – except for the heart-healthy part. Although the beta-glucan fibre does indeed have some healthy effects on cholesterol values, the plentiful amylopectin starch of oats raises blood sugar to high levels and therefore provokes extravagant glycation – the irreversible process of modifying proteins when blood glucose rises. Oats provide an example of something that contains a mixture of good things and bad. The good effects are transient, such as the beta-glucan allowing healthier bowel movements and lower LDL cholesterol, or the B vitamins providing nutrition. But the bad effects are irreversible, especially those of glycation. Consumption of oats, like rice, is best kept to a minimum.
Sorghum
Sorghum was, until sucrose and high-fructose corn syrup became dominant, a popular source for sugar. Until the early 20th century, sorghum syrup was poured over pancakes and used to make sweets. Like all grains, sorghum is largely carbohydrate, with approximately 75 per cent of its calories coming from starch, triggering glycation as enthusiastically as the starchy seed of any other grass. It remains popular as fodder for livestock because it’s as useful for rapid fattening as wheat and corn are.
Sorghum is an especially interesting grass, as it is toxic, and even fatal, when consumed before it’s fully mature; its high cyanide content has been known to decimate herds of livestock, causing death by cardiac arrest. This grass grows wild in much of Africa and is believed to have been first domesticated in the savannahs around 4,000 years ago. While it is a ‘true grass’ from the family Poaceae, sorghum is less closely related to the grasses discussed above. The gliadin protein counterpart in sorghum, kafirin, is only distantly related and therefore does not trigger coeliac or other undesirable gliadin responses. Despite the more benign nature of kafirin proteins, sorghum is still the seed of a grass and is therefore largely indigestible. Accordingly, the proteins in sorghum are poorly digested; about half of them pass right through the human gastrointestinal tract undisturbed.52 This has prompted manipulations to increase digestibility, including mutating the plant’s genetics with gamma radiation and chemicals, genetically modifying it by inserting genes for more digestible proteins, and mechanically or enzymatically processing the flour, all to enhance digestibility.
It is not clear what would happen to humans who relied too much on sorghum as a calorie source. But given its problematic indigestible proteins and high starch content, it is worth minimizing exposure, as with rice and oats.
There’s a Snake in the Grass
To complete our discussion of the seeds of grasses, I should mention that bulgur is simply a combination of different strains of wheat, though often of the durum variety, such as that used in pasta. But it is still wheat, with virtually all the same problems. Triticale is the result of mating wheat with rye; as you would predict, it also shares all of the same issues due to its parentage.
Millet, teff and amaranth, all added to our diets over the last few thousand years, are among several other less-common seeds of grasses that humans consume. None cause the range of health difficulties that wheat, rye, barley, corn, bulgur, triticale or sorghum are responsible for, nor have they been the recipients of enthusiastic genetic modification. However, they’re still high in carbohydrates given their amylopectin content. In France, ortolan songbirds made morbidly obese on a diet of millet and oats, then drowned in Armagnac, set on fire, and consumed whole were considered a delicacy that was savoured for its rich, dripping fat. (This is now outlawed.) Just like corn and wheat, grains whose only known problem is their amylopectin starch are still quite effective at fattening up pigs, cows, songbirds and humans.
Some people feel that they can consume a small quantity of these glycaemically challenging grains now and then without paying a health price, but bear in mind that each time you consume these starchy seeds you invite greater and greater health compromises, just as you do when you eat a bag of jelly beans.
The Human Diet: A Grass-Free Zone
You may want your beef to be grass-fed, but you shouldn’t be that way.
You may have come to recognize that the deeper we dig into this thing called grains or, more properly, the seeds of grasses, the worse it gets. We uncover more and more reasons why non-ruminant Homo sapiens is just not equipped to handle the components of these plants: lectins in wheat, rye, barley and rice; the prolamin proteins gliadin, secalin, hordein, zein and kafirin; acrylamides; cyanide; and arsenic – not to mention that we suffer deficiencies like pellagra and beriberi when we come to overrely on these seeds. Ironically, the world’s calories are most concentrated in the calories of the most destructive grains – wheat and corn – and some serious questions have now been raised about the safety of rice.
Funny how this just doesn’t happen with broccoli, celery, walnuts, olives, eggs or salmon – foods we can consume ad lib and digest easily, without triggering blood sugar, glycation, autoimmunity, dementia or other disease-related effects. As you might predict from the stories I’ve related so far, eliminating the seeds of grasses that were not on the instinctive menu for Homo sapiens frees us of many of the health conditions that plague modern humans, including rampant tooth decay, hypertension, diabetes, depression, and a wide range of neurological and gastrointestinal disorders – conditions notably absent or rare in humans following traditional diets. So I urge you to release your inner ruminant; recognize grains for the indigestible, often toxic seeds of grasses that they are; and allow your struggling Homo sapiens to fully express itself. I predict that you will rediscover health at a level you may not have known was possible.
In the next chapter, we consider just why – beyond desperation, beyond convenience, beyond appeal – grains have managed to dominate the human diet over a relatively short period of time. Why have grains gone from an occasional food of hungry, desperate humans, to the dominant food supply for mankind?