Читать книгу Wheat Belly Total Health: The effortless grain-free health and weight-loss plan - Dr Davis William - Страница 8
Chapter 1 Liberate Your Inner Cow: Life Ungrained
ОглавлениеGoldfish do not eat sausages. John Cleese, ‘How to Feed a Goldfish’, Monty Python’s Flying Circus
Since you are reading this book, I take it that you are a member of the species Homo sapiens. You are probably not a giraffe, toad or woodpecker. Nor are you a ruminant, those taciturn creatures that graze on grass.
Ruminants, such as goats and cows, and their ancient, wild counterparts, ibex and aurochs, enjoy evolutionary adaptations that allow them to consume grasses. They have continuously growing teeth to compensate for the wear generated by coarse, sandlike phytolith particles in grass blades; cows produce in excess of 100 litres of saliva per day; have four-compartment stomachs that host unique microorganisms to digest grass components, including a compartment that grinds and then regurgitates its contents up as a cud to rechew; and a long, spiral colon that’s also host to microorganisms that further digest grassy remains. In other words, ruminants have a gastrointestinal system uniquely specialized to consume grasses.
You don’t look, smell or act like a ruminant. Then why would you eat like one?
Those of you who have already forgone wheat do not, of course. But if you remain of the ‘healthy whole grain’-consuming persuasion, you have fallen victim to believing that grasses should be your primary source of calories. Just as Kentucky bluegrass and ryegrass in your lawn are grasses from the biological family Poaceae, so are wheat, rye, barley, corn, rice, bulgur, sorghum, triticale, millet, teff and oats. You grow teeth twice in your life, then stop, leaving you to make do for a lifetime with a prepubertal set that erupted around the age of 10; produce a meagre litre of saliva per day; have three fewer stomach compartments unpopulated by foreign organisms and without grinding action; don’t chew a cud; and have a relatively uninteresting, linear, nonspiral colon. These adaptations allow you to be omnivorous – but not to consume grasses.
Early members of our species found nourishment through scavenging, and then hunting, animals such as gazelles, turtles, birds and fish, and consuming the edible parts of plants, including fruit and roots, as well as mushrooms, nuts and seeds. Hungry humans instinctively regarded all of these as food. About 10,000 years ago, during a period of increasing temperature and dryness in the Fertile Crescent, humans observed the ibex and aurochs grazing on einkorn, the ancient predecessor of modern wheat. Our hungry, omnivorous ancestors asked, ‘Can we eat that, too?’ They did, and surely got sick: vomiting, cramps and diarrhoea. At the very least they simply passed wheat plants out undigested, since humans lack the ruminant digestive apparatus. Grass plants in their intact form are unquestionably unappetizing. We somehow figured out that for humans, the only edible part of the einkorn plant was the seed – not the roots, not the stem, not the leaves, not the entire seed head – just the seed, and even that was only edible after the outer husk was removed and the seed was chewed or crushed with rocks and then heated in crude pottery over fire. Only then could we consume the seeds of this grass as porridge, a practice that served us well in times of desperation when ibex meat, bird eggs and figs were in short supply.
Similar grass-consuming adventures occurred with teosinte and maize (the ancestors of modern corn) in the Americas; rice from the swamps of Asia; and sorghum and millet in sub-Saharan Africa, all requiring similar manipulations to allow the edible part – the seed – to be consumed by humans. Some grasses, such as sorghum, posed other obstacles; its content of poisons (such as hydrocyanic acid, or cyanide) results in sudden death when the plant is consumed before maturity. Natural evolution of grasses led to wheat strains such as emmer, spelt and kamut as wheat exchanged genes from other wild grasses, while humans selected strains of corn with larger seeds and seed heads (cobs).
What happened to those first humans, hungry and desperate, who figured out how to make this one component of grasses – the seed – edible? Incredibly, anthropologists have known this for years. The first humans to consume the grassy food of the ibex and aurochs experienced explosive tooth decay; shrinkage of the maxillary bone and mandible, resulting in tooth crowding; iron deficiency; and scurvy. They also experienced a reduction in bone diameter and length, resulting in a loss of as much as 5 inches in height for men and 3 inches for women.1
The deterioration of dental health is especially interesting, as dental decay was uncommon prior to the consumption of the seeds of grasses, affecting less than 1 per cent of all teeth recovered, despite the lack of toothbrushes, toothpaste, fluoridated water, dental floss and dentists. Even though they lacked any notion of dental hygiene (aside from possibly using a twig to pick the fibres of wild boar from between their teeth), dental decay was simply not a problem that beset many members of our species prior to the consumption of grains. The notion of toothless savages is all wrong; they enjoyed sturdy, intact teeth for their entire lives. It was only after humans began to resort to the seeds of grasses for calories that mouths of rotten and crooked teeth began to appear in children and adults. From that point on, decay was evident in 16 to 49 per cent of all teeth recovered, along with tooth loss and abscesses, making tooth decay as commonplace as bad hair among humans of the agricultural Neolithic Age.2
In short, when we started consuming the seeds of grasses 10,000 years ago, this food source may have allowed us to survive another day, week or month during times when foods we had instinctively consumed during the preceding 2.5 million years fell into short supply. But this expedient represents a dietary pattern that constitutes only 0.4 per cent – less than one-half of 1 per cent – of our time on earth. This change in dietary fortunes was accompanied by a substantial price. From the standpoint of oral health, humans remained in the Dental Dark Ages from their first taste of porridge all the way up until recent times. History is rich with descriptions of toothaches, oral abscesses, and stumbling and painful efforts to extract tainted teeth. Remember George Washington and his mouthful of wooden false teeth? It wasn’t until the 20th century that modern dental hygiene was born and we finally managed to keep most of our teeth through adulthood.
Fast-forward to the 21st century: modern wheat now accounts for 20 per cent of all calories consumed by humans; the seeds of wheat, corn and rice combined make up 50 per cent.3 Yes, the seeds of grasses provide half of all human calories. We have become a grass seed-consuming species, a development enthusiastically applauded by agencies such as the USDA, which advises us that increasing our consumption to 60 per cent of calories or higher is a laudable dietary goal. It’s also a situation celebrated by all of those people who trade grain on an international scale, since the seeds of grasses have a prolonged shelf life (months to years) that allows transoceanic shipment, they’re easy to store, they don’t require refrigeration and they’re in demand worldwide – all the traits desirable in a commoditized version of food. The transformation of foodstuff into that of a commodity that’s tradeable on a global scale allows financial manipulations, such as buying and selling futures, hedges and complex derivative instruments – the tools of mega-commerce – to emerge. You can’t do that with organic blueberries or Atlantic salmon.
Examine the anatomy of a member of the species Homo sapiens and you cannot escape the conclusion that you are not a ruminant, have none of the adaptive digestive traits of such creatures and can only consume the seeds of grasses – the food of desperation – by accepting a decline in your health. But the seeds of grasses can be used to feed the masses cheaply, quickly and on a massive scale, all while generating huge profits for those who control the flow of these commoditized foods.
Mutant Ninja Grasses
The seeds of grasses, known to us more familiarly as ‘grains’ or ‘cereals’, have always been a problem for us nonruminant creatures. But then busy geneticists and agribusiness got into the act. That’s when grains went from bad to worse.
Readers of the original Wheat Belly know that modern wheat is no longer the 41⁄ 2-foot-tall traditional plant we all remember; it is now an 18-inch-tall plant with a short, thick stalk; long seed head; and larger seeds. It has a much greater yield per acre than its traditional predecessors. This high-yield strain of wheat, now the darling of agribusiness, was not created through genetic modification but through repetitive hybridizations, mating wheat with nonwheat grasses to introduce new genes (wheat is a grass, after all) and through mutagenesis, the use of high-dose x-rays, gamma rays and chemicals to induce mutations. Yes: modern wheat is, to a considerable degree, a grass that contains an array of mutations, some of which have been mapped and identified, many of which have not. Such uncertainties never faze agribusiness, however. Unique mutated proteins? No problem. The USDA and US Food and Drug Administration (FDA) say they’re okay, too – perfectly fine for public consumption.
Over the years, there have been many efforts to genetically modify wheat, such as by using gene-splicing technology to insert or delete a gene. However, public resistance has dampened efforts to bring genetically modified (GM) wheat to market, so no wheat currently sold is, in the terminology of genetics, ‘genetically modified’. (There have been recent industry rumblings, however, that make the prospect of true GM wheat a probable reality in the near future.) All of the changes introduced into modern wheat are the results of methods that pre-date the technology to create GM foods. This does not mean that the methods used to change wheat were benign; in fact, the crude and imprecise methods used to change wheat, such as chemical mutagenesis, have the potential to be worse than genetic modification, yielding a greater number of unanticipated changes in genetic code than the handful introduced through gene-splicing.4
Corn and rice, on the other hand, have been genetically modified, in addition to undergoing other changes. For instance, scientists introduced genes to make corn resistant to the herbicide glyphosate and to express Bacillus thurigiensis (Bt), a toxin that kills insects, while rice has been genetically modified to make it resistant to the herbicide glufosinate and to express beta-carotene (a variety called Golden Rice). Problem: while, in theory, the notion of just inserting one silly gene seems simple and straightforward, it is anything but. The methods of gene insertion remain crude. The site of insertion – which chromosome, within or alongside other genes, within or without various control elements – not to mention disruption of epigenetic effects that control gene expression, cannot be controlled with current technology. And it’s misleading to say that only one gene is inserted, as the methods used usually require several genes to be inserted. (We discuss the nature of specific changes in GM grains in Chapter 2.)
The wheat, corn and rice that make up 50 per cent of the human diet in the 21st century are not the wheat, corn and rice of the 20th century. They’re not the wheat, corn and rice of the Middle Ages, nor of the Bible, nor of the Egyptian empire. And they are definitely not the same wheat, corn and rice that were harvested by those early hungry humans. They are what I call ‘Frankengrains’: hybridized, mutated, genetically modified to suit the desires of agribusiness, and now available at a supermarket, convenience store or school near you.
Wheat: What Changed . . . and Why Are the Changes So Bad?
All strains of wheat, including traditional strains like spelt and emmer, are problems for nonruminant humans who consume them. But modern wheat is the worst.
Modern wheat looks different: shorter, thicker shaft, larger seeds. The reduction in height is due to mutations in Rh (reduced height) genes that code for the protein gibberellin, which controls stalk length. This one mutant gene is accompanied by other mutations. Changes in Rh genes are thereby accompanied by other changes in the genetic code of the wheat plant.5 There’s more here than meets the eye.
Gliadin
While gluten is often fingered as the source of wheat’s problems, it’s really gliadin, a protein within gluten, that is the culprit behind many destructive health effects of modern wheat. There are more than 200 forms of gliadin proteins, all incompletely digestible.6 One important change that has emerged over the past 50 years, for example, is increased expression of a gene called Glia9, which yields a gliadin protein that is the most potent trigger for coeliac disease. While the Glia-9 gene was absent from most strains of wheat from the early 20th century, it is now present in nearly all modern varieties,7 probably accounting for the 400 per cent increase in coeliac disease witnessed since 1948.8
New gliadin variants are partially digested into small peptides that enter the bloodstream and then bind to opiate receptors in the human brain – the same receptors activated by heroin and morphine.9 Researchers call these peptides ‘exorphins’, or exogenous morphine-like compounds. Gliadin-derived peptides, however, generate no ‘high’, but they do trigger increased appetite and increased calorie consumption, with studies demonstrating consistent increases of 400 calories per day, mostly from carbohydrates.
Gluten
Gluten (gliadin + glutenins) is the stuff that confers the stretchiness unique to wheat dough. Gluten is a popular additive in processed foods such as sauces, instant soups and frozen foods, which means the average person ingests between 15 and 20 grams (g) per day.10 Gluten has been genetically manipulated to improve the baking characteristics of its glutenin. Geneticists have therefore crossbred wheat strains repeatedly, bred wheat with nonwheat grasses to introduce new genes, and used chemicals and radiation to induce mutations. Breeding methods used to alter gluten quality do not result in predictable changes. Hybridizing two different wheat plants yields as many as 14 unique glutenin proteins never before encountered by humans.11
Wheat Germ Agglutinin
The genetic changes inflicted on wheat have altered the structure of wheat germ agglutinin (WGA), a protein in wheat that provides protection against moulds and insects. The structure of WGA in modern wheat, for instance, differs from that of ancient wheat strains.12 WGA is indigestible and toxic, resistant to any breakdown in the human body, and unchanged by cooking, baking and sourdough fermentation. Unlike gluten and gliadin, which require genetic susceptibility to exert some of their negative effects, WGA does its damage directly. WGA alone is sufficient to generate coeliac disease-like intestinal damage by disrupting microvilli, the absorptive ‘hairs’ of intestinal cells.13
Phytates
Phytic acid (phytates) is a storage form of phosphorus in whin-breeding efforts over the past 50 years have eat and other grains. Because phytates also provide resistance to pests, graselected strains with increased phytate content. Modern wheat, maize and millet, for instance, each contain 800 milligrams (mg) of phytates per 100 g (31⁄2 ounces) of flour. Phytate content increases with fibre content, so advice to increase fibre in your diet by consuming more ‘healthy whole grains’ also increases the phytate content of your diet. As little as 50 mg of phytates can turn off absorption of minerals, especially iron and zinc.14 Children who consume grains ingest 600 to 1,900 mg of phytates per day, while enthusiastic grain-consuming cultures, such as modern Mexicans, ingest 4,000 to 5,000 mg of phytates per day. These levels are associated with nutrient deficiencies.15
Alpha-Amylase Inhibitors and Other Allergens
Wheat allergies are becoming more prevalent. Numerous allergens have been identified in modern wheat that are not present in ancient or traditional forms of the plant.16 The most common are alpha-amylase inhibitors, which are responsible for causing hives, asthma, cramps, diarrhoea and eczema. Compared with older strains, the structure of modern alpha-amylase inhibitors differs by 10 per cent, meaning it may have as many as several dozen amino acid differences. As any allergist will tell you, just a few amino acids can spell the difference between no allergic reaction and a severe allergic reaction, or even anaphylactic shock. People in the baking industry frequently develop a condition called baker’s asthma. There is also a peculiar condition called wheat-derived exercise-induced anaphylaxis (WDEIA), a severe and life-threatening allergy induced by exercising after eating wheat. Both conditions are caused by an allergy to gliadin proteins.17 Many other proteins have undergone changes over the last 40 years: lipid transfer proteins, omega-gliadins, gamma-gliadins, trypsin inhibitors, serpins and glutenins. All trigger allergic reactions.
Life Outside the Grain Mooovement
The start of grain consumption for humans coincides with the dawn of the domestication of livestock. We learned that some herbivorous species, such as aurochs and ibex, when confined and allowed to reproduce in captivity, could be put into the service of the human diet. While we were domesticating these creatures into cows and goats, they showed us that their diet of grasses was also something we could try to mimic. They also contributed to human diseases by giving us smallpox, measles, tuberculosis, and rhinoviruses that cause the common cold.
While much of the world followed the lead of grazing ruminants and adopted a diet increasingly reliant on the seeds of grasses, not all cultures took this 10,000-year dietary detour. A number of hunter-gatherer societies throughout the world never embraced grains, relying instead on traditional omnivorous menus. The diets followed by such societies therefore largely reflect the diets of pre-Neolithic humans, i.e., diets that pre-date the development of agriculture. The modern world has, over the past few hundred years, encroached on these primitive societies, particularly if their land or other resources were prized. (Think Native Americans and Canadians of the Pacific Northwest or Aboriginal populations of Australia.) Each instance provides a virtual laboratory to observe what happens to health when there is a shift from a traditional grain-free to a modern grain-filled diet.
We have cultural anthropologists and field-working doctors to thank for such insights. Scientists have studied, for instance, the San of southern Africa, Kitavan Islanders of Papua New Guinea and the Xingu peoples of the Brazilian rainforest, all of whom consume foods obtained from their unique habitats. None consume modern processed foods, of course, meaning no grains, no added sugars, no hydrogenated oils, no preservatives and no artificial food colouring. People following their ancestral diets consistently demonstrate low body weight and body mass index (BMI); freedom from obesity; normal blood pressure; normal blood sugar and insulin responses; lower leptin levels (the hormone of satiety); and better bone health.18 Body mass index, reflecting a ratio of weight to height, is typically 22 or less, compared with our growing ranks of people with BMIs of 30 or more, with 30 representing the widely accepted cutoff for obesity. The average blood pressure of a Xingu woman is 102/66 mmHg, compared with our typical blood pressures of 130/80 or higher. The Xingu experience less osteoporosis and fewer fractures.
The Hadza of northern Tanzania are a good example of a hunter-gatherer society that, despite contact with Westerners, has clung to traditional methods of procuring food.19 The women dig for roots and gather edible parts of plants, while the men hunt with bows and poison-tipped arrows and gather honey from bees. The average BMI of this population? Around 20, with vigour maintained into later life, as grandparents help rear grandchildren while mothers gather and prepare food. Despite a lifestyle that appears physically demanding on the surface, the total energy expenditure of the Hadza is no different to that of modern people – not greater or less than, say, an average accountant or schoolteacher.20 Activity is parcelled a bit differently, of course, with hunter-gatherers tending to experience bursts of intense activity, followed by prolonged rest, and modern cultures gradually playing out activity throughout the day, but detailed analyses of energy expenditure among primitive people show virtually no difference. This challenges the notion that modern excess weight gain can be blamed on increasingly sedentary lifestyles.21 (Note that this is not true for all hunter-gatherer cultures; the Luo and Kamba of rural Kenya, for instance, exhibit high levels of energy expenditure. The point is that differences in weight are not solely explained by differences in energy expenditure.)
Humans are adaptable creatures, as the wide variety of diets consumed worldwide attests. Some rely almost exclusively on the flesh, organs and fat of animals, such as the traditional Inuits of the northernmost Pacific Northwest of North America. Some diets are high in starches from roots (such as yams, sweet potatoes, taro and tapioca) and fruit, as with the Kitavans of Papua New Guinea or the Yanomami of the Brazilian rain-forest.
The incorporation of foods from the mammary glands of bovines has provoked expression of a lactase-persistence gene that allows some adults to consume milk, cheese and other products that contain the sugar lactase after the first few years of life – an advantage for survival. The seminomadic Maasai people of central Africa are a notable example. Largely herders of goats, sheep and cattle, they traditionally consume plentiful raw meat and the blood of cows mixed with milk, and they’ve done so for thousands of years. This lifestyle allows them to enjoy freedom from cardiovascular disease, hypertension, diabetes and excess weight.22
This is the recurring theme throughout primitive societies: A traditional diet, varied in composition and high in nutrient content but containing no grains or added sugars, allows people to enjoy freedom from all the chronic ‘diseases of affluence’. Even cancer is rare.23 This is not to say that people following traditional lifestyles don’t succumb to disease; of course they do. But the range of ailments is entirely different. They suffer infections such as malaria, dengue fever and nematode infestations of the gastrointestinal tract, as well as traumatic injuries from falls, battles with humans and animals, and lacerations, reflecting the hazards of living without modern tools, conveniences, central governments or modern health care.
What happens when a culture that has avoided the adoption of agriculture and grain consumption is confronted with modern breads, biscuits and crisps? This invasion by modern foods has played out countless times on a worldwide stage, with the same results each and every time: weight gain and obesity to an astounding degree, tooth decay, gingivitis and periodontitis, tooth loss, arthritis, hypertension, diabetes, and depression and other psychiatric conditions – all the modern diseases of affluence. Like a broken record, this same refrain has played over and over again in varied populations, on every continent.
It has been observed in Pima Indians of the American Southwest, where 40 to 50 per cent of adults are obese and diabetic, many toothless.24 It has been observed in native tribes of Arizona, Oklahoma and the Dakotas, resulting in 54 to 67 per cent of the population being overweight or obese.25 Peoples inhabiting circumpolar regions of Canada and Greenland have all experienced dramatic increases in obesity and diabetes.26 In Pacific Islanders, such as the Micronesian Nauru, 40 per cent of adults are obese with diabetes.27 Modernized diets have put Australian Aboriginal populations in especially desperate health straits, with 22 times the risk of complications of diabetes, 8 times higher cardiovascular mortality, and 6 times greater mortality from stroke compared with non-Aboriginal Australians.28
Until recently, the Maasai of central Africa, Samburu of Kenya and Fulani of Nigeria showed virtually no overweight or obesity, no hypertension and low total cholesterol values (125 mg/dl). When relocated to urban settings, hypertension and obesity explode, with 55 per cent overweight or obese.29 Former hunter-gatherers develop iron deficiency anaemia and folate deficiency as they transition away from hunting game and gathering wild vegetation and rely on purchased foods, especially corn.30 Dr Roberto Baruzzi, a Brazilian doctor, studied hunter-gatherers of the Xingu region of Brazil in the 1960s and 1970s and found slender people with no discernible excess body fat, no diabetes, no cardiovascular disease, no ulcers and no appendicitis. A repeat survey in 2009, following 30 years of contact with modern food, found 46 per cent of the people overweight or obese, 25 per cent of the men hypertensive, and most with abnormalities of cholesterol panels (such as low HDL cholesterol or high triglycerides) and rampant dental decay.31 Another recent assessment of Aruák natives of the Xingu region documented 66.8 per cent of men and women as overweight or obese, 52.1 per cent of women with abdominal obesity and 37.7 per cent of men with hypertension.32
All of these groups represent humans who have not developed the partial tolerances agricultural societies evolved over 10,000 years that allow them to consume the seeds of grasses. Consequently they, more so than us, show exaggerated responses to consumption of grains and sugars.
The diseases of modernization are unfortunately intertwined with the diseases of poverty, given the disrupted and marginalized lives indigenous people often endure at the heavy-handed ways of modern society. Typically, an overreliance on cheap grains and sugars characterizes the diets of these latecomers to the modern world, replacing gathered vegetation, for instance, with flours, convenience foods and sweets. And if Western aid is required due to starvation and maldistribution (which is common when former hunter-gatherers are disconnected from their traditional lifestyles), do we fly in beef, salmon, coconuts or cucumbers? Nope: we send in the grain – wheat, maize, rice – which feeds humans as well as their livestock.
Type 2 diabetes, in particular, is the defining disease acquired when hunter-gatherer populations join the modern world in dietary and health habits – so much so that anthropologists have labelled diabetes ‘the price of civilization’. And, of course, all of us modern humans, being hunter-gatherers at our genetic core, are experiencing diabetes at an unprecedented rate. This modern disease is expected to afflict a third of all adults in coming years, as well as a growing proportion of children and teenagers.33 The world of humans now obtains 50 per cent of its calories from the seeds of grasses and is increasing consumption of sucrose and fructose. Meanwhile, we’re being urged to further increase our reliance on ‘healthy whole grains’ in the developed world while we resort to cheap, accessible grains of any sort in the less-developed world. Under these circumstances, we can expect no relief from this global man-made pandemic – unless we reject the notion of consuming the seeds of grasses outright.
Dr Weston Price: Snapshots of Westernization
Dr Weston Price was a dentist practising in Cleveland, Ohio, during the early 20th century. He was troubled by the amount of tooth decay he witnessed in his patients, particularly children, and intrigued by reports that ‘savages’ (people living in primitive settings) were virtually free of tooth problems. So Dr Price did something extraordinary: he left his home and, along with his wife, Florence, began a 10-year worldwide journey to chronicle the dietary habits of primitive cultures, documenting his findings with careful examinations of teeth, facial structure and more than 15,000 photographs. His efforts provide a remarkable visual record of what primitive cultures looked like and what happens to primitive humans when they begin to consume modern foods.
His travels took him to the Inuits of Alaska, the native Americans of the Pacific Northwest and central Canada, Melanesians and Polynesians, Aborigines of Australia, the Maori of New Zealand, descendants of the ancient Chimú culture in coastal Peru, and tribes of Africa, including Maasai, Kikuyu, Wakamba, Jalou, Muhima, Pygmies, Baitu and Dinkas. In each locale, he examined and photographed teeth, faces and other features he found interesting. In short, Dr Price produced a fascinating record of people living their traditional lifestyles at a moment in time when it was all about to end.
In every culture of the dozens he studied – without exception – he found tooth decay, tooth loss and dental abscesses or infections to be uncommon, typically affecting no more than 1 to 3 per cent (and sometimes none) of the teeth he examined. He also noted the absence of gingivitis and periodontitis, and few to no crooked or crowded teeth. While a keeper of meticulous records, he also observed that facial structure was different, with primitive people enjoying what he called ‘fully formed facial and dental arches’ and a lack of narrowed nasal passages.
Even more remarkably, Dr Price specifically sought out members of these cultures who had recently transitioned to consuming ‘white man’s food’ – people who were bartering for the breads, pastries and sweets of Westerners visiting or bordering their land. In every instance, he observed an astounding increase in tooth decay, affecting 25 to 50 per cent of teeth examined, along with gingivitis, periodontitis, tooth loss, infectious abscesses, crooked and crowded teeth, and reductions in the size of the maxillary (midfacial) bone and mandible (jawbone). Nearly toothless mouths in teenagers and young adults were not uncommon.
The traditional diets of these societies were typically fish, shellfish and kelp among coastal cultures, and animal flesh and organs, raw dairy products, edible plants, nuts, mushrooms and insects among inland cultures. With only two exceptions (the Lötschental Valley Swiss, isolated by the Alps, who consumed a coarse rye bread, and the Gaelic people of the islands of the Outer Hebrides, who consumed crude oats), grains, sugars and processed foods were notably absent. (The Swiss had an intermediate number of dental caries, more than other cultures studied, while the Gaelic population did not.)
What is even more startling about Dr Price’s observations of the rarity of tooth decay and deformity is that none of these cultures practised any sort of dental hygiene: no toothbrushes, no toothpaste, no fluoridated water, no dental floss and no dentists or orthodontists. While Dr Price’s observations cannot be used to precisely pinpoint the nutritional distinctions between modern and traditional cultures, they nonetheless make a powerful point. Anyone wishing to read Dr Price’s account can find it reproduced in a recent reprint.34
This social ‘experiment’ has also occurred in the opposite direction: a return to traditional diet and lifestyle after a period of Westernization. In 1980, Dr Kerin O’Dea, while at the Royal Children’s Hospital in Melbourne, conducted an extraordinary experiment: she asked 10 diabetic, overweight Aboriginal individuals living Western lifestyles, all of whom retained memories of prior lifestyles, to move back to their origins in the wilds of northwestern Australia and follow their previous hunter-gatherer diet of kangaroo, freshwater fish and yams. They began their adventure with high blood glucose levels of (on average) 209 mg/dl, high triglycerides of 357 mg/dl, as well as abnormal insulin levels. After seven weeks of living in the wild, killing animals and eating familiar gathered foods, the 10 lost an average of 17.6 pounds of body weight and dropped their blood glucose to 119 mg/dl and triglycerides to 106 mg/dl.35 Of the original 10, five returned nondiabetic. In a 2005 lecture, Dr O’Dea remarked: ‘I was struck by the change in people when they were back in their own country: they were confident and assertive, and proud of their local knowledge and skills. At the time we were not able to measure markers of psychosocial state, however observation suggested a very positive change.’36
Search the four corners of the earth today and you will find that the only surviving hunter-gatherer population that’s untouched by modern diet is the Sentinelese of the North Sentinel Island in the Indian Ocean. Because their language is strikingly different from all languages in neighbouring lands, it is thought that the Sentinelese have been isolated since anatomically modern humans first migrated to this part of the world 60,000 years ago.37 Attempts to visit their island have been met with volleys of arrows, spears and rocks, so observations are limited. From what has been observed, however, they are lean and healthy, hunting, fishing and gathering foods without the ‘benefit’ of agriculture.
We have to be careful not to regard the life of the hunter-gatherer human as idyllic or problem-free: they had plenty of problems. While it is widely believed that stress is a modern phenomenon, this is absurd. Which is more stressful: struggling to pay your bills or having a marauding, bloodthirsty tribe of humans slaughter your friends, seize the women and enslave the children? We need to observe some of the practices of primitive cultures, such as head shrinking by the Jivaro Indians of the Amazon or cannibalism by the Carib of the Lesser Antilles and Venezuela, to remind ourselves that the world of humans can be an inhospitable place. Violence inflicted by and upon humans has characterized our existence from the start. While violence is certainly still a part of modern life, legal and political constraints that became necessary as human populations developed greater reliance on the practice of agriculture make it far less a part of day-to-day life than it was, say, 50,000 years ago. Yes, there is a bright side to agriculture and civilization.
The development of civilization and the cultivation of the seeds of grasses: two processes that ran parallel over the past 10,000 years that led to concepts such as sedentary non-nomadic life, land ownership, centralized government and many other phenomena we now accept as part of modern life. But when we observe what happens to cultures unexposed to the seeds of grasses who are then compelled to consume them, we observe an exaggerated microcosm of what the rest of the world is now experiencing.
Eat Like an Egyptian
Tooth decay, dental infections, crooked teeth, iron and folate deficiencies, diabetes, degenerated joints, weight gain, obesity: I’ve just described the average modern person. Take a member of a primitive culture following their traditional diet and feed them the processed foods of modern man – complete with the enticing products of the seeds of grasses – and within a few years, we’ve given them all the same problems we have, or worse. Yes, without ‘modern civilization’ they might succumb to the greedy ambitions of a violent neighbouring clan, but with grain in their lives they’ll have to engage in battle while sporting a 44-inch waist, two bad knees and a mouth that’s missing half its teeth.
While obesity and the diseases associated with it are virtually absent from hunter-gatherer cultures, neither are they entirely new. Diseases of affluence developed even before geneticists introduced changes into grains. Hippocrates, a Greek doctor in the 3rd century BC, and Galen, a Roman doctor of the 2nd century AD, both made detailed studies of obese people. William Wadd, an early-19th-century London doctor and a lifelong observer of the ‘corpulent’, made this observation after the autopsy of an obese man:
The heart itself was a mass of fat. The omentum [a component of the intestines] was a thick fat apron. The whole of the intestinal canal was imbedded in fat, as if melted tallow had been poured into the cavity of the abdomen; and the diaphragm and the parietes [walls of organs] of the abdomen must have been strained to their very utmost extent, to have sustained the extreme and constant pressure of such a weighty mass. So great was the mechanical obstruction to the functions of an organ essential to life, that the wonder is, not that he should die, but that he should live.38
What is new is that overweight and obesity have been transformed from that of curiosity to that of epidemic. The situation we confront in the 21st century is all the more astounding because modern epidemiologists and health officials declare that the causes of the epidemic of overweight, obesity and their accompanying diseases are either unclear or that the burden of blame should be placed on the gluttonous and sedentary shoulders of the public. But the answers can be discerned through observations of primitive societies plagued by none of the issues plaguing us.
More than the presence of grains distinguishes primitive from modern life, of course. Hunter-gatherers also drank no soft drinks; consumed no processed foods laced with hydrogenated fats, food preservatives or food colourings; and consumed no high-fructose corn syrup or sucrose. They were not exposed to endocrine-disruptive chemicals released by industry into our groundwater and soil, and which taint our food. The civilizations of ancient Greece and Rome and of 19th-century Europe also did not consume these components of the modern diet (except for increasing consumption of sucrose beginning in the 19th century). No Coca-Cola, hydrogenated fats, brightly coloured sweets lit up by FD&C Red No. 3 (E127) or polychlorinated biphenyl (PCB)-laced water graced their tables. But they did consume the seeds of grasses.
So just how much can we blame on the adoption of the seeds of grasses into the human diet? Let’s consider that question next. Each variety of seeds of grasses poses its own unique set of challenges to nonruminants who consume them. Before we get under way in our discussion of regaining health in the absence of grains, let’s talk about just how they ruin the health of every human who allows them to adorn his or her plate.