Читать книгу The Human Race – Too Smart to Survive - Dr. Matthias Meier - Страница 8
ОглавлениеChapter 3 - Minerals, the Hidden Heroes
Knowledge of our physiology has increased dramatically in recent decades. The foundations were laid primarily in the 19th century when technological advances also changed the world of medicine. Chemistry, biochemistry, and physiology were increasingly surveyed and researched in a scientific and methodical manner. The idea emerged that perhaps some illnesses were just a deficiency of some vital nutrient and that replacing the right element would restore health. This is an entirely different approach from the theory that diseases are caused by “too much” of external influences, for example, bacteria or toxins.
In 1755, a report was published by Gaspar Casal describing a case of “pellagra,” a disease not known in Spain at the time, that was characterized by skin lesions, diarrhea as well as a dementia-like development, and which, interestingly enough, became significantly more frequent in the spring. Those affected were mostly poor, and their nutrition subsisted on cornmeal/corn grits with no access to milk or meat. In Italy, similar reports were made public, and here too it was mainly the poor who were affected. At first, too much corn was thought to be the cause, since corn was first imported into Europe by the Spanish after they colonized South and Central America. However, in these countries, corn did not cause the same symptoms, although the rural population also had little access to milk and meat. The difference was found in the processing of the corn. The Mexicans soaked the corn in a wood ash mixture, while the Europeans made the corn into polenta, which could lead to contamination with fungi or toxins. Further theories circulated until an American named Joseph Goldberger administered milk and eggs to sick people and thus was able to make the symptoms disappear9. This was reproduced and verified in various countries in Europe and the U.S., revealing that pellagra was a deficiency of a vital nutrient (nicotinic acid or vitamin B3).
In 1849, Thomas Addison wrote of a potentially fatal anemia called “pernicious anemia”. It was characterized by reduced but enlarged red blood cells. Vitamin B12 was later identified as the missing element, but absorption in the intestines is only possible through an “intrinsic factor” found in the stomach and intact gastric acid, both products of intact gastric cell metabolism.
Beriberi was a disease not uncommon in rice-rich Asian cultures in the 19th century, characterized by fatigue, impaired concentration, tremors, abdominal pain, burning of the feet, and even paralysis and heart failure. On longer sea voyages, up to 60% of the crew became ill and 25% died. Both deaths and the onset of the disease were prevented entirely by the introduction of barley, milk, beef, and tofu. At first, it was speculated that the increase in protein intake made the difference. In further studies, the link between white and brown rice was established. 70% of prison inmates became ill with white rice as their main food source, while only 3% of inmates with brown rice in their diet became ill. In 1906, Norwegian bacteriologist Axel Holst found that symptoms in hamsters disappeared simply by feeding them cabbage and lemon juice. In the same year, Frederick Gowland Hopkins conducted an experiment on two groups of rats that were fed different diets. One group was fed casein, fat, starch, sugar and salt (essential nutrients known up to that time), the other group was additionally fed milk. Only the group supplemented with milk thrived. As a result, it was assumed that this disease was also due to a substance deficiency. In 1912, Casimir Funk researched the substances that were thought to be causative for the various diseases and called them “vital amines.” The word vitamin came into being in 1920, when it became known that not all vitamins contain amines.10 Funk was able to assign the individual vitamins to the various diseases, so that there was now a scientific explanation for the development of diseases due to a lack of nutrients. Other diseases such as scurvy (vitamin C deficiency) and rickets (vitamin D deficiency) could now be explained. Vitamin D deficiency was especially prevalent in cities during industrialization when people, especially the poor and children, had to work in coal-fired power plants and underground. Vitamin A was initially identified as the factor that caused the disease, but in 1922 it was discovered that the molecule originally called vitamin A consisted of a second co-factor: vitamin D.
In 1922, two scientists from the University of California identified vitamin E as a fertility factor in rats and used green vegetables and wheat germ as a source (Evans et al., 1922).
Scurvy was an epidemic among sailors who spent months on the high seas without fresh fruit or vegetables. In 1919, it was widely accepted that an isolated factor from citrus fruits could cure scurvy. In 1932, this factor was produced as vitamin C, and in 1933, it was renamed ascorbic acid (anti-scurvy acid).
In 1971, Linus Pauling (two-time Nobel Prize winner) recommended supplementation of vitamin C in high doses to prevent and treat colds as well as cancer and heart disease. Many benefits of vitamin C have been attributed to its role as a cofactor in the production of connective tissue (collagen), which is found in bone, skin, and vascular metabolism. The dose was set at 10–12 grams per day. To reverse arteriosclerosis (hardening of the arteries), he recommended 3–5 grams of vitamin C per day and 2 grams of L-lysine per day. On top of that, he also recommended vitamin C for cancer therapy. Pauling considered a dosage of 50 to 100 milligrams per day, which was considered sufficient for adults at that time, to be too low to have an optimal effect. However, his views and his vitamin C studies were not taken seriously by the scientific community, as the effects he suspected could not be proven in several clinical studies. However, a survey published in the journal Science in 2015 suggested, based on new molecular biology findings, that vitamin C does indeed have an anti-tumor effect (1–5). Several years ago, research at John Hopkins University showed that the growth of colon cancer cells (and other types as well) is driven by mutations in two genes, which promoted the formation of unusually high numbers of membrane transport proteins for glucose, or sugar. Glucose and glutamine are the main nutrients for cancer cells, so more transport proteins are needed in these cells for precisely that purpose. Vitamin C inhibits glucose metabolism in the cells and the cancer cells suffer from starvation, so to speak, since they are denied this source of energy. The administration of vitamin C for tumor therapy has a controversial history. While some clinical trials have demonstrated benefit, others have failed to confirm it.11 This discrepancy can be attributed, at least in part, to how vitamin C is delivered. Oral vitamin C therapy, unlike parenteral administration (injected into the veins or muscles) therapy, cannot achieve concentrations lethal to cancer cells. Intravenous therapy can achieve much higher concentrations in the blood with dosages of 7.5 to 45 grams per infusion.
Further studies suggest that vitamin C may play a beneficial role in cancer patients; to date, the data have been evaluated in animals and have not been incorporated in human clinical treatment standards.12, 13
Pauling himself is said to have taken 18 grams of vitamin C per day for years and lived to be 93 years old. Nevertheless, he died of cancer.
The realization that a deficiency of a particular nutrient can lead to disease has been known and accepted for over 100 years now. However, vitamins make up only a fraction of the full 90 essential nutrients that humans need on a daily basis:
1 Vitamins: A, B1 (thiamine), B2 (riboflavin), B3 (niacin), B5 (pantothenic acid), B6 (pyridoxine), B12 (cyanocobalamin), C, D, E, K, biotin, choline, flavonoids and bioflavonoids, folic acid and inositol. Other essential nutrients are
2 Amino acids: valine, lysine, threonine, leucine, isoleucine, tryptophan, phenylalanine, methionine, histidine, arginine, taurine, and tyrosine (the last three are not called essential, but can cause certain diseases if they are deficient).
3 Fatty acids/cholesterol: linoleic acid, linolenic acid, arachidonic acid and cholesterol (while cholesterol is also produced by the body, some scientists suggest that a deficiency may result in Alzheimer’s dementia, type II diabetes mellitus, erectile dysfunction, and burnout, among other things).
However, research on nutrition has also yielded further advances into our understanding of metabolism in recent decades, which should not be withheld from the reader here. If deficiencies in vitamins, amino acids or fatty acids can lead to certain diseases or symptom complexes, what about minerals? Here are some important basics:
Minerals are found exclusively in the Earth’s crust. Neither plants nor animals nor humans can produce them. So, as living creatures on Earth, we depend on plants to absorb minerals from the Earth and make them available for us to metabolize by eating them. This is the way of life—it cannot be changed. You can’t eat calcium stones and think it will strengthen your bones. Likewise, inorganic iron will not help your blood levels. That minerals exist and play a role in our health has also been known for about 100 years. In 1912, a Prof. Wasserman from Berlin observed that tumors in mice disappeared when they were injected with selenium. Since the tumors in mice and humans have a high similarity, Prof. Wasserman expected an enormous benefit in the fight against cancer in humans by using selenium.14 In 2002/2003, a court case against the FDA (Federal Drug Administration in the U.S.) took place, based on the claim that selenium reduces the likelihood of developing some cancers (was won). In 2000, a study appeared in the journal “The Lancet,” which emphasized the importance of selenium. Selenium is essential for human health (6). It reduces antioxidative stress, protects against progression to AIDS in HIV-positive people, promotes sperm mobility, and appears to reduce the likelihood of miscarriage. Deficiency has been linked to mood disorders as well as cardiovascular disease.
In general, minerals are necessary and serve, among other things, as co-factors for the approximately 10,000 enzymes in the human body that catalyze biochemical reactions, for tissue stability, physiological processes such as muscle contraction, and much more. The real question is whether or not we take in the necessary nutrients through our diet. Here, too, one finds contradictory statements. In 1992, the so-called “Earth Summit” of the United Nations took place in Rio de Janeiro. It was stated that between 1936 and 1992, the soils of the U.S. and Canada had lost about 85% of their mineral content, Asia and South America about 76%, Africa 74%, Europe 72% and Australia 55%. In March 2006, the UN recognized that multiple nutrient deficiencies were increasing in frequency. According to the Assistant Secretary General of the United Nations at the time (Catherine Bertini), the overweight were as poorly nourished as the hungry. It’s not the quantity that matters, but the quality. These findings were also reflected in the nutrient levels in various vegetables and fruits. Between 1963 and 2000, green leafy vegetables showed a 62% loss of vitamin C, 41% loss of vitamin A, 29% loss in calcium content, 52% loss of potassium, and 84% loss of magnesium. Cauliflower had lost about half of its vitamin C, thiamine and riboflavin content, and hardly any calcium was found in commercial pineapple. The reason for the reduced mineral content is due to the decreased mineral content in the soil, as plants leach these minerals from the soil. This is what makes them suitable as food for humans and animals. However, the minerals should be replaced in the soil. This fact has been taken into account in various cultures for thousands of years. Wood ash, for example, was spread on crop fields or in gardens. Wood ash is nothing more than the minerals left over from burning wood. The long-lived cultures of this world have never lost this custom. In addition, pesticides and herbicides bind minerals, preventing them from being released in the human body for metabolic use. Bacteria in soils that promote mineral uptake by plants are directly harmed by the chemicals. Meanwhile, pesticides are found in breast milk, urine, feces and various tissues. Another problem is microplastics, which are now being found even in drinking water and in the urine of children and young people. Pesticides and herbicides accumulate in tissues (especially fat) and are difficult to detoxify. Erosion of the topsoil, which is where the actual minerals are found, happens due to wind and weather, but also due to overuse of pastures, ever-increasing harvests of crops and deforestation. The loss of organic material results in a loss of nitrogen, minerals, and rare earths, so the soil can hold less liquid and plant growth is hindered.
Of course, as with all topics, there are studies that show the opposite. They report that soils are not losing minerals and that in some nations there is even a surplus. Supplementation, they say, is unnecessary and in some cases harmful. Again, it is important to educate yourself and to read independent studies that are freely available on the Internet. Understanding the correlations helps immensely to form a conclusion that makes sense.
The minerals that have so far been found in human tissue and identified for metabolic functions are:
1 Aluminum, arsenic, barium, beryllium, boron, bromine, calcium, carbon, cerium, cesium, chloride, chromium, cobalt, copper, dysprosium, erbium, europium, gadolinium, gallium, germanium, gold, hafnium, holmium, hydrogen, iodine, iron, lanthanum, lithium, lutecium, magnesium, manganese, molybdenum, neodymium, nickel, niobium, nitrogen, oxygen, phosphorus, praseodymium, rhenium, rubidium, samarium, scandium, selenium, silver, silica, sodium, strontium, sulfate, tantalum, terbium, thulium, titanium, vanadium, ytterbium, yttrium, tin, zinc, and zirconium.
In order to be absorbed and metabolized, these minerals must be absorbed in plant-bound colloidal form. The absorption rate should be above 90%. One problem with dietary supplements is that not all reach this quality and therefore their effect remains limited. Nature’s prescribed route, as mentioned, is through the absorption of minerals by plants. For this to work, bacteria in the soil are necessary, which together with other microorganisms form the so-called “rhizosphere,” a microcosm that has a close relationship with the plant root and contributes to its growth and immune system (7). Various bacterial and fungal species alter, among other things, the chemically bound form of sulfur, phosphate, and nitrogen, making them bioavailable to plants. Modern fertilizers containing minerals are supposed to replenish the nutrients, however, excessive fertilization poses dangers to the fauna and the soil itself. Fertilizer constituents enter the groundwater and can reduce its quality. An overabundance of minerals is called “eutrophication,” which can promote algal blooms and cause a lack of oxygen in the deep waters of lakes. This is especially a problem with intensive agricultural use with livestock. Sewage sludge fertilization can lead to increased concentrations of heavy metals in the soil, which can result in infertile soils as heavy metals are difficult to leach and plants are unlikely to take them up. Nitrate should be kept out of groundwater as far as possible, as it can be converted to nitrite. This produces nitrosamines in the bodies of mammals, which in turn are carcinogenic. Since 1991, the limit value for nitrate in groundwater has been set at 50 mg/liter. Groundwater naturally usually contains less than 10 mg/liter. In post-war Germany, nitrate levels increased due to denser settlement and intensive soil management and have not stopped to this day (8).
Many people strive to eat healthy and feel good about it. But strictly speaking, we don’t know what’s in the food unless we test it. If the pineapple is grown in soil that is low in selenium, there won’t be much selenium in the pineapple. It’s the same for any other food that comes from the soil.
But it’s not just the spine or mineral deficiencies that contribute to disease. A problem that is clearly underestimated in medicine is our everyday life. In some areas it overlaps, perhaps somewhat unexpectedly, with what finds its way into our mouths as well as with our axial skeleton. The following chapter will explain this in a little more detail.
9 Joseph Goldberger and Pellagra, M. G. Schultz, The American Society of Tropical Medicine and Hygiene, Volume 26, Issue 5, Part 2, 1 Sep 1977
10 Movers and Shakers: A Chronology of Words that shaped Our Age, John Ayfo, p. 54
11 Cancer and Vitamin C: a discussion of the nature, causes, prevention, and treatment of cancer with special reference to the value of vitamin C (1979), Cameron, Ewan Pauling, Linus Carl, The National Agricultural Library
12 Vitamin C and cancer prevention: the epidemiologic evidence, G. Block, The American Journal of Clinical Nutrition, Volume 53, Issue 1, January 1991, Pages 270S-282S
13 Vitamin C selectively kills KRAS and BRAF mutant colorectal cancer cells by targeting GAPSH, J. Yun et al, Science 11 Dec 2015, Vol. 350, Issue 6266, pp. 1391–1396
14 Dr. Sigmund Fränkel: Die Arzneimittel-Synthese, Auf Grundlage der Beziehungen zwischen chemischem Aufbau und Wirkung; Für Ärzte, Chemiker und Pharmazeuten, Springer Verlag 2013