Читать книгу Family and Parenting 3-Book Bundle - Michael Reist - Страница 7
Chapter 2
ОглавлениеThe Other Hundred Years’ War
Thomas emerges from behind the curtain. His young face is anxious but composed. The stage light gleams off the filigreed tuning keys of the classical guitar hanging from a strap across his shoulders. He shields his eyes from the spotlight and scans the crowd until his eyes settle on a familiar face. He heads toward the stage’s lone stool and sets his sheet music — two pages of Bach’s Jesu Joy of Man’s Desiring — on the metal stand perched nearby. He takes a seat. The audience applauds politely. Thomas smiles, offers them an appreciative wave, and begins to play. The fingers of his left hand jump effortlessly from fret to fret, string to string, moving with the grace and dexterity of practiced gymnasts. Notes flow from Thomas’s guitar in a melodious stream, filling the room from wall to wall with sumptuous harmonies.
Thomas’s mother watches from the front row, beaming. When the performance ends she applauds a little louder than the rest, though she’s careful to keep her exuberance in check; she promised Thomas that she wouldn’t embarrass him. As the boy bows and leaves the stage, a woman beside Thomas’s mother leans in to her.
“Was that your son?” the woman asks.
“Why yes, it was,” Thomas’s mother replies.
“He looks just like you. And he plays beautifully.”
Thomas’s mother blushes. “Thanks. He gets that from his father.”
The woman gives a knowing smile. “Good genes.”
Thomas’s mother nods. “It’s certainly not from my side. None of us can carry a tune.”
“It’s the same with my Amelia. They all have beautiful voices on her father’s side. She’s lucky she got that from him and not me.”
A hush settles over the crowd. From behind the curtain emerges a young girl in a pleated dress. Thomas’s mother notes the reaction of the woman she’d been speaking to — the alert posture, the wide, nervous eyes, the anticipatory lean forward — and figures the girl on stage must be Amelia.
Amelia holds no instrument. Her hands fidget, eager for something to do. She holds them together in front of her and steps toward the mike, her eyes nervously scanning the crowd. Her mother waves but Amelia doesn’t seem to notice.
There is a moment where Thomas’s mother thinks the poor girl is going to faint. Amelia sways slightly on her feet, her face wide and pale. But then her mouth opens and the audience lets out a collective gasp. From Amelia’s parted lips leap notes of startling operatic grandeur. Her voice is so rich and strong it seems to force the audience back several inches. They stare in awe of the enormous sound issuing from such a tiny, fragile-looking vessel. The melody soars, circling the audience like a majestic bird of prey. As her final note echoes off the back wall, the audience applauds, Amelia’s mother loudest of all. Amelia scurries offstage. Thomas’s mother gives Amelia’s mother an impressed look.
“Wow.”
Amelia’s mother shrugs, slightly embarrassed. “That’s all her father’s doing. It’s in his blood.”
The Fallacy of Nature vs. Nurture
Good genes. It’s in his blood. I get that from my mother. These easy, off-the-cuff statements mask a long and bitter history of feuding, contention, and one-upmanship. The 20th century has been subject to a long and heated debate over what force makes humans develop the way they do. This hundred years’ war has pitted scientist against scientist, causing our keenest minds to fortify themselves inside bunkers of rigid, absolutist doctrine. The combatants have varied over the years, but each has aligned himself with one of two camps: nature or nurture. Nature surmises that human traits are the result of our genome: 46 long and elaborate strands of deoxyribonucleic acid, each comprised of a double-helix pattern woven around histone proteins and shaped into a hyper-concentrated figure called a chromosome. Nurture, meanwhile, attests that our traits arise from exterior influences: the food we eat, the language we speak, the parenting style under which we are reared, and so forth.
Over the years, both nature and nurture have gained and lost prestige with the scientific community and the public. In the early days of the 20th century, a group of scientists called eugenicists made a convincing case for the supremacy of genes, surmising that a person’s physical stature, mental faculties, and even criminality were the product of his genetic makeup.[2] Humanity, they argued, stemmed from multiple bloodlines, and purifying our gene pools through selective breeding would result in a better, smarter, stronger human race. Criminals and the mentally ill were discouraged from having children, as it was believed their negative traits would muddy the gene pool. Eugenics, infected as it was by a number of unsettling racist overtones, reached a deplorable climax in Nazi Germany as the ideological drive behind Hitler’s Final Solution.
The aftermath of the Second World War dealt a fatal blow to the ugly science, and a grievous — though not quite deadly — wound to the concept of genetic predestination. Eager to distance themselves from the horrors of Nazism, some scientists promptly embraced eugenics’ polar opposite: behaviourism. The discipline, which had existed as a philosophy for centuries and was forged into its modern form by Ivan Pavlov at the start of the 20th century, was championed by noted psychologist B.F. Skinner, whose “radical behaviourism” proposed that human beings were little more than highly complex machines programmed by a series of external stimuli. As supporters of nature worked desperately to scrub the swastika-shaped stains from their banner, Skinner waved the nurture flag proudly.
Behaviourism was later eclipsed by the cognitive sciences, which adopted the same “mind is machine” metaphor, but drew the opposite conclusion. The mind was indeed a machine, they argued, and human behaviour was preprogrammed into it by genes.[3]
Though nature and nurture each retain a small collection of loyalists, the battle has, in recent years, become more of an illusion than reality. Modern academic opinion has, for the most part, ceded the importance of both factors. This is not to say that there is unanimous approval of a single theory. However, the developmental sciences have undergone a paradigm shift. The question is no longer whether nature trumps nurture or vice versa; rather, it is how the two variables interact to produce a unique individual.
Family Resemblance
Newspaper headlines regularly trumpet the discovery of the gene for this or that, hinting to the average reader that, with just a little more research, everything from obesity to alcoholism will be miraculously cured by the tweak of a few key nucleotides. Sadly, this is not the case. A closer reading of such articles reveals a more mundane truth. The so-called “gene for drug use” or “gene for aggression” or “gene for the obsessive collection of Elvis memorabilia”[4] does not apply a fatalistic tag to the individual, dooming them to a life of addiction or anger or hording ceramic figurines of the King. It can, at best, only predict one’s susceptibility to this kind of behaviour. And even then, there are other factors to consider.
The idea purported by the phrase “gene for X” is that a brief series of nucleotides — the tiny molecules that comprise DNA’s four-letter alphabet — commands an organism to develop a certain trait. By deleting this sequence or changing the order of the letters, one could remove the aberrant trait or replace it with something more desirable. This concept raises suspicions to a number of observers outside of the scientific field. “This may be true for something relatively straightforward, like eye colour or height,” they might say, “but surely complex psychological traits like greed or anger cannot be the result of a single poorly worded genetic phrase.” A well-reasoned argument, but it is only half right. In truth, even those seemingly simple physiological traits arise from both environmental and genetic influences.
Certainly, some traits seem more genetically determined than others. When we see several generations of a single family gathered together, we often notice certain similarities between its members. Perhaps a majority of them have the same freckled skin. Or the same green eyes. Or the same stubby fingers. Maybe we spot a family resemblance in their high cheekbones or aquiline noses. Or we note that none of the adults are shorter than five foot eleven. However, even a brief observation will turn up differences as well. The grandfather sits at the kitchen table and delivers an impassioned argument to his youngest daughter, who responds in kind. Meanwhile, her older sister and her mother sit two chairs down, fidgeting awkwardly with the cutlery and sharing nervous glances, uneasy about the heated tone the conversation has taken. Among the youngest generation, a boy of about six dives off the couch and onto an easy chair while a girl, two years his senior, whines at him to stop before he hurts himself. Another girl, this one only four, stands in the corner and scribbles on the wallpaper with a crayon while her cousin, also four, watches her nervously, wondering whether or not he should tell the adults.
Were an outside observer asked to label which of the family’s traits were genetically determined, they would without hesitation point out the green eyes, the freckles, and height. More astute individuals would likely also mention the nose or the cheekbones or the stubby fingers. But most would hesitate to attribute a genetic link to the argumentative dispositions of the father and daughter, or the awkward brooding of the mother-daughter combo two seats down, or the devil-may-care bravado of the couch-leaper and the wall-scribbler. We tend to see these behaviours as less genetically motivated than something like eye colour. After all, one cannot educate a child taller or discipline green eyes brown. However, to divide traits into genetically determined and environmentally determined compartments is to misunderstand how genes work.
Consider hair colour, a trait that, on the surface, seems to be determined solely by a person’s genes. A child’s hair is seldom a colour that does not have some familial precedent. By contrast, the influence of the environment on one’s hair colour seems nonexistent. Blonde Nordic children adopted by Chinese families do not spontaneously develop black hair. However, this does not mean genes alone are responsible for a person’s hair colour. After all, genes can really only do one thing: instruct cells, by way of an interpreter called RNA, to create a series of amino acids, which then link together to form proteins. Now, this one function is extremely, unbelievably important. Proteins are the body’s proletariat, the workers who carry out the myriad tasks which allow us, the society in which they dwell, to function. But genes cannot, on their own, dictate the colour of a person’s hair. Hair colour is determined by melanin, which is the end product of the amino acid tyrosine. Now, genes do code for tyrosine, hence the genetic influence. However, in hair, the degree of melanin accumulation is decided in part by the concentration of copper in the cells producing that hair. When that cell has more copper, the hair is darker. Should the intake of copper be reduced to below a certain threshold, hair generated by the same follicle will be lighter than it was previously, when copper supplies were plentiful.
Similar factors are responsible for every human trait imaginable. The reason height seems to be determined solely by genetics is that, thankfully, just about everyone in the first world receives the base nutritional intake necessary for those genes to take effect. Likewise, most people get enough copper in their diet, as it can be found in a wide number of dietary staples, including fish, whole grains, nuts, potatoes, leafy greens, dried fruit, cocoa, black pepper, and yeast. Because it is almost universally consumed in sufficient quantities, copper’s contribution to hair colour goes largely unnoticed. Somewhat paradoxically, the ubiquity of its influences renders them invisible. Such is the case with thousands of environmental factors we take for granted. It isn’t until a radical change in the environment depletes once-plentiful resources that we realize how much those resources contributed to our development. In the words of Joni Mitchell, you don’t know what you’ve got ’til it’s gone.
A dramatic example of this occurred in Europe in October of 1944. The tides of war had turned on the Germans, who found their once seemingly invincible army forced back on all sides. Allied forces had reclaimed the southern part of the Netherlands, but the Germans maintained control of the rest. In an attempt to demoralize the Dutch, who had been emboldened by the partial liberation of their country and threatened a violent uprising, the Germans placed an embargo on all food supplies heading into the country and flooded the surrounding fields, spoiling the season’s harvest. To make matters worse, November proved the start of a very harsh winter. The Dutch canals were frozen solid, thwarting Allied attempts to ship in supplies by barge. Thus began the Dutch Hunger Winter, a devastating famine that lasted into the spring of 1945 and was responsible for 18,000 deaths by illness and starvation. Though the tragedy of the famine was harsh and immediately felt, the full brunt of its impact did not appear until long after the embargo was lifted and food supplies returned.
Almost immediately after it ended, researchers saw in the Hunger Winter the potential for a large-scale natural experiment. A population of well-fed people with documented medical histories had undergone severe malnutrition for a precisely delineated amount of time, and then quickly reverted to a normal diet. Over the following decades, researchers studied the medical records of individuals who had gestated (were growing in their mothers’ uteruses) during the famine, noting any statistical abnormalities between them and other Dutch nationals who had not been so affected. The results were fascinating. Children whose mothers had been malnourished during the first trimester of their pregnancy were unusually likely to suffer from spina bifida,[5] cerebral palsy, and other conditions of the central nervous system. Additionally, girls from that cohort were twice as likely as the general population to develop schizophrenia. Clearly, malnutrition during those first few months of development impinges on the brain’s ability to properly develop.
However, perhaps the most surprising finding involved men born to mothers who had been malnourished during the first two trimesters of their pregnancy. A 1976 study found that these men, now in their thirties, were significantly more likely to be obese than other men of their age and background whose mothers had not experienced the Hunger Winter. Further studies, though performed on rats instead of people, have helped us to understand the mechanism behind this strange phenomenon. Mothers’ malnutrition during the first two trimesters of pregnancy leads to unusually high insulin levels in male fetuses during the third trimester, which can affect the development of the fetus’ brain. We don’t yet know for sure why this occurs, and why it doesn’t affect females, but it does make sense from an evolutionary standpoint. Mothers experiencing famine could be hormonally conditioning their children’s metabolism to most effectively function in an environment where sources of nutrition are scarce. If a child is born into an environment plagued by famine, the ability to readily store fat would be a significant advantage. However, once the boys were born and the problem of food scarcity was solved, they nevertheless retained their prenatal conditioning despite the fact that their evolutionary advantage had become a disadvantage.
The Hunger Winter provides a good example of how environmental influences on development can easily remain buried beneath the surface of human development, only to be unearthed generations later by a dramatic change in the landscape. Had the Hunger Winter not occurred, we may never have learned that the prenatal environment can affect an individual’s propensity to store fat, and the men studied would likely have developed a body type similar to those of their relatives (though, as always, diet and exercise would have played an important part).
Admittedly, the Hunger Winter is an extreme example. Under normal circumstances, attributing environmental influence to height or hair colour may seem unnecessarily pedantic. After all, if the environmental factor contributing to a trait’s development is present everywhere on Earth, isn’t it fair to say that said trait is genetically determined? If someone’s genetic makeup dictates they will have green eyes, they’re almost certainly going to get them, whether they live on the streets of inner-city Baltimore or in a mansion in Beverly Hills. Likewise, for certain conditions, a single aberrant gene really is the root cause. One could reasonably argue that calling CFTR[6] “the gene for cystic fibrosis” is accurate shorthand, as cystic fibrosis occurs when an individual inherits two mutated versions of that specific gene. The environment in which the affected child is raised will not alter how the gene behaves.
Nevertheless, relying on terms like “the gene for X” can be dangerously reductive, as it blinkers our thinking and encourages limited, simplistic approaches to complex problems.
Let’s consider the scene that began this chapter. Remember Thomas? Was his facility with the guitar purely the result of his genes? It would be difficult to argue that environmental influence didn’t play some part. No one picks up an instrument and plays Bach on the first try. It takes hours and hours of practice to develop the requisite agility, finger strength, and muscle memory. Despite what anyone with particularly accomplished parents may hope, skills and knowledge do not come prepackaged inside our chromosomes. Bodybuilders do not sire toned, muscular children, nor do the offspring of computer programmers enter the world knowing how to code in C++. Every generation must develop these skills from scratch.
Okay, so Thomas’s talent was tempered by hours of dedicated study. But what about Amelia? An instrument may require extensive experience to be played competently, but some people are blessed with a natural singing voice. Amelia’s father, we are told, is an excellent singer, and so too are most of his immediate family. It stands to reason that their musical aptitude was passed down through the generations. Thomas had to work to develop his skills, but Amelia was simply fortunate enough to inherit a gift. Right?
Genetically speaking, it is possible that Amelia was born with certain traits advantageous to a burgeoning singer. If her father’s family has truly abounded with talented vocalists throughout the generations, then perhaps their genes code for better-than-average lung capacity, a strong diaphragm, or exceptionally dexterous vocal cords, and these predispositions are what drew her ancestors to singing in the first place. This could be the case, but to assume it must be — and to cite Amelia’s proclivities as the only evidence — is tremendously naive. It is equally possible, perhaps even probable, that Amelia possesses no physiological advantage as a singer whatsoever. Half her genes come from her mother, after all, who readily admits that her whole family is tin-eared and musically inept. This is not to say that Amelia’s talent wasn’t inherited, only that we mustn’t limit our idea of inheritance to a transaction involving a few dozen molecules.
If Amelia is from a musical home, she probably grew up with music as an important part of her life. We can assume the record player was running often, and that her father regularly sang around the house. In this case, her musical education began before she was even born.
Children develop an aural connection with the outside world as early as six months after conception. Researchers recruited a group of pregnant women and had them read one of two stories — The Cat in the Hat or The King, the Mice, and the Cheese — aloud twice a day from the time they were seven months pregnant until the day they gave birth. Two days after they were born, the children of these mothers were tested to see which story they preferred, using a fairly ingenious device that measured how often they sucked on a pacifier. Sucking is a reflex ingrained in children from birth, and one of the few motions over which infants have conscious (or close to conscious) control. By adjusting the speed of their sucking, the babies could choose whether they heard a recording of The Cat in the Hat or The King, the Mice, and the Cheese. Infants consistently preferred to hear recordings of whichever story their mothers had read aloud while they were in the womb. More fascinating still, researchers achieved the same result even if the recording was not of the child’s mother reading, but a total stranger. This means the children were not simply responding to the unique vocal register of their mothers, but to the specific cadence of the story itself.
Conceivably, then, if Amelia’s father played a variety of music around the house during her gestation, she would have become familiar with the notions of melody, harmony, and counterpoint before she had even drawn her first breath. This is not to say she would have been composing symphonies in her crib, but repeated listening would have attuned her ear to the pitches and intervals common in Western music. Perhaps just as importantly, she would have developed a positive association with the songs she heard in utero, making it that much more likely she would take to music as a child.
As Amelia grew, her musical upbringing would have continued to influence her behaviour. There would have been musical scores lying around for her to ponder, tapes and CDs for her to listen to, and instruments for her to tinker with. And even if her father was not the type of man to force his child to follow in his footsteps, he would almost certainly have encouraged any interest in music that Amelia displayed. After all, what parent wouldn’t want their child to feel passionate about the same things they do?
We inherit more from our parents than our chromosomes. They are the ones who teach us, feed us, scold us when we misbehave, and console us when we scrape our knees or embarrass ourselves at school. They are responsible for cultivating the environments in which we are raised. We are as much the beneficiaries of their affluence (or lack thereof), their dispositions, and their teachings as we are of their genes.
Of course, none of Amelia’s hypothetical early training can guarantee that she will become a musical prodigy. Innate ability does exist, and there are limits to the extent that environmental factors can sculpt an individual. They can chisel out a form, but the genetic material from which the subject is carved will inevitably factor into the final outcome of the sculpture. What’s more, unlike a sculpture, humans are not passive subjects prostrate before the whims of their environment. They have the power to change their surroundings, either through the ingenuity of invention or by simply deciding to live in, speak to, and engage with the places, people, and pursuits that most interest them. A child’s natural talents influence his behaviour, which in turn influences the environmental factors he will encounter. Children generally like to do what they’re good at, and the more they do it, the better at it they get. It is in these instances where the nature versus nurture divide seems truly absurd. Without an inherent knack for music or chess or skateboarding, a child will be less inclined to dedicate the time necessary to improve, but if they never dedicate the time in the first place, they won’t improve no matter how innately talented they are.