Читать книгу Aging in America - Lawrence R. Samuel - Страница 10

Оглавление

2


The De-Aging of America

Chronological age is not a reliable indicator of the

ability of a person to function.

We need to find measures that predict aging better than

a person’s birth certificate.

—John W. Rowe, Harvard University, 1985

On the morning of June 26, 1983, the Clinique consultants at Rich’s department store in Atlanta were delighted to find the new line of products that had just come in. Clinique, the high-end skin care marketer, was launching a program called “Daily De-Aging at Any Age,” which the store’s wealthier customers were bound to find very appealing. Clinique’s dermatologists had reportedly determined that “older skin needs more aggressive exfoliation,” good news for the company given the number of products that were said to be required to remove the dead skin cells. Consumers likely to subscribe to Clinique’s “Daily De-Aging at Any Age” regimen—most of them middle-aged women—would start with a “scrub cream” to soften wrinkles and rekindle glow, after which would follow a twice-a-day application of various soaps, lotions, and moisturizers. Included in the program was a computer for Clinique’s consultants to analyze customers’ skin type, quite an innovation at the time and supposed evidence that the products had been scientifically formulated.1

Clinique’s new line of products focused on aging was very much in sync with the cultural zeitgeist in America. The largest and most influential segment of the population, baby boomers, were beginning to head into their forties in the 1980s, quite a shock for the generation said to be perpetually youthful. The nation as a whole was getting older as longevity increased, making aging a central issue, both personally and socially. Many Americans, however, were simply not ready to accept the fact that their youth was becoming or already was something of the past. The 1980s were the beginnings of what Prevention magazine called in 1984 “the de-aging of America,” as a good number of baby boomers as well as seniors actively sought ways to stall or reverse the physical process of getting older.

Efforts to delay the natural aging of the human body were largely a reaction to the negative attitudes toward oldness that were still very much in play after the youth-oriented counterculture era. Although some progress was being made as American society generally became more tolerant, discrimination against older people remained pervasive, a means perhaps to publicly display the aversion and antipathy to aging that was a defining element of Western culture. The pure reality of getting older was, however, reason enough for individuals to arm themselves with whatever resources were available to manage the aging process as well as possible. Not surprisingly, gerontology flourished in this decade, as the field was perfectly aligned with the demographic surge that was taking place. “De-aging” was a nice idea, perhaps, but there was no getting around the fact that Americans and America would get older in the years ahead.

A Phenomenon Peculiar to American Society

No one was more aware of the realities of aging than senior citizens themselves. As their numbers grew, older people were becoming a more powerful political constituency, something the no. 1 senior citizen of the United States, President Ronald Reagan, well knew. Because of the landmark legislation that came out of the second and third White House conferences on aging, much anticipation surrounded the fourth as it approached. (Little had resulted from the first such conference held by President Truman—so little in fact that many people were not even aware that it had taken place. The White House had not officially sponsored that conference, another reason for its being largely forgotten.) “A phenomenon peculiar to American society which occurs every 10 years is now taking shape again,” announced Matthew Tayback, Maryland’s director on aging, a good year and a half before President Reagan’s 1981 conference would commence.2

Despite (or, as it turned out, because of) all the planning that had begun during the Carter administration, this conference turned out to be a chaotic one, making it seem more like a particularly contentious political convention. Unlike the three previous White House conferences, which were largely free of politics, the fourth was clearly partisan in nature. Reagan fired those appointed by Carter, setting the tone for the 1981 conference. Social Security, health care, and other programs had become hot political issues since the third conference; these issues also made Reagan’s conference far more divisive than the one President Nixon had held a decade earlier.3 (Reagan had proposed in May 1981 to reduce Social Security deficits by cutting early retirement, disability, and other benefits for future retirees by almost 23 percent. He withdrew that plan in September but called for a bipartisan task force to study the issue, leaving things rather uncertain.)4 Also, a contributing factor to the antagonistic nature of the conference was the decline of trust in the government that had originated during the Nixon administration. “I fear that we are being and have been manipulated and are limited in what we can do,” said one conference delegate from California, not optimistic that any of the proposals that came out of the meeting would be taken seriously by the budget-conscious president.5

Fearing that liberals would react strongly to the administration’s proposed cuts to Social Security benefits and cause some embarrassment to the president (most delegates did indeed lean left), Republican leaders went to extraordinary measures before and during the conference to try to prevent that from happening. “From all indications, the White House was in no mood to risk criticism of its policies,” the editor of Newsday later wrote as the dust settled.6 First, the Republican National Committee carefully screened potential members of the conference advisory board prior to the meeting, selecting only those people who were open to the idea of making cuts to Social Security benefits. Second, Richard S. Schweiker, the health and human services secretary whose department was responsible for the conference, oversaw the adding of hundreds of “mystery” delegates pooled from the president’s campaign donor lists in order to stack the deck of important committees.7 Third, key conference administrators were removed and proposals introduced to minimize debate and dissenting opinions, significant enough “irregularities” for the House Select Committee on Aging to launch an investigation even before the event began.8

Even more draconian steps were taken at the conference itself either to sabotage potential divisiveness or to encourage a positive outcome for the president, a report by the General Accounting Office subsequently found. There was abundant evidence that Reagan’s people employed techniques evocative of old-school political machines or borrowed a page from Nixon’s manipulation of the press. Efforts were made to silence certain elected delegates and speakers, for example, notably eighty-one-year-old Representative Claude Pepper, the Democrat from Florida, who served as honorary cochairman of the conference. The administration also apparently made up fake credentials to crash committees and sessions, and it went so far as to raise the temperature in the auditorium in order to encourage people to leave the room.9 Rumors even circulated that meeting rooms were bugged, not that far-fetched an idea given a recent president’s habit of taping conversations for political purposes.

Happily, the Republicans’ shameful ploys backfired; not one of them was in the least bit effective. The conference thus turned out to be an embarrassing episode for the president after all, reinforcing his image as not especially responsive to the interests of senior citizens despite being one himself. Some were even calling the administration’s failed attempts to get an unfair edge “a mini-Watergate” or “Graygate,” the last thing Reagan wanted to get out of his own conference. Despite the overt infusion of partisan politics, however, many agreed that the 1981 conference was for the most part a success given the consensus that was reached. Some six hundred resolutions were passed, including greater opportunities for older workers, more income support, improvements to Medicare and Medicaid, increased publicly financed housing for seniors, and, most important, the maintenance of Social Security benefits. “Social Security can and will be saved,” Reagan told the anxious delegates at the convention, although it was agreed that general tax revenue would not be used to pay for the benefits.10 While all of these resolutions were just goals at this point, most delegates were happy with what they saw as a solid blueprint for action through the 1980s.11

Maximum Life Span

Some Americans were no doubt surprised that Reagan was not an avid supporter of seniors. The man was, after all, the oldest elected president in the nation’s history, although it was easy to forget that. During his first term, Reagan was a prime role model for “positive” or “productive” aging in both body and mind. Images of the seventy-year-old man riding horses or chopping wood only reinforced the idea that forced retirement at any age was a silly law. “The graceful aging of President Reagan’s body has become a matter of record,” wrote Sandy Rovner of the Los Angeles Times a few months after Reagan took office. The fact that the president had recovered quickly from his March 1981 shooting further proved wrong the many skeptics who had said that he was too old for the job. More than any other American, perhaps, Reagan defied stereotypes about aging, something gerontologists were quite pleased about. Having a septuagenarian in the White House was a powerful vote of confidence in and respect for older people—an all-too-rare commodity in our ageist society.12

In addition to the public recognition of a senior citizen showing few if any signs of advanced age, there was good news on the scientific front of aging. Thankfully, many of the more specious attempts to slow or stop the aging process had by the early 1980s disappeared as it became increasingly clear that no fountain was youth was just waiting to be discovered. Experimentation of all kinds was tolerated if not encouraged during the counterculture years, but now there was more scrutiny of scientists’ claims that they could potentially solve the mystery of aging. Work in the field continued, of course, with much of it centered around the not very sensational role of genetics and cellular processes.

Still, there was the occasional report of an alleged breakthrough in an off-the-beaten-track area of aging research. In 1980, for example, an East German scientist, one Baron Manfred von Ardenne, told the media that his “multistep oxygen therapy” was making elderly people quite peppy and dramatically lessened their age-related ailments. Another scientist making such a claim might have been quickly dismissed, but the seventy-three-year-old baron had an impressive background, including pioneering the development of television and the Soviet atom bomb. He also reportedly held more than six hundred patents, making him appear to be more than just a mad scientist with a lab in Dresden. “It returns the [oxygen] supply to levels normally found in the young,” he explained, with clinics across Europe adopting the treatment for wealthier patients searching for a way to maintain their youth.13

While pumping high levels of oxygen into the bloodstream was no doubt an effective pick-me-up (the therapy is still done today), most scientists saw no miracle cures for aging on the horizon. “No simple therapy for arresting the aging process has been discovered and few scientists expect one,” Caleb E. Finch of the University of Southern California conceded at a 1981 symposium on aging. The phenomenon of aging remained mysterious, much to scientists’ chagrin, but two principal theories, each having something to do with the cellular workings of the body, had emerged. The first was that aging was the result of a number of “errors” that gradually occurred in an individual’s DNA, while the second was that humans got older due to an erosion of the body’s immune system.14 The idea of there being some kind of genetic clock that programmed a person’s life span was beginning to fall out of favor, but in some scientific circles it was still vigorously debated. (Some, along these lines, likened aging to the turning of leaves, with the human body experiencing changes analogous to the triggered breakdown of chlorophyll in certain trees.)15 The consensus, however, was that aging was a lot more complicated than previously believed, making the acceptance of any single theory (or solution) highly unlikely.16

In place of identifying a single, master theory of aging, which no longer seemed viable, scientists drifted toward finding a cause of or contributor to the decline of a particular part of the human body. More scrutiny was being paid to the effects of smoking marijuana, for example, as researchers linked the active agent in the weed to a loss of brain cells in rats. Tetrahydrocannabinol (THC) appeared to affect these cells in the same way as aging, Philip W. Landfield of Wake Forest University had found, although it was too early to say that the chemical compound caused humans’ brains to age prematurely.17

Likewise, most serious scientists were now largely abandoning the formation of a grand plan to add many decades to the average human life span. Instead, they were focusing on a specific condition that could perhaps slow the aging process. One of these was space travel, oddly enough, as research by NASA scientists showed that being weightless required less oxygen and food—each something that demanded the body to expend a considerable amount of energy in order to use. Simply fighting the effects of gravity used up about a third of a human’s calorie intake, and so eliminating that factor would theoretically allow an individual to maintain a lower level of metabolism. Reducing the wear and tear on bodily organs via weightlessness would go a long way toward decelerating aging, the NASA researchers believed, conceding that the practicality of gravity would be with us for some time.18

Paradoxically, while scientists were retreating somewhat in the area of dramatic life extension, authors exploring the topic were moving full steam ahead. Americans appeared to remain quite interested in adding many years to their lives despite recent research indicating that achieving such a thing was a long shot at best. Durk Pearson and Sandy Shaw’s Life Extension: A Practical Scientific Approach, for example, was on the New York Times bestseller list for months in the early 1980s. The book suggested that readers should, like the coauthors, gobble up huge quantities of vitamins and supplements to radically extend their lives. A host of other books about how readers could possibly live significantly longer were being published, including Roy L. Walford’s Maximum Life Span, Saul Kent’s The Life-Extension Revolution, John A. Mann’s Secrets of Life Extension, Kenneth R. Pelletier’s Longevity: Fulfilling Our Biological Potential, and Osborn Segerberg Jr.’s Living to Be 100.19 Despite the popularity of such books, the promises contained within them were definitely more sizzle than steak. With the possible exception of severe calorie reduction, there was precious little evidence at this point that an individual could extend his or her life at all except by the tried-and-true methods of eating a balanced diet, doing regular exercise, and avoiding unhealthy activities like smoking.

Not everyone was even sure that achieving a “maximum life span” for humanity was such a good idea. The science fiction trope of a society whose members lived half of their very long lives as old people lingered over the efforts of scientists still intent on pushing the limits of our biology. Walford, who was also the author of The 120-Year Diet, was a strong advocate for life extension via “undernutrition,” the only path that so far appeared to potentially offer a big leap in longevity. Some gerontologists, however, were cautioning his vision of people living twice as long as they currently did by dramatically restricting calories. Leonard Hayflick of the Center for Gerontological Studies at the University of Florida, for example, imagined such a society not unlike that depicted in Jonathan Swift’s Gulliver’s Travels, where citizens (the Struldbrugs) never died but became increasingly disabled. Be careful what you wish for, he and others of a similar bent warned those scientists continuing to maintain that the average human life span could and should be one hundred years or more.20

Of course, a fair number of individuals were reaching the century mark without nearly starving themselves or popping dozens of pills a day. (Doing so famously earned some a mention on NBC’s Today show by the ebullient weatherman Willard Scott.) About twenty-five thousand Americans were aged one hundred or older in 1986, according to the NIA, with more than a hundred thousand centenarians forecast by the year 2000. Roughly half lived in households by themselves or with others, with the other half residing in group-care situations. Interestingly, the percentage of centenarians varied widely in different parts of the United States. The percentage of 100+ers in Hawaii was almost twice that of Washington, D.C., for example, suggesting that lifestyle or climate played a role in longevity. (Simple genetics was likely much more responsible.) Whatever the factors for living to an extraordinarily long age, it was clear that there would be many more centenarians in the future. “If the trend continues, we need to rethink our definitions of young-old, old, and old-old,” the UCLA/USC Long Term Care Gerontology Center noted—exactly what would indeed take place over the next few decades.21

More than anyone else, perhaps, gerontologists understood that aging did not need an aggressive push by science, at least from a social sense. Simple demography was already dramatically extending the mean age of Americans, in the process triggering a host of challenges that would only intensify in the years ahead. Quality of life was more important than quantity, they believed, making the thought of a few decades more of old age for the average person something to be terrified of more than to wish for. By the mid-1980s, the long quest for discovering a scientific fountain of youth in order to indefinitely postpone mortality had been largely eclipsed by the rather pedestrian but much more realistic effort to have more Americans live longer and healthier lives. Developing a stronger immune system in older people was key, they agreed, as it was typically the ability to fight off disease and infections that allowed an individual to reach a ripe old age.22

This did not, however, stop marketers from offering consumers a wide range of products said to have some kind of antiaging properties. Many anti-aging products had hit the marketplace over the years despite there being no evidence that any of these skin creams, vitamins, nutritional supplements, minerals, and “power” foods did anything to slow the aging process.23 Not surprisingly, the Food and Drug Administration (FDA) was not pleased with how companies were marketing products with “anti-aging” or “skin rejuvenation” promises, as such claims would classify them more as drugs than cosmetics. “Each Niosome sphere mimics the support structure of younger skin and carries active anti-age agents into the skin’s inter-cellular structure,” consumers were told on one Lancôme product label, scientific double-talk that sounded good but had little or no basis in fact. Through the eighties, more marketers such as Clinique jumped on the antiaging bandwagon, forcing the FDA to take more aggressive action. In 1987, the agency issued twenty-three regulatory letters to firms including Revlon, Estée Lauder, and Avon, warning them that they better have hard evidence to make the claims they were making in advertising and on labeling.24

The fact was, however, that consumers were eager to purchase anything that offered even a remote possibility of antiaging. Simply accepting the physical signs of aging was commonly viewed in the competitive 1980s as a kind of defeat, especially among women of a certain age. “’Growing old gracefully’ is apparently out of fashion,” noted Ellen Goodman in her syndicated column in 1988 after surveying a stack of women’s magazines. A product called Retin-A with alleged wrinkle-smoothing properties was seen as something of a magical potion among women over forty, just one of various elixirs that those averse to aging naturally kept in their medicine cabinets. Unlike the FDA, women couldn’t care less whether Retin-A and similar products were considered to be a cosmetic or a drug; all they cared about was whether the stuff could make them look five or, even better, ten years younger.25

A Different Species

One did not have to be a marketing genius to know that demographics were on the side of companies trying to sell antiaging products. In 1950 the average age of Americans was around twenty, while in 1980 it had risen to about thirty. In another fifty years, the average age in the United States would be around forty, demographers were predicting, a bubble that would undoubtedly impact not just the marketplace but also education, health care, and even the way homes and cities were built. Given that the very makeup of the United States would be fundamentally altered, many sensibly argued, it was time for our perception of older people to be modified in an equivalent manner. Americans had to “integrate the aging into the full life of society,” remarked Robert C. Benedict, the U.S. commissioner on aging, in 1980, calling for the end of the widespread stereotyping and discrimination of older citizens.26 Carroll L. Estes, a professor of sociology and director of UC Berkeley’s Aging Policy Center, felt similarly. “Our perceptions and policies define old people as a major societal problem but, in fact, it is society’s treatment of old people that is the primary problem,” Estes wrote the following year; she saw Americans’ equation of aging with decline and dependence as the root of the issue. Research showed that chronological aging caused no major changes in personality or behavior, meaning there was little or no basis for our common practice of distancing ourselves from older people.27

Other experts chimed in on what they justifiably believed was a distorted view of aging in America. Rose Dobrof, a pioneer in social work and director of Hunter College’s Brookdale Center on Aging, argued that our muddled perspective had a ripple effect that compounded the problem. Young people naturally became fearful about getting older because of the way we characterized aging, she astutely held, thinking that our discomfort with the subject also served as an unhealthy divisive force. It “widens the sense of difference and distance between the generations,” she said in 1982, leaving older people in a kind of no-man’s-land within society. Aging was one of the most basic and obvious facts of life, yet it was clear that major misconceptions surrounded it. For example, it was safe to say that a good number of people considered aging to be a state when it was really a process, a basic misunderstanding that likely led to many others. No one woke up “old” one day, Dobrof explained, although this would come as news to young folks conditioned to regard aging as a peculiar condition limited to a particular segment of the population.28

Because children and teens often had little contact with older people (and due to the simple fact that becoming old was further into the future), aging was generally an alien concept for the former. In one’s youth, the elderly “seem like a different species,” Walter Goodman of the Chicago Tribune posited, a good way of describing not just how young people viewed seniors but also the loss of humanity that aging brought on.29 Some authors of children’s books, such as Richard Worth, apparently agreed. His 1986 book You’ll Be Old Someday, Too introduced the subject of human aging to students between grades 6 through 9 in a valiant attempt to make older people seem not so foreign to them. In half a century or so, Worth explained, young readers would be old themselves, a nice way of personalizing the rather abstract idea of aging.30

Aging in America

Подняться наверх