Читать книгу Aging in America - Lawrence R. Samuel - Страница 9
Оглавление1
Old in the Country of the Young
America today faces a great paradox: It is an aging nation which worships the culture, values, and appearance of youth.
—American Catholic Bishops, 1976
In August 1976, just a month or so after the nation celebrated its bicentennial, the American Catholic bishops released a statement that revealed a key insight into one of the country’s most important issues. Rather than view aging as “an achievement and a natural stage of life with its own merits, wisdom, and beauty,” the religious organization’s statement read, Americans preferred to look to the carefree lifestyle and rebellious ways of young people as inspiration. “Our culture appears to be unhappy and uncomfortable with old age,” said Patrick Cardinal O’Boyle, a retired bishop from Washington, D.C., noting that negative attitudes toward anyone who was not young had intensified over the past decade.1
This sentiment, perhaps best captured by the popular counterculture phrase “Don’t trust anyone over thirty,” was reflective of the country’s rather recent aversion to aging. To be “old in the country of the young,” as Time magazine expressed it in 1970, was to feel like an outsider in one’s own home, an ironic state of affairs given that older citizens had of course been Americans longer than younger ones. Born around the turn of the twentieth century, the nation’s current crop of seniors had served on the front lines of the Depression and World War II, even more reason why their treatment could be considered unfair and even immoral. From a social sense, at least, the so-called generation gap was decidedly in favor of those on the younger side of the fence; those on the older side were typically cast as not in tune with where the country was and was heading. People over sixty-five (the mandatory retirement age at the time) were deemed as having little or nothing to contribute, making many Americans loathe the very idea of aging. (The famous line “Hope I die before I get old” from the 1965 song “My Generation” by the Who could be considered an anthem of the times.) Over the course of the late 1960s and 1970s, aging began to be seen as nothing less than an enemy, something that should be kept at bay or, if at all possible, be wiped out entirely.
Against this backdrop, it was not surprising that science launched a full-scale attack on aging, with some of the country’s best and brightest dedicating themselves to making old age a thing of the past. Defenders of America’s increasing older population pointed out, however, that aging was a social problem rather than a biological one, and they urged seniors to fight for equal rights just as other marginalized groups had recently done with considerable success. Politicians of the time, including President Nixon, recognized the clout older citizens held and appealed to them as an important voting bloc. While legislation to help seniors live better lives would certainly be helpful, it was proving to be difficult or impossible to change Americans’ feelings toward aging in general. Only the gradual recognition that a large older population would in the future become a major economic problem seemed to capture people’s attention during this era, as aging in America took a historic turn for the worse.
The Curious Property
Consistent with the thinking in the mid-1960s that social ills could be eliminated if Americans set their collective mind to it, many concluded that the best way to solve the problem of aging was simply to make it go away. Scientists enlisted in the cause of aging in considerable numbers, seeing it as a frontier that could first be discovered and then conquered. In 1966, for example, the president of the American Chemical Society challenged fellow scientists to join him in what promised to be one of humankind’s greatest pursuits. A “fountain of youth” was just waiting to be discovered, William J. Sparks announced at the annual meeting of the society, with chemistry the means by which to realize this long-sought dream. More specifically, he explained, it was the kind of chemistry that created plastics and synthetic rubber, firmly convinced that human aging was caused by molecules not unlike those manipulated to produce these scientific wonders. While admittedly a controversial idea, Sparks’s molecular theory was representative of the sort of bold approach that needed to be taken if science was to achieve the very real possibility of humans living much longer and much healthier lives. The study of aging as a whole was vastly underfunded and underresearched, he and others at the conference agreed, urging the federal government to devote much more money and effort to solve what was perhaps our most enigmatic puzzle.2
Unless or until the United States Government launched a full-scale war against aging, however, it was up to individual scientists, most of them industry personnel or university professors, to try to crack the code of how and why humans became older. Extending the human life span was tricky enough, but some researchers in the field were interested only in lengthening the period of life that preceded old age, a view the fountain of youth had engrained in the popular imagination. The Human Genome Project was decades in the future, but already a few scientists theorized that more knowledge about the structure of DNA could lead to the ability to manipulate cellular processes, including those having to do with aging. Others were focused on environmental factors that perhaps caused the body to eventually become, in layman’s terms, allergic to itself. What we interpret as the signs of old age were actually the physical wreckage left by antibodies that had attacked their host, researchers such as Roy L. Walford of the UCLA Medical School suggested. Preventing, reducing the number of, or repairing these mutations was the means by which to prolong the prime of life, Walford held, one of a growing number of scientists investing a good deal of time in unraveling this especially complex riddle.3
With Cold War rhetoric lingering in the mid-1960s, it was not surprising to hear antiaging efforts expressed in aggressive, sometimes militaristic language. Also in 1966, Bernard L. Strehler of the National Institutes of Health (NIH) gave a talk at the New York Academy of Sciences called “The New Medical and Scientific Attack Against Aging and Death”—a fair representation of the kind of approach that was seen as needed to achieve victory. “An understanding of the biology of aging is within reach of this generation,” Strehler told the scientists, quite typical of the self-assured, rather audacious thinking of the time, where anything seemed possible. Breaking the current life span barrier of seventy to eighty years could be seen as analogous to other contemporary great scientific and technological feats like landing on the moon or harnessing atomic energy, with two things—good brains and loads of money—required to get it done.4 At the very least, many scientists agreed, the human life span could and should be extended by a few decades. Scientists at the California Medical Association, for instance, believed that we all should be living a hundred to 120 years, barring the onset of a rare disease or an accident that prematurely cut life short.5
Even if it was almost entirely speculation at this point, such theories seemed feasible and promising to journalists covering the scientific beat. “It is not inconceivable that scientists someday may be able to control human life to the extent that old age and senility are all but eliminated,” wrote Harry Nelson, medical editor of the Los Angeles Times in 1966. Nelson envisioned a normal period of childhood followed by “a 60- to 70-year-long plateau of maturity, health, and high performance,” cutting out the two decades or so of old age. (He did not explain how and why death would suddenly occur after so many years of wellness.)6 Other scientists were meanwhile focusing on a single aspect of physical aging, hoping that it would lead to an understanding of the overall process. Arthur Veis of Northwestern University’s medical school was intent on discovering the cause of skin wrinkles, for example, thinking that wrinkles could possibly point the way to why the human body chose to get older in general.7 Scientists were frankly perplexed by the whole notion of aging because it contradicted what was widely recognized as nature’s most powerful instinct: to survive.
While individual scientists at corporations and universities pursued their particular line of research, the NIH, an agency of the federal government, stepped up its efforts to combat aging. In 1968 (nearly three decades after Nathan Shock, a pioneering gerontologist, began his intramural lab in the Baltimore City Hospitals), the NIH swung open the doors of its brand new Gerontology Research Center in Baltimore, the largest federal facility for aging research in the United States. The goal of the center, which had been founded by Shock and others in 1941, was to uncover “the mysteries and problems of aging that have perplexed our philosophers and scientists over the years,” said William J. Cohen, secretary of the Department of Health, Education, and Welfare (HEW), who dedicated the new building.8 Millions of dollars of taxpayers’ money were going toward winning the war against aging, locating the initiative within the public arena. Indeed, the dream of solving the aging problem went far beyond the halls of science, crossing over into popular culture in the late 1960s. A 1968 episode of The 21st Century, a television series hosted by Walter Cronkite, for example, explored the phenomenon of aging and the possibilities of prolonging human life. “Can we live to be one hundred?” Cronkite asked, visiting the NIH and the Institute of Gerontology in Kiev in the Soviet Union, to try to determine the likelihood of most of us reaching the century mark. (Cronkite himself would make it to ninety-two.) Organ banks would be one way to keep people alive longer, the scientists told Cronkite, as would germ-free “life islands.” By enclosing hospital beds in plastic packages, patients would be protected from infection, a not uncommon cause of death.9
Saving lives through such interventions would of course extend longevity but did not directly address the actual aging process. Slowing or stopping the body from getting older in a biological sense required something truly miraculous, but that did not prevent scientists and physicians from trying to make such an amazing discovery. Like today, many medications designed to treat a particular condition were considered to also possibly have antiaging properties, each one (briefly) entertained as perhaps the much sought after wonder drug. One such drug was sodium warfarin, which was commonly used to keep blood from clotting in veins and arteries. After prescribing that medication to a number of older people, a physician, Arthur G. Walsh, noticed that his patients’ mental and physical conditions improved, cause enough for him to report his findings in a 1969 issue of the Journal of the American Geriatrics Society. The narrowing and hardening of arteries in the brain played a significant part in the aging process, Walsh proposed, meaning anticlotting drugs like sodium warfarin might be a way to dramatically extend the human life span.10 Although in hindsight this could be seen as much too big of a leap to make based on the limited evidence, it was a prime example of the race to find a cure for aging muddying scientists’ normally clear thinking.
With aging now a highly visible issue, notable scientists from around the world made their voices heard on the subject. One of the leading advocates of antiaging in the late 1960s and early 1970s was the British scientist and physician Alex Comfort. Just a year before he published his hugely successful Joy of Sex, in fact, Comfort was considered in academic circles quite the expert on aging. While he often served as a much needed voice of reason, Comfort did firmly believe that scientists would soon figure out a way to extend the human life span by another fifteen years. (He had actually been researching and writing about aging since the early 1950s and was the author of a pair of books on the subject, Aging: The Biology of Senescence and The Process of Aging.) Over the past century, he explained in a 1971 issue of the scientific journal Impact of Science on Society, great strides had been made in preventing premature death. Little or no progress had been made in getting already old people to live longer; this was the next logical frontier. The “clock” of aging needed to be first found and then adjusted, Comfort proposed, with scientists’ determination to do just that increasing over the previous few decades. Extending the lives of mice through calorie reduction had already been achieved, he pointed out, making this approach the sensible one to explore with humans. Science would solve the aging problem before it found a cure for cancer, he felt, although preventing aging could very well lead to preventing cancer.11
While as knowledgeable as anyone about aging, Comfort freely admitted how little scientists like him really knew about the process. Aging was “the curious property that makes us more likely to die the older we become,” he wrote in 1972, as vague a definition as one could put forth. Eliminating some causes of premature death was a piece of cake compared to extending the human life span beyond a hundred years old, he explained; it was becoming increasingly apparent that science had hit a kind of biological wall that prevented this next leap. But space travel too had until recently been just a farfetched dream, a good way of justifying all the effort that was going into what seemed like the stuff of science fiction. The most challenging piece of the puzzle would be to increase the average human life span without lengthening the period of old age, something that even the most ardent of antiagers admitted would test the limits of science. Modern medicine (and, even more so, public health measures) had already added a few decades of old age to the human life span, and the consensus was that tacking on a few more decades was a very unpalatable prospect indeed.12
A Bad Press
The aggressive effort to discover the cause of aging and then try to delay it as much as possible was directly correlated with the rise of youth culture in the late 1960s. Younger people in America gained social status at the expense of older people during the counterculture years, a fact that did not go unnoticed by leading gerontologists of the time. “How much youth fixation can a culture allow?” asked Bernard Coughlin in 1969, his question perhaps influenced by the media frenzy surrounding the recent Woodstock Festival. Coughlin, dean of the St. Louis University School of Social Services, was attending the International Congress of Gerontology, a group of researchers from forty nations who studied some aspect of aging. (Practicing gerontologists had little interest in the scientific effort to extend the human life span; rather, they focused on improving the lives of people in their later years.) Some kind of “generation gap” could be expected in any society, Coughlin noted, but the current one in the United States was excessive. Youth was being overvalued and age undervalued, he felt, keenly recognizing the historic shift that was taking place. Another attendee at the conference, Walter Walker of the University of Chicago, argued that older people in the country now made up a “minority group,” their social status analogous to that of people of color and other oppressed groups. Unlike African Americans, women, and even farmworkers, however, seniors had no movement to support their civil rights, reason enough for Walker to urge them to organize in order to gain political and economic power.13
Walker’s wishes were answered the following year in Philadelphia with the formation of the Gray Panthers. The mission of the organization (whose name was inspired by the militant Black Panthers) was to dispel stereotypes about older people and to influence legislature affecting them. There were about eight thousand Gray Panthers in the United States by the nation’s Bicentennial, each member committed to fighting prejudice against older people and to bringing attention to their cause. Activism centered around the “three H’s”—health, hunger, and housing—with much of it directed at the Ford administration’s budget cutting as it dismantled a good part of LBJ’s Great Society programs.14 The so-called medi-gap in health insurance was a particular problem, with Medicare covering fewer medical expenses than when the program began a decade earlier.
The marginalization of older Americans could be seen as the result of many different factors. One was the medical approach of treating aging as if it were a disease, popularizing the idea that all older people were somehow chronically ill. Such a view helped to turn age into a social problem, and served as a wellspring of negative attitudes toward anyone who was not categorized as young (again, typically less than thirty years old). Carl Eisdorfer, a psychiatrist at Duke University, which had a major gerontology center as part of its medical school, believed older people were generally seen as an ecological problem, like no deposit, no return bottles, accounting for why they were considered disposable. Instead, he proposed in 1971, the twenty million Americans over sixty-five could and should be “recycled” rather than simply be discarded. Eisdorfer went further, arguing that the nation’s universities should reinvent themselves from being degree-granting institutions for young people into resources offering lifelong education for all (still a good idea, I believe). Contrary to popular belief, research showed that older people were capable of learning new things, and did not deserve to be thrown onto a kind of trash heap as soon as they reached retirement age.15 Some textbooks included charts indicating how intelligence declined with age, however—which was completely wrong information given recent studies showing otherwise.16
Indeed, a study published in Psychology Today that same year did suggest that older folks could be recycled. Two collaborating psychologists, Robert Kastenbaum of Wayne State and Bernice Neugarten, now at the University of Southern California, studied two thousand people between seventy and seventy-nine years old and found that many of the widely held beliefs about this age group were untrue. The majority of these septuagenarians were mentally competent, adaptable, and active, Kastenbaum and Neugarten reported, not at all the senile oldsters living lives of quiet desperation that many would have expected. Most surprising, perhaps, many of those in the study were sexually active, even more so than when they were younger. (Retirement communities and nursing homes were and remain ideal settings for residents to hook up. It was not unusual, however, for residents in nursing homes to be scolded for making sexual advances toward someone of a similar age; the younger employees considered it unnatural or, perhaps, potentially harmful.) Kastenbaum and Neugarten’s group also tended to be quite liberal in their views, dispelling the myth about the crotchety, prudish senior firmly set in his or her ways. Besides offering valuable, counterintuitive insights into an underresearched segment of the population, the study helped to illustrate the cultural bias against aging in this country. Americans in general “[have] an irrational fear of aging and, as a result, maintain a psychological distance from older persons,” the coauthors perceptively wrote; their findings showed that seventysomethings were a lot more like the rest of us than we liked to believe.17
Aging as a whole was often viewed in the United States as something that happened to other people when, of course, it was, like birth and death, a universal experience. The aversion to, even hatred of, older persons was all the more peculiar given that everyone would become one of them if he or she lived long enough. (The same could not be true of racism or sexism, as people did not change color or, with few exceptions, gender.)18 Other ways in which Americans distanced themselves from aging was to think that individuals turned into different people when they got older, or that the process took place quite suddenly. A person was young and then boom! he or she was old, this notion went, a completely inaccurate reading of how humans actually (that is, gradually) age. (Also, from a biological standpoint, each body part aged at a different rate, depending on the individual, meaning there was no single physical process of aging.) Grouping people into an anonymous mass of “old people” was equally silly but not uncommon; seventy-year-olds were just as individualistic as thirty-year-olds (if not more so given they had had more time to develop their unique personalities). Finally, older people did not remain in a constant state of “oldness” but continually changed, another fact that anyone younger than middle-aged might find hard to believe or accept.19
No one agreed more with such thinking than Sharon R. Curtin, who was downright angry about the way older people were treated in America. In her 1972 book Nobody Ever Died of Old Age, which was based on an article she had written for the Atlantic, Curtin looked at how seniors were deliberately separated from the rest of the “productive” population. Whether placed in nursing homes, retirement communities, or what were called at the time “geriatric wards,” she argued, those in the third act of their lives were now regarded as outsiders. Things were not much better for those simply left on their own to collect their meager Social Security checks and to fend for themselves as best they could. “We live in a culture that worships youth and denigrates the aged, no matter how honorable their past contributions,” Curtin wrote, like others labeling this a shameful aspect of American society. As she pointed out, however, this was a relatively recent phenomenon, with grandparents much more likely to be welcomed into an extended family setting up to the 1950s. Even spinster aunts could often be found living with their families before the Second World War, a situation that would be viewed as quite odd in the 1970s, when households had shrunk considerably. With no responsibilities assigned to them or expectations of them, older relatives were now often seen as burdens to their families, a sad situation for all parties. Although things were rapidly changing in Europe and Asia, grandparents were still typically welcomed into the homes of younger relatives in foreign countries, a reflection of the greater recognition and respect given to older generations.20
Although the discarding of a large segment of the population was very real, older Americans could be said to have had, in public relations terms, an image problem. Eric Pfeiffer, another Duke psychiatrist, believed aging in the United States had received a bad press, meaning the media had been unfairly critical of older people. Aging was popularly perceived as a “hopeless, unremitting downward drift, an image of despair, deprivation, disease, poverty, and social isolation,” he claimed, not at all reflective of reality. Instead, he suggested, Americans should look to the many “successful agers” for inspiration, people like comedian Jack Benny (aged eighty in 1974), Supreme Court Justice William O. Douglas (aged seventy-five), and Senator Strom Thurmond (aged seventy-one). Thousands of ordinary people were living similarly active and productive lives, he maintained, not at all the impression one would get from reading newspapers and magazines or watching television.21
Arthur Flemming no doubt thought similarly when he told a congressional subcommittee a few years later that forcing people sixty-five years old to retire was a clear act of discrimination. “We should never deny anyone the opportunity for employment simply because of age,” the seventy-one-year-old head of the U.S. Commission on the Aging declared, comparing mandatory retirement to illegal biases of sexism and racism. (Despite new legal statutes, it should be noted, there are more litigations today based on the Age Discrimination in Employment Act of 1967 than on women’s charges of employment inequities.) Making people stop working when they reached a certain age was reflective of America’s attitude that older folks should be placed “on the shelf,” Flemming stated, a view that often resulted in real psychological damage to new retirees. (A full quarter of suicides in the nation were among those sixty-five and older, in fact.) Flemming, who had served as the secretary of HEW from 1958 to 1961 during the Eisenhower administration, linked this high number to the “traumatic experience” of mandatory retirement. Almost one-third of retired people over sixty-five would indeed prefer to be working, according to a 1974 Harris poll, more evidence that current labor policies were out of sync with the wishes of a sizable percentage of older Americans.22
Flemming’s vigorous appeal appeared to work. Following the passage of the Age Discrimination in Employment Act, Congress lifted the mandatory retirement age to seventy in 1978. (Congress’s decision to bump up the age by five years was not completely altruistic, as the move lessened Social Security payouts, at least in the short term.) Moving back the mandatory retirement age made even more sense given that many older people had to continue working beyond age sixty-five whether they wanted to or not. By the late 1970s, galloping inflation had effectively shrunk the value of savings and pensions, forcing those who had planned to retire to keep their jobs for a few more years. Inflation was also making older people rely more on Social Security as their personal nest egg lost value, increasing the burden on younger people to support them. Experts were forecasting that the number of Americans aged sixty-five or older would balloon from 18 percent to 30 percent of the nation’s adult population by 2025, early signs that the baby boomer age wave could perhaps crash the economic system. Older people continuing to work was one of the best things that could happen to the economy, in fact, making federal policies encouraging early retirement contrary to the nation’s long-term interests.23
A Revolution in Our Thinking
Lessening discrimination against older people in the workplace through legislation can be seen as one outcome from a process that had begun in earnest about a decade and half earlier. History was made on April 1, 1965, when, by a vote of 394 to 1, the House passed a bill to create an administration on aging within HEW. The bill still had to go to the Senate for approval, but with the nearly unanimous vote (the sole nay was from Representative Dave Martin, a Republican from Nebraska) it seemed clear that Washington intended to improve life for those sixty-five years or older (10 percent of the population). An initial $17.5 million over two years would go toward providing retirement income, health care, housing, and job opportunities for this group of Americans; the agency would coordinate current such efforts that were scattered across many departments.24
With the election of Richard Nixon in 1968, however, much of the enthusiasm behind LBJ’s Great Society began to seriously fade. As part of his ambitious tax-cutting measures, Nixon pared back some of the domestic programs created during the Johnson administration, including those earmarked for older citizens. Over the course of his first term, however, he frequently mentioned in speeches his commitment to improving the lives of older Americans, giving seniors hope that the federal government would direct much needed funding their way. Much excitement thus revolved round the third White House Conference on Aging as it approached in late 1971. The first such conference, initiated by President Truman, was held in 1950; the second was hosted by President Eisenhower in 1961, out of which ultimately came Medicare, Medicaid, and the Older Americans Act of 1965. The Nixon administration had previously held official conferences on hunger (1969), children (1970), and youth (1971), and it realized that much work still needed to be done regarding the nation’s large and growing aging population. This conference was considered important enough for the U.S. Postal Service to issue an eight-cent stamped envelope to commemorate the event.25
Although critics correctly noted at the time that the administration’s three previous conferences were much more talk than action, the expectations for this one were high. The goal of the conference was, in President Nixon’s words, to work out a “comprehensive national policy” for older Americans, something that did not come of the first summit.26 Some thirty-four hundred delegates would converge on Washington, D.C., in November 1971, each of them committed to pressuring the president to pass measures that would help older Americans. More than twenty million citizens were now aged sixty-five or older, obviously a large contingent whom any politician wanted to have on his or her side. Older people also consistently voted in great numbers, even more reason to try to win them over.27 Skeptics noted that it was not a coincidence that the conference was held just as the 1972 presidential election was heating up, however, and warned seniors thinking their ship would soon come in to curb their enthusiasm.
Democrats, especially Frank Church from Idaho, were cautiously optimistic that the president’s call for this conference was more than just about getting reelected. Church was chairman of the Senate Special Committee on Aging, which was an outcome of the second White House Conference on Aging. Church was a staunch defender of the rights of older Americans, and he made it clear that this third conference was an ideal opportunity to make real progress. “I think there is no country, that has the means that we do, that has done as badly in providing for the elderly as we have here in the United States,” he said on the eve of the conference, calling our performance in this area “one of the greatest travesties of the contemporary American way” and “one of the most conspicuous of our failures.” Church and his colleagues on the Senate Special Committee were fully aware that getting the president to agree to all their demands would be an uphill battle given his record (especially as there was at least as much interest among politicians in capturing the youth vote), reason enough for them to turn up the heat as the meeting approached.28
Another politician friendly to the cause of aging was Representative James A. Burke (D-MA), a member of the House Ways and Means Committee. In one speech leading up to the conference, Burke proclaimed that there was a need for “a revolution in our thinking about the elderly,” just the kind of thing the delegates and seniors themselves wanted to hear. Some kind of national policy on aging was expected to result from the conference, although both the president and Congress would have to agree to whatever was presented to them. Hopes were high that a bill creating a heavily funded federal department of aging with a cabinet-level secretary would ultimately result from the conference—only this was likely to lead to the revolution Burke had in mind. If affordable health care had been the biggest outcome of the second conference, however, financial assistance was clearly the goal of the third. Many retirees lived in poverty or close to it, making money the central issue for older Americans.29 The high inflation rate of the early 1970s was hitting anyone living on a fixed income hard, with those subsisting solely on Social Security especially vulnerable to rising prices.30
The president was well aware that this third White House Conference on Aging was an opportunity to repair his image when it came to federal policies related to aging. Over the course of his first term of office, Nixon had not only cut programs related to aging but also vetoed a host of bills that if passed would undoubtedly have made life better for older Americans. Making comprehensive amendments to the Older Americans Act, giving more power to Flemming within the administration, and awarding big grants to states were just a few proposals that the president had killed, earning him a much deserved reputation as no friend to seniors. This was clearly his greatest chance to remedy his standing with seniors, a smart thing to do, if only for political reasons.31
In words at least, Nixon acted quickly after the White House Conference on Aging, an obvious move to silence his many critics. The biggest news was that he would multiply the budget of the Administration on Aging by almost five times to $100 million, vastly improving services for the elderly should the Democrat-led Congress approve. Making Social Security benefits inflation proof was another promise he made, something that would be hugely beneficial if it too became a reality. Conference delegates were elated by the president’s intentions, with virtually everything they had wished for included in Nixon’s plan.32 During the 1972 campaign, Nixon understandably continued to appeal to older voters whenever he had the opportunity to do so. In a speech to Congress in March of that year, for example, he offered a five-point strategy to bridge what he called “the generation gap between those who are over 65 and those who are younger.” It was clear from his speech that he had relied heavily on Flemming to form the strategy, which focused on income and living conditions for older Americans. What was not clear was how such measures, while certainly helpful if enacted, would bring older and younger people together, making his message more campaign rhetoric than anything else.33
Nixon continued to mention the “generation gap” in speeches during his second term, however, seeing it as a highly charged term that made him appear to be a uniting rather than a divisive political force. “We cannot afford a generation gap which shuts out the young in this country but neither can we afford a generation gap that shuts out the old,” he said in an October 1972 radio address, calling on all Americans to develop a new attitude toward aging. He was especially proud to announce that Program Find, a program his administration had launched, had so far been a success. Some three hundred thousand “isolated” older people had been “found” and were now receiving the help they were entitled to but otherwise would not get, the president claimed, an example of his commitment to those over sixty-five. Riffing on “black power”—a bit ironically given his often contentious relationship with leaders of the African American community—he proclaimed that “senior power” was a valuable resource for the country, a prime example of his efforts to hold together an increasingly fragmented nation.34
The Walking Wounded
While Washington moved slowly toward passing measures designed to better the lives of older Americans after the historic White House conference in 1971, experts in some aspect of aging continued to get together to discuss what else could and should be done. The Annual Conference on Aging, which had been sponsored by the University of Michigan since 1948, was a central gathering place for gerontologists and others interested in the social and cultural dynamics of aging in America. Susan Sontag, the author and critic, made an appearance at the 1974 conference, for example, quite fitting given that this meeting was devoted to issues faced by older women. In her speech, entitled “The Double Standard of Aging,” Sontag discussed the inequalities that existed between older men and older women, an expression of the vast gender bias of the time. While aging men simply met their natural fate as they got older, she argued, women were seen as facing a losing battle and were socially chastised for showing the physical signs of age. Given this view, she asked, was it surprising that many women lied about their age and did everything they could to preserve their youthful looks?35 To Sontag’s point, an increasing number of fiftyish women reaching menopause were seeking estrogen therapy in the hope that it would keep them looking and feeling young. The body stops producing estrogen during menopause, making many women (and some physicians) think that continuous, large doses would delay the effects of what was commonly called the “change of life.” The research was unclear, but most doctors hesitantly provided the hormone to their patients when pressured to do so.36
The limits some women and men would go to try to slow the aging process were sometimes extreme. Visitors returning from Romania (and some other countries) in the late seventies were bringing in a drug called Gerovital H3 that purportedly had antiaging properties. The drug was not approved in the United States, and for good reason: controlled trials showed that it did absolutely nothing to improve the physical or mental health of patients.37 This was not the first time Romania was believed to the source of some magical potion created for modern-day Ponce de Leóns. A dozen years earlier, a biochemist from that country cooked up a “youth cocktail” said to reverse some of the effects of aging. (The key ingredient was cysteine, an amino acid commonly used in food, pharmaceutical, and personal-care products.) A decade or so before that, another Romanian scientist was injecting people with some kind of “rejuvenation treatment,” which also turned out to be a complete hoax.38
Of course, one did not have to smuggle in a drug from a foreign country to antiage, at least cosmetically. Plastic surgery boomed in the United States in the 1970s as Americans sought all sorts of ways to look not just more beautiful but specifically younger. Middle-aged workers, many of them men, were feeling pressure from younger employees as they rose up the corporate ladder, one more dimension of pervasive youth culture. Some men, worried about their jobs in certain hard-hit industries such as aerospace, were especially keen on smoothing out wrinkles. Wives suspecting their husbands were cheating on them with other women (and legitimately so, given the skyrocketing divorce rate) were, meanwhile, getting tummy and bottom tucks. Plastic surgery had long been just for the wealthy, but now ordinary folks, thinking a facelift was just the thing to maintain a youthful appearance, were going under the knife. As with the prescribing of estrogen, some doctors were refusing to perform surgery because a patient only wanted to retain his or her youth, but the genie was already out of the bottle.39
As Sontag suggested, the seeking of medical intervention by ordinary women in order to appear younger revealed the complex relationship between aging and gender. It made perfect sense that some feminists of the 1970s, most of them older women, vigorously addressed issues of “ageism.” (Robert Butler had coined the term in 1969 to describe individuals who faced discrimination and prejudice because they were considered old.) Separate from the obvious matter of injustices based on gender, women usually lived longer than men, extending the amount of time that they would be older people. The odds that a wife would eventually become a widow made getting older a different and often more difficult experience for women. Henrietta Quatrocchi, an anthropologist and gerontologist, considered many older women in America to be “the walking wounded,” with no clear purpose in life after completing their primary role of domesticity. For every Eleanor Roosevelt or Golda Meir, she felt, there were millions of older women who, because of social pressure to do so, assumed an image not unlike that portrayed in the famous painting colloquially known as Whistler’s Mother. It was up to older women themselves to stay active and defy cultural stereotypes, Quatrocchi insisted, suggesting they do things like take karate lessons rather than “do things with egg cartons.”40
Social critics from a variety of backgrounds took a long hard look at aging in America as the subject reached a kind of critical mass in the mid-1970s. Butler’s Why Survive? was a scathing indictment of ageism in America, detailing how older people were consistently socially excluded and routinely taken advantage of. In the spirit of the famous line from the movie Network, which was released the following year—“I’m mad as hell, and I’m not going to take this anymore”—Butler argued that it was up to older Americans themselves to adopt the kind of militancy that other oppressed groups had. Voter registration drives, marches, and whistle blowing were all things seniors could do to strike back against the virulent form of discrimination that was ingrained in American society.41 A number of experts on the subject wrote chapters for the 1978 anthology The New Old: Struggling for Decent Aging, which served as a sort of manifesto for older Americans. Seniors were treated shamefully, the book argued; its contributors made a compelling case that ageism was the last form of segregation in the nation. As with Why Survive? the book also included an agenda for action, urging that older people stand up for their rights.42
Not surprisingly, gerontologists were working the hardest to try to dispel the myths and misconceptions that surrounded aging. “Ageism is an attitude no less destructive than racism or sexism,” said Sue Smolski, a registered nurse in Connecticut who was educating other gerontologists on what was and what was not true regarding older people. First and foremost on her list was that, again, aging was not a disease, something many, including a fair share of physicians, did not understand. In fact, seniors themselves did not so much mind growing older as dreaded getting sick, illustrating the distinction.43 It was a serious illness or injury that often triggered what was commonly considered “old age” and its various problems, many older people clarified, this too not very well understood by those considerably younger. And contrary to popular belief, most seniors did not spend their days preparing themselves for death; they were as directed toward living as anyone else.44
Nurses like Smolski seemed to have keen insight into the dynamics of aging, no doubt because they spent so much time with older people. Another Connecticut nurse serving as a consultant on aging felt that the scientific efforts to solve its mystery were, more than anything, a diversion from having to think about the fact that we were all getting older. The real problem with aging was our inability to accept its reality, Nancy Gustafson told a group of women in Hartford in 1977, explaining that it was more of a social than a biological issue. As Curtin had described in her Nobody Died of Old Age, older people no longer had a recognizable role in family life, Gustafson added, a byproduct of our more disposable and mobile society. (No need for grandma to mend those pants now that a new pair could be easily and cheaply bought, especially if she lived miles away.) If productivity was the main currency of the times, which it certainly appeared to be, Gustafson observed, things did not look good for older people given the physical changes that typically took place at a certain point. This was unfortunate, since most seniors remained as psychologically fit as ever and had much to offer if given the opportunity. “Aging is life,” Gustafson wisely concluded, “the integration of all our experiences here on Earth and with other humans.”45
As Gustafson suggested, the mythologies surrounding seniors were not limited to physical abilities; the mental aptitude of older people was often called into question. The majority of older Americans were not institutionalized as some thought (just 5 percent, most of them over eighty years old, were in institutions in 1977, with a mere 1 percent in psychiatric hospitals), and they did not spend an inordinate time just lying in bed. Depression was fairly common among the elderly, however, a result perhaps of the losses (of friends, family, jobs, and homes) that many endured as they outlived others.46 Oddly, however, relatively few psychiatrists took on older people as patients, viewing them as “unfixable” because of their advanced age. Less than 2 percent of the psychiatrists in the Washington, D.C., area offered therapy to people more than sixty-five years old, for example, and even these were unlikely to prescribe any of the antidepressants available at the time.47 Older people should be depressed, the thinking was; it was a natural state given their reduced position in life and proximity to death.
While few psychiatrists were interested in treating older people, it made perfect sense for those dabbling in the brand of pop psychology so prevalent at the time to address issues of aging. “The September of your years can be exactly what you make of it,” counseled Tom Greening and Dick Hobson in their 1979 book Instant Relief: The Encyclopedia of Self-Help; the pair suggested that seniors counter their negative portrayal in the media through “positive thinking.” Bombarded with representations of people like themselves as having little value or worth, it was up to older folks to avoid seeing aging in a negative light. “With the right attitude, attention, and preparation,” the two wrote, “aging can be experienced as a full and meaningful maturation process, like the ripening of a fine wine or a musical instrument.” No longer having to raise children, climb one’s way to success, or move up the social ladder offered a kind of freedom younger people did not have, they argued, making “the golden years” the ideal opportunity to develop deep self-awareness and wisdom. Most important, perhaps, older people should take responsibility for their own sense of well-being and not see themselves as victims—classic 1970s self-help talk that actually made perfect sense given the profound stereotyping of the times.48
While obviously not a good thing, the dismissal of older people by psychiatrists was in part a function of their training or, more accurately, lack thereof. Medical schools did not at the time provide their students with an adequate education in aging, which accounted also for some physicians’ equation of old age with disease. Few if any courses in geriatrics were required at the nation’s 114 medical schools, with just thirty-one of them offering elective courses in the subject. Also, older people’s tendency to get sick and die was a frustrating fact of life to future physicians being taught to keep patients alive and well at all costs; this was a contributing factor to physicians often displaying a glaring insensitivity to seniors after receiving their degree. Practicing doctors were known to say to a no longer young patient with a particular health complaint things like “Well, what did you expect, you’re old,” not exactly what one would consider good medicine or a good bedside manner. In fact, the acquaintance many medical students had with an old person was as a corpse, hardly a good model by which to offer good health care to a very much alive septuagenarian or octogenarian.49
One of Nature’s Best-Kept Secrets
While medical schools did not do an adequate job teaching students about the bodies and minds of older patients, some in medicine and the sciences continued to target aging as an exciting area of inquiry and research. Corporate and academic funding committees clearly recognized that unlocking “one of nature’s best-kept secrets,” as the Atlanta Constitution described it, would be quite a feather in their organization’s hat. (The financial payoff for achieving such a feat would also likely be enormous.) In 1972, for example, Morton Rothstein, a biologist at SUNY Buffalo, won a million-dollar, five-year grant from the National Institute of Child and Human Development (NICHD) to try to learn the causes of aging and, it was hoped, how to control the process. Rothstein was eager to find a cure for aging should there be one, admitting that his getting close to the age of fifty played a role in his ambitious project. The biologist was very much in the cellular theoretical camp, and he planned to study how nematodes (small worms) aged over their brief (twenty- to forty-day) lives. Unlike humans, nematodes retain the same cells throughout life, offering a comparative analysis of aging at the molecular level that could perhaps reveal the precise reason for why all organisms get older. Rothstein’s initial thinking was that “ineffective enzymes” were the culprit, these faulty molecules triggering a chemical reaction throughout the body, which led to the aging process.50
The startling array of theories regarding aging was in part a result of the limitations placed on scientists. Besides the ethics involved, experiments on aging in humans were less than practical because of our long life spans. An eighty-year study was obviously not feasible, as subjects could very well outlive the researchers. (Alex Comfort quipped that mice were preferable to Ph.D. students as subjects because they died much more quickly.) With studies limited to laboratory animals, little persuasive data had yet to be collected on precisely why organisms aged, making researchers look in unusual places. One promising area was progeria, the rare and tragic disease that prematurely aged infants and usually caused death in the early teenage years. Scientists investigating the process of aging believed they could learn much from children with progeria, although thankfully there were precious few cases reported. (Fewer than sixty could be found in the medical literature up to the early 1970s.) A bit more common was Werner’s syndrome, which was similar in some ways to progeria, but it too did not offer many cases to study.51
There was a busy parade of supposed causes of aging through the 1970s, each one briefly taken quite seriously. Almost as a rule, scientists were over-eager to conclude that their particular line of research was the pot of gold everyone was searching for, perhaps in order to receive additional funding to pursue their work. Glands were often suspected of being at the heart, so to speak, of aging. In 1973, for example, a team at the University of Texas claimed that thymosin, a hormone produced by the thymus gland, was a major factor in the aging process. Blood levels of the hormone decreased dramatically as a person got older, the biochemists reported, hindering the body’s immune system’s ability to combat disease.52 At the very same time, other scientists down the road in San Antonio pinpointed the prostate gland as the thing that explained the mechanics of the aging of the human body. It was the eventual inability of that gland to maintain its normal growth and development that served as the first evidence toward understanding the molecular basis of aging, a pair of researchers from the Southwest Foundation for Research and Education crowed, not addressing how this applied to women, who do not have a prostate.53 Meanwhile, Alexander Leaf, a Harvard internist, claimed that it was the protein amyloid that accounted for some people living so long. As some other researchers had done, Leaf traveled to remote parts of Ecuador, the Soviet Union, and Pakistan to study the relatively numerous centenarians that lived in these areas. High concentrations of amyloid were found in the century-plus locals, making him think that the mysterious substance was the main ingredient of longevity.54
Virtually any and every part of the human body was responsible for aging, according to scientists in the 1970s. Some said that the hypothalamus at the base of the brain contained a central “pacemaker” that instructed the body to get older, one version of the popular “clock” theory of aging.55 That timer in the brain told the pituitary gland to release “death hormones,” an extension of this idea went, killing the body when it was automatically programmed to die.56 Linking specific body parts to aging was not only silly but reminiscent of ancient medical theories such Hippocrates’s belief that certain bodily fluids affected human personality. That any of them were judged to be real possibilities demonstrates how little was actually known about physical aging in the late twentieth century. In his 1976 book Prolongevity, Albert Rosenfield compiled many of the theories about aging and extending life that had surfaced over the preceding decade. (The book’s subtitle—A Report on the Scientific Discoveries Now Being Made About Aging and Dying, and Their Promise of an Extended Human Life Span—Without Old Age—hardly needed extending.) Theories ranged from the sublime to the ridiculous, making anyone with anything close to an objective viewpoint conclude that this field of research was not one of science’s better moments.57
With concern over air pollution (often called “smog” at the time) and other ecological threats a significant part of the national conversation, it was not surprising that scientists included them on their list of potential causes of aging. It was, more specifically, heavy oxidant stresses on cells that led to premature aging and disease, scientists theorized, justification for identifying agents that could protect against such damage. One of them was vitamin E, which was discovered in 1923 but whose precise function remained uncertain. Most experts agreed, however, that the nutrient did serve as a cellular defense against oxidants such as nitrogen dioxide and ozone, atmospheric gases known to be elevated in areas with high air pollution. Studies done at Berkeley in 1974 confirmed as much, bringing more attention to vitamin E as a potential antiaging agent. (Vitamin C was also believed to help delay the aging process.) Scientists stopped short of recommending that individuals gobble up large quantities of vitamin E to combat oxidants and thus slow down the aging process, but there was a consensus that the substance might very well play a major role in halting the decay of human cells. If nothing else, this line of research proved that cells did not self-destruct because of some kind of predetermined biological mechanism, an important revelation.58
Although a minority of scientists was interested only in lessening the effects of old age during a normal lifetime, most equated the battle of aging with extending life expectancy. Over the course of the first three-quarters of the twentieth century, life expectancy in the United States jumped by about twenty-five years, from about forty-seven (46.3 for men and 48.3 for women) to roughly seventy-two (68.8 for men and 76.6 for women), an amazing phenomenon by any measure.59 By the 1970s, however, this dramatic extension of Americans’ average life span was slowing down considerably, a source of frustration to scientists who saw no reason it should not continue at a rapid rate. (The biggest factor contributing to the increased life expectancy was the much higher survival rate of infants and children.) Only a greater understanding of the basic processes of aging could lead to another exponential leap in life expectancy, making what was seen as the “deterioration” of the human body one of the central questions of science. The best genes, nutrition, and environment could not prevent aging, and the finest health care in the world would not prevent death—each a biological annoyance that kept many a scientist up at night. The problem of aging was a main topic of conversation at the 1974 annual meeting of the American Association for the Advancement of Science, with both physicians and academics offering their respective takes on what to do about the disturbing situation. Even a cure for cancer or heart disease would yield just a few more years of life expectancy, one of the attendees pointed out, not nearly enough for what was believed to be very much achievable: a society full of centenarians.60
Like many scientists working in the area at the time, Alan Richardson, a professor of chemistry and biology at Illinois State University, was fairly confident that the average human life span could be extended by a decade or two, or possibly much longer. The key question, he told colleagues at the 1976 meeting of the American Aging Association, was whether aging was genetically programmed, something his own research with rats suggested. Should our genes be responsible for making us get older (specifically by a decline in protein synthesis, Richardson theorized), it would be “fairly straightforward” to slow aging down. “If aging turns out to be a regulatory process at the cellular level, it should be possible to extend the human life span 200 or 300 years or indefinitely,” he claimed, more concerned about the moral implications of a nation of Methuselahs than the obstacles to achieving such a thing.61 Scientists at the National Institute on Aging in Baltimore also pegged two hundred years as a reachable goal, possibly via a vaccination much like the vaccinations used to prevent diseases. “Doubling the lifespan potential is a reasonable objective from what we know now,” said Richard G. Cutler of the institute in 1978, envisioning injecting people with an “aging cocktail” to fool the body into repairing damage to cells’ genetic material.62
For Richardson and others, the symptoms of getting older had been addressed through medicine, but science had yet to confront aging itself; this was the much bigger opportunity and challenge.63 Although no breakthrough had yet come to pass, journalists consistently presented scientists as being hot on the trail of aging, with something big expected soon. “Aging Reversal Is Called Near,” declared a headline in the Atlanta Constitution in 1976 based on some findings reported by Johan Bjorksten. Bjorksten was positively brash in describing his research, calling a news conference at the American Chemical Society to announce what he had found so far. “I’m not interested in gaining five years here and five years there,” he puffed, proud to say that he was “shooting for the whole pot.” For Bjorksten, the “whole pot” was nothing less than an average life expectancy of eight hundred years, with a yet to be discovered enzyme to be the ace up his sleeve. Other chemists were confident the enzyme would soon be identified. With intensive research, “I think we could have the thing done in 5 to 10 years,” agreed Rolf Martin of City University of New York, quite assured that science was on the brink of realizing what many would consider its greatest achievement.64
A Quietly Ticking Social Time Bomb
Lost in the scientific pursuit to add many years to human life was the fact that those intimately involved in the lives of older people already had their hands quite full. Gerontology, a field that had been around in some form since the turn of the century, was booming in the late 1970s as America’s population became noticeably older due to both demographics and life extension. It was the future, however, that presented abundant opportunities for those considering careers in gerontology. There were about 32.7 million Americans older than sixty in 1977 (15 percent of the population), while demographers were forecasting that 41.9 million citizens would fall into that age category in the year 2000 (16 percent of the population). (Their forecast turned out to be spot on, although that number represented about 17 percent of the population.) Many social workers were already switching their emphasis from children to seniors, taking courses in gerontology to take advantage of the expanding field. Hospital administrators, registered nurses, and government employees were also equipping themselves with the educational tools to work with older people, fully aware that both social and economic factors were on their side. At the New School in New York City, for example, about one hundred students were pursuing a master’s degree in gerontological services, a program that covered virtually every aspect of aging. Duke University, the University of Michigan, and Columbia University also offered programs in gerontology, with many other colleges planning to do so soon.65
The leading school in gerontology, however, was arguably the University of Southern California. Since 1973, the university’s Ethel Percy Andrus Gerontology Center had focused on the problems of the elderly and was serving as the prototype for other institutions interested in starting up their own research facilities. (Andrus, a retired school principal, founded AARP in Los Angeles.) For James E. Birren, founding director of the center, the work completed to date in aging was just the tip of the proverbial iceberg, with much more research to be done as additional problems were identified. Given how older Americans were viewed and treated, it made perfect sense that the nation’s top institution in gerontology subscribed to the idea that aging represented a “problem.” All kinds of professionals—sociologists, biologists, psychologists, urban planners, social workers, nurses, hospital and nursing home administrators, and even architects—were required to take care of older members of society, Birren rather condescendingly pointed out, with many more to be needed in the next few decades. “Gerontology is an embryonic field,” agreed Ruth Weg, associate director of training at the center, calling on the federal government to support institutions like hers as the nation’s population aged.66
One need not have been an expert to know that the nation would likely have some kind of crisis in half a century or so when baby boomers entered their senior years en masse. Carl Eisdorfer, the Duke psychiatrist who had become the president of the General Services Administration by the nation’s Bicentennial, called aging “a time bomb” as he looked ahead to the not-too-distant future. Just the next decade and a half was cause for considerable concern given the expected numbers: the sixty-five-plus demographic was forecast to almost double over the next fourteen years, from twenty million in 1976 to 38 million in 1990. With this many older people in America, Eisdorfer saw two major problems—mandatory retirement at age sixty-five and stereotyping of seniors—that would make this bomb explode if was not defused. Allowing seniors to remain productive by working would be a big help, his research showed, and not separating them from the rest of the population would be of considerable psychological benefit. More generally, treating older people like “old people” actually accelerated the aging process, good reason for him to encourage the rest of the population to try to resist categorizing them into some special group.67
The metaphor of a looming demographic explosion gained traction in the late seventies. “A quietly ticking social time bomb—America’s rapidly aging population—is due to explode in 20 years or so with potentially revolutionary impact on the nation’s economy,” warned Philip Shabecoff in the New York Times in 1978. Little had been done to prepare the country for this “future shock,” Shabecoff reported, although the Senate Special Committee on Aging was beginning to consider what, if anything, the federal government could do about it. The slower economic growth of the past decade, as well as other factors, such as inflation, a jump in early retirement, and a greater demand for health care, only added to the foreseeable problem. Science had yet to lengthen the human life span to a century plus, despite all the efforts to do so, but biomedical advances were dramatically increasing the number of Americans living to be eighty years old or more. Some fifty-five million people, almost one-fifth of the nation’s population, would be sixty-five years old or older by 2030, Joseph A. Califano Jr., secretary of HEW, told the Senate committee, the baby boom gradually turning into the “senior boom.” Where would the money to take care of all these older people come from? Califano asked the senators, the fact that all this doom and gloom was still a half century away not offering much consolation.68 As America entered the 1980s, the subject of aging intensified, as did the conversation that, whether we liked it or not, we were all constantly getting older.