Читать книгу Aging in America - Lawrence R. Samuel - Страница 8
ОглавлениеIntroduction
Aging in America: A Cultural History is, as the title makes clear, a cultural history of aging in the United States. No such book has been recently published, something surprising given the centrality of the subject in contemporary life. Much interest currently revolves around aging in America, as the tens of millions of baby boomers head into their sixties and seventies. While I spend considerable time following the trajectory of the generation born between 1946 and 1964, all Americans are somehow and necessarily involved in the story. Everyone is aging, after all, making the subject something with which we all can identify. Aging goes to the heart of individual identity, a good reason why its cultural history is a worthy venture. On a deeper level, aging goes to the essence of humanity; it is one of our very few common or even universal experiences. I am interested in stories that bring us closer together rather than push us apart, and from this perspective it is difficult to imagine a more pertinent topic, as each of us gets older every day regardless of our race, gender, or other socially defined division.
Focusing on the past half century of American history makes sense for a number of reasons when thinking about aging, the most obvious being the demographics. Never before has there been a generation so large and so influential, making its evolution over time an attractive topic for any historian. Ten thousand Americans will turn sixty-five years old every day for the next twenty years, an astounding figure. More remarkably, perhaps, eighty-five to ninety-four-year-olds represent the fastest-growing age group in America, according to the most recent census, with that segment of the population increasing almost 30 percent between 2000 and 2010. Although much has been understandably made of American youth culture, the nation’s past fifty years have in many ways been heavily defined by the idea of aging, with key moments ranging from politics (the passage of Medicare) to science (the genetic revolution) to medicine (the rise of “antiaging” medicine) to education (the creation of gerontology as a field of study).
Given its significance, tracing the story of aging in the United States over the course of this era reveals key insights that add to our understanding of American culture. One of the key points here is that the idea and reality of aging have contradicted prevailing social values, attitudes, and beliefs, a phenomenon that has largely disenfranchised and marginalized older people from the rest of the population. One could reasonably conclude that the aging of the largest generation in U.S. history would have significantly altered American values over the past half century, but this simply hasn’t happened. Our ageist society has deep roots, going back decades to produce what is perhaps the most youth-oriented culture in history. On the cusp of old age, baby boomers like myself are now increasingly the target of ageism (thinking or believing in a negative manner about the process of becoming old or about old people), a likely byproduct of a culture in which getting older has little or no positive value.1
More than that, I suggest, there is currently no useful narrative of aging in American culture, leaving a large social vacuum as our population becomes older. Older people “find themselves with no mythology to support their presence, no place—figurative or otherwise—for themselves in the culture,” the spiritual teacher Ram Dass wrote in his 2000 book Still Here.2 The absence of a clear definition of aging is reflected by our difficulty in arriving at an acceptable label for those whom I call older (versus old) people. The once popular term “elderly” is no longer considered appropriate for many in their seventies and eighties today, and even “seniors” and “senior citizens” (the latter coined during LBJ’s Great Society programs that promoted the first Older Americans Act) carry connotations of dependence and cantankerousness. Both “older adults” and “mature adults” have recently increased in usage, especially among academics, but to me each sounds more like a movie rating than a group of people. “Geezers” (a term coined in the 1880s) has also gained some currency in recent years, but some are pushing for more politically correct terms, such as “seasoned citizens,” “wellderly,” and “superadults.”
Many of the problems associated with aging can be seen as rooted in a lack of knowledge about the experience. It is fair to say that we simply do not know how to age, as we are never provided with the informational tools to gain any kind of fluency in the process. There are now many books devoted to aging “gracefully,” but that body of work is overwhelmed by the plethora of resources advising individuals on how to stop or slow the process. Without any real “story” to aging, except as something to be delayed as long as possible, all kinds of antiaging therapies have flourished, further denaturalizing the perfectly natural act of getting older. Much of this is the boomer generation’s own fault, as this cohort has largely failed to turn the idea of aging into a relevant and meaningful part of life. Rather, boomers have clung desperately onto their youth, an ultimately futile pursuit that does not bode well as they rush headlong into their senior years.
Marketers and the media have each encouraged the idea that aging does not and should not have to happen, further entrenching the peculiar idea that getting older should be avoided at all costs. My local PBS station frequently airs antiaging shows such as Aging Backwards, for example, whose producers promise viewers they can “reverse the aging process and look nineteen years younger in thirty minutes a day.” Olay, the maker of skin care products, urges consumers to look “ageless,” an appeal that reflects our general antipathy toward getting older. Despite their popular appeal, “aging backwards” and “age-lessness” are, of course, absurd concepts that have absolutely no foundation in how the human body or any other living organism works. Those who yearn to reverse the aging process are attempting to negate a fundamental part of life that every human in history has experienced. “Antiaging” is, quite simply, antihuman, making any and all efforts to achieve such a thing contrary to the basic mechanism of life as we know it. More people should embrace the idea that aging is a natural part of life, versus trying to turn the clock back. Even if it were possible, achieving antiaging would result in myriad truly horrific scenarios, one more reason we should not just accept but welcome the fact that our bodies get older.
Women especially have been urged to try to evade aging, a reflection of our youth-obsessed society that places so much emphasis on appearance. “Such evasion comes at considerable social, material, and existential cost,” argues bioethicist Martha Holstein in her 2015 book Women in Late Life as she considers the all-too-common sight of old (and usually wealthy) women in young-looking bodies. A leading force in the field of aging for decades, Holstein can understand why many women pursue such strategies, however. “Women face particular difficulties that derive from the intersection of age and gender across their life spans,” she writes, with a host of economic, physical, caregiving, and health-related inequalities in play.3
For both women and men, the problematic subject of aging can be seen as a natural result of our devaluation of older people and our inability or refusal to confront the reality of disappearing youth. The first two acts of our lives are sharply defined (growing up and getting educated in the first act, working and raising a family in the second), but what should we do in the third act of our lives? Continue to work as long as we can? Play as much as possible as a reward for our hard work? Spend time with grandkids or travel? Give back to society in some way or leave some kind of legacy? The answer is not at all clear, and, what is worse, few people are even asking the question. Some have argued that boomers will pursue a path of “unretirement” in their senior years, but there is little evidence to suggest such an option is realistic.
The need or desire for baby boomers to work into their senior years is further complicated by blatant, although hardly ever mentioned or acknowledged, ageism in many, if not most, American companies. If anything, one would expect people of different ages to be eagerly welcomed into organizations as an expression of diversity—a prime initiative of human resource departments. Discrimination against older people in the workplace is commonplace (and illegal), however, a product of our deeply embedded aversion to people considered past their prime. “It would be awkward and embarrassing to have an older person work for me,” younger friends of mine have explained in my attempt to understand the underlying reasons for ageism in the workplace. True, perhaps, but white people got used to working alongside black people and men alongside women, making age the only remaining demographic criterion in which it is acceptable to discriminate (often in the name of something like “overqualification”). Imagine the legal and social consequences if millions of American employees casually mentioned their discomfort in having to supervise or work for an African American, woman, Latino, or gay or disabled person! A huge class action suit would result if the same kind of bias being shown by corporate America toward older people was based on a job applicant’s gender, race, or other biological attribute, something a clever lawyer might want to think about.
All expressions of ageism are the natural result of aging being seen in American culture as a state of decline, the downward side of the curve of life. Despite laudable attempts by AARP and some “pro-aging” advocates, the years following the age of fifty or perhaps sixty are commonly considered a kind of existential purgatory between the end of one’s active life and death. Older people are generally deemed weaker, less attractive versions of their younger selves, a terrible and simply untrue expression of identity. It is easy to see how seniors are often viewed as little more than slow-walking, bad-driving, hard-of-hearing, Matlock-watching citizens. (Studies show that ageism and negative attitudes toward older people are present in young children, and these feelings are difficult to change by the time they become tweens.)4 Hollywood has been especially unfriendly toward older people, either portraying them as comic foils or ignoring them completely. This attitude has reinforced cultural stereotypes related to aging and has lowered older people’s own sense of self-worth. Without a clear appreciation for what aging is and how it could be something to embrace rather than deny or ridicule, we may be headed toward a social crisis during the next few decades. Aging should be viewed in positive terms, and older people should be considered valuable parts of society. I urge fellow baby boomers to start telling that story while we still have the chance.
There is a rich and abundant literature devoted to aging in the United States, of which space here allows only a cursory review. While leading figures in the field of gerontology may have tended to ignore its cultural dimensions, they were instrumental in forging a body of work that helps to theoretically frame this study. The legacy of Matilda White Riley is an especially important one, as she perhaps more than anyone else understood the value of bringing an interdisciplinary perspective to the field. From the late 1960s until her death in 2004, Riley, often in conjunction with her colleague and husband, Jack Riley, “presented a compelling vision of the need for other disciplines to consider the role of social forces in shaping both aging as an individual, lifelong process and age as a feature of culture and social systems,” as Dale Dannefer, Peter Uhlenberg, Anne Foner, and Ronald P. Abeles expressed it soon after her death. Gerontologists from many disciplinary backgrounds were influenced by her work, something all the more remarkable given the fact that she did not begin to study aging until her mid-fifties.5
It is also difficult to overestimate the contribution of physician and gerontologist Robert Butler to our understanding of aging in America. Butler, who began his career in the 1950s and died in 2013, was the founding director of the National Institute on Aging (NIA), where he made Alzheimer’s disease a primary area of research. He was also the first chair of a geriatrics department at an American teaching hospital (Mount Sinai in New York City). He coined the term “ageism” after observing the lack of respect shown to the elderly and their conditions in medical schools, a theme that heavily informed his 1975 book Why Survive? Being Old in America. In 1963, he published a paper entitled “The Life Review: An Interpretation of Reminiscence in the Aged,” which, without exaggeration, redirected the trajectory of gerontology in this country. Via a “life review,” elderly people were offered the rare opportunity to look back on their personal past and see their “life course” in a larger context. Gerontologists found that such “memory work” proved to be a beneficial psychological process offering significant therapeutic value, so much so that mental health experts and social workers adopted the approach in their own practices. It is easy to see how conceiving of one’s life in narrative terms with a beginning, middle, and end rather than as a more or less random series of events can help an older person make sense of his or her time on the planet, something that Butler keenly recognized. He was a “visionary of healthy aging,” wrote historian W. Andrew Achenbaum in his fine biography of the man, devoting his own life to improving those of older adults.6
Peter Laslett, an English historian, also contributed greatly to the field after he retired from teaching in the early 1980s. Although he devoted much of his career to British political history, Laslett turned his attention to aging later in his own life. His concept of the Third Age and the Fourth Age, as outlined in his 1989 book A Fresh Map of Life, is as relevant and useful as ever. As his model makes clear, it is important to make a distinction between older people, as there is great variation within the population based on individuals’ respective mental and physical health. Laslett posited that the Third Age is typically one of activity and fulfillment, while the Fourth Age is one of dependency and frailty, a major difference in terms of viewing the concept of aging.7
Laslett’s supposition speaks directly to the cultural dynamics of aging in the United States today. Americans are inclined to lump all older people together into one group, just one example of how aging is often overgeneralized, misunderstood, and misinterpreted in contemporary society. As Laslett’s theory implies, it is important to distinguish Fourth Agers from baby boomers, as the latter have substantially different health care and economic needs than the former. Likewise, psychologist Bernice Neugarten (a “giant” in the field in her own right) believed there was a pyramid of aging composed of the “young-old, old-old, and oldest old” that also offers a constructive way to segment the population.8 Finally, in their Rethinking Old Age, British sociologists Paul Higgs and Chris Gilleard echoed Laslett’s idea of a Fourth Age while emphasizing its serious social consequences, something the present book also attempts to achieve.9
While this work is a cultural history of aging in the United States, it is impossible to ignore the universality and timelessness of the subject. Many of the great minds of their day offered key insights regarding aging that are still relevant, ranging from Cicero’s view of getting older as a natural part of life to Francis Bacon’s dream of eliminating disease and perhaps even death. In his Aging in World History, David G. Troyansky offered a whirlwind tour of aging through time and space, beginning with how hunters and gatherers understood getting older, then moving to the concept of old age in classical civilizations and to the role of later life in the Middle Ages and during the Renaissance. Troyansky traces how the modern concept of aging emerged in Europe and North America, leading to its creation as a social problem in the nineteenth and early twentieth centuries. Learning how a cross-section of civilizations over thousands of years have interpreted getting older, not to mention the wisdom of the likes of Aristotle, Socrates, Plato, and Shakespeare, is not only fascinating but has the unexpected effect of making those encroaching physical signs of aging a bit less vexing for a middle-aged reader.10
A brief overview of aging in the United States from the beginnings of the country through the post–World War II era does much to show how we got to what today is arguably a troubling situation. We now take our youth-oriented culture as a given, but this was not always the case. From the seventeenth through the early nineteenth centuries in America, people who lived a long life were venerated, their advanced age seen as divinely ordained. “Old age was highly respected in early America, perhaps in part because it was comparatively rare,” wrote David Hackett Fischer in his Getting Old in America, with just 20 percent of the population living to be seventy years old in 1790. This began to change soon after the American Revolution, however, as the first Americans to be born in the new country distinguished themselves from those who had immigrated to the colonies. By the turn of the nineteenth century Americans no longer exalted old people as their parents and grandparents had, a major shift in the social dynamics of age. “The authority of age began to be undermined, and at the same time the direction of age bias began to be reversed,” Fischer continued. The nation’s core values of democracy and independence had much to do with this leveling of a social hierarchy based on age, with economic class now the primary means of differentiation among Americans.11
Through the nineteenth century, older Americans continued to lose social status as the cult of youth gained traction. “After the Civil War, new scientific and economic data and theories gradually undermined the comparatively favorable assessments of the aged’s worth,” wrote Achenbaum in Old Age in the New Land. Oldness in all forms was condemned in an increasingly modern society, and old people were considered a drag on the noble pursuit of progress. More people were living longer, making old age less special than it had been a century earlier. The heroes of the nineteenth century also tended to be young men filled with determination and energy, whether they were conquering the frontier, herding cattle, discovering gold, or fighting wars. Against this backdrop, it is not surprising that old age homes became popular, a way to segregate people believed to be no longer capable of contributing to society. Forced retirement also became common in many occupations in the late part of the century, a reflection of Americans’ negative attitudes toward older people.12
The “demotion” of older Americans became institutionalized in the twentieth century through a number of powerful forces that led to viewing older people these past hundred years or so primarily as a social problem. Gerontology emerged as a professional field about the time of World War I, with those involved in it dedicated to help solve the perceived challenges of the elderly.13 With older people generally no longer welcome in the workplace, the idea of providing pensions for them gained acceptance. At the same time, the average life span continued to increase, meaning there would be more years of retirement and more money required to pay old folks’ living expenses. The notion of saving for one’s later years had not yet caught on, and pension plans for both company employees and government workers remained rare. This changed between the world wars, however, as labor leaders and politicians led the way to provide support for older Americans, culminating in the Social Security Act of 1935.14
In his 1991 book The New Aging, Fernando Torres-Gil of UCLA located the 1930s as the time in which attitudes toward older people began to become more negative than positive. Wisdom, and the simple act of exceeding longevity expectations, lost considerable worth during this decade, he argued, with a new kind of appreciation for values associated with youth. The 1920s had been a golden era for youth culture, of course, and the economic pressures of the Depression may have served to reward those exhibiting vitality and forward thinking. At the same time, older people were becoming increasingly defined as a segment of the population who were economically dependent on the government, a natural result of FDR’s New Deal policies, particularly the Social Security Act. Federally subsidized benefits were wonderful for many retirees, but they no doubt played a key role in creating the image of older Americans as unproductive people and a costly drain on society.15
Aging became increasingly defined within the context of science and medicine after World War II, deepening the perception that getting old was not unlike contracting a disease, or at least warranting a well-deserved rest after one’s vital and productive years. The founding of AARP in 1958 was an important milestone in the history of aging in America, a result of the emergence of what was then considered a new life stage. “Retirement” was a reaction to what was commonly seen as the major social problem of older people in mid-century America, transforming what A. Barry Rand, current CEO of the organization, describes as a ‘“life in purgatory’ to a much desired destination.”16 While the creation of AARP was certainly a positive development from many respects, it could be argued that it also helped to brand older people as less than fully contributing members of society and more of a liability than an asset to the country.
During the postwar era, Americans who were aged sixty-five or more increasingly became seen as a kind of special interest group separate from the rest of the population because of their “idleness” and dependence on government support. In his address to the White House Conference on Aging in 1961 (the first of a handful of such events held once a decade), Rabbi Abraham Joshua Heschel astutely captured the dilemma facing older Americans. In a society singing the praises of capitalism, he argued, the lives of those who were no longer working were considered meaningless, the reason why older people were often trivialized and belittled. “The moment the machine [for the making and spending of money] is out of order and beyond repair, one begins to feel like a ghost without a sense of reality,” he told the gathered audience, with retirement encouraging people to live “a pickled existence, preserved in brine with spices.” Despite the pleasures to be had in retirement, a life of leisure was seen as a wasted one in postwar America, contrary to the principles of free enterprise and upward mobility that so defined the times.17 That same year, Elaine Cumming and William E. Henry published Growing Old, which reinforced the idea that older people were no longer relevant. In fact, separating the elderly from the rest of society was mutually beneficial, according to their “disengagement theory,” making retirement a convenient mechanism to make them out of sight and mind.18 Some scholars, notably Robert Havighurst and Bernice Neugarten at the University of Chicago, strongly argued otherwise, however, maintaining that personality was a significantly more important variable in relative life satisfaction than level of engagement was.19
Following the conference hosted by JFK, organizations dedicated to the interests of older people lobbied for more federal funding, leading to the Older Americans Act of 1965 under LBJ. Medicare and Medicaid programs began that same year, expanding the field of geriatric medicine and formalizing the aging “industry” that we have today. While the rise of gerontology and the various social policies put into place over the past one hundred years have unquestionably benefited older people financially and in terms of longevity, the deal that was made can be seen as a devil’s bargain. Both geriatric medicine (the treatment and prevention of disease in older persons) and gerontology (the study of the process of aging) helped to instill a negative view of older Americans by focusing on the problems of aging, transforming them into a group seen as requiring special care.20 “In our [twentieth] century, vastly improved medical and economic conditions for older people have been accompanied by cultural disenfranchisement,” wrote Thomas R. Cole in The Journey of Life, defining this marginalization as “a loss of meaning and vital social roles.” Since World War I, the later years of many if not most Americans have been “impoverished,” Cole believed, with an inverse relationship between the quantity and quality of the final third of life. Over time, old age became detached from the former, purposeful part of life, an epilogue to the main story.21
Today, society generally views aging as a looming, sinister presence, threatening to bankrupt the national economy due to the anticipated health care costs of elderly baby boomers, and/or it spawns a generational war over how tax money should be spent. Their money, power, and influence notwithstanding, boomers are beginning to be considered unwanted guests as younger generations squeeze them out of the workplace and view them as socially over the hill. On an individual level, aging is typically ignored, with the prospect of physical and cognitive decline too painful to consider in our youth-obsessed culture. This book attempts to show how and why we got to this place, picking up the story in the mid-1960s when our current narrative of aging can be said to have begun.