Читать книгу Coming of Age in Times of Uncertainty - Harry Blatterer - Страница 9
Оглавление1
REPRESENTATIONS OF ADULTHOOD
“What is called the common-sense view is actually the grown-up view taken for granted.”
Peter Berger, An Invitation to Sociology (1963)
Bent on proving the value and validity of sociology as a scientific discipline, and critiquing the psychology of his day to make his point, Emile Durkheim used the term “collective representations” to describe the social a priori of ideas. Relatively fixed, even time honored, myths, legends, religious beliefs, and moral sentiments have in Durkheim's conception a strongly constraining and integrating function. Taking his lead from the venerable pioneer, contemporary French social psychologist Serge Moscovici has coined the term “social representations.” In La Psychoanalyse, son image et son public he offers this explanation:
Social representations are almost tangible entities. They circulate, intersect and crystallize continuously, through a word, a gesture, or a meeting in our daily world. They impregnate most of our established social relations, the objects we produce or consume, and the communications we exchange. We know that they correspond, on one hand, to the symbolic substance which enters into their elaboration, and on the other to the practice which produces this substance, much as science or myth corresponds to a scientific or mythical practice. (quoted in Duveen 2000: 3)
Unlike Durkheim's collective representations, which appear like an impenetrable layer of sundry sentiments, emotions, and beliefs, Moscovici emphasizes the mutability and plasticity of commonly held ideas, not least because social representations are intersubjectively constituted through verbal and nonverbal communication. In turn, as embodiments of our collectively held ideas, they orient our practices. Not unlike Durkheim's (1966) axiom that social facts are things sui generis, Moscovici proposes, “to consider as a phenomenon what was previously seen as a concept” (2000: 30, original emphasis). It is in Moscovici's sense that adulthood can be usefully considered a social representation.
Adulthood is circumscribed by historically and culturally specific practices and expectations, achievements, and competencies. It is fixed in our minds as childhood's other, and as adolescence's not-yet-attained destination. More than a concept, and testimony to the power of ideas, this social representation enacts differences: the child and adolescent are cast as dependent on adults. To better grasp adulthood as a social representation, let us imagine that it was struck from the imagination. Beyond the nonexistence of a mere sound, a range of associated concepts and ideas would be divested of their present meaning. What are childhood and adolescence without their counterpart and goal? How would we understand maturity and autonomy? The evaluative potency of adulthood (its taken-for-granted centrality in the apportioning of power) would be missing, replaced perhaps by other concepts conjured up by collective practices and ideas. This illustrates “the curious position” of social representations “somewhere between concepts, which have as their goal abstracting meaning from the world and introducing order into it, and percepts, which reproduce the world in a meaningful way” (Moscovici 2000: 31).
As part of the social constitution of adulthood, everyday communication and social scientific discourse feed off each other and reproduce the cluster of meanings and representations with which the word adulthood has become associated over time. In fact, it is of fairly recent provenance as a word, and of even more recent pedigree as a commonsense concept and social phenomenon. Still, the meaning of ideas, concepts, and social phenomena changes along with large-scale social transformations. Today, demographic and cultural transformations that originate in the post-Second World War decades are forcing apart ideas about adulthood and the practices that produce its substance. The emerging cleavage is manifest in a normative lag between commonsense and social scientific discourse, and the practical redefinition of adulthood on the ground. This is a central argument of the book. Thus it is worth exploring adulthood and its emergence as a concept, its transmutation into and consolidation as a social representation, as well as the present dilemmas these changes pose for social scientists in general and sociologists in particular. On the whole this task has a twofold aim: to elucidate the interaction between commonsense and social scientific knowledge in the formation of our cultural vision of adulthood, and to address how this conception is used in approaches to new adults' practices and orientations.
A Brief History of Adulthood and Maturity
The word adulthood, denoting a stage of life, is a relatively recent addition to the English lexicon. According to the Oxford English Dictionary, usage of the noun was preceded by the adjective “adult,” which entered the vocabulary via the adoption of the French adulte, itself a sixteenth-century adaptation of the Latin adolescere, to grow up. “Adultness” is said to have come into usage mid-eighteenth-century, and was superseded around 1870 by “adulthood.” The Shakespearian scholar Charles Cowden Clarke (1787–1877) is credited with using the term for the first time in a literary work. Writing about Shakespeare's Twelfth Night, he noted that the play “was written in the full vigour and adulthood of his [Shakespeare's] conformation” (OED 1989: 178–180).
Some time passed, however, before the social meaning of adulthood was to gain normative efficacy. Preindustrial Western cultures did not know adulthood as a defined social category: “You were a man or a woman if you weren't a child” (Merser 1987: 52). In the United States the term came into circulation after the Civil War and reached prominence no sooner than the early twentieth century. Winthrop Jordan (1978) stresses that this was linked to the increasing fashionableness of the notion of psychological maturity, which at that time began to develop into a metaphor for adult status. Jordan identifies as crucial to the emergence of the mature individual qua adult the transformation of Calvinist predes-tinarianism into a theology that emphasized individual effort as the means to salvation: “Only when the individual's own struggles were given far greater weight in the process of conversion would there be room for a process of reaching psychological maturity” (1978: 190). So, the emergence of adulthood is inextricably linked to processes of individualization, that is, individuals' gradual liberation from the determinants of birth and religious conformity, and their simultaneous charging with an ever-increasing self-responsibility for all aspects of their lives.
Toward the end of the nineteenth and the beginning of the twentieth century adulthood became the default position: a life stage situated between adolescence and old age. G. Stanley Hall's (1904) work on adolescence was pivotal in this regard. Hall's thought was influenced by post-Darwinian evolutionary biology. His work was an important precursor to developmental psychology, which, particularly in its early to mid-twentieth-century form, set about segmenting the life course into discrete and well-defined units. It followed that adolescence, which was ever more perceived and treated as a period of inner turmoil, came to denote a preparatory life stage to adulthood, now understood as its developmental goal.
Earlier, in preindustrial Europe for example, children took on adult responsibilities at a young age by today's standards. For some sectors of society at least, participation in productive work tended to extend across almost all of the life span. Furthermore, the combination of early family formation, short life expectancy, and high fertility rates meant that parenthood too was a lifelong endeavor for most. As Hareven (1978: 205) puts it, “demographic, social and cultural factors combined to produce only minimal differentiation in the stages of life.” Moreover, the separation of young people from the world of production through universal education, while exclusion from work was also replicated at the other end of the life course, played an important role in the emergence of adulthood as a separately conceived life phase (Pilcher 1995).
Adulthood emerged in public consciousness and entered the cultural vocabulary of everyday life as the achievable (and indeed desirable) end to adolescent immaturity during the Second World War. In the U.S. a fascination with being grown up emerged in popular culture. Reader's Digest, McCall's, and Vital Speeches of the Day were some of the publications with a wide readership that concerned themselves with what it meant to be adult. A 1952 issue of Reader's Digest, for example, invited young readers to complete a quiz in order to find out whether or not they were indeed grown up (Jordan 1978: 197). So, since its entry into the vernacular during the Civil War, adulthood had come to signify something solid to aim for, a life stage that held the promise of fulfilled wishes and achieved aspirations. Accordingly, a number of words, phrases, and practices associated with adulthood as social status began to settle and eventually became taken for granted and commonplace. Directives like “Don't be childish!” and “Grow up!” and turns of phrase such as being “set in your ways” or having “settled down,” are linguistic devices associated with adulthood. They are also figures of speech that enact social asymmetries and put adult “human beings” in a more powerful position vis-à-vis those who, like children, are perceived and treated as “human becomings” (Qvortrup 1994).
“Maturity” acts as a central metaphor encompassing normative achievements and attributes of adulthood. Although the term is most closely associated with biological development, maturity tends to be used to describe individuals' social and psychological competencies and dispositions. While being mature does not necessarily make a person an adult in the eyes of others (a child may be “mature for her age,” just as an adult may be deemed immature), when linked to adulthood, maturity denotes an end state to biological, psychological, and social development. The notion of social maturity adds an extra dimension. It takes as its starting point the premise that adulthood is constituted not so much by the significance individuals attribute to their own attitudes and actions, but by the kinds of social validation these attract. Just as the interpretation of biological and psychological maturity is culturally specific, as Margaret Mead's classic work Coming of Age in Samoa ( 1928) has shown, maturity is subject to socially constructed and acknowledged forms of meaning. Its plural meanings (biological, psychological, and social) are, for example, institutionalized in law. To appropriate the thinking behind James and Prout's (1997) social constructivist stance on childhood: the maturity of adults is a biological fact of life, but the ways in which this maturity is understood and made meaningful is a fact of culture.1
Notions of maturity hold an important place in the self-understanding of entire societies that share the liberal European tradition. The obvious example here is Immanuel Kant's (1724–1804) statement, “Enlightenment is humankind's emergence from self-incurred immaturity” (1975 [1784]). In his critical analysis of this text, Michel Foucault spells out the synonymy between history and individual development. He maintains that Kant defines the historical process “as humanity's passage to its adult status,” to “maturity” (1994: 308–9, 318). Similarly, historian Norman Davies comments, “Europeans reached the ‘age of discretion’…with medieval Christendom seen as the parent and Europe's secular culture as a growing child conceived in the Renaissance” (Davies 1996: 596). Common perspectives of human development from a state of childlike dependence to adult independence parallel our understanding of modernization as a process of emancipation from dogma, tradition, and authority. This direct link between historical process and individual maturation has consequences for the social-scientific appraisal and treatment of young people to this day. The clearest case, again, is Hall's early interpretation of adolescence, where the individual's development was said to recapitulate the historical maturation of the human species as a whole. Along with a new emphasis on personal and social development, certain practices emerged as symbolic and constitutive of adulthood.
Adult Practices
Picture this: a man and a woman in their mid-twenties. The woman holds a baby in her arms; a small child clings to the man's hand. The woman wears an apron, the man his work overalls. A “Sold” sign perches on the fence that surrounds the freshly painted house. A generously sized car sits in the driveway. No one could ever mistake the man and woman in this romanticized picture for adolescents, and few would be tempted to suggest that they were not adults. Many would, as if by reflex, assume the man to be husband to the woman and father to the child. But something about this image jars against the present. Just like the choice of frame for a painting or a photo, so the right time frame too helps integrate representation and reception. With this in mind, I suggest that no period in the history of Western societies has been more conducive to the institutionalization of a particular model of adulthood (of which the above, romanticized image is one possible representation) than the era historian Eric Hobsbawm (1995) has called the “Golden Age,” namely the time between the end of the Second World War and the oil crises of the early 1970s. No period has provided more favorable conditions for this model to become lived experience for a majority; no period has shown a more faultless synthesis of ideal and reality. Following Lee (2001), I call this synthetic construct “standard adulthood.”
After the Second World War the industrialized economies experienced unprecedented affluence and stability. The period from about 1945 to the early 1970s saw a concerted effort by business, government, and unions to prevent a recurrence of the Depression, the harrowing experience of which still haunted decision makers. Although more wealthy nations had their own macroeconomic agenda, public spending, full employment, and universal social security provisions were given priority to ensure internal demand and hence economic expansion. The then-prevailing mode of management and organization, what came to be known as Fordism, has since come to denote more than that: it signifies a once-prevalent “total way of life” that congealed around goals of long-term stability and economic growth (Harvey 1989: 135). Typically, businesses valued employee loyalty, which was generally rewarded with promotions in hierarchically constituted organizations. For employees and families this meant that there were plannable career paths with predictable milestones on the way, and a known destination: retirement on guaranteed government pensions. In the world of work the accumulation of experience with age was viewed as a valuable asset and was seen to increase, rather than inhibit, job security (Lee 2001: ii-13). According to one sociologist's interpretation of the time—characteristically exaggerated for illustrative purposes—these economic and work-related aspects alone, “created a society in which people's lives were as highly standardized as the sheet steel from which the cars were welded together” (Beck 2000: 68).
These social conditions corresponded to a value system that remained unchallenged in its normative validity until the rising discontent of the 1960s. Open same-sex relationships were extremely risqué and hence rare, and same-sex parenthood (as opposed to guardianship) was unimaginable. The heterosexual nuclear family prevailed as the ideal. It is during this time that early marriage and family formation came to be lived experience for many adults.2 Add to this the opportunities provided by the labor market, and a picture emerges that one commentator draws with clarity:
[O]nce ‘adult’ and employed, one could expect to stay ‘the same’ for the rest of one's life in a range of ways; one's identity was stabilized by sharing the work environment with more or less the same people throughout one's working life; the geographical area one lived in would remain the same since the organization one belonged to had set down firm roots in that area; and, even if one were dissatisfied with one's job, one would not have to seek a position with another organization (in another place with different people) because time and effort would bring the reward of career progression. (Lee 2001: 12–13)
Flexibility—first a buzzword in the New Capitalism (Sennett 1998, 2006) and now a taken for granted imperative in all social relations—was as yet a far-off reality. Becoming adult was a matter of following a life course that resembled a veritable march through the institutions of marriage, parenthood, and work. By today's standards these objective markers of adulthood were relatively fixed, achievable, and supported by an overarching value consensus. There was a high degree of fit between norms and social practice. Sharply delineated structures of opportunity rested on culturally and socially reproduced normative foundations that were, for a time, rarely questioned. With fulltime long-term work within reach for a majority, and with early marriage and family formation so common, what being grown up meant was clear. The fulfillment of the classic markers of adulthood (family, stable relationships, work, and independent living) brought in its wake the social recognition necessary for adult status to become a meaningful achievement. The experience of affluence and stability after the Second World War thus added its share of securities to the vision of standard adulthood, a now crystallized social representation.
Not all was well in the Golden Age, however. For one, growing up as a member of the postwar generation in the West was to live a contradiction. The Cold War meant that the new reality of increased chances for social mobility and relative affluence, and the belief in continuing economic and technological advance, was checked by the knowledge that the possibility of total annihilation was just as real. For example, the Cuban Missile Crisis of 1962 served as a stark reminder of tragic possibilities.3 The lived contradiction of threat and opportunity underpinned one of the so-called “baby boomer” generation's defining mottos: “We're not here for a long time, we're here for a good time” (Mackay 1997: 62). As we shall see in chapter 4, this attitude marks an ideological transformation in the meaning of youth that was to reverberate decades into the future and that significantly altered the meaning of adulthood.
My schematic equation—economic stability plus an explicitly sanctioned normative consensus equals a stable adult identity—is not intended to be positive nostalgia.4 After all, standard adulthood was highly gendered in an era when the labor market overwhelmingly favored men as breadwinners. It would also be a gross historical misrepresentation were this image to be generalized to include marginalized groups. The kind of stability and predictability of life suggested by this model of adulthood is based primarily upon the experiences of white, heterosexual, middle-class males; on experiences, that is, that were lived in mainstream families and reproduced in mainstream culture, whatever the extant inequalities. The crux of the matter is this: the real differences did not diminish standard adulthood's normative status as the ultimate benchmark for adult maturity. Our contemporary associations of adulthood with stability arose from that generation's experiences and expectations.
Today standard adulthood as a norm remains robust, though it may be increasingly counterfactual for many. It is still associated with the ideals of stable relationships, stable work and income, a family of one's own, and independent living (Furstenberg et al. 2003; 2004). Framed in the language of a specific kind of maturity, standard adulthood promises greater self-understanding and the self-confidence that comes with the accumulation of social competencies. In these terms, settling down is not to be shunned. When the experience of opportunity, possibility, and stability is passed from one generation to the next and is focused in a notion such as adulthood, it stands to reason that this cultural idea should become a powerful ideal.
Classic Markers of Adulthood
The achievement of adult status has to do with “sets of practical accomplishments, and repertoires of behaviour” (Pilcher 1995: 86). This is particularly necessary in modern societies due to the absence of all-encompassing, firmly instituted rites of passage to adulthood. Thus there are various signposts that serve to identify and acknowledge individuals as adults, such as age, independent living, stable relationships, parenthood, stable employment, and the right to vote, to name a few. The descriptors of adulthood discussed below are limited to those objective markers of adulthood that have long standing salience as achievements that are deeply embedded in dynamics of social recognition. As ideal-typical yardsticks for commonsense and social scientific judgments regarding individuals' status, these classic markers are the tangibles of standard adulthood.
Marriage with its ritualistic inauguration is one such instance. It is ingrained in the social imaginary and as such most closely approximates a transition ritual from adolescence to adulthood. Through marriage people enter into a union underwritten by a tacit understanding that responsibility and commitment, central notions in the cultural vocabulary of adulthood, are vital ingredients for its success. The institution of marriage and adult status are linked through the symbolism of the wedding ring. This badge of membership in the world of adults can be a sign of integrity; it can signify a shared fate; it can spell “off limits” as well as “discretion assured.” Above all, it symbolizes an act of commitment, its diminishing chances of survival notwithstanding.5 Marriage evinces the overcoming of reputed youthful self-absorption and hedonism. In everyday life it connotes the achievement of adulthood anchored in commitment and responsibility to someone (spouse) and something (a stable relationship).
As Eisenstadt (1971: 30) maintains, adult status “coincides with the transition period from the family of orientation to that of procreation, as it is through this transition that the definite change of age roles, from receiver to transmitter of cultural tradition, from child to parent, is effected.” The social validation attained through parenthood is palpable in everyday interaction. Outings with children often involve conversations with strangers, previously perhaps a rarity. In the supermarket, at the bus stop, in the park; there always appears to be someone willing to share their experiences, wanting a peek at your baby, encouraging or reprimanding your particular style of child rearing, commenting on the difficulties of work/life balance. To paraphrase a respondent (with sociological training) in my sample who had recently been “catapulted” into twin fatherhood, this is “social integration at its most intimate.” Particularly in the post-Second World War era, adulthood and family life were inextricably bound together in the social imagination. As Furstenberg et al. (2004: 35) put it with reference to the United States: “By the 1950s and 1960s, most Americans viewed family roles and adult responsibilities as nearly synonymous. In that era, most women married before they were 21 and had at least one child before they were 23. For men, having the means to marry and support a family was the defining characteristic of adulthood, while for women, merely getting married and becoming a mother conferred adult status.”
Many of us remember the question “what do you want to be when you grow up?” when “be” really meant “do.” Work—showing that you are capable of “paying your own way,” “pulling your weight,” contributing to the family and to society independently of state or parents—is another commonsense marker of adulthood. Stock phrases such as “wait 'til you're out in the real world” and “welcome to the real world” are historically framed. Since the completion of the process of differentiation that assigned children their place in school and adults their place in work (Gillis 1981; Mitterauer 1992; Perrot 1997), adulthood is also partly defined by independence gained through participation in the “work society” (Offe 1984).
Independence from parents is most explicitly achieved with the realization of independent living. To this end, leaving home has long been an integral part of one's identification as an autonomous individual, because “for much of the twentieth century, home-leaving was the starting point for a range of processes that signaled the transition from youth to adulthood. Most young people left home to marry, complete their education, serve in the military, or to work. With those changes came parenthood and economic independence” (Pullum et al. 2002: 555). In Australia home ownership is part of the national imaginary; it is a dream to be made reality through hard work and frugality. A stable relationship, parenthood, and work can thus be seen as finding their culmination in the family home as the meaning-giving reference point. The key to the home—just like the key to the parental home some receive at their twenty-first birthday in Anglophone societies—also opens the door to the world of adults.
These classic markers of adulthood provide the social frame for standard adulthood, a model that not only approximates many contemporary adults' lives, but that is the normative model par excellence. Although these are by no means all the characteristics that are attributable to adulthood, they are central cultural typifications and as such impact on public opinion just as much as on social research.
Adulthood and Age
The social constitution of adulthood can be further clarified by considering the rift between social practice and biological factors. In the course of Western history the onset of puberty has been slowly but steadily occurring at ever-younger ages (Mitterauer 1992). This does not mean, however, that Western societies, in contrast to some other cultures, accept the attained physical ability to procreate as marking the transition to adulthood. Rather, it is the social actualization of a physiological capacity through parenthood that has, historically speaking, marked this transition. To make matters more intricate still, the timing of such an event, measured against culturally specific and more or less institutionalized age norms, plays a pivotal role. Thus, in the absence of explicit rites of passage, one immediate problem in contemporary society is the lack of empirical determinacy as to when adulthood begins. This is paradoxical insofar as the life course is to a significant degree rationalized along age lines (Buchmann 1989; Settersten 2003 ).6 The age-structured pathway through primary and secondary education is an obvious example. Yet, the search for a definitive point at which adulthood is formally marked as beginning is futile.
Age legislation is a case in point. The law adheres to a pluralist conception of maturity denoting various competencies that are distributed among a range of ages. Thus, entry into adulthood is conceived in extraordinarily fragmented terms. For instance, depending on the age difference between partners, sexual preferences, and state legislation, the age of consent ranges from 14 to 18 in the United States and from 10 to 18 in Australia. The age of criminal responsibility in the U.S. is as low as 7 depending on state legislature (not all states specify minimum ages) (CNIJ 1997). In Australia it is 10, while in other countries it is 14 (Austria, Germany), 16 (Japan, Spain) or 18 years of age (Belgium, Luxembourg) (Urbas 2000; AIC 2003 ).7 A variety of other acts are deemed legal at different ages. In Australia, movies rated “M” for mature may be viewed from age 15, the same age at which individuals are free to leave school of their own accord. Cinema goers have to purchase full-price tickets from age 16, the age at which people can opt to get tattooed and are allowed to purchase cigarettes. At age 18 individuals are permitted to vote, carry firearms, get married, make medical decisions, and so forth (Sunday Age 1992; Urbas 2000; DHA 2003). This illustrates that entry to adulthood, as far as legislation is concerned, occurs on a continuum along which rights and obligations are incrementally attributed. Even if we were to speak of a legal adulthood that comes with the attainment of full legal rights and obligations at age 18, 19, or 21, this does not pertain to noncitizens regardless of age (e.g., permanent residents, prisoners, refugees, and asylum seekers). And although they are in principle, in fact individuals are neither guaranteed equal access to the law, nor protected from structural exclusion.
These official ambiguities are in tension with “informal age norms” (Settersten 2003) that function as cultural reference points to behavior and status. The relationship between age and the classic markers of adulthood is an example, for the latter depend for normative effectiveness on when in a person's life they are achieved. Stable relationships or parenthood, work or independent living attained too soon may be interpreted as signs of precociousness or deviance rather than adult competence. Likewise, late attainment may entail physical risks (childbirth); it may be increasingly difficult (work), or deemed behind schedule (marriage, independent living). Clearly, what is regarded as late or early is historically and culturally contingent. We may also think about the significance of the twenty-first birthday in some societies, or the confidence with which individuals remark on how young or old somebody looks “for their age” without the slightest need for expert opinion. This is not to privilege informal over formal age norms on points of accuracy, however. Ageism can be institutionalized both in everyday interaction as well as in the official sense. Whether this pertains to discrimination against elderly individuals, or to disrespect of those who simply appear to be too young to be given credence for full adult status, age is a powerful ascriptive force in contemporary society. In fact, age norms are an exception to Parsons' (1951 ) rule that with modernity premodern ascription gives way to achievement, that status is now exclusively a matter of individual action rather than predetermined parameters. To the contrary, informal as well as formal age norms are social facts that are culturally reproduced and shored up. They facilitate and inhibit social participation. They are markers of inclusion just as much as they are points of discrimination.
We can see, then, that the practical, everyday taken-for-grantedness of adulthood is at odds with its conceptual indeterminacy. Neither official age grading nor the attribution of rights and obligations; neither biological characteristics nor psychological traits; neither formal nor informal age norms; neither fixed roles nor rites of passage can be drawn upon to delineate and therefore define adulthood. All these aspects are in silent tension with one another, contradictory, imperfectly integrated. Yet, in the social imagination adulthood remains the central stage of the biography—that which childhood moves toward and which old age has left behind.
Prolonged Adolescence, Postponed Adulthood?
Media reports consistently focus on the practices of young people who according to cultural age norms ought to be grown up, but are described as at best deferring and at worst rejecting adulthood. These reports by and large attempt to come to grips with the social trends that underpin the alleged problems of contemporary young adulthood: prolonged stay in or episodic returns to the parental home; delayed or altogether forfeited marriage and family formation; drifting from temporary job to temporary job; or the repudiation of long-term aspirations in favor of short-term goals and experimental living. Some selected headlines from newspapers and magazines exemplify this discourse. Here, those who ask “Why Today's Teenagers are Growing up Early” (Sydney Morning Herald 2001a) are countered by a majority typically expressing sentiments along these lines: “Now Wait til 35 for Coming of Age” (Australian 2001), “‘Adults’ Fail the Age Test” (Herald Sun 2004), “Kids Who Refuse to Grow Up” (Herald Sun 2003), and “Forever Young Adultescents Won't Grow Up” (MX Australia 2004). Articles of this kind are mostly based on the claims of market researchers who believe that the “fundamental philosophy [of people in their twenties and beyond] is a deferment of any sort of commitment…Underpinning that is a sense of the now, and little sense of the future” (Australian 2004). In keeping with their arguments, and to enable them to target new consumer demographics, marketing professionals offer a number of labels describing individuals who are said to be averse to growing up. These descriptors are readily taken up in the media where there is talk about the rise of “adultescents,” “kidults,” and “twixters” in the U.S. and Australia, “boomerang kids” in Canada, Nesthocker in Germany, mammone in Italy, and KIPPERS (Kids In Parents' Pockets Eroding Retirement Savings) in the U.K. (Time 2005a; Time 2005b).8
Social scientists have evolved their own concepts to accommodate essentially the same view. Indeed, media attention to young people's deferral or rejection of adulthood can draw on expert advice with a long history. Although in reality a number of approaches differ in nuance, for analytic clarity I subsume these under the delayed adulthood thesis. As intimated above, twentieth-century North American notions of adolescence as a period of “structured irresponsibility” (Parsons 1942a) and as “identity crisis” (Erikson 1950) can be considered paradigmatic of Western culture's perceptions and treatment of adolescents up to the present. Taking the “storm and stress” view as a given, psychoanalyst Peter Blos (1941) coined the term “post-adolescence.” It designates a stage of life inhabited by individuals who have outgrown adolescence, but have not yet reached adulthood. These are individuals who, according to Keniston (1970: 634), “far from seeking the adult prerogative of their parents…vehemently demand a virtually indefinite prolongation of their nonadult state.” Erikson's (1968) “prolonged adolescence” neatly encapsulates this idea—a vision that continues to have currency today. What for marketers and journalists are twixters, adultescents, and kidults, for social scientists are lives led in a manner that is analogous with a particular image of adolescence: a time of irresponsibility where few decisions have to be made, and the capacity to reconcile “work and love” has not yet been completely attained. Consequently, contemporary trends are then equated with a prolonged transition to and delayed entry into full adulthood (e.g., Côté 2000; Furstenberg et al. 2003, 2004; Arnett 2004; Settersten et al. 2005; Schwartz et al. 2005).
Sociologist Frank Furedi (2003: 5) is unequivocal in his condemnation of what he perceives to be an “infantilisation of contemporary society.” Taking umbrage at adults' alleged “present-day obsession with childish things”—gadgetry of all kinds, including soft toys and children's books—he asserts: “Hesitations about embracing adulthood reflect a diminished aspiration for independence, commitment and experimentation.” His agreement with social psychologist Stephen Richardson, who holds that “we do not reach maturity until the age of 35” (quoted in Furedi 2003: 5), also speaks volumes about social scientists' perspective on adolescence as a time of immaturity, and adulthood as the culmination of a kind of maturity that Peter Berger ( 1966: 69) described as that “state of mind that has settled down, come to terms with the status quo, given up the wilder dreams of adventure and fulfillment.” This kind of view, of which Furedi's essay is but one articulation, illustrates how late-nineteenth-century ideas about young people (and prevailing normative notions of masculinity and adulthood) are still deployed in social-scientific analyses of present trends. And it does so with particular eloquence because its advocates, in all their earnestness, appear entirely oblivious of this very fact. Hans Peter Duerr's (1985: 126) assertion that one of the tasks of the scientist is to “mount a defence against that which is strange,” has some resonance here, if only as a possible unconscious motivation rather than full intention.
In fact, with this (unacknowledged but implied) model of adolescence in mind, proponents of the delayed adulthood thesis at times assert with some certainty when adolescence now ends and adulthood begins. Thus the U.S. National Academy of Sciences pegs the end of adolescence at 30 years of age (Danesi 2003: 104–5). The issue becomes positively confusing when, in a programmatic statement on professional confidentiality, members of the U.S. Society for Adolescent Medicine state, “[a]dolescents who are age 18 or older are adults” (Ford et al. 2004: 164). There is, in other words, no social-scientific consensus concerning the end of one period of life and the beginning of the next. There is agreement, however, that today young people take longer to reach full adulthood than was previously the case. Furstenberg (2000: 898) sums up the prevailing accord: “[T]he transition to adulthood extend[s] well into the third decade of life and is not completed by a substantial fraction of young people until their 30s.”
To accommodate the demographic changes that lie at the root of the allegedly protracted and delayed entry into adulthood in affluent societies, two conceptions of North American provenance have gained particular currency: Jeffrey J. Arnett's “emerging adulthood,” and “early adulthood,” a concept marshaled by a MacArthur Foundation research group into transitions to adulthood headed by Frank F. Furstenberg. Both perspectives are based on the belief that a new life stage separates adolescence from adulthood. Emerging adulthood pertains to individuals between 18 and approximately 25 years of age who, “[h]aving left the dependency of childhood and adolescence, and having not yet entered the enduring responsibilities that are normative in adulthood,” inhabit an in-between stage (Arnett 2000a: 469). Early adulthood describes a phase from the late teens to the late twenties or early thirties when “young people have not yet become fully adult because they are not ready or able to perform the full range of adult roles” (Furstenberg et al. 2003: i). According to Arnett (2000a: 471), terms such as “late adolescence,” “young adulthood,” and by implication “early adulthood,” should be avoided because emerging adults “do not see themselves as adolescents, but many of them also do not see themselves entirely as adults.” The main difference between these approaches, then, is one of nomenclature. In fact, the research agendas are eminently compatible both in terms of their respective subject areas and conclusions. They are erudite elaborations of the notion that the transition to adulthood is increasingly extended, and that thus entry into full adulthood occurs later than was previously the case—that the twixters, the kidults, and adultescents are on the rise.
Taken in sum, Arnett and Furstenberg et al.'s research output makes important contributions to the study of young people's experiences and the shifting normative frame in which they unfold. Be it the chronicling of social transformations in the United States since the Second World War; be it the subjective perceptions of the transition to adulthood (Arnett 1997) and the perceptions and attitudes of young people concerning their futures (Arnett 2000b); be it North Americans' views about the timing of life events that for them connote the transition to adulthood (Furstenberg et al. 2003; 2004): the combined findings are significant contributions to our understanding of subjective views and manifestations of social changes. But the validity and utility of data hinge ultimately on how they contribute to concept building, and how new concepts are put to use. This is particularly important when we attempt to describe and understand the experiences of young people, not least because policies informed by this kind of research have a very real and direct impact on young people. For this reason alone the prevailing view needs to be expanded. A first step is to point out some inherent misconceptions, not least because they underpin much of the work done in the area of “youth transitions.”
Epistemological Fallacy I: The Subjectivization of Everything
The aforementioned researchers have made invaluable contributions to our understanding of young people's perceptions of adolescence and the transition to adulthood, and perhaps none more so than Arnett. However, something is amiss in his interpretation of the data. Arnett connects what is considered a highly individualized Western culture directly to the alleged personalization of life stages. To this end the following statement may be considered programmatic not only for Arnett's approach, but for much of the oeuvre: “The more individualistic a culture becomes, the more the transition to adulthood is individually rather than socially defined. It takes place subjectively, individually, internally, in an individual's sense of having reached a state of self-sufficiency, emotional self-reliance, and behavioral self-control” (Arnett and Taber 1994: 533, original emphasis). This assertion fits hand in glove with Côté's (2000: 31 ) claim, “adulthood is now more a psychological state than a social status.” In fact, Côté's approach to the changing nature of adulthood is instructive here, and it is worth addressing, not least because of his recent collaboration with Arnett (Schwartz et al. 2005)—a quasi-natural affiliation considering their respective approaches. Central to Côté's view on identity formation in emerging adulthood is what he calls “two developmental routes in the individualization process” (Schwartz et al. 2005). As he previously elaborated in Arrested Adulthood (2000), Côté distinguishes between passive “default individualization” and active “developmental individualization” in his analysis of people's orientations concerning their life trajectories.9 Consumer-corporate interests are said to perpetuate and benefit from the default option; pop culture thrives on the illusory notion that individuality is a function of “selecting the right wardrobe or developing slight affectations in speech, behavior or appearance” (2000: 34). An increasing number of adults are seen as taking “paths of least resistance” rather than acquiring “self-discipline, in order to develop advanced skills, aptitudes, and attitudes” (2000: 34). Drawing on his notion of “identity capital,” Côté suggests that individuals need to become successful investors in the identity market in order to reach their potentials despite the machinations of a media-driven pop culture.
While both Arnett and Côté have succeeded in highlighting young people's potential for agency, objections can be raised. Their focus on individual perception and individual agency/passivity psychologizes the meaning of adulthood. This renders the young people under scrutiny agents of their own fate to such an overdrawn extent that the systemic factors that influence their practices all but disappear from view, except to provide obligatory variables. Practitioners in the sociology of youth have shown that today young people are under great pressure to succeed at a time when structural adjustments have softened up the foundations on which they are to build their lives, and that they often do so believing themselves to be solely responsible for their successes and failures (Furlong and Cartmel 1997; McDonald 1999; Wyn and White 2000; Dwyer and Wyn 2001). This blindness to systemic conditions on individuals' behalf has been called “the epistemological fallacy of late modernity” (Furlong and Cartmel 1997). Precisely this fallacy is discursively reproduced and social-scientifically legitimated by the orthodox, highly individualistic approach to adulthood.
Epistemological Fallacy II: The Normative Lag
The prevailing pronouncements about young people's practices have one thing in common: they implicitly use the model of standard adulthood as their benchmark. Wagner and Hayes's (2005: 4) considerations are instructive in this regard: “Our present-day thinking is based on a succession of historically evolved mentalities; on mental edifices which previous generations have constructed, pulled down, renovated and extended. Past events are compressed in images and metaphors which determine our present thinking even if we are not always aware of them.” Standard adulthood, a commonsense life stage in the “thickly viscous form of the past” (Wagner and Hayes 2005: 4), remains conceptually fixed, unproblematic, and thus escapes articulation, let alone analysis. What is seen as worthy of analysis, however, is the failure to reach a taken-for-granted standard at a time of life when this is conventionally deemed most appropriate. In fact, research into the timing of the transition to adulthood has shown that today the realization of the classic markers of adulthood is still expected by most people to occur in their twenties (Arnett 1997; Du Bois-Reymond 1998; Furstenberg et al. 2003; 2004). The nonattainment of these markers by many people in their twenties and beyond is thus taken as a sign that their adolescent state is prolonged, that they in fact defer or reject adulthood for a time, only to emerge into the standard model of adulthood later. What is most significant for now is the fact that the current benchmark of adult behavior is anachronistic. Ideal types are useful instruments in sociological methods. As is well known, Max Weber (1922) advocated their use. But proponents of the delayed adulthood thesis not only refrain from using standard adulthood as an ideal type in order to gauge present or past deviations from it, but actually confuse a historically contingent model with contemporary social realities and continue to posit the ideal type as the normative telos to individual development. What is more, the outdated model is often held up as something to be striven for at a time when the realization of standard adulthood is for many not only impossible, but also hardly desirable (e.g., Du Bois-Reymond 1998; see also chapters 5 and 6 in this volume). Social scientists, journalists, and marketers, members of the previous generation as well as young adults themselves are thus frequently subject to a normative lag between the idea of standard adulthood and contemporary realities. In the long view of history such delays are commonly recognized: “Mentalities are at any one time the most sluggish components of historical change. They lag behind…and establish contradictions and rogue complications in historical development…In this way, they become the driving force behind new change” (Wagner and Hayes 2005: 3). In our specific context, analysis of this normative lag is perhaps nowhere of greater urgency than in those policy domains that deal with young people's transition from education to work. As Peter Dwyer and Johanna Wyn (2001: 78) assert:
Relying on our own past…establishes a predetermined expectation about what happens in the lives of the next generation. It takes for granted a linear model of development which assumes that young people progress through a pre-set series of separate stages in their lives which involve innate processes of maturation and normative forms of socialization within stable families and an age-based education system, leading at the proper time to a movement from dependence to independence, from school to work, from young people's status as adolescents to their eventual achievement of a stable and secure adulthood.
This illustrates the point that the normative lag also translates into a policy gap between the ideology of increased educational participation and the persistent uncertainties of outcomes for the post-1970 generation (Dwyer and Wyn 2001: 74)—a gap, that is, that takes the linearity of a previous generation's transition to adulthood as the evaluative and policy-forming benchmark by which young people's successes and failures are judged. Thus there is good reason to rethink our notions of adulthood. Maguire et al. (2001: 198) make an uncommon (and therefore all the more pertinent) point: “The idea of a ‘refusal of adulthood’ potentially carries within it the notion that there is a ‘normal’ version of adulthood which (some) young people are rejecting. There are significant dangers in this interpretation. First, that those who are ‘refusers’ are in some ways deviant or ‘other’ and secondly, that there is a fixity in adult status.” These critical remarks are exceptions to the rule. Indeed, the largely uncritical manner in which the prevalent perspective is employed by social scientists means that often it simply mirrors the sentiments expressed in the media by marketing and advertising specialists. It is a relationship worth some consideration.
Adulthood, Common Sense, and Sociology
When Theodor W. Adorno remarked on the interdependence of what he called “prescientific thinking” and sociology, he insisted that the former is to be taken seriously: “Unless prescientific interests or extra-scientific concepts are imported into every scientific sociological investigation, then scientific interests and concepts are entirely lacking as well” (2000: 126). Indeed, as Bauman (1990: 10) notes, there is a “special relationship between sociology and common sense.” The discipline relies on common sense knowledge as the starting point to analysis. With the help of disciplinary epistemology, or perhaps simply with trained intuition, sociologists defamiliarize the familiar, taken-for-granted assumptions of everyday life and build “second order typifications” (Schütz 1954), theoretical concepts that stand apart from common sense. They build concepts and theories out of the material they find, observe, and study. However, it is not methodologically desirable to privilege common sense above abstraction: “Some historians and sociologists still believe that they can do without conceptual tools altogether and rely exclusively on what they think is plain common sense. But ‘common sense’ consists of nothing more or less than the abstract concepts and models wrapped up in conventional ways of saying things; as a result, commonsense commentators simply deprive themselves of any possibility of a critical understanding of their own conceptual tools” (Todorov 2003: 7).
Giddens (1984) attempts to reconcile these perspectives. His “double hermeneutic” aims to address the fact that social scientific interpretations of everyday constructs are reinterpreted and reassimilated into lay knowledge. In so doing, Giddens more than intimates that the quest for second-order typifications never ends. Applying these understandings to our analysis, the sociological notion of an allegedly prevalent delayed adulthood is the second-order typification of lay knowledge about “young people who refuse to grow up.” As a second-order typification it has its conceptual origins in common sense, “that rich yet disorganized, non-systematic, often inarticulate and ineffable knowledge we use to conduct our daily business of life” (Bauman 1990: 8).
While sociology rarely concerns itself with an analysis of adulthood as a social category, adult behavior and adulthood as a life stage are implicit in all sociological analyses. From the minutiae of everyday life to the macro processes of globalization; from ethnomethodology to systems theory; from the sociology of knowledge to critical theory to the cultural turn; the actor—whether conceived as individual, as decentered subject, or as system—is an embodiment of adultness. Even when sociologists are explicitly concerned with childhood, adolescence, youth, or old age, adulthood is always present as a point of reference. The adult represents the actor par excellence as the object of the majority of sociological investigations as if, in Norbert Elias's (1978: 248) words, he or she “was never a child and seemingly came into the world as an adult.” Thus, adulthood is both undertheorized as a phase of life and taken for granted as a default category and heuristic concept that grounds all manner of analysis. It is as neglected by sociologists as it is ever-present and central to what they do.
The term “practical consciousness,” which Giddens (1984)—borrowing from Marx and modifying Schütz's concept of “the natural attitude”—has used in his theoretical work, is useful in this respect. Practical consciousness refers to that prereflexive, intersubjectively constructed stance toward the world that enables individuals to pursue daily life. It is the unarticulated background knowledge that reduces the complexity of everyday interactions largely through their routinization. Elsewhere Giddens further emphasizes the existential centrality of the practical consciousness as “the cognitive and emotive anchor of the feelings of ontological security characteristic of large segments of human activity in all cultures” (1991: 36). Practical consciousness thus refers to a shared repertoire of meanings that confers a measure of predictability on social life and furnishes actors with a stability of reference. Note that sociologists inhabit a practical consciousness other than that which they share with others in their professional field: they are also lay participants in the everyday interactions of the lifeworld. “However hard they might try, sociologists are bound to remain on both sides of the experience they strive to interpret, inside and outside at the same time” (Bauman 1990: 10). They are both subject to and progenitors of commonsense assumptions and second-order typifications. Because of their vocation and the relationship between sociology and common sense, sociologists cannot strictly separate their professional from their everyday practical consciousness. There is therefore considerable overlap between the natural attitudes required in both terrains.
Sociologists are also adults. They have their own memories of childhood, which, like others, they may nostalgically reconstruct; they have their ideas as to what growing up means to them; and it is their own adulthood they embody in the present as the place from where coming of age—their own as well as others'—can be viewed from some distance. As such, adulthood is perhaps for most of us an unproblematic notion. It is not only central and marginal to the discipline, but it is also central and marginal to all adults: central to who they are, and marginal because it is an unobserved part of their identities. The overlap in sociologists' stance toward adulthood as an experiential fact of life and as a disciplinary given contributes to the paradoxical status of adulthood in sociology. Ultimately, it is this paradoxical quality of adulthood as a social representation, its simultaneous centrality and marginality in everyday life, media discourse, and social scientific perspectives that makes it such a rich area of social inquiry.