Читать книгу Changing Contours of Work - Stephen Sweet - Страница 17

Culture and Work

Оглавление

One of the core questions guiding the sociology of work concerns how much work should be performed and the amount of work that should be expected of individuals throughout their life course. Most classic theories of work embrace cultural perspectives that view labor, in and of itself, as a noble endeavor. Karl Marx (1964 [1844]), for example, argued that work is what distinguishes humans from other species, and he highlighted how it enables people to transform their environments to suit human interests. Sigmund Freud (1961 [1929]) argued that work is a socially accepted means by which humans can direct their sublimated sexual energies. As such, he saw work as a means of achieving satisfaction when fulfillment in other parts of life is lacking or is prohibited. Émile Durkheim (1964 [1895]) offered a different thesis, that work and the complex division of labor in society offered a means to create social cohesion. All these perspectives have in common the assumption that work has the potential to cement social bonds and advance the development of civilization. These perspectives suggest that the more work that people are able to perform, the more liberated their lives will be; they would evaluate the new economy in terms of its ability to enable people to work more. This proposition should not be accepted too quickly, however.

Is it a cultural universal that work should be praised and that commitment to work is an indicator of social health? Anthropological and historical studies suggest otherwise. In many cultures, work is defined as the means for day-to-day survival. Subsistence economies operate on the basis of cultural assumptions that work is primarily a means to an end, so that once individuals have enough food and shelter, labor is expected to cease. Such an orientation to work in today’s American culture would indicate a moral weakness and be perceived as a threat to social order. But from the point of view of many other cultures, our embrace of work could be considered pathological. If one can obtain enough to eat and gain sufficient shelter by working a few hours a day, so be it. Why should a hunter set out in search of game if the supply of food is adequate (Brody 2002; Sahlins 1972)?2 And even within geographic regions such as modern-day Western Europe, expectations regarding the age at which individuals are expected to embark on careers, or to retire, vary remarkably. These varied life course scripts result in some cultures expecting ten years (or more) of additional attachment to the labor force than others, with policies that match these expectations (Sweet 2009). When viewed through this lens, Mike’s loose attachment to work becomes more understandable and seems a lot less pathological. And note that Meg’s decision to exit the labor force at midlife to care for her children carries with it no stigma.

One important cultural question concerns why work plays such a central role in some societies but not in others. Part of the answer, according to Max Weber (1998 [1905]), is that the societies in the forefront of the Industrial Revolution had been swayed by changing religious doctrines. These religious beliefs, particularly those that underpinned the Protestant Reformation, created anxieties about one’s fate in the afterlife. In response, Western European and American culture advanced the value of the work ethic, a belief that work is not something people simply do—it is a God-given purpose. Devoting oneself to work and doing a good job were considered to be ways of demonstrating that a life of virtue reflects grace. And as members of these societies embraced the idea that work is “a calling,” they applied themselves to their jobs with greater vigor, creating wealth and affirming to themselves and others that God was looking favorably on their actions.

Although many now question Weber’s thesis that the Protestant Reformation was responsible for the emergence of capitalism, the centrality of the work ethic to the development of Western society is widely accepted. So deeply is it ingrained in contemporary American culture that nearly three-quarters of Americans report that they would continue to work, even if they had enough money to live as comfortably as they would like for the rest of their lives.3 Americans work to affirm to themselves and others that they are virtuous, moral individuals and deserving of respect (Shih 2004). Conversely, those who choose not to work, or workers like Mike who are unsuccessful in securing a job, are looked down upon and stigmatized. In American society, to be without work is to be socially suspect and unworthy of trust (Katz 1996, Liebow 1967).

The work ethic defines labor as a virtue, but it also has pathological dimensions. The cultural embrace of work may be akin to the flame that attracts the moth. It is telling that many who can afford to work less, and who have the opportunities to do so, choose not to (Hochschild 1997). Psychologists call these individuals “workaholics” (Machlowitz 1980), but as we discuss later in this book, many of those driven to work long hours do so because they are driven by organizational cultures that bestow rewards on those who live and breathe their jobs. The suspicion cast on those who do not hold jobs has created pressures to force work on those who get little benefit from it. Consider that welfare reform legislation, passed in the mid-1990s, requires even very poor mothers of young children to work in order to receive welfare assistance. This requirement defines mothering as “not work” (a concept we return to later) and ignores the reality that when these women do work, usually it is in low-wage dead-end jobs.

Why would people work above and beyond their own economic needs? For some, work is a means to provide for their families, which in turn drives them to work incredibly hard, even in jobs that offer few intrinsic sources of satisfaction (Menges et al. 2017). But need is not the sole driving force. Thorstein Veblen (1994 [1899]) in The Theory of the Leisure Class observed that attitudes to work are bound up with materialistic values held in American culture. Markers of status include luxury autos, large homes, and expensive clothing. All of these commodities are conspicuously consumed, put on display to be seen and admired, and set standards for others to follow. By the mid-twentieth century, the drive to purchase social status permeated American society, compelling workers to labor hard “to keep up with the Joneses” and their neighbors’ latest purchases (Riesman, Glazer, and Reuel 2001 [1961]). Contemporary American workers engage in the same status game that emerged in the late nineteenth century, but with new commodities (e.g., iPhones, BMWs, and home theaters). Their competition now expands beyond their neighborhoods, as they are saturated with media images of success and have developed numerous ways to accumulate debt (home equity loans, student loans, and credit cards).4 The result, some have argued, is “affluenza,” the compulsion to purchase and spend beyond one’s means (de Graaf, Wann, and Naylor 2001). For some members of the new economy, work has become the means to manage spiraling debts incurred while striving to keep up with others who are spending beyond their means as well (Schor 1998).

Because culture also shapes the attitudes workers and employers have toward each other, it is a social force that can create (but also dismantle) opportunity divides. One means by which culture contributes to social inequality is through the construction of social divisions and group boundaries. Racial and gendered divisions, for example, are based on assumptions that different social groups possess different capabilities. Whether these differences were originally real is immaterial; as the early twentieth-century American sociologist W. I. Thomas noted, what people believe is real often becomes real in its consequences (Thomas and Thomas 1928). In turn, these beliefs contribute to the formation of self-fulfilling prophecies. As we discuss later in this book, cultural assumptions about gender and race shape social networks, influence access to resources, and funnel people into different lines of work.

Beyond setting up boundaries, culture extends into job management practices and the design of technologies. Consider, for example, the enduring legacy of scientific management, one of the bedrock managerial approaches of the old economy. Frederick Winslow Taylor introduced this managerial philosophy (also known as Taylorism) at the beginning of the twentieth century to increase the productivity of workers laboring in factories. He advocated the benefits of redesigning work to wrest control from workers and place it in the hands of management. His Principles of Scientific Management (Taylor 1964 [1911]) argued for the separation of “thought” from “execution” to establish clear divisions between managers (whose job was to think and design) and workers (whose job was to carry out managers’ instructions). He used time and motion studies to decompose production jobs into the simplest component tasks to increase worker speed and accuracy. And managers’ jobs were redefined to absorb worker skills into the machines and organization and to keep the flow of knowledge going in one direction—from the shop floor into managers’ hands. The result was the creation of legions of deskilled jobs, the dissolution of many craft skills, and a decline in the individual worker’s ability to control the conditions and rewards of work (Braverman 1974, Noble 1979, Pietrykowski 1999). It also fostered distrust and hostility between workers and their bosses (Montgomery 1979).


Exhibit 1.7 The Film Modern Times Offered a Poignant Illustration of the Alienating Nature of Work in Factory Jobs in the Old Economy

Source: Max Munn Autrey/Margaret Chute/Getty Images.

Why did Taylor advocate this way of organizing work, given its obvious negative consequences for the quality of work life and its negative effects on labor–management relations? In part, it was a response to something real—the fact that workers often did not work as hard as they could. His experiences had taught him that they did not show up to work consistently, took long breaks, and worked at a more leisurely pace than owners desired. These behaviors reflected workers’ cultural values and their definition of what constituted a reasonable amount of labor. Likewise, Taylor’s interpretation of this behavior was culture bound. He interpreted workers’ behavior not as a rational, class-based resistance to employers, but as an irrational unwillingness to work to one’s full potential. Taylor, like many Americans of his time, was embracing a cultural denial that class divisions within the workplace existed. His solutions also reflected the culture in which he was living. He advocated a reorganization of the workplace based on scientific methods, something that resonated tremendously in a society where science had come to be seen as the solution to many human problems. And he depicted the worker as essentially unintelligent and easily manipulated; Taylor was fond of using an example involving a worker named Schmidt (whom he described as “oxlike”), whom he persuaded to adopt his new system through a combination of simpleminded arguments and limited incentives. This, too, was typical of the dominant American culture at that time; many Americans believed that members of the lower classes, immigrants, and others at the bottom of society were inferior in various ways (including intelligence) to the more successful members of society. Taylor’s ideas also reflected an abiding cultural belief in the correctness of capitalism, particularly the proposition that it is natural that some should be owners and others laborers, that the efforts of those at the top were more important and valuable, and that an extremely unequal distribution of the fruits of labor was not just defensible but actually desirable (Callahan 1962, Nelson 1980).

The legacy of managerial philosophies—in this case, scientific management—highlights how culture and social structure intersect. Managerial perspectives that embraced the proposition that workers are indolent and should not be trusted are directly responsible for the creation of many of the alienating, low-wage “McJobs” present in America today. These philosophies initiated the development and application of assembly lines, promoted the acceptance of the idea that some people should be paid to think and others to labor, and fostered divisions between “white-collar” and “blue-collar” jobs. This cultural orientation to work can explain the types of jobs that Tammy held at General Motors, as well as the rationale for eliminating those jobs and moving them elsewhere in the global economy.

These examples of how culture shaped workplaces in the past suggest interesting questions about culture’s role in carving out the contours of the new economy. Have cultural attitudes about the role of work changed, and if so, have workplaces changed along with them? How long are people working, and why do they work so much? Have Americans begun to abandon long-standing Taylorist cultural assumptions about the proper way to organize work, or do we continue to construct workplaces on the assumption that workers are lazy, ignorant, and not to be trusted? To what extent do perceived divisions between the members of society continue to deprive some people of access to opportunity? We address these concerns in the chapters that follow.

Changing Contours of Work

Подняться наверх