Читать книгу Feeding the Crisis - Maggie Dickinson - Страница 9

Оглавление

CHAPTER TWO

Care and Abandonment in the

Food Safety Net

The expansion of food assistance in the twenty-first century—what I call today’s growing food safety net—differs radically from the expansion of food assistance that took place in the 1970s, when the basic architecture of the modern food stamp program was put in place. The expansion of food assistance in the 1970s was one of the last major achievements of the War on Poverty. Spurred by the civil rights movement, the goal of the War on Poverty was to extend economic citizenship rights to all Americans, including African Americans who had been excluded from many of the New Deal welfare state programs established in the 1930s (Katznelson 2005, Quadagno 1996). The modern food stamp program emerged out of a deep-seated belief that hunger in a wealthy nation like the United States was intolerable and that the federal government had an obligation to make sure that no one starved.

In the late 1960s, Robert Kennedy embarked on a poverty tour to bring attention to pockets of severe poverty across the United States. In the Deep South and Appalachia, he and his team met listless children with clear signs of clinical malnutrition, including swollen bellies, wounds that would not heal, and stunted growth. Following in their footsteps, a team of doctors and nurses funded by the Field Foundation visited these same areas, documenting the prevalence of hunger and its terrible physical effects on children and others. They reported seeing children “suffering from hunger and disease, and directly or indirectly, . . . dying from them—which is exactly what ‘starvation’ means” (Robertson 1967). A separate investigation by the labor-backed Citizens Crusade Against Poverty identified 256 “hunger counties” across the United States. Attention to “the hunger issue” exploded when a 1968 CBS documentary titled Hunger in America brought these findings to the American people (Levenstein 1993).

Images of visibly malnourished children shocked many Americans. There was a sense of disbelief that such severe hunger could exist in what, at the time, was broadly considered an affluent society. Just a few years earlier Michael Harrington, in his unflinching expose of poverty in a land of plenty, had written, “To be sure, the other America is not impoverished in the same sense as those poor nations where millions cling to hunger as a defense against starvation. This country has escaped such extremes” (Harrington 1962). And yet the images emerging from Appalachia and the Deep South in the late 1960s provided stark proof that this was not the case. Not everyone living in the United States had, in fact, escaped such extremes.

The political response to these findings and the public concern they generated was swift and dramatic, largely because they emerged in a moment of high social movement activity. Poverty politics in the late 1960s was dominated by social movements like the National Welfare Rights Organization (NWRO) that were engaged in a national campaign to expand access to welfare and to institute a guaranteed income for all Americans (Nadasen 2004). By the late 1960s, leaders in the civil rights movement had shifted focus to winning economic rights through multiracial organizing efforts like the Poor People’s Campaign. In this context, groups across the political spectrum were galvanized to push for political action on the hunger issue, which seemed like an intolerable manifestation of poverty. Civil rights activists who had long been involved in antipoverty struggles and welfare rights campaigns demanded changes to the food stamp program so that families with little or no cash could more easily access food assistance (Robertson 1967, Kornbluh 2015). Grassroots activists like the Black Panther Party took matters into their own hands by organizing programs of free breakfast for children and free groceries, building community support for their revolutionary political agenda and embarrassing the federal government into taking action (Nelson 2012, Patel 2012, 3).1 Journalistic exposés on hunger proliferated and Congress quickly initiated its own investigations into the extent of the problem in the United States. Liberal advocacy groups formed and pushed Congress to take action. Hunger quickly emerged as an unavoidable political issue. Both Richard Nixon and Hubert Humphrey campaigned on promises to end hunger in the United States in the 1968 presidential election (Levenstein 1993, 150).

Under tremendous public pressure to address hunger and poverty, the Nixon administration and Congress moved swiftly to expand access to food assistance. By 1974, the food stamp program was available in all fifty states and the school lunch program had also been expanded (Levenstein 1993). The Food Stamp Act of 1977 finally eliminated the food stamp purchase requirement, which meant poor families no longer needed to have cash up front to purchase food stamps (Poppendieck 1998). The effect of these policy changes on hunger in the United States was dramatic. Enrollment in the food stamp program jumped from three million in 1969 to eighteen million in 1976 (United States Department of Agriculture 2018). Doctors who had documented the effects of severe hunger in the late 1960s revisited those same communities a decade later and found that clinical malnutrition had been virtually wiped out in the United States. The swollen bellies that were so prevalent a decade before were nowhere to be found. Researchers reported, “Where visitors ten years ago could quickly see large numbers of stunted, apathetic children with swollen stomachs and the dull eyes and poorly healing wounds characteristic of malnutrition—such children are not to be seen in such numbers. . . . Many poor people now have food”(Kotz 1979). These researchers were clear in their assessment that it was public policy—the expansion of food stamps, school lunch, and school breakfast that made the difference.

By the mid-1970s food stamps were a universal entitlement available to all citizens as long as they met the income requirements, regardless of whether they worked or not. As such, food stamps were the closest thing we have ever had to a universal floor under wages in the United States. The degree to which the public and policy makers across the political spectrum were galvanized to take action in the face of visible malnutrition speaks to the success of social movements in pushing the state to guarantee a basic level of economic security. The food safety net of the 1970s was explicitly intended to protect citizens against the worst ravages and hardships associated with poverty. However, expanding food assistance succeeded precisely because it remained a partial, supplemental solution. Even with food assistance, poor people still need to work to pay for shelter, clothing, and other necessities. More transformative demands like the NWRO’s campaign for a guaranteed income failed precisely because they challenged the idea that poor people must go to work to meet their basic needs.

The expansion of the food stamp program in the 1970s was rooted in demands for care that were enacted through the welfare state. Social movements, advocates, and citizens offered a clear response to the question, “how should care happen in an inclusive democracy?” (Tronto 2013, 10). To be a citizen in a land of plenty meant to be able to have enough food to sustain oneself, and a broad range of Americans insisted that it was the role of the state to ensure that everyone had access to sufficient food. The modern food stamp program was a tremendous policy and public health achievement, virtually eliminating clinical malnutrition in the United States. But the program’s success in eliminating severe hunger was quickly overshadowed by a political backlash against the War on Poverty and the gains of the civil rights movement more generally (Quadagno 1996, Neubuck and Cazenave 2001). The paradoxes of the growing food safety net in the twenty-first century are rooted in this backlash.

ORGANIZED ABANDONMENT

The tone of the debate around food assistance changed dramatically by the early 1980s. In the 1970s, social movements pushing to expand social and economic rights for poor people began to wane for a number of reasons, including direct infiltration and disruption of movements by the police and the FBI (Blackstock 1975). At the same time, economic elites in the US began to organize politically through new institutions like the Business Roundtable. Their goal was to “restructure state agencies that had been designed under the enormous emergency of the Great Depression (the New Deal) and its aftermath (loosely, the Great Society) to promote the general welfare” (Gilmore 2009), including welfare state programs like cash assistance to poor families, food stamps, Medicaid, and social security. By 1980, the ideas promoted by these business elites had gained national prominence in the figure of Ronald Reagan.

Reagan perfected the use of coded racial language meant to conjure an image of welfare users as Black—despite the fact that the majority of both welfare and food stamp recipients have always been white (Haney-Lopez 2014). He used thinly veiled racist terms like the welfare queen to build support for a political agenda aimed at dismantling and undermining state agencies that business leaders saw as problematic, including social protections for poor and working-class people. He defined care as the problem, asserting that the state was providing too much care to the poor at the expense of aggrieved tax payers (Edsall and Edsall 1992). Drawing on thinkers like Charles Murray, Reagan reframed welfare programs as harmful to poor people, arguing that assistance to the poor encouraged a culture of dependency (Murray 1984). He valorized work as the primary path to independence and voluntarism as the solution to poverty. By repeatedly portraying welfare dependents as living in luxury and contrasting them with economically struggling tax payers, Reagan successfully vilified welfare state programs among many white voters. He successfully pushed through cuts to welfare state spending in the early 1980s, including cuts to the food stamp program, in the midst of a deep recession (Poppendieck 2014, 263).

Alongside his attacks on welfare state programs, like food stamps, Reagan argued that the private sector and voluntary organizations were better equipped to address many of the nation’s ills (Germani 1981). He popularized the idea that care should come from the local community, not the state. Efforts to reinvigorate the American tradition of voluntarism emerged alongside growing food insecurity due to welfare program cuts and a deep recession. People in struggling communities, hit by factory shut downs and cuts to social services, scrambled to respond to the rising needs of newly unemployed and insecure workers (Walley 2010, Pappas 1989). The efforts of faith-based organizations and community groups to respond to the early effects of deindustrialization gave rise to the modern food bank movement in the early 1980s and established the two-pronged approach to hunger we see today in the United States (Poppendieck 1998). Emergency food providers proliferated to fill in the gaps left by cuts to public programs like food stamps to meet the needs of hungry families.

One of the key welfare policy innovations Reagan championed was adding work requirements for public benefits like food stamps and cash assistance. Though some modest pilot welfare-to-work programs were put in place in the 1980s, these efforts were limited. The first meaningful work requirements for food stamps were instituted in 1996 as part of the federal welfare reforms passed under Bill Clinton’s administration.

Clinton’s campaign pledge to “end welfare as we know it” set him apart from other Democratic leaders. It signaled a willingness to abandon a Keynesian approach to the welfare state that would protect programs that redistribute resources to people when the market fails them. Instead, Clinton’s ascendency marked the consolidation of a brand of market triumphalism, often referred to as neoliberalism, that saw free, unregulated markets as the solution to a whole range of social issues—including poverty and hunger (Maskovsky and Goode 2001). From a policy perspective, Clinton defined the solution to poverty as participating in the market as a worker, particularly for poor women with children who were the primary beneficiaries of cash and food assistance. Citizenship itself was redefined as both the right and the obligation to participate in markets.

Welfare reforms in the Clinton era directly undermined the gendered exemption from work for poor mothers caring for young children. Millions of poor women left the welfare rolls after the passage of welfare reform in the late 1990s, and when they did, they often lost their food stamp benefits. The diversionary tactics local welfare offices employed to discourage families from applying for cash assistance after the passage of welfare reform were also employed to discourage them from applying for food stamps (Davis 2002, Independent Budget Office 2008). Food stamp rolls plummeted after 1996, falling from twenty-five million in 1995 to just below seventeen million in 2000 (Wolkwitz 2007). The steep drop in the food stamp rolls set the stage for the revival of food assistance as a strategic policy intervention designed to meet the needs of the working poor.

SELECTIVE CARE

If the face of a starving child motivated the expansion of food stamps in the 1970s, it was the face of the working mom who could not afford healthy food that inspired the expansion of food assistance at the turn of the twenty-first century. In the wake of welfare reform, women entered the work force, but found they often could not move out of poverty, despite working full time (Lein 2007, Newman 2001). The working poor emerged as a visible and sympathetic population, fulfilling their obligation to work but still struggling with incomes that kept them below the poverty line. The expansion of food assistance that began in 2001 under the George Bush administration was largely a response to the needs of poor mothers who could not make ends meet even though they were working. Changes to the food stamp program that made it easier for working families to access the program were part of a broader push to increase work supports, such as the earned income tax credit (EITC) for low-wage workers. Business leaders were largely supportive of increased wage subsidies because these programs made it easier for employers to keep wages low. Unlike the expansions of food assistance in the 1970s that were in many ways a response to social movements and their demands for wealth redistribution, the expansion of food assistance in the twenty-first century was essentially a give-away to low-wage employers. While many poor working families receive a boost to their income from SNAP and the EITC, those who do not fit the model of the working poor can be excluded from help.

During the George Bush administration, the expansion of federal food programs took place quietly. Reimagining food stamps as a work support was a technocratic response to long-term economic trends in the labor market that began in the 1980s. Throughout this period, manufacturing jobs, which provided stable work and middle-class wages for large numbers of working-class families, disappeared. These jobs were largely replaced by service-sector jobs that are lower paid and less secure. The kinds of jobs available to working-class households include care work of various kinds—cooking; looking after children, the ill, and the elderly; and house work.

Broad changes in the labor market also had an enormous impact on the ways that Americans eat. Changes in the American diet over the past several decades have spurred concern over public health, obesity, and diet-related disease, which became central issues shaping food policy in the Obama era. The commodification of household labor has created a huge pool of cheap “help” that even low-income families can access. These cheap forms of help range from fast-food restaurants and processed, prepared foods in the grocery store to the proliferation of daycare centers and home health aides. These growing industries are the mainstay of employment growth in the United States, producing thousands of new, poorly paid jobs for the American working class. As Susan Thistle writes, “The conversion of women’s domestic tasks into work done for pay has been the area of greatest job growth over the past thirty years” (Thistle 2006, 102). She argues that almost two-fifths of the increase in jobs since 1970 was due to market takeover of household and caring tasks. Some of the most remarkable growth has come in the food service industry, as Americans of all income levels increasingly eat food prepared outside the home. The factory-like production of fast food has been the key to lowering the costs of prepared food and putting it in reach of even poor households (Thistle 2006, Fantasia 1995, Schlosser 2002, Levenstein 1993). Even more rapid growth has taken place in the realm of routine domestic care for the sick, elderly, and young, and “the commercialization of women’s domestic realm will continue to provide the bulk of new employment over the first decades of the twenty-first century” (Thistle 2006, 106).

As women moved into the labor force in large numbers, the commodification of domestic labor has produced a strained, stratified patchwork of market, familial, and state systems of care. But as Mona Harrington has argued, “We have come nowhere near replacing the hours or quality of care that the at-home women of previous generations provided for the country” (Harrington 2000, 17). As caring labor is increasingly commodified, we are confronted with the question, how much can we reorganize forms of care without imposing significant costs? As Sylvia Federici points out, “The degree to which the marketization of food production has contributed to the deterioration of our health (leading, for example, to the rise of obesity even among children) is instructive” (Federici 2012). Reliance on commodified forms of care for low-income households—fast food and prepackaged foods in particular—are increasingly understood as contributing to ill health and producing new forms of social instability.

Led by Michelle Obama’s efforts, the Obama administration emphasized the role that programs like SNAP could play as both a work support and a public health intervention. However, in the face of the entwined crises of care, economic insecurity, and public health, the Obama administration’s efforts to transform food programs were largely symbolic. For example, his administration changed the name of the food stamp program to the Supplemental Nutrition Assistance Program, or SNAP, to emphasize the nutritional impact of the program as part of the 2008 stimulus bill. This change was part of a broader push by Michelle Obama’s Let’s Move campaign to tackle rising rates of childhood obesity and diet-related disease. The emphasis on health and nutrition that defined the Obama administration’s approach to food policy and was reflected in the passage of improved nutrition standards for school lunches and the program for Women, Infants and Children (WIC). But these narrow, targeted interventions failed to confront the power of a loosely regulated food industry to flood consumer markets with unhealthy foods. Nor did they do much to improve low-income households’ purchasing power in ways that might enable them to afford healthier but more expensive food.

Obama-era concerns with health and well-being were grafted onto a food safety net that was structured first and foremost to encourage and support low-wage work, not to ensure a universal right to adequate, healthy food. President Obama’s commitment to work-first welfare was in line with previous administrations, reflecting the staunch political consensus of welfare reform as a success. His approach to poverty and food insecurity was largely compatible with the organized interests of the business community that have dominated national politics since the 1980s. He attempted to give states greater flexibility in how they implemented work requirements for safety net programs in 2012. These efforts met a swift and furious backlash, with presidential candidate Mitt Romney running ads claiming that Obama was trying to “gut welfare reform.” The Obama administration immediately backed off these minor changes, and Obama defended his record of helping to implement work requirements in Illinois as a state senator, signaling his strong commitment to work-first welfare despite his administration’s vocal concerns with public health (Ball 2012).

THE POLITICS OF EXCLUSION

In the 1970s, profits began for fall for industrial manufacturing in the United States. Firms responded by pursuing a spatial fix, lowering labor costs by moving production to areas of the globe where labor was cheaper. Today, the growing sectors of the US economy have no spatial fix. Care of the elderly and sick, housekeeping, and the retailing of food and other consumer goods cannot be moved offshore. Since there is no spatial fix for these sectors, elites have pursued what Collins and Mayer have called a “relational fix,” creating new categories of people within the working class who are more vulnerable to exploitation and who can be excluded from basic economic rights. Collins and Mayer argue that workfare and welfare reform were part of the creation of a race to the bottom in service jobs that tracked with the global race to the bottom in manufacturing (Collins and Mayer 2010). The institution of workfare, along with an undermining of immigrant labor rights and the creation of a large population of formerly incarcerated people who can be legally discriminated against in the labor market (Alexander 2010) exerts downward pressure on the wages and rights of all wage workers. This domestic race to the bottom has taken the form of both stagnating wages and salaries as well as increasingly informal labor arrangements—including many occupations in what is now called the “gig economy”—extremely short-term jobs, such as Uber drivers, with no formal relationship between employers and employees. The lives of the families in this book are intimately shaped by this race to the bottom. As I assisted people in applying for food stamps and accompanied them to welfare offices, packed bags with them in the food pantry, and shared meals with them in the soup kitchen, they shared their sense of slipping further behind. For many, the stable lives they yearned for felt increasingly out of reach.

Across the political spectrum, welfare reform continues to be heralded as a success precisely because TANF, the cash assistance program for poor families, continues to enroll far fewer people today than it did when reforms were passed. Success is never measured by how many people actually moved into work or escaped poverty, but how many are moved off of the rolls. The assumption is that if people are no longer receiving assistance, then they must be working. But the reality is that there are a growing number of Americans who are disconnected from both the labor market and state assistance. Six million people in the United States had no access to any income other than SNAP in 2010 (DeParle and Gebeloff 2010). The number of single mothers who are disconnected from both work and welfare has grown steadily since 1996 (Blank and Kovak 2008). Further, growing numbers of both men and women find themselves outside of or on the edge of the paid labor force and are struggling to find a way back in.

After a huge spike beginning in 2008, the unemployment rate slowly returned to pre-recession levels. In September 2018, unemployment stood at 3.7 percent, or 6.1 million people who wanted to work, but were not employed. However, the official unemployment rate only captures part of the story. The labor force participation rate—the percentage of US residents who are currently working—suggests a less rosy economic picture for American workers. The number of people who are actually working has remained far below pre-recession levels. Only 62 percent of all adults were employed in 2017—down from 66 percent in 2007. Labor force participation rates began to rise in the early 1970s as middle-class incomes stagnated. Women, including mothers of young children, began moving into the labor force in higher numbers. Labor force participation peaked at 67.3 percent of the adult population in 2000 and has been declining ever since. In the wake of the recession, labor force participation remains at a thirty-eight-year low. Men’s participation in the labor market has steadily declined since 1950 (Bureau of Labor Statistics 2016). Low unemployment rates don’t tell the full story because they don’t count all the people who have given up on a labor market that has little demand for more workers.

Further, unemployment numbers tell us very little about the quality of the jobs people encounter when they enter the labor market. Employment numbers don’t just count standard employees, with regular full-time jobs. This number includes anyone working, even if they only work a few hours a week and earn next to nothing as underemployed freelance or contract laborers. Freelance and contract workers grew from 10.1 percent of all workers in 2005 to 15.8 percent in 2015, and nearly 40 percent of people in these jobs have a bachelor’s degree or higher. Within the part-time labor force, 6.4 million would have preferred full-time work (Katz and Krueger 2016). Even as the labor market has grown tighter, with record low unemployment rates, wages have stayed flat (O’Brien 2018).

There is a disconnect between an official economic recovery and the overwhelming sense of economic insecurity that is palpable among people at welfare offices and food pantries across the United States. This economic insecurity has become a driving force in US politics. Downward mobility runs the risk of creating a dangerous political instability. Just as the safety net has been drawn more tightly around the idea of work, workers like Nigel and the other families we will meet in this book are getting their legs kicked out from under them. Employers are abandoning their commitments to even basic tenets of the employer-employee relationship.

Work, as a system of distributing necessary resources to the bulk of the US population, has begun to fail in a range of ways. The commonsense belief that work is a way out of poverty is at odds with the reality that work has never been a particularly effective mechanism for distributing wealth, particularly for racialized populations living in the United States. For generations, African Americans and exploited immigrant laborers have been relegated to low or nonpaid, degrading forms of work that did not provide enough to sustain families with dignity and security. These insecure and exploitative labor conditions have become far more widespread in the twenty-first-century US economy. Many full-time workers no longer earn enough to afford basic necessities like food and shelter. The food safety net has been reconfigured to subsidize poor workers and exclude those who do not work, just as work itself has become more nebulous and less secure. The next chapter tells the stories of two families as they navigate the new terrain of a growing food safety net targeted to the working poor in an era where work itself is being redefined.

Feeding the Crisis

Подняться наверх