Читать книгу The Green New Deal and Beyond - Stan Cox - Страница 10

Оглавление

1


GROWTH AND LIMITS: 1933–2016

“Every day of continued exponential growth brings the world system closer to the ultimate limits to that growth. A decision to do nothing is a decision to increase the risk of collapse.”

—Donella Meadows et al., The Limits to Growth, 197213

No nation in history has done what the climate emergency now requires the United States and other nations to do. We must decide collectively that we will refrain, forever, from tapping known, rich reserves of easily available energy. It remains to be seen if we can manage that. We and other nations have faced resource limits many times before, but they were not self-imposed. Now they must be. Can we, collectively, of our own free will, put permanent boundaries around extraction of potent mineral energy from the Earth? That clearly will be the most difficult step in taking on the climate challenge. Should we take that step, we can prepare to deal with the consequences by learning from past episodes in which Americans found themselves hemmed in by forces beyond their control and were forced to deal with limits.

Proponents of ambitious climate initiatives have long been fond of historical allusions. The Manhattan Project, the Apollo Program, the Interstate Highway System, the New Deal, and World War II all have been cited as precedents. For our purposes, we can safely set aside purely technical feats such as the bomb, the moon shot, and the interstates. But the New Deal, the wartime mobilization of the 1940s, and other crucial junctures in the decades that followed, offer insights that can be useful to us in responsibly addressing the climate crisis.

Various Green New Deal visions have been explicit in emulating the 1930s New Deal example of using public policy to put society to work and solve big problems. Those and other strategies for a green makeover of the nation’s energy systems and infrastructure often hark back to the lightning-speed buildup of productive forces in the 1940s. But that wartime industrial surge was only half the story. The other half was that for a brief four years, the U.S. civilian economy went into emergency mode, becoming almost the opposite of itself, with carefully planned production and strictly limited but equitable civilian consumption. The postwar economic boom of the 1950s and ’60s, surfing on a wave of cheap oil and military spending, created the false impression that limits of all kinds had been suspended. But when the energy crisis landed hard in the 1970s, Americans were shocked back into reality, and the decade came to be defined by limits. Restoring and increasing the flows of both fossil energy and wealth became a central mission of the federal government in the 1980s. Finally, through the climate-aware 1990s and 2000s, the need to reduce and eventually eliminate the use of fossil fuels was dismissed time after time on the grounds that economic growth always takes priority.

The 1930s and ’40s saw a desperate need to burst through limits imposed by the economic system. Now, we need desperately to pull the economy back within limits set by the Earth’s ecosystems. Whether or not our society—or human civilization—can survive the current emergency intact will depend in large measure on whether we take ecological limits seriously.

“COOPERATIVE CAPITALISM”

Crisis was far too mild a word; emergency came closer to capturing most Americans’ predicament in the early 1930s. The U.S. unemployment rate had vaulted from 3 percent at the time of the 1929 financial crash to 24 percent during the 1932 presidential campaign. Given those numbers, prospects appeared excellent for Franklin D. Roosevelt, the Democratic Party nominee, as he set out to unseat incumbent Republican Herbert Hoover, whose weak free-market tonics had only worsened the downward spiral.

In a campaign speech at the Commonwealth Club in San Francisco, Roosevelt called for a sharp break from long-standing economic orthodoxy. He had come to the conclusion that in America, growth had not just faltered; it had come to an end. The free-market policies of the nineteenth century, he said, were inadequate to address the human catastrophe they had created. Sounding more like a twenty-first-century steady-state economist than a wealthy politician of the 1930s, he declared, “Our task now is not discovery or exploitation of natural resources, or necessarily producing more goods. It is the soberer, less dramatic business of administering resources and plants already in hand, . . . of distributing wealth and products more equitably, of adapting existing economic organization to the service of the people.”14

In reality, Roosevelt had no intention of knocking out the pillars of capitalism in such a fashion, and once he took office, his actions were not as radical as he had implied in the San Francisco address. But those actions did include a flood of economic legislation that served as inspiration for today’s vision of a Green New Deal. In the Roosevelt administration’s first hundred days, Congress passed a breathtaking stack of stimulus initiatives that, among other things, provided $3.3 billion for public works—more than the entire federal budget of three years earlier. This came four years before the publication of John Maynard Keynes’s epic The General Theory of Unemployment, Interest and Money, the book that showed the world why ending a depression or severe recession requires deep-pocketed government intervention in the economy. In a 1999 article, Patrick Renshaw, then of Sheffield University, discussed how the New Deal was built not on a theoretical foundation like the one laid out by Keynes but rather on the “chaos of improvisation.” He wrote, “As it struggled to end mass unemployment, the federal government stumbled on this policy, whereby it was forced to act as compensating agent during an economic downturn, spending public money to fill troughs in the trade cycle in order to stimulate revival.”15

One of the headline initiatives of those first hundred days was the National Industrial Recovery Act (NIRA) of 1933. Declaring a national emergency, the NIRA created the National Recovery Administration (NRA) and gave it the mission of steering private industry toward prosperity. The Recovery Administration was not a mere dispenser of stimulus funds. Rather, its goal was no less than the planning of the entire industrial economy. It aimed its biggest guns at the cutthroat competition that New Dealers saw as driving down wages and prices and deepening the Depression. The Recovery Administration relaxed antitrust enforcement and worked with private industry, through hundreds of business and trade associations, to develop voluntary “codes of fair practices” that would limit production and set wages and prices. The Recovery Administration also guaranteed the right of workers to unionize, even giving union members a voice in the development of the fair-practice codes. Summing up the Recovery Administration’s goals, Ira Katznelson, the author of Fear Itself: The New Deal and the Origins of Our Time, wrote that it “sought to refloat capitalism and sustain a balanced private economy.” This was to be accomplished through economic planning and “corporatism” aimed at eliminating class conflict and curbing what Roosevelt lieutenant Rexford Tugwell called “the anarchy of the competitive system.”16

As a grand experiment in industrial planning, the Recovery Administration flopped badly. The Southern Democratic members of Congress who had voted for the underlying legislation turned against the Recovery Administration when they saw that Black workers might have to be paid as much as whites. And the voluntary codes ended up being written and edited largely by the trade associations and powerful corporations, with labor having little say in the matter. It was Katznelson’s nicely understated conclusion that “uneven class power made planning for cooperative capitalism difficult.”17 In 1935, the Supreme Court delivered the death blow, declaring the Recovery Act unconstitutional. But 1935 also saw the creation of one of the New Deal’s most highly visible recovery programs, the Works Progress Administration (WPA). The WPA pumped enormous stimulus into the economy by hiring more than 8 million unemployed Americans to construct countless public works. Meanwhile, the Recovery Act’s failed attempt to foster voluntary reform of private industry was eventually succeeded by toothier regulation under the Fair Labor Standards Act of 1938, which mandated the national minimum wage, the eight-hour workday, overtime pay, and the end of child labor.

KIND OF GREEN

With the climate emergency, the Green New Deal is sharply focused on solving the headline issue of our time. The New Deal had its own, less conspicuous green side, one that sought to resolve the headline environmental problem of its day: the Dust Bowl.

Every state but Vermont and Maine experienced at least one period of severe drought between 1930 and 1936. At times, fine brown dust fell like snow across the eastern half of the country; it had flown all the way from the plowed-up wheat lands of the High Plains two thousand miles to the west. Exposed, desiccated soil was being eroded by the region’s characteristic high winds, filling the sky and drifting in roadside ditches. In 1932, there were fourteen major dust storms, each covering vast portions of the region; that annual total rose to sixty-eight in 1936 and seventy-two in 1938. According to the historian Donald Worster, it took a monster record-breaking dust storm in May 1934, to finally “make the plains visible to Washington.” He wrote, “As dust sifted down on the Mall and the White House, Roosevelt was in a press conference promising that the Cabinet was at work on a new Great Plains relief program.”18

In general, the Dust Bowl and the economic devastation of Depression-era rural America sprang from the same roots: The drive for maximum production resulted in maximum exploitation of the soil, at the same time creating a massive glut of grain that an impoverished populace couldn’t afford to buy. Worster put in this way:

Linking the two disasters was a shared cause—a common economic culture, in factories and on farms, based on unregulated private capital seeking its own unlimited increase. In the 1920s that culture had created a high-producing, high-consuming life for Americans. Few people at that time questioned its premises; business was the national faith. But it could also be, as both the bread lines and the dust storms of the following decade revealed, a self-destructive culture, cutting away the ground from under people’s feet.19

The immediate economic problem was addressed by the Agricultural Adjustment Act (AAA), passed in 1933. Under the act, the Department of Agriculture worked out agreements to reduce production and raise prices to farmers for the major crops and animal products as it sought to get rid of surpluses and expand markets. Later, the Food Stamp Plan was created not only to address widespread hunger, but also to pump up demand for agricultural commodities.20 The most prominent initiative aiming directly at the Dust Bowl was the Soil Conservation Service (SCS). The agency was created in 1935 by Public Law 74-46, which declared that “the wastage of soil and moisture resources on farm, grazing, and forest lands . . . is a menace to the national welfare.”21 The Soil Conservation Service undertook numerous projects, both in the arid High Plains and also farther east, where rainstorms caused severe “gully erosion.” The efforts included acquiring unoccupied land and running public demonstrations of soil-conservation practices such as terrace-building; planting wind-blocking rows of trees with the U.S. Forest Service; subsidizing farmers’ soil-saving farming methods; organizing watersheds into Soil Conservation Districts in which farmers were officially designated “cooperators”; and launching many valuable research and extension programs.22

Another greenish initiative of the New Deal era was the Civilian Conservation Corps (CCC). With notions of environmentalism as we now know them then still decades in the future, the Conservation Corps grew out of the Progressive-era conservation tradition, focusing on preserving and managing the nation’s “natural” lands rather than preventing ecological damage by industry and agriculture. Between 1933 and 1942, some 3 million young men—and only men—took jobs with the Conservation Corps and headed out to train, live, and work in the nation’s forests and grasslands. They planted 2 billion trees, built eight hundred state parks, addressed erosion on 40 million acres, built 13,000 hiking trails, and stocked rivers with more than a million fish. They also took up emergency assignments such as firefighting and flood control. (Other accomplishments, such as building 46,000 vehicle bridges, 10,000 small reservoirs, and a million miles of fence, as well as eradicating almost a half-million “predatory animals” were, by current standards, not so green.)23

There are obvious echoes of these laws and agencies in the economic justice and stimulus goals of the Green New Deal. Other agencies also were apt precedents. They included the Rural Electrification Administration, the Tennessee Valley Authority, and the Puerto Rico Reconstruction Administration, among others.

THE UPRISING OF 1934

In their study of the two blockbuster recovery bills passed during Roosevelt’s first hundred days in office—the Agricultural Adjustment Act for rural America and the National Industrial Recovery Act for industry—Theda Skocpol and Kenneth Finegold see both measures as efforts to create a new system under which “economic functions formerly shaped by market competition would be planned and regulated in the public interest.” Had that goal been achieved, they write, the United States would have ended up with business and labor working together congenially under a “centralized system of politically managed corporatist capitalism.” But that plan didn’t work out. While the Agricultural Act did lead to long-term federal management of the farm economy, things veered off course in industry, which was dominated by corporations much more powerful than the upstart government agencies that were trying to herd them into collective recovery. Furthermore, the National Recovery Administration’s leading officials had been drawn from the business world and were more sympathetic to its desire to maximize private profit than to the noncommercial goal of collectively advancing the public interest. In the end, write Skocpol and Finegold, “the dream of harmony between corporate management and industrial labor dissolved into even more bitter conflict.”24

The “bitter conflict” was an astonishing labor uprising in 1934 that cross-pollinated with the growing social movements of the unemployed, students, African American communities, and farmers, along with local political movements such as the Minnesota Farmer-Labor Party and the Progressive Party in Wisconsin. A general strike in San Francisco spurred long-term militancy up and down the West Coast. Tens of thousands of unemployed people helped striking workers in Toledo fight off National Guardsmen and scabs. A labor struggle in Minneapolis drew help from the Farmer-Labor Party and a radical group called the Farmers Holiday Association. There was a surge in radical organizing in Detroit’s auto industry. The labor uprising and the violent response to it by police, corporate security forces, and the National Guard struck terror in the hearts of national politicians, some of whom started talking publicly about the prospect of open industrial warfare, revolution, and even the imminent opening of the “gates of Hell.”25

The radical labor upsurge of 1934 was essential to passage of one of the most important pieces of 1930s legislation: the National Labor Relations Act (NLRA), which would guarantee the right of workers to form independent trade unions. The bill had been fiercely opposed by employers, business owners, many members of Congress, and even some on the Left, who saw it as too business-friendly. But with a labor revolt under way nationwide, mainstream leaders came to see the Labor Relations Act as the necessary alternative to permanent industrial turmoil or even Communism. The bill’s chief sponsor, Senator Robert Wagner of New York, cited the uprising as a compelling reason to pass the act. The unrest posed a dire threat not only to capitalists but also to the American Federation of Labor (AFL), the mild-mannered organization that had officially been representing organized labor for decades. Its leader, desperate to see the passage of the Labor Relations Act in order to help quell the uprising, went a little radical himself, announcing to a massive rally in Madison Square Garden that if Congress did not pass the act, the AFL would lead a national general strike.26

The Labor Relations Act ended up passing easily and was signed into law by Roosevelt. It benefited the working class for decades, but, not surprisingly, the 1934 uprising’s aim of reversing the imbalance of power between capital and labor was never achieved. Indeed, in recent decades, the labor movement has lost enough power to take it all the way back to 1932.

Like the Labor Relations Act, ambitious legislation aimed to resolve the climate crisis is likely to pass only if there is a broad-based, grassroots uprising that leaves Congress no option but to pass it. And the Labor Relations Act example suggests that getting laws passed is only the beginning; they have to be backed up by long-term public support demanding that they be enforced in both letter and spirit.

“RAISE PLENTY OF HELL”

The Green New Deal breaks most sharply with its 1930s namesake on one issue in particular: race. Recognizing that the New Deal had the effect of cementing rather than dissolving institutional racism, the drafters of the Green New Deal have kept marginalized communities at the forefront in every document they have turned out so far.

Steve Valocchi of Trinity College in Connecticut was one of many scholars who argued that the New Deal didn’t just ignore racial discrimination; in several ways, it directly harmed Black communities. The Works Progress Administration, for example, allowed payment of locally prevailing wages, which hurt people living in predominantly Black areas. Earlier, the National Recovery Administration had similarly allowed lower wages in the South and in occupations that were dominated by Black workers. The underwriting manual of the Federal Housing Administration required banks to, in effect, perpetuate residential segregation:

Areas surrounding a location are investigated to determine whether incompatible racial and social groups are present, for the purpose of making a prediction regarding the probability of the location being invaded by such groups. If a neighborhood is to retain stability, it is necessary that properties shall continue to be occupied by the same social and racial classes. A change in social or racial occupancy generally contributes to instability and a decline in values.27

Finally, there were no significant New Deal initiatives to guarantee civil rights or voting rights or to fight racial discrimination; those would have to wait another thirty years.

Several policies unjustly shortchanged the 40 percent of Black Americans who were working in agriculture at that time. For example, many Black sharecroppers never received payments for which they were eligible, either because the local office of the Adjustment Agency failed to disburse the funds to them, or landowners held the money back on the pretext of covering bills. Some plantation owners who were paid to take their cotton crop out of production would evict the Black tenant farmers who were cultivating that land. Meanwhile, another New Deal headliner, the Social Security Act, did not cover farm laborers or domestic workers, and two-thirds of all Black people employed at the time were working in those occupations.28

One of the more courageous and tenacious campaigns of resistance to the New Deal’s built-in racial discrimination was launched in 1934 by the Southern Tenant Farmers Union (STFU). Black and white sharecroppers and farmworkers across northeast Arkansas, angered with being cheated by planters and the Agricultural Adjustment Agency, began organizing door-to-door and field-to-field, with encouragement from the Socialist Party and its leader, Norman Thomas. By late 1935, they had formed two hundred local chapters with a total of 25,000 members. In September of that year, five thousand members of the Southern Tenant Farmers Union staged a successful strike for higher wages. That brought a surge of enthusiasm and a flood of new members.29 In his May 1936 account of the Arkansas rebellion, Jerold Auerbach wrote:

Union members marched hundreds abreast across the cotton fields to gather additional recruits. Instead, they incensed planters and politicians. Memphis police broke picket lines at the Harahan bridge; striking croppers were arrested and leased to planters to work off their fines and court costs; and a Crittenden County landlord built and filled a small concentration camp. On the fourth day of the strike Governor Futrell sent in National Guardsmen and State Rangers and the union quietly surrendered.30

In another incident, cops assaulted and jailed the organizer of a new chapter of the union, a Black minister. A delegation of fifty white sharecroppers with a lawyer in tow managed to get the minister released. But Arkansas landowners, local cops, and the state government were proving to be crueler and more recalcitrant than the local officials from the Adjustment Agency and the New Dealers in Washington.31

The union’s strategy was to combine “relentless pressure on the New Deal with trade union tactics,” but, wrote Auerbach,

Viewed from the perspective of traditional trade unionism, the organizing drive of the Southern Tenant Farmers Union seemed an anomaly. Its most effective weapons were agitation and publicity, not strikes or collective bargaining. During these early years, the union’s organizing drive always had twin objectives: recruitment of new members and propagation of radical alternatives to New Deal agricultural policy. The Southern Tenant Farmers Union sought to organize a protest movement no less than to organize the sharecroppers.32

The union newspaper’s call to action was, “Raise plenty of Hell and you will get somewhere.”33 The hell-raisers of the Southern Tenant Farmers Union deserve much credit for bringing to America’s attention the racial injustices that were built into the New Deal.

THE WAR CURE

Roosevelt worried that if his efforts failed and the Depression dragged on, it would send a message to the world that democracies are ill-equipped to deal with severe economic crises, and this at a time when fascism was on a winning streak in Europe and Japan. Early on, the head of the National Recovery Administration, Hugh Johnson, and other New Dealers had even openly admired the way the dictator Benito Mussolini was handling Italy’s economy, but that sort of talk ended in 1935 with the Italian invasion of Ethiopia.34 Then, in a quick twist of history, it was the 1940s fight against fascism that finally brought full employment and prosperity back to the United States. (Almost eight decades later, the Green New Deal is being envisioned at a time when Americans are once again confronting a slide toward authoritarianism, if not full-blown fascism—this time not only in Europe and Asia but in Washington, DC, as well.35)

In 1936, the Roosevelt administration, concluding that the recovery it had jump-started could sustain itself, decided to start easing off the stimulus spigot. Federal spending dropped by 25 percent over two years, and unemployment promptly leaped by a quarter, to 19 percent by 1938. This economic decline was even steeper than that of 1929–33, and unemployment remained above 14 percent until 1941.36

Supplying allied nations already at war in Europe while at the same time building up the U.S. arsenal and inducting millions into the military finally accomplished what the New Deal could not. By 1940, Congress had spent $62 billion over eight years trying to dig itself out of the Depression. In the next five years, it would spend $321 billion on World War II—according to Patrick Renshaw, more than the U.S. government had spent in total from 1790 to 1940. Heavy spending on the war buildup and consequent massive hiring by both the military and the private sector worked like magic. Over the next four years, the gross national product doubled and the unemployment rate fell to a mere 1.2 percent.37

The fact that U.S. industries could ramp up production to a historically unprecedented output within months in support of urgent national goals has inspired present-day visions of a similar industrial mobilization to combat greenhouse warming. There has been less discussion of the deep adaptations that were required of the wartime economy. Overnight, a government that had struggled for a decade with an excess of production and a deficit of consumer buying power had to figure out how to start serving a population that had plenty of money to spend but now faced shortages of goods on which to spend it. The economy had gone from cash-limited to resource-limited. If nothing were done, Depression-era price deflation would flip into just-as-destructive hyperinflation. The federal government responded, tiptoeing into the murky waters of price controls. Under the Office of Price Administration, the process began in 1940 with voluntary campaigns, one commodity at a time, and by 1943 had escalated into a mandatory clampdown on prices throughout the economy.38

Enforcing ceilings on prices is a sure, direct way to stop inflation, but it doesn’t guarantee fair access. Suppressed prices boost the demand for goods but not the supply. And in the 1940s, supplies of goods were further limited by the diversion of workers and resources into the effort to win the war. The U.S. government was eventually forced into a second level of intervention to make sure that the entire population had access to an adequate supply of food, shelter, clothing, and other basic necessities.

SHRINK, STANDARDIZE, SIMPLIFY

Washington had thrown in the towel in its effort to institute economic planning during the Depression. Now, with a resource-intensive effort to win a war in Asia and Europe, and with the threat of critical shortages at home, planning of production and consumption would become the rule in both the military and civilian economies. A month after the United States entered World War II, a War Production Board (WPB) was created to allocate resources between the military and civilian sectors, ensure an adequate flow of resources to industries supplying crucial civilian goods, and regulate or ban production of other goods.

The degree to which the War Production Board was able to successfully restructure the economy was astonishing. It ordered a halt to all car manufacturing in Detroit and converted the factories for production of tanks and other military vehicles. Non-military sales of mechanical refrigerators were barred in February 1942, and all production was halted in April, saving a quarter-million tons of critical metals over the next year. WPB blocked production of any new air conditioners used “solely for personal comfort” and ordered that air-conditioning systems in some big-city retail stores be removed and installed in far-flung armament factories. Other regulated goods included lumber, bolts, industrial chemicals, bedsprings, farm equipment, cooking stoves, coal- and oil-fired heating stoves, pressure cookers, and even used washing machines.39

Authorities sought ways of deeply reducing civilian rail travel to conserve coal. Brewers were restricted in the number of railcars they could use per month for shipping beer, and they were prohibited from hiring trucks to haul additional product. Shipping of retail packages measuring less than sixty inches in length plus girth or weighing under five pounds was prohibited. With ammonia manufacturers feeding the production of explosives for the military, farmers’ supplies of nitrogen fertilizer were cut off, and availability of organic fertilizers was limited.

While restricting the production and sale of some products, the War Production Board issued standardized and simplified manufacturing specifications for a whole host of others, explaining that “[s]implification, as it is applied in the war program, is a procedure for eliminating unessentials from an item or a line of items. It reduces the number of items in a line and the variety of style, size, color, or ornamentation not actually necessary to the efficiency or usefulness of the product.”40 The quantity of metal allowed in light fixtures was cut by 60 to 80 percent. A limited range of sizes was specified for glass jars used for preserving vegetables and fruits, and to save metal, glass-top seals and thinner rings were prescribed. The standardization and simplification program covered a wide range of other products, including women’s dresses, work uniforms, shoes, socks, stockings, blankets, wooden furniture, and farm machinery parts.41

The mandatory channeling of a large share of the nation’s resources into the war effort, along with the extensive regulation of civilian production, constricted the supply of some consumer goods. At the same time, price controls kept demand high, raising the specter of shortages, rushes on essential goods, hoarding, and under-the-table selling at high prices to more affluent customers. The nation had gotten its fill of bread lines during the Depression and would tolerate no more of that. But simply lifting price controls would have left lower-income households without adequate access to basic necessities. There remained open only one efficient and just course of action: fair-shares rationing.

In the simpler of the two types of rationing systems, households were issued a monthly set of stamps, each of which specified the physical quantity of a rationed product (e.g., pounds of sugar) that could be purchased. Some classes of goods—most famously, meats, cheese, and butter—were covered by a different system called “points rationing,” under which each product in a class was assigned a point “price.” For example, for meats, it might mean three points per pound for hamburger and twelve for steak. Ration stamps for these products were denominated in units of points rather than physical quantity.

During 1942–43, a broad range of goods were brought into the rationing system: fuel oil, kerosene, gasoline, tires, cars, bicycles, stoves, typewriters, shoes, coffee, sugar, meats, canned fish, canned milk, cheese, fats, and processed foods. People seemed to complain most about the systems for rationing gasoline and rubber. Those living in Eastern states were even hit with a ban on “pleasure driving.” Drivers were told they could obtain additional gasoline for commuting to work, but only after they formed a “car club” with at least three other passengers. For food products, price controls and rationing had some salutary effects, not only prompting families across the country to plant 22 million “victory gardens,” but also improving nutrition in all economic classes. Civilian consumption of protein rose 11 percent during the rationing period. Increases for calcium and vitamin A were 12 percent, for vitamin C, 8 percent, and for vitamin B1, 41 percent.42

Washington moved very cautiously early in the war. War planners imposed price controls and rationing only when they were unavoidable, and made ration limits as generous as possible. The government was perhaps too cautious. In August 1942, when there were only a few products under ration, 70 percent of consumers told pollsters they felt that more extensive rationing was needed in order to eliminate shortages and other problems. Six months later, with controls starting to broaden and tighten, 60 percent of people polled by Gallup still believed that the government should have acted more quickly in rationing scarce goods. Later, when rationing was at its zenith, approval outweighed disapproval by two to one.43 The wartime experience of the 1940s suggests that rationing is well tolerated or even popular when it is a response to a clearly perceived national crisis.

THE AGE OF LIMITS

In the 1930s and ’40s, the U.S. and world economies were far smaller than they are today, and greenhouse emissions were far lower. Earthlings, all but a tiny handful, were blissfully unaware that continuing fossil-fuel-enabled growth would one day become a mortal threat to civilization. The original New Deal was free to aim strictly at restoration of financial stability and prosperity. There were plenty of fuels and raw materials sitting there waiting to be put to work, and the biggest environmental problem, the Dust Bowl, could be fixed in the course of restoring the economy of the Plains.

During the war mobilization that followed, the government spent funds at eight times the rate it had spent fighting the Depression. As far as I know, no one complained at the time about the 65 percent increase in fossil energy consumption that occurred between 1935 and 1945, thanks to the growing economy.44 Even if there had been prophetic scientists within the growing federal bureaucracy of the 1930s sounding the alarm on future global warming, few, if any, planners would have considered holding back on carbon release before the fight against fascism could be won.

The New Deal legacy, World War II, and a free-flowing bonanza of fossil fuels propelled the postwar U.S. economy into a long, high-glide trajectory. But they also masked an underlying drag on economic growth, according to a landmark book by Paul Baran and Paul Sweezy, Monopoly Capital,45 published in 1966. The two Marxian economists saw the United States becoming increasingly dominated by shrinking numbers of giant corporations, thereby developing a long-term tendency toward stagnation. A decade in the writing, the book applied and extended Marx’s analysis of capitalism to this mid-twentieth-century phenomenon, one that Baran and Sweezy dubbed “monopoly capital.”

The New Dealers had weakened antitrust regulation, and the concentration of economic power had continued to deepen during and after World War II. Monopoly Capital’s central idea was that in the postwar period, companies and conglomerates had become so large that they were able to transcend the meat grinder of competition. But in doing so, they were undermining the very engine of economic growth. In such oligopolies, Baran and Sweezy argue, big corporations can set prices with little concern for what their competitors charge, thereby avoiding destructive “price wars.” The big firms can also afford the kinds of technologies that allow economic output per worker—productivity—to rise faster than wages. Largely immune to competitive forces and able to churn out more of their products with a smaller payroll while charging higher prices, corporations accumulate vast surpluses of wealth that far exceed the amount that can be absorbed through investment in new capital. The force behind the growth of capitalist economies—the cycle of production, sales, wealth accumulation, reinvestment, and expanded production—gets bogged down, and stagnation results.46

In a world dominated by monopoly capital, therefore, economists no longer need inquire into the causes of stagnations or depressions; the question, rather, is how do mature capitalist economies manage to grow at all? After all, Monopoly Capital’s argument that there exists a long-term tendency toward stagnation was published twenty years into an unprecedented economic boom. How to explain that? Noting that opportunities for absorbing excess surplus still existed, but that powerful ones were exceedingly rare, Baran and Sweezy point to a small handful of phenomena that had created major capital-investment incentives. Most obviously, there was the war economy, which had ended the stagnation of the 1930s and, thanks to the Cold War and U.S. militarism around the globe, was still pumping adrenaline into the economy of the 1960s. Then there were the massive economies surrounding the private automobile: its manufacture and care, and the related industries that were directly enriched from it, including gasoline, insurance, and tourism. In addition to these were what Baran and Sweezy call the “sales effort,” an entire meta-economy led by the advertising and public relations industries. They warned that although these financial engines were capable of driving the U.S. economy further, they all had their own limits. Furthermore, the business cycles inherent in all capitalist economies ensured that growth would never be constant and linear. (Later, from the 1980s onward, militarism, the vehicle industry, hyper-commercialism, and mass advertising would prove to be insufficient forces for boosting mass consumption sufficiently to keep up with ballooning surplus production, so a fourth economic adjustment mechanism, the seemingly limitless growth of financial markets, emerged. The “financialization” wave, as well as those older investment mechanisms, has remained in play and, crucially, together they continue to be some of the dominant contributors to greenhouse warming. Now the nascent renewable-energy industry is being billed as yet another means of keeping economic stagnation at bay.)

By 1972, gross national product was still growing nicely, and the United States was looking forward to the end of yet another foreign war. At that year’s Republican Convention, President Richard Nixon crowed that Americans had “more prosperity than any people in the world, [and] we have the highest rate of growth of any industrial nation.”47 A month later, the New York Times reported on forecasts of “continued strong general economic growth” through 1973.48

The last thing politicians, economists, or investors wanted to hear about in 1972 was any talk of hindrance, caution, or restraint. So the publication that year of a book titled The Limits to Growth49 was about as welcome as a bowl of prune soup at a potluck. Through the remainder of the twentieth century, the book was widely loathed and dismissed in economic, political, and scientific circles; in this century, with climate chaos making its arguments appear more and more on target, it has gained wide respect.50

The Limits to Growth had its roots in a 1968 meeting of what came to be known as the Club of Rome, an international collection of, in their own characterization, “scientists, educators, economists, humanists, industrialists, and national and international civil servants”; the group took its name from the city where that meeting took place. Their broad concern was, as they saw it, the global complex of economic, political, social, cultural, and environment problems.

In the gender-hindered language of the times, Club members later decided to pursue what they called a “Project on the Predicament of Mankind.” Their analysis employed then-state-of-the-art computer models developed at the Massachusetts Institute of Technology. The models predicted the trajectories of humanity’s and the Earth’s vital signs all the way through the twenty-first century, based on several alternative strategies aimed at relieving or circumventing ecological limits and thereby avoiding economic decline or even the collapse of civilization. But every model they tested led to a dangerous “overshoot” of environmental limits. Sometimes it was because of pollution, sometimes resource scarcity, sometimes decline of food production. Any of these, the model predicted, would cut off economic growth sometime before 2100, triggering an irreversible decline. Regarding technology, for example, the book’s authors came to a grim conclusion: “When we introduce technological developments that successfully lift some restraint to growth or avoid some collapse, the system simply grows to another limit, temporarily surpasses it, and falls back.”

Even in a model that assumed “unlimited” resources, strong pollution controls, increased agricultural productivity, and “perfect” availability and application of family planning, while using every tool at hand “to circumvent in some way the various limits to growth,” the result was still an end of growth, caused by three “simultaneous crises:” soil degradation, resource depletion “by a prosperous world population,” and dramatic increases in contaminants—among which the modelers presciently included excess atmospheric CO2 (carbon dioxide.) They concluded that the “application of technological solutions alone has prolonged the period of population and industrial growth, but it has not removed the ultimate limits to that growth.”51

Finally, they radically altered the model’s parameters to portray a society that chooses restraint. For example, they assumed universal access to fully effective birth control and a desired reproduction rate of two children; limits on production; dramatically increased efficiency of resource use and pollution prevention; emphasis on soil protection and universal food security; and better durability of goods and capital stock. That scenario produced not collapse but a world that ended up in an economically modest equilibrium that was adequate to satisfy human needs.

The Limits to Growth was full of graphs depicting future rises and declines of population, resource use, agricultural and industrial production, and pollution, and included a stark warning: “We . . . believe that if a profound correction is not made soon, a crash of some sort is certain. And it will occur within the lifetimes of many who are alive today.” For years, the book’s critics enjoyed pointing out that the collapses forecasted in the business-as-usual curves had not occurred. Finally, in the early 2000s, a few analysts decided to compare The Limits to Growth’s predictions with what had actually happened so far.52 They found that the world’s vital signs had followed the old model’s predictions quite closely up to that time. The bad news for us, they pointed out, is that if you follow those ascending business-as-usual curves to which the world is still adhering out to the year 2030, they show industrial and food production peaking out and then collapsing.

THAT SEVENTIES CLAUSTROPHOBIA

Almost exactly a year after publication of The Limits to Growth, people in the United States found themselves in their first encounter with general scarcity since World War II. It wasn’t the beginning of the collapse portrayed in the book’s graphs, but it was a pretty good preview.

In October 1973, an already alarming blast of inflation triggered largely by the prolonged U.S. war in Indochina was exacerbated when Arab nations belonging to the Organization of Petroleum Exporting Countries (OPEC) imposed an oil embargo on Western countries. World oil prices leaped suddenly and dramatically, and gasoline prices in the United States spiraled to unheard-of heights. As inflation impacted the economy, it was perversely accompanied by recession, bringing a new word into the American lexicon: stagflation. President Nixon ordered that a World War II-style national allocation plan be put in place to ensure that every region had access to adequate fuel supplies. Once again, pleasure driving was discouraged. Nixon also announced cuts in deliveries of heating oil: 15 percent for homes, 25 percent for businesses, and 10 percent for manufacturers. Aviation fuel was cut 15 percent, and on the nation’s highways, the maximum speed limit was lowered to 55 miles per hour.53

Long lines at gas stations became an enduring symbol of the 1970s. A more alarming development, one illustrating the severe backlash that governments can face when they attempt to deal with resource limits, was the violent strike by independent truckers that first broke out in December 1973. To back their demands for reduced diesel fuel prices, abolition of the 55 mph speed limit, and general deregulation, the truckers not only stopped hauling but also tried to keep all trucks off the road. Many parked their semis in the middle of highways, while others resorted to throwing bricks, puncturing tires, and brandishing firearms. After the first two rounds of protests, one trucker told the press, “When the next shutdown comes around . . . I’m gonna take my goddamn truck and burn it on the goddamn White House lawn.”54

The end of the OPEC embargo in 1974 brought just enough relief from gasoline shortages to calm nerves and shorten gas lines, but what would become known as the “energy crisis” was far from over. Upon taking office following Nixon’s resignation, President Gerald Ford laid out a plan to reduce the country’s dependence on imported oil through taxes and tariffs, but it turned out to be a political disaster for him. And stagflation raged on. President Ford was among those who saw inflation becoming an even greater national threat than unemployment. At one point, he noted that unemployment was “the biggest concern of the 8.2 percent of American workers temporarily out of work,” but inflation was “the universal enemy of 100 percent of the people.”55 Any attempt at New Deal–style stimulus to address the stagnation problem would have triggered even worse inflation. Nor was military spending a solution; after all, it was a big part of what had gotten America into its inflationary predicament in the first place.

The oil crisis calmed for a while. But in 1979, the Iranian Revolution sent petroleum prices skyward again. In Levittown, Pennsylvania, a crowd of 1,500 gas rioters reportedly burned cars, destroyed gas pumps, and threw rocks and bottles at police. One officer responded to a question from a motorist by smashing his windshield, whacking the driver’s son with his club, and putting the man’s wife in a choke hold. In all, eighty-two people were injured, and almost two hundred were arrested. The U.S. media solemnly discussed the possibility of “civilizational breakdown.”56 That summer, inflated diesel prices triggered a revival of the independent truckers’ strike. The historian Shane Hamilton described chaos that exceeded that of the 1973–74 strike:

On June 5, 1979, a convoy of truckers arrived in Washington and circled the Capitol. [Strike leader] Mike Parkhurst seized the moment and called for a nationwide shutdown, not simply to demand lower fuel prices, but to abolish the Interstate Commerce Commission and open up regulated freight trucking to untrammeled competition. By the end of June, approximately seventy-five thousand truckers heeded Parkhurst’s call and stopped driving. Once again the protests were violent, as roving bands of truckers set fire to empty trucks and shot at the windshields of drivers who refused to stop. Nine states called out the National Guard. By the time the shutdown ended in early July, one driver had been shot and killed, dozens more injured.57

As enraged truckers were laying siege to Washington, President Carter, by then in his third year in office, prepared to deliver what was being billed as his most important speech yet. It would be his fifth address to the nation on energy policy, a sign of how that issue had so far dominated his presidency. (Four months later, the Iran hostage crisis would displace the energy crisis as his administration’s most pressing concern.)

On July 15, 1979, Carter delivered his speech live on prime-time TV.58 Right at the top, he declared that the threat everyone was calling an energy crisis was actually a crisis of confidence, of self-indulgence, of consumption. “Human identity,” he said, “is no longer defined by what one does but what one owns.” In an echo of today, he decried the “fragmentation and self-interest” that was roiling the nation. ABC’s Frank Reynolds would later characterize that portion of the speech as “almost a sermon.” To today’s ears, the proposals Carter rolled out in his speech sounded something like a Green New Deal, 1970s-style. He envisioned investing massively in alternative energy sources, restricting oil imports, creating an “Energy Mobilization Board,” establishing a “bold conservation program,” and making energy affordable to low-income Americans. He proposed spending an additional $10 billion on public transportation and asked Americans “to take no unnecessary trips, to use carpools or public transportation whenever you can, to park your car one extra day per week, to obey the speed limit, and to set your thermostats to save fuel.” He added, “Every act of energy conservation like this is more than just common sense, I tell you it is an act of patriotism.”59

In 1979, the terms Carter used in naming his proposals didn’t mean quite what they might during today’s climate emergency. The Energy Mobilization Board’s (EMB) mission, for example, would expedite the regulatory process to put high-priority fossil-energy projects on the “fast track.” The top tier of projects for the EMB included oil and gas drilling on federal land, extracting oil from shale, coal gasification, coal liquefaction, and building new oil pipelines. Renewable energy and energy conservation projects would be treated as lower-priority initiatives. Less than two weeks after Carter pitched the EMB in his big energy speech, the House voted it down. Concerns, according to one observer, included perceived encroachment upon states’ rights, expansion of bureaucracy, and “reluctance by members of the Republican party to support a key element in the President’s energy program in an election year.”60

In his July 15 speech, Carter had also called on Congress to authorize one major energy conservation initiative: a standby plan for rationing gasoline.61 Although it took a while, Congress passed such a measure in 1980. Five billion ration coupons had been printed up during the Nixon years and were ready to go.62 Rationing would be triggered in the event of a 20 percent shortfall in the national gasoline supply, and household gasoline allowances would be fixed at 20 percent below normal consumption.63

In the end, rationing would not be needed. Oil supplies stabilized, economic stagnation suppressed fuel demand, and the resulting glut pushed world prices down. No future need for rationing was anticipated. Furthermore, the Army reportedly was concerned that if the coupons, which featured an image of George Washington, got out into circulation, change machines might mistake them for dollar bills. In June 1984, Nixon’s old coupons were pulled out of storage at the Pueblo Army Depot in Colorado, shredded, and buried.64

The energy claustrophobia of the 1970s was summed up by historian Jefferson Cowie:

[T]he nation running out of energy was both a reality and a metaphor, and the problem of limits shaped the entire discussion. It haunted Richard Nixon, stymied Gerald Ford, all but destroyed the Carter presidency, and opened up the space for the Reagan restoration of the new Gilded Age. . . . By the time Ford filled in after Nixon’s resignation, the litany of a restricted future had become less abstract and more particular until Carter was forced to concede that “dealing with limits” was the “subliminal theme” of his presidency.65

Carter had declared in his 1979 speech that “beginning this moment, this nation will never use more foreign oil than we did in 1977—never.” But the president did not reaffirm that pledge in his 1980 State of the Union address. Instead, in response to the Soviet invasion of Afghanistan two months before, he announced what came to be known as the Carter Doctrine. The United States would put the world on notice that it would use military force to protect its interests in Southwest Asia, the Arabian Peninsula, and other oil-rich regions.

After almost four subsequent decades of burning fossil fuels—and having seen the beginnings of the climatic impact of all those emissions—we can ask whether the energy conservation proposals of the 1970s could have evolved into a transition to independence from fossil fuels. To take one small example, if Congress’s standby gas-rationing plan had been triggered, and had per-capita consumption remained at the rationed amount until the present day (taking population increase into account), we could have saved 920 billion gallons, more than six years of today’s U.S. gasoline consumption.66 That would have kept 9 billion tons of CO2 out of the atmosphere. It wouldn’t have been enough to prevent the climate emergency, but perhaps forty years of living under such a limit could have set off a chain reaction of progressive moves throughout the economy aimed at dealing with energetic and ecological limits. Left unconstrained, however, the fossil fuel business, and the myriad commercial goods and services that extend from on it, continued to grow like unchecked tumors.

One legacy of the 1970s did persist: the Carter Doctrine. Ensuring the flow of fossil energy became a top priority for the U.S. military, which, not coincidentally, has become one of the world’s largest petroleum consumers. Several wars later, much of our armed presence around the world remains dedicated to securing U.S. access to foreign oil, natural gas, and other mineral resources.

MAKING FOSSIL AMERICA GREAT AGAIN—AGAIN

Federal energy and environmental policy would suffer a severe case of whiplash following the inauguration of President Ronald Reagan in 1981. Toward the end of Reagan’s first term, James Everett Katz of the University of Texas wrote that the president had “returned the USA to an era when the energy industry and the government cooperated amiably with each other. Gone are the equity concerns that were an integral part of the Nixon, Ford and Carter Administrations’ energy policies. Instead, a minimum of governmental involvement in energy planning is advocated, and Reagan’s energy philosophy has backed free market forces instead.”67

Six months into that first term, Reagan’s team put out a National Energy Plan, which turned out to be a search-and-destroy mission against any of the even faintly green or humanitarian provisions that had been in previous energy plans under Carter.68 All conservation and renewable energy policies were targeted for elimination. Consideration of social impacts? Gone. Price controls to keep fuel and utility bills within reach of low-income households? Out. Consumer protection? Sorry, no more of that. Katz noted, “The plan also offers little hope that the government will take an active role in handling energy shortages or emergencies.”

The Green New Deal and Beyond

Подняться наверх