Читать книгу American Nightmare - Randal O'Toole - Страница 9

Оглавление

3. The Urban Dream

Though a majority of Americans continued to live in rural areas until around World War I, American cities began rapidly growing after 1840. From 1790 to 1840, the urban share of the American population had barely doubled, from slightly less than 5 percent to slightly more than 10 percent. In 1840, New York, Philadelphia, and Boston—still the nation’s three largest urban areas when “suburbs” such as North Liberties and Charlestown are included— housed just 4 percent of the population, and only 20 cities had more than 20,000 people. Still, the 1840 census was the first to find more than 100 communities of more than 2,500 people each—there were just 90 in 1830.

After 1840, the urban share of the nation’s population began growing by about 5 percent per decade, so that it nearly doubled to 20 percent by 1860, doubled again to 40 percent by 1900, and doubled again to nearly 80 percent by 2000 (see Figure 3.1). This growth increasingly made urban homeownership rates the heart of our story.


New transportation technologies—the steamboat after 1810, the canal after 1820, and most importantly the railroad after 1830— stimulated the explosive growth of the cities. By greatly reducing the costs of moving raw materials to factories and finished goods to consumers, these new forms of transportation increased industrial production and the demand for workers in urban areas.

If transport and industry provided the jobs, immigration provided many of the employees and their families who made up the growing populations of the nation’s larger cities. Immigration grew from under 10,000 people per year before 1825 to more than 100,000 people per year in 1842. From 1840 through 1900, immigrants made up 30 percent of the population growth of the United States, and many of those immigrants settled in industrial cities, such as Chicago, Detroit, and Pittsburgh.

The Middle-Class/Working-Class Split

The advent of large corporations, such as railroads, textile manufacturers, and steelmakers, created a demand for two types of employees: managers and laborers. Management required special skills and, often, personality traits, so good managers were paid a premium. Laborers were viewed as interchangeable and were generally paid much less. Although management jobs tended to go to native-born Americans with better educations, immigrants mainly took the laborer jobs. The managers became the middle class, while laborers became the working class.

These distinctions were largely unknown before the 19th century. The first recorded use of the term “working class” was in an 1813 book titled A New View of Society, which divided people into “poor,” “working class without property,” “working class with property,” and higher classes that employed the working classes.1 The term “middle class” is older, its earlier meaning referring to traders and merchants who were between the aristocracy and peasants. Its modern usage of referring to professionals and managers has been traced to an English statistician named T. H. C. Stevenson, who worked for the Registrar-General—roughly the British equivalent of the Census Bureau—in his analysis of Britain’s 1911 census.2

Cultural differences between the middle and working classes go well beyond differences in income, which greatly narrowed in the second half of the 20th century. Qualifying for a management job generally requires more education, and that advanced education helps shape people’s tastes in food, recreation, entertainment, and lifestyles in general.

These cultural differences go back many decades. An analysis of 19th-century literature, for example, found that working-class fiction, reflecting the realities of working-class lives, tended to be more violent and more overtly sexual than middle-class fiction.3 Cultural differences in the 19th century were particularly exacerbated by the high percentage of immigrants—often immigrants from countries culturally much different from the Anglo-Americans who made up most of the native-born population. Irish immigration peaked in the 1840s and 1850s; German immigration came in waves that successively peaked in the early 1850s, early 1870s, and early 1880s; Italian and eastern European immigration became significant in the 1890s and peaked in the 1900s–1910s.4 These waves of immigration brought in people who were increasingly alien to the Anglo culture that dominated the United States.

Imagine a Sunday summer afternoon in late 19th-century Chicago or Pittsburgh where a working-class family lives next door to a middle-class family. On the front porch, the working-class father plays a banjo and the family sings songs from their native land in their native language, accompanied by clucking chickens and barking dogs. Meanwhile, the middle-class family gathers around the piano indoors. The working-class mother is cooking a meal liberally laced with garlic, strong cheeses, and other odiferous ingredients. The middle-class family eats food that is even blander than English food is noted for today. Boarders come and go in the working-class house along with customers of the in-home sewing business; the only visitors to the middle-class house are as prim as its residents. The potential for conflicts is obvious.

The Immigrant Dream

One of the cultural differences between 19th-century working- and middle-class families was in how they viewed their homes. While middle-class families considered homes to be simply a place to live, working-class families also used their homes as income-producing opportunities. They could raise chickens and vegetables in their yards; take in boarders; and run small businesses, such as sewing, out of their homes.5 These supplemental sources of income, together with income earned by children and other family members, typically came close to equaling the income earned by the family’s principal wage earner.6

In addition, working-class families rarely had bank accounts (partly because 19th-century banking hours were so limited— typically from 10:00 a.m. to 3:00 p.m.—that laborers could rarely visit a bank). Instead, they regarded their home as a bank, using their homes as collateral for small loans (from relatives or “real-estate entrepreneurs,” not from banks) to start up businesses or tide them over if the leading income producer in the family was temporarily out of work.7 Realtors of the day said that houses were “better than a bank for a poor man,” and “with hindsight,” says one historian, “this appears to have been true.”8

As such, 19th-century working-class families had powerful incentives to own their own homes. “The lure of land ownership served as the single most important magnet for English and European immigrants from the sixteenth century until well into the nineteenth,” observes historian William Worley. “In the last third of the nineteenth century, this hunger was transformed into desire for ownership of city and town lots.”9 “The ambition of the immigrant to own property in America is one of his most striking characteristics. For it, he will make almost unbelievable sacrifices both of his own comfort and that of his wife and children,” wrote an observer of Chicago working-class living conditions in 1913. “The possession of a house from which one may draw an income is the highest mark of prosperity.”10

Fortunately for the immigrants, housing was inexpensive. Land in most cities was abundant. Although land within walking distance of factories eventually became scarce, in Chicago, for example, subdividers had created enough lots by 1873 to accommodate a million people, or 2.5 times the city’s actual population.11 Rather than buy a lot, people could save a little money by leasing a lot and building a small, wood-frame home that could be moved when the lease expired.12

The cost of erecting a small, wood-frame structure was low and could be made lower if the homeowners did much of the work themselves. Someone could buy a lot and built a small home—suitable for eventual expansion—for $800 to $1,000.13 Median incomes for late 19th-century or early 20th-century wage earners was $350 to $600 a year, or roughly one-third to one-half the cost of a house.14

Early Credit Tools

The financial tools available to 19th-century homebuyers were more primitive than they are today. Until 1886, no one conceived of a contract that would convey a home’s title to the buyer only after the buyer had paid off most or all of the house, and sellers were reluctant to give title to someone without a substantial deposit. So most lots and homes were sold with a 50 percent down payment. Working-class homebuyers would have to save this money or, more likely, borrow it from relatives.

The remaining 50 percent would be paid through a nonamortizing loan—which today we would call an interest-only loan—typically at 7 to 9 percent interest, with a balloon payment of the principal after five or six years. A $500 loan would have a monthly payment of less than $4; at each five- or six-year interval, the family would typically refinance the loan. To make monthly payments, families might take children out of school in their mid-teens so they could earn money and contribute to the household budget.15

Housing affordability was a major political issue in late 19th century cities. After the Chicago fire, the city’s Common Council considered an ordinance forbidding wood-frame homes and requiring brick or stone instead. That constraint would have easily tripled the cost of home construction. When more than 2,000 people besieged city hall in protest, the council agreed to exclude working-class neighborhoods from the requirement.16 Even among the middle class, passage of the ordinance slowed home construction. According to a local writer in 1878, the ordinance resulted in “a brisk demand for building lots just outside of the fire limits, and a chronic dullness in the market for moderately choice lots within those limits.”17

Middle-Class Renters

While working-class families had strong reasons to buy homes, the incentives for middle-class families were quite different. Before zoning, no one could predict what would happen to neighboring properties. Although it was unlikely that anyone would build a factory or dig a gravel pit next to middle-class homes, no one could stop a working-class family from buying a vacant lot and building a small home complete with boarders, in-home businesses, smelly foods, noisy children, and backyard livestock. Unlike today, urban neighborhoods in the 19th century tended not to be divided by income levels.

This lack of security from working-class invaders encouraged the vast majority of middle-class urban families to rent or lease their homes in the late 19th century. Landlords in some cities offered leases for as long as 21 years, blurring the distinction between leasing and renting a home.18 Since middle-class workers had easier access to banks and other means of saving and investing their money, they did not feel the need to buy a home to use as a savings bank. As a result, working-class homeownership rates, especially among immigrants, were higher than for middle-class families.

The 1890 census found that about 37 percent of nonfarm dwellings were owned by their occupants. However, about 40 percent of “nonfarm” homes were in rural areas; homeownership rates in urban areas, though not specifically recorded by the Census Bureau, appear to have been much lower. An 1890 survey of urbanites by the U.S. commissioner of labor found that just 17.6 percent were homeowners, which would put rural nonfarm homeownership at 62 percent.19 That figure sounds reasonably accurate since in other decades for which data are available, rural nonfarm and rural farm homeownership rates tend to be similar, and rural farm ownership was 66 percent in 1890.

Homeownership rates for many ethnic groups, who were mainly working class, were much higher. As early as 1870, 27 percent of German families and 20 percent of other immigrant families owned their own homes in Chicago, and homeownership rates among these groups were probably even higher by 1890.20 Since upper-class homeownership rates were close to 100 percent, middle-class home-ownership rates must have been below 10 percent.

Housing Innovations

Several innovations during the late 19th century made it even easier for working-class families to own their own homes. First was the development of new techniques that sped and simplified home construction. Balloon-framed houses held together with nails almost anyone could pound replaced traditional timber-framing methods that required skilled workers to make the mortise-and-tenon joints that held the house together. Sometimes called “Chicago construction” because it was widely used after the 1871 Chicago fire, balloon framing was made possible by the development of machine-made nails and standardized lumber sizes.

The simplification of home construction stimulated another innovation, which was the growth of the home construction industry and the early application of mass production techniques to home building. Traditionally, subdividers would sell lots, and buyers would build or hire someone to build a home. But the 1880s saw the emergence of housing developers who would sell lots and build homes on those lots, both to order and on speculation.

One of the largest developers in the country was Samuel E. Gross, a lawyer who began subdividing land and building homes in the Chicago area in the early 1880s. In little more than 10 years, he sold more than 40,000 lots and built more than 7,500 homes—more than any other Chicago homebuilder before or since—in 150 different subdivisions.21

Chicago families could buy an S. E. Gross home for as little as $800, and $1,000 to $1,500 would buy a four-room house, the difference in cost depending on whether or not the house had indoor plumbing. Many of Gross’s early homes can still be found in Chicago with relatively few modifications other than the addition of indoor plumbing if the home was not originally so built.22

In 1886, a Cincinnati subdivider named W. E. Harmon conceived of the idea of a “contract for deed,” in which a buyer would not receive the deed to the property until it was completely paid for. Initially, Harmon sold lots with less than a 10 percent down payment; eventually, he reduced that down payment to as little as 1 percent.23

Such contracts were quickly adopted by other subdividers and homebuilders. By 1889, Samuel Gross was offering to sell a $950 home with as little as $50 down followed by payments of $8 a month.24 Though the loans were nonamortizing—meaning buyers faced a balloon payment every five or six years—Gross bragged that he never foreclosed on a loan, instead renegotiating new loans when needed.25 Like General Motors’ encouraging auto buyers to step up from Chevrolet to Buick and Cadillac, Gross helped workers trade up to larger and better homes as their fortunes improved.

Gross has been lauded for having “altruistic motivations” in selling homes to working-class families on such generous terms.26 Yet he was no altruist, building a fortune estimated in 1895 to be $4 to $5 million.27 He had merely combined several ideas into a successful business model that focused on a customer base of people who preferred to own, rather than rent, their homes.

Credit Innovations

A third innovation was the growth of building and loan associations (B&Ls, later known as savings and loan associations). Unlike commercial banks, which had owners and customers, the original B&Ls were cooperatives: people who opened accounts and saved money were members. Instead of investing for maximum profits, the associations worked primarily for their members, loaning them money for real estate and other purposes. The first American B&L opened in the Philadelphia area in 1831. By 1893, more than 5,500 such associations across the country had helped more than 300,000 families acquire homes.28

Building and loan associations pioneered the use of amortizing loans as early as the 1880s. Amortizing mortgages were less risky for both the buyer and the seller, and they made it possible for many families to become true homeowners instead of mortgagees more rapidly. Before 1913, national banks were not legally allowed to make real-estate loans, and from 1913 to 1934 they could lend only half the appraised value of a property for just five-year terms. Although state banks could make such loans, they relied on non-amortizing loans into the early 20th century. A few developers, such as Boston’s Robert Treat Paine, offered amortizing mortgages to homebuyers as early as the 1890s.29 However, amortizing mortgages from B&Ls were “commonplace by the late nineteenth century,” and the vast majority of such mortgages before 1930 were provided by these associations.30 B&Ls also loaned as much as 70 percent of the appraised value of a home, offered terms as long as 12 years, and generally charged lower interest rates than banks.

Public Health

Even as these innovations made housing more affordable, another innovation made it less affordable: sanitary sewers and water supplies. Clean water delivered to a kitchen or bathroom sink was arguably a private good, and private water supply companies sprang up in many American cities soon after the Revolution. By 1800, Americans had built waterworks in 17 cities, 16 of which were private; by 1830, there were 45 waterworks, 36 of which were private.31

If clean water was a private good, polluted water was a public bad. Economically, a public good is one that benefits everyone even if only some pay the cost; national defense is the classic example. Conversely, a public bad is one that potentially harms everyone even if only a few are responsible for the problem.

Poor sanitation had been the bane of cities ever since the first cities were built. Before the 19th century, many European cities had higher death rates than birth rates and were able to grow only because of people emigrating from rural areas. These problems were transmitted to the new world as soon as cities grew to a significant size and particularly when world trade expanded the range of microorganisms that were once limited to one or two countries.

One such microorganism is cholera, which one historian called “the classic epidemic disease of the nineteenth century.”32 Before 1800, cholera was largely confined to India, but in 1817 an epidemic affected much of the Old World. An 1832 epidemic reached the New World, killing thousands of people in Chicago, Cincinnati, New York, and many other cities along the Mississippi and Ohio rivers, the Erie Canal, and the Great Lakes. At the time, most people suspected the disease was transmitted through the air; many years, and several more epidemics, were required before public health officials realized that the real problem was contaminated drinking water.

Cholera is a bacterium that infects the human intestines, leading to serious diarrhea. Since many people obtained their water from easily contaminated rivers or wells, the disease could spread rapidly. Cholera was a particularly frightening disease because dehydration killed 50 to 60 percent of infected patients within a few hours; it was not until the 20th century that medical doctors realized that massive rehydration could reduce mortality to 1 percent.

As early as 1842, an English social reformer named Edwin Chadwick called for replacing cesspools and privy vaults with a citywide sewage system consisting of pipes that would use household water to transmit fecal matter and other waste to a single location where it could be composted and sold as fertilizer.33 Chadwick himself failed to understand the dangers of drinking contaminated water; instead, he believed that cholera, typhoid fever, dysentery, and other waterborne diseases were spread through “foul air,” so his goal was to move sources of contagion downwind of the cities.34

The first proof that cholera was spread by contaminated drinking water had to wait until 1854, when an English physician named John Snow found that nearly every victim of a cholera epidemic in south London lived near and drank water from a single well. The well was later found to be only three feet from a cesspool that had been contaminated by cholera.35 Although Snow’s report led the city to remove the pump handle from that particular well, Snow’s theory of waterborne contamination remained controversial for many years.

Although most American waterworks were eventually taken over by city governments, they could potentially be private and their costs, even when government owned, have largely been paid for out of user fees. From a public health view, however, voluntary user fees might be inadequate to pay for sewers; as long as anyone could refuse to pay the fee and continue to dump their wastes in a cesspool, water supplies could potentially become contaminated. Chadwick’s hope that fertilizer sales would cover the costs of a citywide sewage system was unrealistic, and the high cost of sewers combined with debates over the actual causes of diseases kept most American cities from installing universal sewage systems for several decades.

Chicago, for example, built one of the first American municipal sewer systems dealing with human wastes in 1856.36 Yet 37 years later, a survey found that nearly three out of four Chicago residents still relied on privies rather than indoor plumbing.37 The reason was Chicago paid for the sewers out of user fees, and the only homes connected to the sewage system were those whose owners could afford to pay for indoor plumbing and hookup fees.38 It was not until 1902 that Chicago mandated that all new homes be hooked up to sewer systems. This law exempted existing homes, and by increasing the cost of new housing, it made it more difficult for working-class families to buy a home.39

The cost of indoor plumbing fixtures and connections to city water and sewer systems could nearly double the price of a small, single-family home.40 Mandating such hookups would price many working-class families out of the homeownership market.

Boston took a different approach from Chicago’s. Though it did not begin to construct an integrated sewer system until after the Civil War, when it did so it paid for the capital costs out of general funds, meaning, mainly, property taxes.41 This payment method meant that owners of expensive homes effectively subsidized owners of smaller homes. Low-income families who sought to build a home still had to pay for indoor plumbing fixtures and to pay the city for water meters and operating costs.

In the long run, the public sewer model had an even more profound effect on housing costs. The perceived need for publicly owned, centralized sewer systems led to a significant growth in city government. Cities hired sanitary engineers who attempted to forecast future needs and write long-range plans to meet those needs.42 These long-range plans set a precedent for later city plans and increasingly specific regulations written to deal with such things as transportation, land uses, parks, historic buildings, watersheds, and trees. Although no one can argue the public health benefits of integrated sewage systems, later policies that, for example, imposed “impact fees” on homebuilders to pay for transportation or created time-consuming permitting processes for the cutting of individual trees significantly increased the costs of homeownership while providing dubious benefits.

Housing for Factory Workers

In the short run, another late 19th-century trend had an even larger effect on housing affordability: the growth of the factory system. The nation’s first factory, the Slater Mill in Pawtucket, Rhode Island, employed just nine workers. By 1880, factories remained small enough that close to 90 percent of the families of, say, Detroit could live in individual homes and workers would still be within walking distance of their places of employment.43

However, the average number of employees per factory more than doubled between 1869 and 1899 and continued to grow after that.44 By 1904, 60 percent of all manufacturing employees worked in factories with more than 100 employees, and 12 percent worked in factories with more than 1,000 employees.45 Moreover, the tendency of many industries, such as Chicago’s stockyards, to concentrate in one part of a city created transportation problems for workers.

Although some cities saw the installation of horsecars as early as 1850 and rapid growth of electric streetcar networks after 1890, factory workers earning between $3 and $6 a week could not afford to devote 10 to 20 percent of their incomes to transit fares. The limited amount of land within walking distance of factories, and the resulting high cost of such land, forced many to live in high-density tenements instead of single-family homes.

The word “tenements” brings to mind extremely high-density mid-rise buildings housing several families per apartment, or even per room, in New York City’s Lower East Side. Reformer Jacob Riis photographed residents of these buildings in the late 1880s and early 1890s. His 1889 book How the Other Half Lives: Studies among the Tenements of New York included several of these photos and raised public attention about poor housing conditions and influenced the passage of several tenement laws.46

Riis’s tenements were five to six stories high built on 25-by-100 foot lots, meaning about 16 could fit on a single acre. The front and back of the buildings occupied the full width of the lot with a small airshaft between the buildings in the center. Because they were narrow in the middle to make room for the airshaft, these buildings were often called “dumbbells.”47 Designed to house four families to a floor, or up to 24 per building, they were sometimes packed with far more. The narrow airshafts meant that most rooms had little light, and the odors from the garbage that people inevitably threw to the inaccessible bottoms of the shafts must have been stifling. Many of these tenements had no indoor plumbing; those that did might have only one toilet per floor. Perhaps most scandalous to the middle-class readers of Riis’s book was the lack of privacy: children of all ages and both sexes often slept in the same rooms as their parents, other relatives, and unrelated boarders.

New York City tenement conditions were a direct function of the density of inner-city jobs. The Triangle Shirtwaist factory, site of the infamous fire that killed nearly 150 workers, occupied 3 floors of a 10-story building in lower Manhattan and employed 500 workers. The building covered about one-quarter acre out of the nearly 7,000 developed acres in Manhattan. Although pre-1890 factories would have fewer floors, even six-story factories could contain 4,000 workers per acre, most of whom had to live within walking distance of the buildings.

In 1910, the year before the Triangle Shirtwaist fire, Manhattan had about 2.3 million residents at an average population density of about 100,000 people per square mile. (For comparison, the median density of the nation’s 50 largest urban areas in 2000 was about 2,800 people per square mile.) Manhattan today has about 2.3 million jobs, and certainly nearly all the pre-1900 residents who had jobs worked in Manhattan. The combination of offices, factories, and residences competing for space made lower Manhattan some of the most valuable real estate in the world. Factory workers who couldn’t afford to commute off the island were forced to live in high-density environments, which is why the Lower East Side housed some 334,000 people per square mile in the 1880s.

The problem was compounded by 19th-century construction technology, which made it difficult to build structures taller than about six stories. (The nation’s first steel-framed skyscraper, St. Louis’s 10-story Wainwright Building, was built in 1891; the tallest commercial masonry building ever built, Chicago’s Monadnock Building, is also 10 stories and was also built in 1891.) Residents typically need a lot more space than factory workers; given Manhattan’s high land prices, the tenement was the way of packing as many workers and their families as possible into as little land as possible.

Although the conditions in the Lower East Side were truly awful, they were far from “the other half.” As historian Robert Barrows observes, “New York City’s Lower East Side, the case study most frequently cited because of its immaterial power, was virtually unique, an aberration that, in scale at least was replicated nowhere else in the country.”48 “New York has over 100,000 separate tenement houses, whereas in most American cities that tenement house is the exception rather than the rule,” admitted Lawrence Veiller in a 1912 speech. Veiller had worked harder than anyone to promote legislation outlawing the housing conditions pictured in Riis’s photographs.49

Public concern about urban housing for the poor actually dates back to well before the Civil War. In 1847, a group known as the Association for Improving the Condition of the Poor published a survey showing that New York City tenement housing was “defective in size, arrangement, water supply, warmth, and ventilation, and that rents were disproportionately high.” As a result, the poor “suffer from sickness and premature mortality; their ability for self-maintenance is thereby destroyed; social habits and morals are debased, and a vast amount of wretchedness, pauperism, and crime is produced.”50

This belief in architectural determinism—the notion that the built environment shapes human behavior and a poorly built environment leads to debased morals, pauperism, and crime—guided the association’s solution, which was government regulation mandating minimum housing standards. Better housing, the association believed, would lead the poor to engage in less crime and more productive work, which would guide them out of poverty. Failing to persuade the New York City Council, the association went to the state legislature, which in 1867 passed a tenement house law requiring builders to provide a 10-foot backyard and a water supply and forbidding the renting of apartments that were totally underground. An 1879 law required a window in every room in a tenement.51

Ironically, the result of these laws were the dumbbell tenements that so horrified Jacob Riis: the narrow airshafts that created the dumbbell shape ensured that every room had a window even if the lower-story windows let in almost no light. This experience would be repeated over and over as low-income housing advocates, sometimes called “housers,” would propose government programs that, as finally implemented, did little for poor people other than make housing less affordable.

The New York State Tenement House Act of 1901, for example, required builders of new tenements to provide sanitary facilities, outward-facing windows in every room, and a courtyard for garbage removal in place of the airshafts where people tossed garbage. This act was considered a model law, but as historian Gwendolyn Wright observes, “The stipulations proved so strict that few speculative builders would divert their money into this kind of construction any more, and the housing shortage for the poor grew worse.”52

Wright points out that the housers “brought moralistic middle-class biases to their crusade” and “considered their own taste to be a universal standard of beauty, hygiene, and human sentiment.” When Chicago succeeded in condemning a tenement and was evicting its Italian residents, one reformer noted, “It was strange to find people so attached to homes that were lacking in all the attributes of comfort and decency.” One housing proposal urged that tenements be given more of those attributes by equipping them with such amenities as doorbells and bay windows.53 Such requirements, of course, would do nothing to fix the fundamental problem of urban poverty.

Rather than regulate urban housing, some housing reformers dreamed of moving working-class families to the suburbs, where land was cheap and they could live in uncrowded conditions. In the 1870s, Boston Unitarian minister Edward Everett Hale encouraged working-class men to form and join building and loan associations so they could buy suburban homes. Recognizing the transportation problem, Hale urged the railroads to provide “cheap trains for laboring men.”54

In New York City, Edward Bassett, a one-term member of Congress who later helped write New York City’s zoning code, argued that people should move “from crowded centres to the open spaces” where they could have “sunny homes and plenty of air.” Overcoming the transportation problem required “low fares, so that the expense might not deter people from moving where life would be pleasanter.” The main obstacle, he believed, was “the real estate forces of New York [who] believe in congestion” and who “prevent the opening up of new areas by five-cent rapid transit” for fear it will reduce property values in the urban core.55

Unfortunately, even the traditional nickel streetcar fares were beyond the reach of many unskilled workers. As planning historian Peter Hall notes, the development of the mass-produced automobile, not the five-cent streetcar, managed to “dissolve the worst evils of the slum city through the process of mass suburbanization.”56

Fortunately for working-class families outside New York City, most low-income housing in Chicago and other industrial cities would be considered luxury housing relative to the Lower East Side. Typical was the “two flat,” a two-story building with an apartment on each floor.57 Less common were three flats (or as they were known in Boston, three-deckers) for three families. Many flats were owned by absentee landlords, but often the owners of a two- or three flat would live on the top story (which was generally quieter and had slightly more square footage than lower stories) and rent out the other flats.

Like New York City tenements, two flats were built on 25-footwide lots, but most of the houses were only about 20 feet wide so there was room for a walking path between each one. They generally had small backyards, some of which have been filled with garages since they were first built in the late 19th century. Two flats and three flats on narrow lots, with occasional four- to eightplexes, provided enough density for working-class employees to live within walking distance of work in Chicago and other industrial cities. But the multifamily nature of the buildings meant that most workers could not achieve the immigrants’ dream of owning their own home.

Although middle-class families were less attached to the idea of homeownership than those from the working class, they enjoyed relatively high housing standards whether they rented or owned. A late 19th-century working-class home, such as the kind built by Samuel Gross in 1890, might have a parlor, a kitchen that doubled as a dining room, and two seven-by-eight-foot bedrooms, usually with a privy in back instead of indoor plumbing.58 Even more basic was a two-room working-class cottage, with a kitchen that also served as parlor, dining room, and bedroom for the children, and a bedroom that was sometimes shared with boarders.59

“A generation later,” says historian Joseph Bigott, housers such as Edith Abbott “denied that cottages ever provided decent accommodation.” Abbott argued “that the unskilled are a dangerous class; inadequately fed, clothed, and housed, they threaten the health of the community.” She wanted the government to build public housing for low-income families, but such government programs would be politically possible only if Abbott could persuade people that the homes workers provided for themselves were unsafe or otherwise inadequate.60 In attempting to do so, she was imposing her middle-class biases on working-class families.

Middle-class homes had several features not found in a typical working-class house of the 1890s: water, sewer, and (later) electrical hookups; a three-fixture bathroom; a kitchen sink and other new technologies, such as an icebox, washing machine, and, eventually, electrical appliances; a formal dining room; enough bedrooms so that parents and children could have their own rooms; a front porch; and storage closets (since working-class families had “little to store,” their basic homes “made almost no provision for built-in, enclosed storage”).61 Although working-class families aspired to add these features to their homes in the 20th century, the fact that their homes did not have them in the late 19th century did not mean they were ignorant or (except in the case of sanitation) dangerous to the community.

As the 19th century came to a close, Riis’s revelations about the abominable housing conditions of many low-income families led middle-class intellectuals to ask two important questions. First, how can the poor be assured of safe and decent housing? And second, how can we make sure they don’t move next door to us? Not surprisingly, the second question was answered first by government policies such as zoning and public housing, while the answer to the first question would wait for the ingenuity of entrepreneurs such as Henry Ford and William Levitt.

American Nightmare

Подняться наверх