Читать книгу Lead Wars - Gerald Markowitz - Страница 13
Оглавление2From Personal Tragedy to Public Health Crisis
All scientific work is liable to be upset or modified by advancing knowledge. That does not confer upon us a freedom to ignore the knowledge we already have, or to postpone the action that it appears to demand at a given time.
BRADFORD HILL, 1965
By the mid-1950s the cat was out of the bag. Any doubt that lead exposure could permanently damage children was put to rest as researchers at Harvard documented continuing mental and neurological disorders among those ostensibly “cured” of acute lead poisoning, which was most often diagnosed after children showed a variety of symptoms, such as convulsions, muscle paralysis, “mental lethargy,” vomiting on eating solid food, and dizziness. For generations it was well recognized that workers in lead-based industries suffered severe neurological damage from lead poisoning, and by early in the twentieth century women and children were often barred from working in the areas of pigment and paint factories where lead was used. Beginning in the early twentieth century recognition grew that children outside the factory were also at risk because of contact with lead paint in their homes. As the nation’s cities grew exponentially following the Civil War, so too did the danger from lead paint that was used in and on the new houses.
By the 1920s physicians were remarking on the fact that children “lived in a lead world,” and by the 1940s a huge literature had emerged that detailed the horrifying effects of this metal on children. But for both children and adults prior to the 1940s, the assumption had been that if the overt symptoms of lead poisoning passed, there would be no residual effects. During World War II, the two Harvard researchers—Randolph Byers and Elizabeth Lord, a pediatrician and psychologist, respectively, at Boston’s Children’s Hospital—documented the long-term effects of acute lead poisoning even after a child had ostensibly “recovered.” From a group of 128 patients ranging in age from about ten months to four years who had been admitted with acute symptoms of lead poisoning over the span of a decade, the researchers followed twenty children who still lived in the Boston area. All but one of the children who had returned home with no clinical symptoms of cerebral damage still suffered in “both the intellectual and emotional spheres” in school over the course of the study. These children’s motor coordination was abnormal and their general intelligence appeared to have been permanently affected. A few of the children suffered “recurrent convulsions.” One child at the end of first grade had “not learned to write or print his name or recognize any figure.” Another six-year-old was described as “cruel, unreliable [with] impulsive behavior; runaway; unable to get on with other children or adults; excluded from school because of behavior.”1 In the decades since, researchers and clinicians have documented the huge numbers of children at risk, now with the understanding that lead causes permanent damage.
As the seriousness of this epidemic became increasingly apparent in the 1950s, public health officials in Baltimore, New York, Chicago, Cincinnati, Boston, and other large cities began to follow the scientific and medical literature on the effects of lead paint poisoning. Many cities passed ordinances that required warnings on containers and restrictions on the sale of lead paints for use on walls, woodwork, and other surfaces accessible to children. But their actions were piecemeal and uncoordinated.
Historically, health departments in the United States were local operations whose administrators rarely harmonized responses with each other, even in the face of the most dire public health threats. In the case of childhood lead poisoning, very few city administrators as late as the 1950s were even aware of the national scope of the problem, much less how colleagues in other communities were coping with it. While some of the larger cities began to establish registers to document the extent of the problem within their jurisdiction, there was no central source for information outside of the Lead Industries Association (LIA), the trade association of the lead industry. Nor did public health officials generally remember the controversy that arose about the potential hazards from lead when it was introduced into gasoline in the 1920s.
Since its creation in 1928, the LIA had downplayed health concerns for fear that they might undermine business, but that had not stopped the organization from tracking reported cases in the medical literature of death and disease among children exposed to lead paint. In the 1950s the LIA bragged that it possessed the most extensive archive of newspaper articles, reports, and general information on this toxic metal.2 Though the U.S. Public Health Service (PHS) was nominally responsible for addressing the health effects of toxic metals, at the time this agency was largely focused on the problems of infectious epidemic diseases and their threat to the nation as a whole. The modern federal institutions that potentially might coordinate a national effort to inform local agencies of toxic threats and to coordinate remedial action were just being born. The U.S. Department of Health Education and Welfare, the predecessor to the current U.S. Department of Health and Human Services, which today oversees the PHS and the National Institutes of Health, was only established by Congress in 1954.
In the absence of federal and local knowledge and coordination on lead issues, from the 1930s through the 1950s the LIA assumed a central role in funding research on lead-related illness and framing national policy regarding childhood lead paint poisoning. The trade group resisted efforts by cities and states to regulate lead pigments in paint. Instead, in the 1950s it called for the establishment of limited, voluntary agreements among paint manufacturers to cap the amount of lead used in paints intended for indoor use. These recommendations the LIA misleadingly called “standards,” and both the lead and paint industries hoped they would thereby inoculate pigment and paint manufacturers from state and local regulatory action. The lead industry, through the LIA, in effect set the agenda that public health officials and lead researchers would live by for the foreseeable future: and the LIA of course did not advocate the removal of lead paint from the walls of homes. Rather, from the 1950s onward it promoted the view that lead poisoning was a virtually insoluble problem, largely limited to black and Puerto Rican children living in slum dwellings, and that the elimination of childhood lead poisoning was a utopian dream.
Before the mid-1950s, the one exception to general ignorance about the extent of lead poisoning was in Baltimore, where in the 1930s the Department of Health had begun to track and even treat lead-poisoned children who appeared in its clinics. Baltimore was the first and only American municipality before the 1950s to develop, according to the pioneering research of historian Elizabeth Fee, “an extensive public health program on childhood lead paint poisoning.”3 The City organized health education campaigns, housing inspections, and lead-abatement programs, and it passed some of the nation’s first paint-labeling laws. Baltimore’s visionary commissioner of health, Huntington Williams, appointed in 1931, was instrumental in bringing the city’s lead problem to the forefront of public health knowledge. Baltimore’s early recognition of the issue’s seriousness may also be traced to the identification of fifty-nine cases of lead poisoning among poor African Americans who had burned battery casings to keep warm in the early years of the Depression.4 According to Fee, “Several patients developed acute encephalitis while others experienced headaches, vomiting, and dizziness.”5 The Baltimore American, too, described cases of lead poisoning and its dangers and communicated Health Department warnings about ingesting lead paint. “Parents should be on the lookout and remember that paints often contain large quantities of lead compounds and that the eating of considerable amount from paint materials may result in lead poisoning,” read one such alert from the 1930s.6
By 1935, the Health Department had begun to offer free laboratory diagnostic tests to doctors who suspected that their patients were suffering from lead poisoning.7 Department inspectors visited homes, took samples of loose paint, and tested them for lead. When lead was found, the agency ordered the paint removed.8 During the first three years of the program, fifty-seven cases of acute lead paint poisoning in children were confirmed. Throughout the 1930s, the department documented paint as a prime source of childhood lead poisoning and used the new medium of radio to warn local residents of the often dire, even fatal effects of lead poisoning.9
The dedication of Williams and the Baltimore Department of Health to uncovering lead-poisoned children was quite remarkable, given the enormous effort such an undertaking required. It was nearly impossible to get children tested for suspected lead poisoning for several reasons: legal restrictions limited testing to occupational, not environmental, exposures; the tests themselves were difficult to carry out; and only a limited number of laboratories were capable of performing the extraordinarily time-consuming analysis needed.10 As late as the 1950s, one technician could typically analyze only eight tests per day.11
By the early 1940s, it was abundantly clear to Baltimore’s health officials that children were the prime victims of lead poisoning: according to Fee, in 1942 “86 per cent of the recorded deaths [from lead] were those of children, with an average age of death of two and one half years.”12 Recognizing that the problem was related to lead paint in the dilapidated slum housing of the city, Williams convinced the mayor to promote a city ordinance that would enable Baltimore to take action when harm seemed imminent. The Hygiene of Housing Ordinance was signed into law in 1941, authorizing the commissioner of health to order the removal or abatement of anything in a building or structure found to be “dangerous or detrimental to life or health.”13
Baltimore’s efforts were only successful in removing lead from a small number of buildings, but those efforts demonstrated that if you looked for lead poisoning among America’s urban children, you generally found it.14 Because of Huntington Williams’s efforts, Baltimore provided the nation’s most startling evidence on childhood lead poisoning. This in turn prompted Maryland to pass a Toxic Finishes Law in 1949, which, one LIA spokesman noted, “made it unlawful to sell toys and playthings, including children’s furniture, finished with any material containing ‘lead or other substance of a poisonous nature from contact with which children may be injuriously affected’ unless such articles are so labeled as to show that the finish contains lead or other poisonous substance.”15 The LIA subsequently lobbied state officials to repeal the law and soon claimed success in 1950 when the governor signed the repeal: “The campaign to remove this 1949 enactment from the statute books of the state was brought to a successful conclusion,” the association trumpeted to its members.16 The law imposed a burden on its affiliates, the organization said,17 while its health and safety director, Manfred Bowditch, complained privately that “these young Baltimore paint eaters were a real headache.”18 Not surprisingly, the lead industry favored placing the burden for preventing lead poisoning directly on the family. “The only seemingly feasible means of coping with the childhood plumbism problem is that of parental education,” the LIA argued.19 This ran counter to some of the oldest observations about childhood lead poisoning, dating at least to the first decade of the twentieth century when A.J. Turner, one of the first researchers to document childhood lead poisoning due to paint, argued that public health could not rely on parental education; legislation was needed to stem the epidemic.20
By the 1950s Baltimore institutions were dramatically affected by the ongoing lead crisis. Indicative is the experience at one hospital in Baltimore, as summarized by Mark Farfel: “Ninety per cent of the children between the age of seven months and five years seen at the hospital’s outpatient clinics in a one year period in the early 1950s had blood lead levels greater than 30 µg/dl [micrograms per deciliter].”21 In an attempt to curb the further spread of lead paint, Baltimore’s health commissioner issued a regulation in 1951 that it would take other communities at least a decade to replicate: “No paint shall be used for interior painting of any dwelling or dwelling unit or any part thereof unless the paint is free from any lead pigment.”22
Huntington Williams meanwhile had begun looking beyond the seizures and deaths of children to speculate that “unrecognized plumbism, lead poisoning, in children may explain many obscure nervous conditions and convulsions of undetermined etiology.” His (correct) conclusion was that “lead poisoning is cumulative.”23 Even the LIA, in 1950, recognized that new problems were on the horizon: “As our hygiene activities have expanded, the magnitude of our industry’s health problems become more and more evident.”24 In 1953, the LIA said it collected during the previous year “nearly 500 newspaper clippings featuring lead poisoning, often in sizable headlines,” indicative of the greater role the press was playing in bringing the severity of lead poisoning to the attention of the general public. Internally, the LIA admitted that “childhood lead poisoning continued to be a major problem and source of much adverse publicity,”25 yet it still opposed warning consumers of the danger its product posed to children.
The continual refrain from the lead industry—that childhood plumbism could only be addressed through the voluntary action of parents—quickly grew stale for anyone who routinely saw the effects of acute lead poisoning. J. Julian Chisolm, then a young physician associated with Johns Hopkins Hospital, had much firsthand experience with the group of children that by the mid-1950s were unfortunately labeled “lead heads” by the young residents at the hospital. Chisolm took issue with the industry’s casual attitude toward what was obviously a serious medical problem affecting Baltimore’s children. In a study of children at the Harriet Lane Home, he and his coauthor, Harold Harrison, had inspected sources of lead contamination in homes and found, like Henry Thomas and Kenneth Blackfan more than forty years before, that the prime “sources of lead were windowsills and frames, interior walls, including painted paper and painted plaster, door frames, furniture and cribs.” Throughout the “dilapidated dwellings” where young children lived, Chisolm observed that “flaking leaded paint is readily accessible.” He took umbrage that the industry blamed parents for the tragedy: “While the responsibility of parents to protect their children from environmental hazards is not denied, no mother can reasonably be expected to prevent the repetitive ingestion of a few paint chips when these are readily accessible.”26
THE GROWING EPIDEMIC: FROM BALTIMORE TO THE NATION
As early as 1951, the American Journal of Public Health acknowledged both the centrality of Baltimore and Johns Hopkins in the unfolding story of lead-poisoned children and the reality of lead poisoning as a nationwide problem. It chastised the public health profession for not recognizing the extent of lead poisoning, sarcastically asking “whether babies brought up in the shadow of ‘the Hopkins’ develop peculiar alimentary tastes not common elsewhere” and arguing that “if such is not the case, perhaps other health officers have been missing something.”27 This was indeed the case in Chicago, where Robert Mellins, a young Public Health Service officer, uncovered an epidemic of childhood lead poisoning in 1953. Mellins had been assigned to Chicago in response to the continuing polio epidemic that terrified the nation in the post–World War II era. His first day in Chicago, he learned from local health personnel about what they feared was an outbreak among the city’s children of St. Louis encephalitis, a serious mosquito-borne neurological disease. Having been a medical student at Johns Hopkins in the late 1940s and early 1950s, Mellins was aware that lead poisoning was often mistaken for encephalitis, which led him to question the diagnosis and suggest the children be reevaluated. What he had come upon was, in fact, the first epidemic of lead poisoning in Chicago that would be recognized as such.28
In an internal summary of his LIA activities in 1952, Manfred Bowditch once again used the image of a “major headache” in what was emerging as a major national tragedy. Calling childhood lead poisoning “a source of much adverse publicity,” he counted 197 reports of lead poisoning in nine cities, of which 40 were fatal, but acknowledged that this was an “incomplete” estimate, especially for New York City.29 Others also began to notice the scale of the epidemic. Between 1951 and 1953, according to George M. Wheatley of the American Academy of Pediatrics, as reported in the New York Times, “there were 94 deaths and 165 cases of childhood lead poisoning . . . in New York, Chicago, Cincinnati, St. Louis, and Baltimore.”30 By the standards of the time, these were of course only the most acute cases, often life-threatening; lead poisoning that caused lesser damage was neither the focus nor in many instances even attributed to lead.
The LIA was caught in a bind. On the one hand, it had in its possession numerous reports from health departments demonstrating the widespread nature of the lead paint hazard. On the other hand, the association was fighting a rearguard action hoping to convince officials and the public that the number of lead-poisoning cases was exaggerated. To continue in this fight, Bowditch confided to an industry colleague, would be “prohibitively expensive and time-consuming.”31 Bowditch did not dispute that childhood lead poisoning could come from ingesting lead-based paint. But rather than concentrate on how to prevent lead poisoning—toward which a first step would be the elimination of lead from interior paint—Bowditch believed the LIA should focus on “securing more accurate diagnoses of lead poisoning or face the likelihood of widespread governmental prohibition of the use of lead paints on dwellings.”32 Robert Kehoe, the longtime head of the Kettering Institute at the University of Cincinnati, a research center established and funded by the Ethyl and General Motors Corporations in the 1920s, admitted in 1953 in a personal letter that the problem was not diagnostics but the paint itself. If the elimination of lead paint “for all inside decoration in the household and in the environment of young children . . . is not done voluntarily by a wise industry concerned to handle its own business properly, it will be accomplished ineffectually and with irrelevant difficulties and disadvantages through legislation.”33
By the mid-1950s, newspapers and public health departments in other cities had begun to report more systematically on cases of lead poisoning. The LIA responded by trying to divert attention from the lead industry’s role in distributing a known poison, sometimes in the process even mocking the children who were poisoned. In a private letter to the editor of the American Journal of Public Health, Bowditch suggested that the high rates of lead poisoning in Baltimore indicated that there was “all too much ‘gnaw-ledge’ among Baltimore babies.”34 When he was being serious he was even more dismissive of the victims: the problem was not lead in the paint, it was the housing and the parents. In 1956 Bowditch wrote to a former head of the LIA, Felix Wormser, then assistant secretary of the interior—the federal agency responsible for regulating lead and other mining and metal industries—criticizing an article on childhood lead poisoning that had appeared in Parade, the nationally distributed Sunday newspaper supplement. “Aside from the kids that are poisoned,” Bowditch complained, “it’s a serious problem from the viewpoint of adverse publicity.” The basic problem was “slums,” he argued, and to deal with that issue it was necessary “to educate the parents.” “But most of the cases are in Negro and Puerto Rican families, and how,” Bowditch wondered, “does one tackle that job?”35
Bowditch was a bit more discreet in his statements to the LIA’s general membership. At the association’s 1957 annual meeting, he argued that “the major source of trouble is the flaking of lead paint in the ancient slum dwellings of our older cities”—though in saying this he obscured the fact that lead had been the main component of interior paint as recently as the early 1950s (and still constituted 1 percent of many wall paints for the next twenty years). “The problem of lead poisoning in children will be with us for as long as there are slums,” he said. But then he absolved the LIA of responsibility, again arguing that the real problem lay with the ignorant children and parents. “Because of the high death rate, the frequency of permanent brain damage in the survivors and the intelligence level of the slum parents, it [the issue of lead-poisoned children] seems destined to remain as important and as difficult [a problem] as any with which we have to deal.”36
But how could the problem be addressed? Bowditch was not optimistic: “until we can find means to (a) get rid of our slums and (b) educate the relatively ineducable parent, the problem will continue to plague us.”37 This argument, that it was inevitable that black and Puerto Rican children would be damaged by lead for the foreseeable future, set the stage for the next half century of lead-poisoning policy. With the lead industry unwilling to accept its responsibility for this epidemic or remove all lead from paint, and with only sporadic moves to restrict use of lead products and enforce the housing codes that did exist, doctors were forced to treat more and more children suffering from lead-induced acute symptoms of severe brain damage with powerful drugs, the “chelating agents” that when introduced into the blood stream could bind with lead, allowing it to be passed from the body through urination. The more sophisticated and progressive public health departments would sometimes visit children’s homes and remove the lead from the walls. At best, this helped to prevent further injury, but such remedial actions did little to forestall the housing, pediatric, and public health crises that were emerging. The industry’s proposition that lead poisoning was largely a problem of “flaking of lead paint in the ancient slum dwellings of our older cities” rendered it a disease of poverty and the socioeconomically deprived. As lead poisoning became increasingly defined as a problem of poor African American and Latino children in urban slums, in this pre–civil rights era there was no active political constituency capable of making it a pressing concern.38
From the very first, then, lead poisoning and housing were inextricably linked. For housing officials, removing lead paint was (and still is) an expensive procedure that landlords were often unwilling to undertake. And housing officials in the few cities that passed regulations to control lead often ignored these housing codes, fearing that the expense of abatement would prompt landlords to abandon their properties.39 Further, effective enforcement required a huge army of inspectors, personnel that were unavailable to local departments of health with limited budgets. Finally, identifying dilapidated interiors was itself difficult because most poor tenants were unaware of their rights to a safe home even under the existing housing codes, or they were afraid they might be evicted if they filed a complaint. Even when buildings with peeling lead paint were identified, it might take months, even years for a landlord to be hauled into court, and even then the fines were generally minimal, leading landlords to forego expensive repairs and pay the eventual fine instead.40
Children suffered enormously as a result of this inaction. Until the 1950s, when BAL (British anti-Lewisite, or Dimercaprol) and CaEDTA (ethylenediaminetetraacetic acid) were introduced as chelating agents,41 two-thirds of children who suffered convulsions and swelling of the brain due to lead ingestion died. With the use of chelating agents, the death rate was cut in half, but it was still almost one in three,42 and those who did survive were often deeply damaged.
IT’S IN THE AIR: LEADED GASOLINE AND OTHER SOURCES OF LEAD DANGER
Increased attention to paint as a source of lead in the environment was complemented in the early and mid-1960s by a growing body of evidence suggesting that significant amounts of lead were also entering the human environment through other means: contamination of the soil and air from insecticides; and fallout from lead-bearing compounds during smelting, mining, and fabricating processes and automobile exhausts. Pots and pans, water pipes made of lead or joined by lead solders, and cans sealed with lead solder—once hailed by the industry as symbols of lead’s role in creating the modern environment—were now suspected as contaminants of the human food chain, as was beef, from the lead cattle absorbed in grazing. To explore these issues, the U.S. Public Health Service sponsored a conference in 1965 on environmental lead contamination, where it soon became clear that lead from gasoline was the most pressing concern because of its magnitude and dispersal throughout the country.
As early as the 1920s, public health leaders had worried that the introduction of lead into gasoline would, as the auto industry expanded, ultimately prove to be a serious source of environmental pollution.43 A 1966 study of lead in the soot of New York City streets, for example, revealed the startling fact that its lead concentration was 2,650 parts per million (ppm).44 Of particular worry at the time was the rapid expansion of the interstate highway system through the heart of most American cities: studies had found that much more lead was deposited from exhaust pipes when cars were moving at high speeds, thereby increasing the threat to urban populations.45
Yet since the 1920s the lead industry had sponsored research by Robert Kehoe that claimed that introducing more lead, even much more lead, into the environment presented no danger to people because, he argued, lead was a natural part of the human environment and people had developed mechanisms over the millennia to excrete lead as rapidly as they inhaled or ingested it. This rationale, that lead was a “natural” constituent of the human environment, became a mainstay of the industry argument from the 1920s forward. At the 1965 conference, Kehoe laid out the industry view of lead’s dangers: the intake of lead “is balanced for all practical purposes by an equivalent output,” so there was “an equilibrium with the environment.” Did the lead that people absorbed in the course of their daily lives constitute a risk? “The answer,” said Kehoe, “is in the negative.”46
This fanciful model of lead’s ecology was dismantled piece by piece as speaker after speaker at the conference, for the most part in a businesslike and respectful manner, questioned Kehoe’s underlying assumptions. While the world of lead toxicology was still relatively small and dominated by a few recognized experts, new voices, influenced by the emerging environmental movement following publication of Rachel Carson’s Silent Spring in 1962, were beginning to be heard. Criticisms of virtually every element of Kehoe’s model were made throughout the conference, but only on the last day did they coalesce as a full-blown rejection of the industry’s paradigm. Harry Heimann of Harvard’s School of Public Health, who had had experience working in the PHS Division of Air Pollution, told the conference that he wanted to make some “comments based on my listening for the last two days, having some discussions with some people in and outside the room, and my experience as a physician who has spent most of my life in public health work.” While he did not “mean to get into any acrimonious debate” and was “not intending to impugn anybody’s work,” Heimann confronted Kehoe directly. He announced that he felt compelled to “point out that there has been no evidence that has ever come to my attention . . . that a little lead is good for you.” It was, he went on, “extremely unusual in medical research that there is only one small group and one place in a country in which research in a specific area of knowledge is exclusively done.” Kehoe’s experiments that were said to provide evidence that lead from gasoline and other airborne sources presented little danger to people would need “to be repeated in many other places, and be extended,” before the scientific community lent them legitimacy. He also questioned Kehoe’s assertion that no danger existed below a blood lead level of 80 micrograms per deciliter, a reading that often corresponded with convulsions in adults working in lead-paint and other factories.47
In addition to presenting a clear challenge to the paradigm that Kehoe and the lead industry had carefully propagated for more than thirty years, participants at the 1965 conference challenged the very basis of industrial toxicology as it then existed. In the words of one attendee, lead toxicology put “the whole field of environmental health . . . on trial.”48 Scientists had for too long accepted the industry argument that if workers who were exposed to various toxins, including lead, did not show symptoms of disease, the public had little to worry about, since consumers were exposed to much lower levels of these materials. A broad public debate was needed on what was, and was not, an acceptable risk; industry assurances of safety were not sufficient. The “public at large [needed to] be given a rational basis on which to decide . . . that lead should or shouldn’t be taken out of gasoline, that pesticides should or shouldn’t be used in various situations, that asbestos should be curbed.”49 Indeed, in the coming years, the field of lead toxicology would be transformed to address just such concerns.
In the mid-1960s, Kehoe was just one of several industry supporters repeating the mantra that the critical measure of lead’s toxicity was the worker in the plant. Studies had shown that lead workers on average were absorbing less lead than earlier in the century, and industry touted this as proof that the public was protected as well. When Senator Edmund Muskie (D-ME) held hearings on air pollution in mid-1966, the LIA campaigned to undercut any criticism of the lead industry that might emerge. In addition to preparing articles and press releases to encourage “positive stories regarding lead and its uses,” the LIA developed testimony for the hearings.50 Felix Wormser, retired but still on retainer for St. Joseph Lead Company, testified on behalf of the LIA, asserting that “vast clinical evidence” showed that “the general public is not now, nor in the immediate future, facing a lead hazard.” Leaded gas posed no harm at all and a vast literature and much research confirmed this view, he claimed.51 Kehoe went on to testify that “the evidence at the present time is better than it has been at any time and that [lead] is not a present hazard.”52 His commitment to an 80 µg/dl blood-lead-level threshold blinded him to the possibility that, whatever this standard’s adequacy for protecting adults, children, because of lead’s effect on their developing neurological systems, might be at much greater risk at lower levels.
Though Kehoe’s position aroused skepticism among some in scientific and political arenas, it still found considerable acceptance among the general public. Kehoe himself had told Muskie’s committee that his laboratory was “the only source of new information” about lead in the factory and the environment and had “a wide influence in this country and abroad in shaping the point of view and activities . . . of those who are responsible for industrial and public hygiene.”53
Storm clouds were appearing on the horizon, however. In 1967, the LIA commissioned the Opinion Research Corporation to conduct a survey of “public knowledge and attitudes on lead.” The survey revealed that 42 percent of the public identified “lead among ten substances as being harmful to health.” In fact, lead ranked second only to carbon monoxide in Americans’ perceptions of risk. The only solace the LIA could garner from the survey was that the public relations damage seemed, for the moment, to be contained: in the public mind, lead’s danger “seems to be associated primarily with paints.” Only 1 percent of those surveyed identified leaded gasoline as being “harmful to health.” Still, few people polled could identify any positive uses for lead, the LIA learned, a point that did not augur well for the future. That so many people believed that lead posed a health problem meant, in the words of Hill & Knowlton, the lead industry’s public relations firm, “they could be expected to be receptive to—or are, in effect, preconditioned for—suggestions that lead emissions into the atmosphere may constitute a health hazard.” Hill & Knowlton warned that with increasing attention to air pollution the public could soon view leaded gasoline as a threat to their health.54
As with early concerns about lead paint, the industry made it its business to promote the metal as good for society and to challenge assertions that lead in the atmosphere was dangerous. In a letter to its members in 1968, the LIA extolled the importance of its new publication, Facts about Lead in the Atmosphere, which it described as “one phase of the LIA’s efforts to refute the many claims made in the technical journals and the lay press that lead in the ambient air is reaching dangerous levels.” Such claims were “entirely without foundation,” the association asserted.55 Just as the National Lead Company, producers of the Dutch Boy brand of lead pigment and paint, had sponsored ads in the century’s opening decades, bragging that “Lead Helps to Guard Your Health,” among other supposed benefits, the LIA called lead “an essential metal that is too commonly taken for granted by the public.”56 The uses for lead were now of a decidedly more modern and technological nature, though. It was used as “the basic ingredient in the solder that binds together our electronic miracles and is the sheath that protects our intercontinental communications system. It is the barrier that confines dangerous x-rays and atomic radiation. It is sound-proofing for buildings and ships and jet planes.” And, it was, of course, the major component of batteries and an ingredient of the gasoline that ran the nation’s automobiles.57
Perhaps more than any other figure of the middle decades of the twentieth century, Clair C. Patterson, a geochemist at the California Technical Institute who had trained at the University of Chicago and had worked on the Manhattan Project during World War II, challenged the dominant paradigm of industry-sponsored lead researchers and the control that the LIA exercised in how lead was perceived. Among the many articles Patterson wrote, one that he submitted to the Archives of Environmental Health in 1965 particularly outraged Robert Kehoe.58 Patterson’s research challenged their belief that lead was present in only trivial amounts and had always been present at about the same level in the environment. Although both Robert Kehoe and fellow researcher Joseph Aub were asked to review Patterson’s paper before its publication, only Kehoe was willing to critique it directly.
In the article, Patterson documented the extensive pollution caused by the growing use of lead in the wake of the Industrial Revolution.59 He had taken core samples of ice from the polar ice cap and measured them for metal content. The increase of lead over time in the core samples from Greenland paralleled the increase in lead smelting and, what was more telling, the consumption of leaded gasoline. The lead concentration of the ice had risen 400 percent in the two hundred years from the mid-eighteenth to the early twentieth centuries; but in just the ensuing twenty-five years, the period when leaded gasoline became the standard fuel for the exploding automobile industry in Europe and America, it rose another 300 percent.60 Patterson estimated that the average level of lead in the blood of Americans was about 20 µg/dl, well below what in the early 1960s was considered the “danger point,” 80 µg/dl, but still startling.61 (Today, as we have seen, the Centers for Disease Control defines 5 µg/dl as “elevated.”) In this, Patterson was directly contradicting Kehoe’s long-standing argument that humans had been adapted to roughly current levels of lead for centuries.
Far from it being normal for Americans to have such elevated levels, Patterson claimed that most Americans bore an unnatural, and potentially unhealthy, amount of lead in their bodies.62 Unlike earlier lead researchers, he was coming at the issue of lead poisoning from outside the small world of lead toxicologists who had largely depended on industry to support their research. It was as important, from industry’s point of view, to tarnish the credibility of this “outsider” as it was to rebut the specifics of his argument. With an attack on Patterson’s work, the industry began a campaign—which continues to this day—to undercut the findings of researchers who have dared suggest that low-level lead pollution has subtle impacts on the general population’s health and specifically on children’s mental development.
Kehoe worried that so many draft copies of Patterson’s paper had already circulated that without a formal channel for rebuttal, Patterson’s position might gain greater and greater credibility through word of mouth alone. In the end, Kehoe supported the Archives of Environmental Health’s decision to publish the piece, a move that historian Christian Warren ascribes to Kehoe’s recognition that its publication was inevitable, and to his hope to thus obligate the journal to make room for a subsequent detailed critique.63
In addition to questioning Patterson’s credentials, methodology, and interpretation of the data, his critics were most concerned about his conclusion that “the average resident of the United States is being subjected to severe chronic lead insult.”64 Through his argument, Patterson was undermining the industry view that relatively low levels of exposure were harmless and that the only Americans at risk were workers exposed to high levels. He was questioning the industry view that one was either acutely lead poisoned or one was essentially unaffected by the substance.65 Right after the publication of Patterson’s 1965 article, Donald G. Fowler, the LIA’s director of health and safety at the time, took issue with Patterson’s “assertion that lead pollution in the air has reached ‘alarming’ proportions.” Fowler dismissed the findings as “based on his [Patterson’s] own geological studies . . . and his own interpretive extensions upon these studies into non-geological fields.” Patterson’s work, he claimed, ignored “the recognized body of clinical and biological evidence” and was “unsupported by any medical evidence.” Fowler went on to declare that “lead is not a significant factor in air pollution” and “the public can rest assured that lead constitutes no public health problem.”66
FIGURE 2.Lead (Pb) deposited from leaded gasoline in U.S. cities, 1950-1982. Gasoline was a main source of lead that damaged many, particularly urban, children until the 1980s. The amount of lead in gasoline was gradually reduced from 4 grams per gallon to 0.1 gram per gallon in 1986. It was finally completely phased out of gasoline for automobiles in 1996. Source: Howard W. Mielke, Mark A. S. Laidlaw, and Chris R. Gonzales, “Estimation of Leaded (Pb) Gasoline’s Continuing Material and Health Impacts on 90 US Urbanized Areas,” Environment International 37 (January 2011): 248-57, available at www.sciencedirect.com/science/article/pii/S016041201000156X.
Patterson’s critique of lead’s ubiquity and its potential danger to the public came at a critical time for the industry. Historically, lead pigment had been the most economically significant market for lead producers, but it had begun declining in importance as latex and titanium pigments increasingly captured market share. As automobile sales mushroomed with the economic boom following World War II, the auto industry and the producers of batteries and leaded gasoline supplanted users of pigments as the major buyers of lead. Between 1940 and 1960, despite less frequent use of lead in interior paints, lead consumption increased from about 600 short tons to approximately 1,000 short tons per year. During this period, lead for use in gasoline increased eightfold, from about 25 short tons to just under 200; and lead used in batteries doubled, from 200 short tons to just under 400. Despite the increasing evidence of lead’s destructive environmental effects, during the 1960s and the 1970s lead production increased substantially. In 1964, the United States consumed 1,202 short tons of lead. By 1974, this had grown by about 25 percent, to 1,550 short tons.67
Patterson was not alone in taking on Kehoe’s paradigm. In 1966 Harriet Hardy, one of the nation’s preeminent occupational health physicians, condemned the lead industry’s threshold idea of harm based on adult lead workers; she argued it was inadequate as a means of protecting high-risk populations outside of the workplace. As coauthor, with Alice Hamilton, of the main textbook in occupational medicine, she argued that certain vulnerable populations—particularly children and pregnant women—might suffer the effects of lead at much lower levels of exposure than male workers did. Hardy also delineated the inadequacies in earlier definitions of lead poisoning, arguing that lead poisoning produced a host of subtle and difficult-to-define symptoms (such as fevers, lethargy, and joint pain), but no less damaging for that, which could easily escape the notice of physicians. “It is necessary to emphasize,” she wrote, “that no harmful effect of lead is unique [to that poison] except perhaps the motor palsy of the most-used muscle group, as in the wrist drop.”68
Hardy believed that the developing child was most at risk. Randolph Byers and Elizabeth Lord’s research in Boston on long-term effects of acute lead poisoning, along with clinical observations by doctors such as L. Emmett Holt, John Ruddock, Charles McKhann, and Edward Vogt, supported Hardy’s opinion that lead was more toxic to the young than to the adult population.69 In contrast to Kehoe, who used adult males in his studies and in his model of classic lead poisoning, Hardy recognized that a much wider net had to be cast to understand the full range of lead’s effects: “Prevention of diagnosable Pb poisoning in healthy male workers is important but not enough in our society.” Lead was a known toxin, and there was “no available evidence that lead is useful to the body,” particularly for women and children.70
In the coming years, the policy model that Patterson first proposed—that lead, a known toxin, should not be widely introduced into the human environment, and that Hardy expanded to specifically include women and children—would be embraced by those who pushed for the removal of lead from gasoline, and, hence, from the atmosphere.71 This was an early statement of what, generalized, would become known as the “precautionary principle”—the basic idea behind public health, that, when considering the use of new or suspect chemicals, it is prudent to prove them safe rather than waiting to see if they are harmful to people or the environment. Hardy quoted Bradford Hill, the eminent English epidemiologist who, with Sir Richard Doll, demonstrated the relationship between cigarette smoking and lung cancer: “All scientific work is incomplete. . . . All scientific work is liable to be upset or modified by advancing knowledge. That does not confer upon us a freedom to ignore the knowledge we already have, or to postpone the action that it appears to demand at a given time.”72
SOCIAL ACTION AND LEAD POISONING
Until the mid-1960s, lead poisoning—whether from lead paint, lead in gasoline, or lead in the factory—had remained, beyond the victims themselves, largely an issue for clinicians and researchers in a few major medical centers around the country, a small group of public health professionals, and an industry intent on protecting its market. But this changed as the civil rights movement galvanized the African American community and forced middle-class white Americans to acknowledge the extent of endemic poverty and racism. Michael Harrington’s 1962 book, The Other America, and the civil rights–era sit-ins, Freedom Rides, voter-registration campaigns, and school-desegregation drives all made poverty and racial discrimination headlines in daily newspapers across the country.
As the War on Poverty took shape in the years following John F. Kennedy’s assassination, the links between poverty, housing, and racism in the nation’s cities became increasingly apparent to many Americans. Lead poisoning—particularly from peeling paint in slum housing—became a signature disease of poverty. In New York City, the number of children identified as lead poisoned, now defined as 60 or more micrograms of lead per deciliter of blood shot up from 20 in 1952 to 509 by 1965; in Philadelphia, from 2 in 1952 to 163 in 1965; in Chicago, from 33 in 1953 to 304 by 1966. This was not because more children were affected but because more public health authorities and doctors were now conscious of lead poisoning’s existence and its array of symptoms.73
FIGURES 3A.Children at risk, 1960s. During the War on Poverty, peeling and chipping paint became a symbol of urban blight and social inequality. Community activists and housing reformers were critical in pressing for improved conditions. Source: (a) Chicago Tribune, February 3, 1966, reprinted with permission;
FIGURES 3B.Continued_(b) New York City Housing Authority, Wagner Archives, reprinted with permission.
Community groups such as the Young Lords in New York (a largely Latino organization), the Citizen’s Committee to End Lead Poisoning in Chicago, the Black Panthers in Boston and Oakland, and the Harlem Park Neighborhood Council in West Baltimore, as well as others around the country, seized on this devastating disease, seeing it as a representation of the ills of a culture rather than as a product of nature. These groups began agitating for more testing of children, better enforcement of existing housing laws, poisoning surveillance and prevention programs by departments of health, and new laws to hold landlords accountable for lead hazards.74 Sometimes, the lead tragedy actually led to civil disobedience, as it did in New York City in 1970 when the Young Lords seized unused mobile testing vans and began door-to-door screening for lead poisoning while others staged sit-ins at the Department of Health.75 In 1969, Jack Newfield, an influential writer for the Village Voice, picked up the story and wrote a series of articles about lead poisoned children who had suffered irreversible brain damage, thereby putting enormous pressure on the city to strengthen housing codes against flaking and peeling paint.76
Community activism played an important role in bringing attention to lead poisoning and reducing its impact on poor communities, as Mark Farfel wrote in 1985, a few years before he would codirect the Kennedy Krieger Institute study: the “Great Society programs, including Medicaid, urban renewal projects, and food stamp and food supplement programs,” led to the identification and amelioration of lead poisoning in subtle ways that were “difficult to quantify.” Building on the public health model of an earlier time, in which social reform was viewed as essential to effective public health efforts, Farfel noted that “improved nutrition, access to medical care and new housing” were critically important in reducing risk to children. “Even the civil rights movement may have reduced risk for toxicity among blacks by opening some doors to better housing.”77
Indeed, public health activists embraced numerous social causes in the mid-1960s and mid-1970s, building on the older tradition of allying with community organizations and consumer groups to effect changes in the delivery of services and health care. In New York City, the Health Policy Advisory Committee (Health PAC) gave young professionals both in and out of government a means of linking movements to combat poverty, poor housing, lead poisoning, racism, and other social ills to health and their professional identities. For at least a decade, Health PAC and other health professional organizations and other groups in the American Public Health Association—such as the Medical Committee for Human Rights, Physicians Forum, and Physicians for Social Responsibility—helped to build community health centers in poor neighborhoods in northern cities and southern rural communities, achieved a partial atomic test ban to reduce strontium 90 and other radioactive exposures, developed programs to improve housing condition in poor communities around the country, and pressured governments to organize services for the poor on Indian reservations and in urban neighborhoods.78
Throughout the country, large city health departments were pressed by community groups and concerned professionals to expand surveillance efforts and screening programs, which brought greater awareness of the extent of lead exposures and concern that Clair Patterson and Harriet Hardy were accurate in arguing that lead poisoning was a much more serious problem than previously assumed. According to one government expert, by the late 1960s several large cities, including Chicago and New York, “reported that 25 to 45 percent of one- to six-year-old children from high-risk areas had blood lead levels exceeding 40 µg per 100 ml.”79
As doctors became more alert to the possibility of lead poisoning, the numbers of those acknowledged to be affected naturally increased. But the fatality rate didn’t. In Chicago, in 1966, for example, a study of more than 60,000 children showed “a marked rise in cases reported [compared to the 1950s] and a sharp decrease in fatality rate.”80 In New York, like Chicago, the fatality rate among those diagnosed with lead poisoning declined from 27 percent in the 1950s to 1.4 percent in 1964.81 Such drops in the fatality percentage were in part a function of increased surveillance and a lower threshold used to trigger a diagnosis of lead poisoning, which increased the pool with which the percentage was calculated but not the fatalities. But beyond that, it was widespread use of chelating agents that was responsible for this remarkable decline. An early champion of chelation therapy was J. Julian Chisolm, who would years later be the co-principal investigator of the KKI study so excoriated by the Maryland Court of Appeals.
As a young physician in the 1950s, trained at Johns Hopkins and at Princeton before that, Chisolm was in his generation almost unique in his ongoing professional focus on lead poisoning, particularly among African American children. In the early 1950s he received a fellowship to study the breadth of lead poisoning among Baltimore’s children. He visited homes, collecting stool samples of young children to analyze their lead content, and found that the City had been grossly underestimating the extent of the problem. He recalled that his “first findings . . . were that children who ingested paint were getting more lead than even heavily exposed industrial workers.”82
FIGURE 4.J. Julian Chisolm examining a child, ca. 1972. Chisolm was one of the early pioneers who called attention to the lead-poisoning epidemic, and throughout his life he treated thousands of lead-poisoned children in Baltimore. Source: Baltimore Sun, March 14, 1972, reprinted with permission.
Chisolm’s own background, perhaps, stimulated his commitment to and concern for African American children. Ironically, he came from a long line of southerners whose roots were in the South Carolina planter class. His great-great-great uncle, also named J. Julian Chisolm, was the leading surgeon for the Confederacy during the Civil War and author of the primary text for Confederate army surgeons.83 That Chisolm moved to Baltimore after the Civil War, established the Presbyterian Eye, Ear and Throat Charity Hospital, and became professor of ophthalmology and dean at the University of Maryland School of Medicine in Baltimore.84 J. Julian Chisolm Sr. (our J. Julian Chisolm’s father), himself the son of a Presbyterian minister who presided over a Natchez, Mississippi, segregated congregation of African American and white parishioners, received his medical degree from Johns Hopkins early in the twentieth century and taught at the medical school there for many years.85
J. Julian Chisolm Jr., who died in 2001, was a tall, large man with “a round face and sort of wispy hair that wasn’t very well combed,” according to Ellen Silbergeld, his student and protégé. He was mild mannered “in a kind of old Maryland gentleman way,” she remembers, and he “always wore a bow tie as the pediatricians in his day did,” so that young children couldn’t grab his tie. He could also be “very acerbic” to those who denied the importance of issues he cared deeply about. That the poisoning of African American children was one of these, Silbergeld said, “probably inhibited his promotion at Hopkins . . . to full professor until he was almost dead.” His commitment to the children he treated from the neighborhood around Hopkins was unquestioned by his students and colleagues. He once told Silbergeld that he saw racism inherent in the society’s lack of response to lead poisoning. “If this was a disease of white children,” he told her, “we would have done something about this a long time ago.” From these experiences came a life-long passion to address the effects of lead paint as the primary source of danger to children.86
Chisolm was working against the ingrained segregationist culture of Baltimore and Johns Hopkins at this time. Like other medical schools, Hopkins was an institution dominated by relatively wealthy, white, overwhelmingly male doctors and trustees. It was a “white enclave on a hill surrounded by” a largely poor African American community, recalls Connie Nathanson, now a professor of sociomedical sciences at Columbia, who worked in pediatrics at the Hopkins medical school from the late 1950s until 2002.87 In the postwar period, plans were developed for 178 garden apartments for African American families who were being displaced by the urban renewal project close to the Hopkins medical school campus. Those same plans included housing for residents and staff at the university. As social scientist Stephanie Farquhar documents, the “planned 178 garden apartments for blacks were never built . . . though the garden apartments for Hopkins married staff and the residence hall for Hopkins’ unmarried staff and students were.”88 Nathanson recalls that the “housing for residents and interns [was] surrounded by wire fencing that segregated it off” from the surrounding community.89
By 1969 the combined use of the two chelating agents, BAL and EDTA, according to Chisolm, “apparently reduce[d] the mortality from acute lead encephalopathy to 5%.” It was a pyrrhic victory, though, for “the incidence of severe, permanent brain damage among survivors of encephalopathy continues to be 25% or more,” he said. And “if survivors of an initial attack of acute lead encephalopathy are re-exposed to abnormal lead exposure the incidence of severe permanent brain damage is increased to virtually 100%.”90 Chisolm clearly understood the critical connection between treatment of children and the need to rehabilitate the housing where they lived. “The cornerstone of our current therapeutic program,” he argued, “is prompt termination of environmental exposure to lead: no child with an increased body burden of lead is ever returned to a leaded home.” Although it is not clear how often this was accomplished or who paid for this service, Chisolm wrote that after chelation therapy, children at Johns Hopkins clinics were either placed in suitable new housing or their homes were abated of lead. Chisolm understood that children having to be poisoned before remedial action was taken was both destructive for the child and expensive for society. “How much more intelligent it would be,” he commented, “to spend our effort and substance on the systematic elimination of environmental lead exposure associated with old dwellings. Were this to be done, childhood lead poisoning could be largely eradicated in the United States.”91 Although the solution was obvious, the means of attaining it were not. In the late 1960s there were, for example, thirty million housing units nationwide still in use that had been built before 1950; of these, at least 90 percent were polluted by lead.92