Читать книгу The State of Science - Marc Zimmer - Страница 9

Ethical and Safety Challenges of Scientific Growth

Оглавление

Authors and thinkers, including Ray Kurzweil and Bruno Giussani, suggest that science and technology are growing exponentially, while the structures of our society (government, education, economy, etc.) are designed for predictable linear increases, which are dysfunctional in today’s exponential growth.[11] This is why our nation-state system can’t deal with the challenges of modern science. The challenges presented by modern science (climate change, CRISPR, gene drives, and artificial intelligence) are much larger than those brought about by the Industrial Revolution (steam engines and electricity). Even if we find ways of globally regulating science, there will always be a country marketing itself as a place to do research that is banned elsewhere. And it just takes one country pursuing a high-risk, high-profit path for all the other countries to follow. In fact, the nation-state/growth economy that exists today requires that countries follow such paths to avoid falling behind.

Many governments, including that of the United States, control research by intentionally not funding certain areas that are either dangerous and unethical or difficult to regulate. This technique doesn’t work when foundations or start-up companies fund the work. It also fails when the techniques and materials being used are inexpensive, as is the case with CRISPR (chapter 9), and government funding isn’t needed (The Amateur Scientists; chapter 3). In the absence of clearer guidelines or regulations, scientists have to rely on themselves, on their own scientific norms. This doesn’t work too well in modern science because of the intensely competitive nature of academia, in which “the drivers are about getting grants and publications, and not necessarily about being responsible citizens,” notes Filippa Lentzos of Kings College London, who specializes in biological threats.[12] High-profile results matter. In addition, to prevent their competitors from knowing what they are doing and prevent being scooped, scientists keep their experiments under wraps until they are ready to publish, at which point the cat is let out of the bag, and it is too late to think about the ethical impact of the work or to try to stop the research.

Dual-use research, which could be used for either good and ill, presents its own challenges to the safe and ethical regulation of global scientific research. Occasionally scientists work their way to an invisible dual-use research line and cross it. In response, surprised, shocked, and scandalized scientists have urgent meetings to discuss the moral and safety implications. Scientists often proudly point to the 1975 Asilomar conference on recombinant DNA as a model response for dealing with science that has reached new and challenging boundaries in ethics and safety. They perceive the conference as a successful self-regulation in the public’s interest. Senator Ted Kennedy and other politicians of the time saw it differently; they considered the scientists a group of unelected experts making public policy without public input. Scientists need broader input from the general public and ethicists, but they are hamstrung by the goals and modus operandi of the expert collaborators they need. Philosophers and ethicists take a contemplative, long-term perspective, while engineers are eager to take results from the laboratory to the market, and investors are always in a hurry and looking for short-term financial gains. Consequently, we have been very good at commercializing scientific discoveries but less proficient at predicting their consequences and proposing the appropriate guidelines (e.g., DDT, fracking, nuclear chemistry). The increasing speed at which scientific breakthroughs are being made will also make it harder and harder to predict and regulate them in the future.

Scientists, despite their desire to have inputs into policy related to their community’s discoveries, are not trained to anticipate the consequences of their research, and their solutions are often ineffective, as evidenced by the frequency of such “transgressions’ and mini “Asilomars.” For example, in 2002 scientists from the State University of New York, Stony Brook synthesized a polio virus from scratch; in 2005 researchers from the Centers for Disease Control and Prevention (CDC) reconstructed a particularly virulent form of the 1918 flu virus; in 2012 two teams mutated the bird flu virus to make it more virulent in mammals; in 2017 a group at the University of Alberta resurrected a horsepox virus, a close cousin of the smallpox virus; and in 2018 CRISPR was used for the first time to create genetically modified human babies. Each of these experiments crossed a line that may have unforeseen consequences, and each led to an emergency conference. Each case leads us closer to the point at which one small accident or well-placed malicious scientist can affect a large portion of the human population or even accidentally wipe out an entire species. In an interview with the Atlantic’s Ed Yong, Kevin Esvelt, a CRISPR/gene drive expert at MIT, succinctly summarizes the problem: “Science is built to ascend the tree of knowledge and taste its fruit, and the mentality of most scientists is that knowledge is always good. I just don’t believe that that’s true. There are some things that we are better off not knowing.”[13] On the other hand, we have to remember that some research, such as in vitro fertilization, was once seen as a transgression of scientific norms but is now scientifically and socially acceptable.

The State of Science

Подняться наверх