Читать книгу Bioethics - Группа авторов - Страница 116

Risks and Mistakes

Оглавление

[…] One of the objections [to genetic engineering] is that serious risks may be involved.

Some of the risks are already part of the public debate because of current work on recombinant DNA. The danger is of producing harmful organisms that would escape from our control. The work obviously should take place, if at all, only with adequate safeguards against such a disaster. The problem is deciding what we should count as adequate safeguards. I have nothing to contribute to this problem here. If it can be dealt with satisfactorily, we will perhaps move on to genetic engineering of people. And this introduces another dimension of risk. We may produce unintended results, either because our techniques turn out to be less finely tuned than we thought, or because different characteristics are found to be genetically linked in unexpected ways.

If we produce a group of people who turn out worse than expected, we will have to live with them. Perhaps we would aim for producing people who were especially imaginative and creative, and only too late find we had produced people who were also very violent and aggressive. This kind of mistake might not only be disastrous, but also very hard to ‘correct’ in subsequent generations. For when we suggested sterilization to the people we had produced, or else corrective genetic engineering for their offspring, we might find them hard to persuade. They might like the way they were, and reject, in characteristically violent fashion, our explanation that they were a mistake.

The possibility of an irreversible disaster is a strong deterrent. It is enough to make some people think we should rule out genetic engineering altogether, and to make others think that, while negative engineering is perhaps acceptable, we should rule out positive engineering. The thought behind this second position is that the benefits from negative engineering are clearer, and that, because its aims are more modest, disastrous mistakes are less likely.

The risk of disasters provides at least a reason for saying that, if we do adopt a policy of human genetic engineering, we ought to do so with extreme caution. We should alter genes only where we have strong reasons for thinking the risk of disaster is very small, and where the benefit is great enough to justify the risk. (The problems of deciding when this is so are familiar from the nuclear power debate.) This ‘principle of caution’ is less strong than one ruling out all positive engineering, and allows room for the possibility that the dangers may turn out to be very remote, or that greater risks of a different kind are involved in not using positive engineering. These possibilities correspond to one view of the facts in the nuclear power debate. Unless with genetic engineering we think we can already rule out such possibilities, the argument from risk provides more justification for the principle of caution than for the stronger ban on all positive engineering. […]

Bioethics

Подняться наверх