Читать книгу We Humans and the Intelligent Machines - Jörg Dräger - Страница 10

In the service of justice

Оглавление

“Smaller, safer, fairer.”9 Using this motto, Mayor de Blasio presented his plan to close New York’s largest prison in June 2017.10 In the 1990s, most of the city’s then 20,000 prisoners were incarcerated on Rikers Island, once known as the new Alcatraz. By now, less than 10,000 New Yorkers are imprisoned and Rikers Island, which costs $800 million a year to run, is partly empty. Moreover, the prison has recently been shaken by a scandal about the mistreatment of a juvenile detainee. De Blasio therefore has several reasons for wanting to close the facility. He also would like to further reduce the number of prisoners: to 7,000 in five years and to 5,000 in the long term.

His biggest lever: algorithms. They are supposed to help New York’s judges better assess risks, for example, whether pre-trial detention is necessary or whether an early release is adequate. The probabilities to be assessed here are, in the first case, the danger that the alleged person will flee before the trial and, in the second case, the threat of recidivism. These probabilities depend on so many factors that a judge can hardly be expected to evaluate all of them adequately in the time allotted for each case.

COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) is the software that calculates the risk of flight and recidivism. While the company that developed the program refuses to publish the algorithm behind it, research by ProPublica, a non-profit organization for investigative journalism, has shown that such systems collect and analyze a large amount of data, such as age, gender, residential address, and type and severity of previous convictions. They even gather information on the family environment and on existing telephone services. All in all, COMPAS collects answers to 137 such questions.

The potential for providing algorithmic support to judges is huge. In a study in New York City, researchers calculated that if prisoners with a low probability of recidivism were released, the total number of detainees could be reduced by 42 percent without increasing the crime rate.11 In Virginia, several courts tested the use of algorithms. They ordered detainment only in half as many cases as when judges issued a ruling without such software. Despite that, there was no increase in the rate of people who did not show up for their trial or who committed a crime in the interim.

Algorithmically supported decisions improve forecasts even if they do not offer 100-percent accuracy. In addition, they could also reduce variations in the sentences handed down. In New York City, for example, the toughest judge requires bail more than twice as often as the most lenient of his colleagues. The fluctuations may be due to the attitude of the judges but also to their workload, since they only have a few minutes to decide what bail to set.

What promises advantages for society can, however, result in tangible disadvantages for the individual. Hardly anyone knows this better than Eric Loomis, a resident of the state of Wisconsin. In 2013, he was sentenced to six years in prison for a crime that usually draws a suspended sentence. The COMPAS algorithm had predicted a high probability of recidivism, contributing to the judge’s decision in favor of a long prison sentence. The discrimination that can result from the use of algorithms will be discussed in more detail in Chapter 4.

We Humans and the Intelligent Machines

Подняться наверх