Читать книгу We Humans and the Intelligent Machines - Jörg Dräger - Страница 12
Setting the course
ОглавлениеNew York City is not the only American city where algorithms are omnipresent. Chicago and Los Angeles provide their judges with support in the form of software or use predictive policing as well. Algorithmic systems are also used outside the US, for example in Australia, where they decide on social benefits and even automatically send reminders and warnings when potential fraud is perceived (see Chapter 9). Germany is not there yet but initial applications do exist: In Berlin, places at primary schools are allocated using software (see Chapter 10) and algorithms check tax returns for plausibility. Six of the country’s states use different forms of predictive policing (see Chapter 11). Especially in large cities, public administration has become so complex that municipal services from police patrols to waste collection can hardly be managed without technological support – including the use of algorithms. They are part of the daily life of every citizen. But most citizens do not know these algorithms exist, let alone understand how they function. People do not need to understand, you might say. They should be happy if the garbage is picked up on time and no unnecessary costs arise for them as taxpayers.
Yet with decisions about imprisonment, access to the best educational path or governmental support, algorithms intervene deeply in the fundamental rights of individuals. This makes the software and its design highly political. Such seemingly intelligent systems should not only be debated behind closed doors or among academics but also in a broad social and political discourse – especially since even well-designed algorithms can discriminate. In the fight against crime, they can be self-reinforcing: The police find the most crime in the areas they investigate the most. Minor drug offenses, for example, common in most parts of a city, are identified disproportionately frequently in certain neighborhoods, leading to even more police checks there. Or in the case of the courts: When an algorithm sends people to prison for a longer period of time, they are more likely to remain unemployed after their release. They will also have less contact with family and friends and will therefore be more likely to become repeat offenders which confirms the algorithm’s predictions. Critics argue that all this reinforces the discrimination against and stigmatization of certain social groups.
As New York City shows, algorithms can solve tasks that are too complex for humans. They can be useful helpers for us and our societies. But whether or not they are successful depends on the goals we set for them. They are neither inherently good nor bad. Ideally, they result in more safety, justice and efficiency. At the same time, however, they can reinforce existing social inequalities or even create new forms of discrimination. It is up to us to set the course so that things develop in the right direction.
James Vacca now teaches at Queens College, City University of New York. His years on the City Council are over since its members can serve a maximum of two consecutive terms. He proudly looks back on December 11, 2017, and his greatest legacy, the algorithmic accountability law, saying: “We were the first to politically concern ourselves with algorithms. Algorithms are helpful, it would be wrong to ban them. But we have to regulate how to deal with them. It is the political task of our time.”14