Читать книгу Data protection for the prevention of algorithmic discrimination - Alba Soriano Arnanz - Страница 25

3.2.1.2. Automation of public services, aid and welfare programmes and the perpetuation of inequality

Оглавление

One of the risks generated by the growing implementation of algorithmic systems in public service and public aid provision is the possibility that economic efficiency in the short-term may be prioritised over other public interest goals.

For example, since 2018, the Austrian Employment Service has been using an algorithm on a trial basis in order to classify unemployed individuals according to their probability of finding a new job. The final introduction of this automated system, called PAMAS, was approved to take place during 2020 and its objective is to carry out public resource assignation in the most efficient way possible.145

The algorithm analyses different characteristics of unemployed individuals and provides each person with a score. Once the scoring process has taken place, the algorithm classifies unemployed people in three different categories according to the probabilities they have of finding a new job: high, medium or low probability.146

The first controversy surrounding this automated system arose when the firm hired to create the model published the source code and underlying logic. In said document it was possible to see how points were taken away if certain characteristics such as being a woman or a non-EU member state national concurred in an individual. It thus became obvious that, in general, individuals’ scores would be lowered as they qualified into more groups at risk of being discriminated. Hence, vulnerable individuals with a greater risk of social exclusion would be classified into the “low probability of finding a job” category.

While this system is objective in that it simply reflects the discriminatory practices that exist in the labour market, it has been heavily criticised by certain Austrian social organisations and sectors. These criticisms are not without reason given the fact that just by classifying members of vulnerable groups in the lower category, the system contributes to the stigmatisation of these groups and their members.147 The Austrian Employment Service defended the need for the system to operate in this manner in order to ensure that public resources are distributed in the best way possible, providing more adequate help to those individuals who might find more difficulties in accessing the job market with more appropriate help.148

However, while the Austrian Employment Service justified this form of classification under the pretext of offering better help to individuals who had more difficulties in finding a new employment, they decided to prioritise efficiency in the allocation of public resources over any other objective. Hence, after reaching the conclusion that the most efficient resource allocation would be providing more resources to individuals with a medium probability of finding a new job, the Austrian Employment Service decided to considerably reduce the amount of resources and other aid provided to unemployed individuals whose chances of reentering the job market are lowest,149 therefore perpetuating the situations of social exclusion suffered by certain groups and individuals, as well as helping to reinforce the construction of social structures and institutions through narratives of subordination of historically oppressed groups.

Moreover, automating the processes involved in these types of public services can also help to perpetuate inequality due to the fact that the population groups that resort to public aid and services aimed at mitigating inequality are in situations of greater economic vulnerability and, consequently, an erroneous outcome in welfare allocation can cause very significant harms to those who depend on these programmes to get by. Additionally, these individuals generally have fewer resources for defending their rights against possible errors made in the processes of granting and providing public aid and services. Hence, even if these systems do not discriminate against the members of disadvantaged groups, they can also help to reinforce historical structures of discrimination because the individuals negatively affected by them may not always be able to challenge the results yielded by automated programmes used in the context of public service and aid provision aimed towards reducing inequality.

Data protection for the prevention of algorithmic discrimination

Подняться наверх