Читать книгу Data protection for the prevention of algorithmic discrimination - Alba Soriano Arnanz - Страница 18

3.1.1.1. The banking sector and the expansion of credit scores

Оглавление

Algorithms have been used by credit card companies92 and the banking sector in the US for quite some time. The main purpose of these algorithms is to establish individuals’ credit capacity and make decisions on their eligibility for loans. These decisions have been based on creditworthiness scores for around the past six decades.93 Before credit scores entered the picture, it was bank employees and, later on, experts, who decided what someone’s trustworthiness ought to be and whether the loan they had requested should be granted or denied.94 Eventually, specialised companies began creating models to detect the probability that a loan applicant would default when repaying the bank, thereby producing credit scores.95

Through the development of new big data technologies, the sources of information employed in building individuals’ credit scores have increased exponentially. Moreover, new versions of these scores are now adapted to the specific needs of a wide range of fields,96 from human resources to car insurance.97

It is important to highlight that the United States is not the only country in which these reputational systems for determining creditworthiness are used. Credit scoring is, for instance, also used in the United Kingdom98 and Canada99 as well as in several Asian countries such as Malaysia, Singapore and Hong Kong100 and is rapidly expanding throughout other parts of the world.

In European countries other than the UK, loans have traditionally been granted relying on a reduced number of elements, such as salary, job security or family financial situation, which determine the probability that a loan applicant would default in his or her payments.101 More complex scoring systems have, however, gradually entered the banking sector and are now also widely used102 although mostly limited to loan granting.

Credit scoring in theory eliminates the prejudiced attitudes that individual bankers who previously made the decisions on granting loans may have had.103 Moreover, given that it is supposed to provide an accurate measure of the financial capability of a loan applicant, the system benefits both the individual and the creditor seeing as no loans that could put too much strain on the applicant’s finances, even leading to a payment default, will be granted.104

However, a number of variables are used when determining these scores which can lead to the association between loan decisions and specially protected attributes such as race or gender. For example, postal codes are sometimes used as a variable to determine whether a person should be granted a loan. The system compares individuals’ postal codes with data regarding the percentage of loans that have been granted and denied to people living in that same area in the past and data on how many of the applicants who were granted some kind of credit in said area defaulted in returning it.105 If the percentage of credit denials and/or default in that postal area is high, it is less likely that individuals will be granted the loan they apply for.

The problem with using applicants’ postal codes is that racial minorities at risk of discrimination, which generally comprehend a significant percentage of immigrant population, tend to live in the same neighbourhoods, in which individuals of a lower socioeconomic status also tend to live. Consequently, whether it is because those in charge of making loan granting decisions in the past were prejudiced towards racial minorities and the percentage of denied loans is higher in minority neighbourhoods or whether it is because racial minorities tend to live in poorer neighbourhoods, due to the use of a proxy variable such as applicants’ postal code, they are more likely to be denied loans or to have harsher conditions placed on them.106 In this sense, Avery et al.107 found that credit scores predicting a higher probability of loan repayment default tend to correlate with areas with a high presence of minority populations. Consequently, harsher loan conditions correlate with minority populations.108

Women are also discriminated in access to credit partly as a consequence of the gender pay gap, which results in women having generally lower scores than men. Even though overall debt is lower for women than for men, women generally use a higher percentage of their available credit because they have a lower average credit limit, which results in their scores being lowered.109 Furthermore, Henderson et al.110 found that, when using risk scores to determine loan application eligibility for business start-ups, controlling for other factors, the differences in loan conditions due to race and gender were amplified and not reduced.

Data protection for the prevention of algorithmic discrimination

Подняться наверх