Читать книгу Data protection for the prevention of algorithmic discrimination - Alba Soriano Arnanz - Страница 31

4.3. Profiling and automated decision-making: descriptive, predictive, classification and recommendation purposes

Оглавление

It is important to highlight the role profiling currently plays in algorithmic decision-making for two main reasons. On the one hand, article 22 of the GDPR specifically mentions profiling as a form of automated processing when it indicates that “the data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her”, which highlights the relevance of this form of data processing. On the other hand, and more importantly, profiling is the key tool through which algorithms are used in decision-making processes that affect humans. Profiling is generally used as the first step in the decision-making process. Profiles are used in order to create classifications of individuals, which are then used in order to make prediction regarding future individual or collective behaviour. Profiles are therefore the basis upon which the algorithm will produce recommendations or make automated decisions.176 Moreover, the creation of profiles is, in itself, a form of automated decision-making for the profiling algorithm makes decisions regarding the categories in which the individuals whose data is processed will be classified into and the parameters that will be measured and evaluated.

Within profiling, algorithms can be used for (1) descriptive and (2) classification or predictive purposes, whereas algorithms used in (strictly speaking) automated decision-making are used for (3) recommendation purposes.177 These three objectives tend to work together.

Algorithms used for descriptive purposes establish patterns and relationships between different pieces of data. This is especially relevant towards creating profiles. For example, an algorithm used by taxation authorities might determine that there is a correlation between making large deductible donations and tax evasion. Classification or predictive systems are the next step in the process. They work by establishing a series of categories or classes that indicate that if a series of data points concur in an individual she will be classified into a certain category.178 Certain types of behaviour are predicted for all the individuals included in a class or category. Hence, the fact that an individual has made a larger deductible donation, in combination with other pieces of data, will lead to that person being placed in the category that predicts individuals to be at high-risk of committing tax fraud.

Finally, once the descriptive and classification/predictive objectives have been achieved, algorithms can also be used as “systems of recommendation”,179 seeing as once an individual has been predicted to behave in a certain way the machine can recommend what will the best action to address said conduct. This classification by objectives is obviously not as structured or systematic when automated systems operate but it helps to provide an overview of how they work and are used. Algorithms can be used as systems of recommendation in order to inform final decisions made by humans (semi-automated decision-making) or can be directly responsible for making the final decision (automated decision-making).

16. De Mauro, A. et al., “What is big data? A consensual definition and a review of key research topics”, paper presented at the 4th International Conference on Integrated Information, Madrid, 5-8th September 2014, p. 97.

17. Ward, J., & Barker, A., “Undefined by data: a survey of big data definitions”, 2013, p. 1. Available on 20th September 2018 at: https://arxiv.org/

18. Franks, B., Taming the Big Data Tidal Wave, Hoboken, New Jersey, John Wiley & Sons, 2012, p. 4.

19. Laney, D., “3D data management: controlling data volume, velocity and variety”, 6th February 2001. Available on 20th September 2018 at: https://blogs.gartner.com/

20. Article 29 Working Party, “Opinion 03/2013 on purpose limitation”, 00569/13/EN, WP 203, 2nd April 2013, p. 35.

21. Cukier, K. & Mayer-Schoenberger, V., “The rise of big data: how it’s changing the way we think about the world”, Foreign Affairs, vol. 92, No. 93, 2013, p. 29.

22. Taylor, C., “Structured vs. Unstructured data”, Datamation, 28th March 2018. Available on 13th June 2019 at: https://www.datamation.com/.

23. Connolly, T. & Begg, C., Database Systems: A Practical Approach to Design, Implementation, and Management, Essex, Pearson, 6th ed., 2014, p. 1130.

24. Taylor, C., “Structured vs. Unstructured data”, cit., 2018.

25. De Mauro, A. et al., “What is big data?…”, cit., 2014, pp. 6-7.

26. Gil González, E., Big data, Privacidad y Protección de Datos, Madrid, Agencia Española de Protección de Datos, 2016, p. 18.

27. Franks, B., Taming the big data tidal wave, cit., 2012, p. 4.

28. Custers, B., “Data dilemmas in the information society: introduction and overview”, in Custers, B., et al., (eds.), Discrimination and Privacy in the Information Society: Data Mining and Profiling in large Databases, Berlin, Springer, 2013, p. 8.

29. De Mauro, A. et al., “What is big data?…”, cit., 2014, pp. 2-3.

30. Franks, B., Taming the big data tidal wave, cit., 2012, p. 6.

31. Ibidem.

32. Cukier, K. & Mayer-Schoenberger, V., “The rise of big data…”, cit., 2013, p. 29; Lerman, J., “Big data and its exclusions”, Stanford Law Review Online, No. 66, 2013, p. 57.

33. Gil González, E., Big data, Privacidad y Protección de Datos, cit., 2016, p. 18.

34. Cukier, K. & Mayer-Schoenberger, V., “The rise of big data…”, cit., 2013, p. 29.

35. Custers, B., “Data dilemmas in the information society: introduction and overview”, cit., 2013, p. 9.

36. Ward, J. & Barker, A., “Undefined by data…”, cit., 2013, p. 1.

37. US Executive Office of the President, “Big data: seizing opportunities, preserving values”, 2014, pp. 43-44.

38. Article 4.7 GDPR.

39. Article 4.8 GDPR.

40. Fayyad, U. M., Piatesky-Shapiro, G. & Smyth, P., “From data mining to knowledge discovery in databases”, AI Magazine, vol. 17, No. 3, 1996, p. 39.

41. Hildebrandt, M. & Koops, B. J., “The challenges of ambient law and legal protection in the profiling era”, The Modern Law Review, vol. 73, No. 3, 2010, pp. 431-432.

42. Hand, D. J., “Data mining: statistics and more?”, The American Statistician, vol. 52, No, 2, 1998, p. 112.

43. Sahu, H., Shrma S., & Gondhalakar, S., “A brief overview on data mining survey” International Journal of Computer Technology and Electronics Engineering (IJCTEE), vol. 1, No. 3, 2013, p. 114.

44. Murphy, K. P., Machine Learning: A Probabilistic Perspective, Cambridge (Massachusetts), The MIT Press, 2012, p. 1.

45. Tegmark, M., Life 3.0. Being Human in the Age of Artificial Intelligence, London, Penguin Books, 2017, pp. 97-107.

46. Surden, H., “Machine learning and law”, Washington Law Review, vol. 89, 2014, pp. 89-90.

47. Witten, I. H. et al., Data Mining: Practical Machine Learning Tools and Techniques, 4th ed., Cambridge (Massachusetts), Morgan Kaufman, 2017, p. 28.

48. Murphy, K. P., Machine Learning…, cit., 2012, p. 1: “Most closely related to data mining is without doubt machine learning. There is a big overlap between the two communities, and over time the difference became less relevant and boundaries are beginning to blur. Traditionally, machine learning is about learning to perform a task, whereas data mining is more about “finding knowledge from the data”. Both are tightly connected; on the one hand, in general, useful knowledge extracted from given examples of a task will allow for performing the task better, whereas on the other hand, during the learning process of a task, knowledge about the task will have to be accumulated in one form or another, from the examples, and be stored in the system. Given its task-oriented nature, historically one can see the ML community having a strong focus on supervised tasks, whereas data mining is more concerned with unsupervised tasks.”

49. Oquendo, M. A. et al., “Machine learning and data mining: strategies for hypothesis generation”, Molecular Psychiatry, vol. 17, No. 10, 2012, p. 957.

50. Murphy, K. P., Machine Learning…. cit., 2012, p. 1.

51. Idem, p. 16.

52. Tegmark, M., Life 3.0. Being human in the age of Artificial Intelligence, cit., 2017, p. 97.

53. Grossfeld, B., “A simple way to understand machine learning vs deep learning”, Zendesk, 18th July 2017. Available on 31st January 2019 at: https://www.zendesk.com/.

54. Liu, J. & Wu, C., “Deep learning based recommendation: a survey” in Kim K. & Joukov N. (eds), Information Science and Applications 2017. ICISA 2017, Lecture Notes in Electrical Engineering, vol. 424, Singapore, Springer, p. 452; Coglianese, C. & Lehr, D., “Regulating by robot: administrative decision making in the machine-learning era”, The Georgetown Law Journal, vol. 105, No. 5, 2017, p. 1160.

55. Lee, J. G. et al., “Deep learning in medical imaging: general overview”, Korean Journal of Radiology, vol. 18, No. 4, 2017, pp. 570-584.

56. Wang, J. et al., “Deep learning for smart manufacturing: methods and applications”, Journal of Manufacturing Systems, vol. 48, 2018, pp. 144-156.

57. Bostron, N., Superintelligence. Paths, Dangers, Strategies, Oxford University Press, Oxford, 2014, pp. 179-180.

58. Grossfeld, B., “A simple way to understand machine learning vs deep learning”, cit., 2017.

59. Ibidem.

60. Hildebrandt, M. & Koops, B. J., “The challenges of ambient law and legal protection in the profiling era”, cit., 2010, p. 432.

61. Ibidem.

62. Custers, B., “Data dilemmas in the information society: introduction and overview”, cit., 2013, p. 7.

63. O’Neil, C., Weapons of Math Destruction…, cit., 2017, p. 6.

64. Hildebrandt, M. & Koops, B. J., “The challenges of ambient law and legal protection in the profiling era”, cit., 2010, p. 432.

65. Hastie, T., Tibshirani, R. & Friedman, J., The Elements of Statistical Learning: Data Mining, Inference and Prediction, Berlin, Springer, 2009, p. xi.

66. Lehr, D. & Ohm, P., “Playing with the data: what legal scholars should learn about machine learning”, UC Davis Law Review, vol. 51. No. 2, 2017, p. 676.

67. Ibidem.

68. O’Neil, C., Weapons of Math Destruction…, cit., 2017, p. 18.

69. Monasterio Astobiza, A., “Ética algorítmica: Implicaciones éticas de una sociedad cada vez más gobernada por algoritmos”, Dilemata, No. 24, 2017, p. 185.

70. Suthaharan, S., Machine Learning Models and Algorithms for Big Data Classification: Thinking with Examples for Effective Learning, New York, Springer, 2015, p. 123.

71. Coglianese, C. & Lehr, D., “Regulating by robot…”, cit., 2017, p. 1158.

72. Lehr, D. & Ohm, P., “Playing with the data…” cit., 2017, pp. 671-672.

73. Coglianese, C. & Lehr, D., “Regulating by robot…”, cit., 2017, p. 1158.

74. Ibidem.

75. The technologies analysed with regard to the data protection legal framework, and for which a new regulatory framework shall be proposed, are mainly referred to as algorithms and automated systems and, in some cases, data processing technologies, models and software programmes.

76. Parloff, R., “Why deep learning is suddenly changing your life”, cit., 2019.

77. Monasterio Astobiza, A., “Ética algorítmica…”, cit., 2017, p. 188.

78. Cath, C. et al., “Artificial intelligence and the ‘good society’: the US, EU and UK approach”, Science and Engineering Ethics, vol. 24, No. 2, 2018, p. 506.

79. Ranchordás, S., “Nudging citizens through technology in smart cities”, International Review of Law, Computers & Technology, vol. 33, 2019, pp. 1-23.

80. Coglianese, C. & Lehr, D., “Regulating by robot…”, cit., 2017, p. 1152-1153.

81. Idem, p. 1180.

82. Monasterio Astobiza, A., “Ética algorítmica…”, cit., 2017, p. 188.

83. European Parliament, “European Parliament resolution of 16 February 2017 with recommendations to the Commission on Civil Law Rules on Robotics” 2015/2103(INL), 2017; US Executive Office of the President, “Artificial intelligence, automation and the economy”, 2016; UK Government Office for Science, “Artificial intelligence: an overview for policymakers”, 2016.

84. The substitution of human workers, will mostly affect low and middle income workers The changes caused by automation entail the need for a highly specialised workforce trained in highly technical skills, meaning governments will have to design employment and education policy accordingly in order to avoid a massive increase in economic equality resulting from a surplus in unspecialised workers whose skills are no longer required in the digital and automated economy. US Executive Office of the President, “Artificial intelligence, automation and the economy”, cit., 2016, pp. 13-21 and 26; Coglianese, C. & Lehr, D., “Regulating by robot…”, cit., 2017, p. 1150; Cath, C. et al., “Artificial intelligence and the ‘good society’…”, cit., 2018, p. 510.

85. Guzella, T. S. & Caminhas, W. M., “A review of machine learning approaches to spam filtering”, Expert Systems with Applications, vol. 36, No. 7, pp. 10206-10222.

86. Davies, D., “How search engine algorithms work: everything you need to know”, Search Engine Journal, 10th May 2018.

87. Autonomous cars have become a very prominent topic in discussions regarding automation due to the ethical dilemmas that arise regarding the instructions the car should follow in cases in which it has to choose between putting either the passenger or a pedestrian’s life at risk. See, for example, Renda, A., “Ethics, algorithms and self-driving cars – a CSI of the ‘trolley problem’ ”, CEPS Policy Insights, No. 2018/02, 2018, pp. 1-15.

88. US Executive Office of the President, “Big data…”, cit., 2014, p. 44.

89. Kroll, J. et al., “Accountable algorithms”, University of Pennsylvania Law Review, vol. 165, No. 3, 2017, p. 658.

90. Furlow, B., “IBM Watson collaboration aims to improve oncology decision support tools”, The Journal of Oncology, 16th March 2016; US Executive Office of the President, “Big data…”, cit., 2014, p. 45.

91. Pasquale, F., The Black Box Society: The Secret Algorithms that Control Money and Information, Cambridge (Massachusetts), Harvard University Press, 2015, p. 25.

92. US District Court for the Northern District of Georgia, Atlanta division, “Complaint for permanent injunction and other equitable relief at 35 FTC v. Compucredit Corp”, No. 1:08-CV-1976-BBM, 2008.

93. Abdou, H. A., & Pointon, J. “Credit scoring, statistical techniques and evaluation criteria: a review of the literature”, Intelligent Systems in Accounting, Finance & Management, vol. 18, No. 2-3, 2011, pp. 59-88.

94. Citron, D. K. & Pasquale, F, “The scored society: due process for automated predictions”, Washington Law Review Online, vol. 89, 2014, p. 8.

95. Idem, p. 9.

96. In fact, even a dating website takes credit scores as the main premise when matching customers who hire their services. See Credit Score Dating. Available on 27th March 2019 at: www.creditscoredating.com

97. Consumer Reports, “The secret score behind your auto insurance”, 10th August 2006.

98. Ball, K., “Blacklists and black holes: credit scoring in Europe”, in Webster, W., & Ball, K., (eds.), Surveillance and Democracy in Europe: Courting Controversy, Oxon, Routledge, 2019, p. 69.

99. Payne, A., “Credit score systems across the world”, Graydon, 9th February 2015. Available on 25th February 2019 at: https://www.graydon.co.uk/

100. Ibidem.

101. Kayne, C., “Do credit scores matter outside the US?”, CNBC, 9th February 2011. Available on 25th February 2019 at: https://www.cnbc.com/

102. Ball, K., “Blacklists and black holes…”, cit., 2019, p. 71.

103. O’Neil, Weapons of Math Destruction…, cit., 2017, p. 145.

104. Ball, K., “Blacklists and black holes…”, cit., 2019, p. 70.

105. O’Neil, Weapons of Math Destruction…, cit., 2017, p. 146.

106. Jones Havard, C., “ ‘On the take’: the black box of credit scoring and mortgage discrimination”, Public interest law journal, vol. 20, 2011, p. 283.

107. Avery, R. B. et al., “Credit scoring: statistical issues and evidence from credit-bureau files”, Real Estate Economics, vol. 28, 2000, p. 537.

108. Pasquale, F., The Black Box Society…, cit., 2015, p. 41.

109. Zarya, V., “Why being a woman hurts your credit score”, Fortune, 10th February 2016. Available on 19th April 2019 at: http://fortune.com/

110. Henderson, L. et al., “Credit where credit is due?: race, gender, and discrimination in the credit scores of business startups” The Review of Black Political Economy, vol. 42, 2015, p. 477: “…not only do credit scores fail to explain racial and gender differences in credit lines, they appear to mask the size and significance of such differences.”

111. Burt, A. & Volchenboum, S., “How healthcare changes when algorithms start making diagnoses”, Harvard Business Review, 8th March 2018. Available on 25th March 2019 at: https://hbr.org/

112. See, in general, Sánchez-Martínez, F. I., Abellán-Perpiñán, J. M. & Oliva-Moreno, J., “Privatization in healthcare management: an adverse effect of the economic crisis and a symptom of bad governance. SESPAS report 2014”, Gaceta Sanitaria, vol. 28, No. 1, 2014, pp. 75-80.

113. Chen, M. et al., “Disease prediction by machine learning over big data from healthcare communities”, IEEE Access, vol. 5, 2017, pp. 8869-8879.

114. Malgieri, G. & Comandé, G., “Sensitive-by-distance: quasi-health data in the algorithmic era”, Information & Communications Technology Law, vol. 26, No. 3, 2017a, p. 231.

115. CareerBuilder, “More than half of HR managers say artificial intelligence will become a regular part of HR in next 5 years”, 18th May 2017. Available on 12th February 2019 at: https://www.prnewswire.com/.

116. Faliagka, E. et al., “On-line consistent ranking on e-recruitment: seeking the truth behind a well-formed CV”, Artificial Intelligence Review, No. 42, 2014, p. 516; Faliagka, E., Ramantas, K. & Tsakalidis, A., “Application of machine learning algorithms to an online recruitment system”, paper presented at the 7th International Conference on Internet and Web Applications and Services, 2017.

117. Bogen, M. & Rieke, A., “Help wanted: an examination of hiring algorithms, equity and bias”, Upturn, p. 17, 2018.

118. Kirimi, J. M. & Moturi, C. A., “Application of data mining classification in employee performance prediction”, International Journal of Computer Applications, vol. 146, No. 7, 2016.

119. Dastin, J., “Amazon scraps secret AI recruiting tool that showed bias against women”, Reuters, 10th October 2018.

120. Ibidem.

121. Ibid.

122. Bar-Gill, O., “Algorithmic price discrimination when demand is a function of both preferences and (mis)perceptions”, Chicago Law Review, vol. 86, No. 2, 2019, p. 219.

123. Valletti, T., & Wu, J., “Consumer profiling with data requirements”, Production and Operations Management, Vol. 29, No. 2, 2020, pp. 309-329; US Executive Office of the President, “Big data…”, cit, 2014, p. 44.

124. Bing, J., “Code, access and control”, cit., 2005, pp. 203-204.

125. Coglianese, C. & Lehr, D., “Regulating by robot…”, cit., 2017, p. 1161.

126. Ibidem.

127. Eubanks, V., Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, New York, St Martin’s Press, 2017.

128. See, in general, Sunstein, C. R. y Thaler, R. H., Nudge: Improving Decisions about Health, Wealth and Happiness, New Haven, Yale University Press, 2008.

129. Ranchordás, S., “Nudging citizens through technology in smart cities”, cit., 2019, p. 18.

130. Cobbe, J., “Administrative law and the machines of government: judicial review of automated public-sector decision-making”, Legal Studies, vol. 39, No. 4, 2019, p. 654.

131. Cosculluela Montaner, L., Manual de Derecho Administrativo, Cizur Menor (Aranzadi), 27th ed., 2017, p. 604; Muñoz Machado, S., Tratado de Derecho Administrativo y de Derecho Público General. Tomo XIV. La Actividad Regulatoria de la Administración, Madrid, Boletín Oficial del Estado, 2015, p. 14.

132. Garrido Falla, F., “El concepto de servicio público en el derecho español”, Revista de Administración Pública, No. 135, 1994, p. 20.

133. US Executive Office of the President, “Big data…”, cit., 2014, p. 23.

134. Redacción Médica, “ ‘Big data’ e IA mejoran un 40% la detección precoz de la sepsis grave”, 10th March 2019.

135. Ledien, J. et al., “An algorithm applied to national surveillance data for the early detection of major dengue outbreaks in Cambodia”, PLOS One, vol. 14, No. 2, 2019, pp. 1-11.

136. Alimadadi, A. et al., “Artificial intelligence and machine learning to fight COVID-19”, Physiological Genomics, vol. 52, 2020, pp. 200-202.

137. O’Neil, C., Weapons of Math Destruction…, cit., 2017, pp. 3-11.

138. Ibidem.

139. US Department of Education, “Enhancing teaching and learning through educational data mining and learning analysis”, October 2012.

140. Alston, P., “Digital welfare states and human rights”, UN Special Rapporteur on extreme poverty and human rights, report A/74/493, A/74/493, 11th October 2019, pp. 10-11.

141. Empresa Municipal de la Vivienda y el Suelo, “Procedimiento de adjudicación”, Ayuntamiento de Madrid. Available on 27th November 2019 at: https://www.emvs.es/.

142. Regulation 20th December 2018, for the adjudication of housing managed by the municipal housing and land company of Madrid (article 13).

143. Belmonte, E., “La aplicación del bono social del Gobierno niega la ayuda a personas que tienen derecho a ella”, CIVIO, 16th May 2019.

144. CIVIO, “Que se nos regule mediante código fuente o algoritmos secretos es algo que jamás debe permitirse en un Estado social, democrático y de Derecho”, CIVIO, 2nd July 2019.

145. Szigetvari, A., “Arbeitsmarktservice gibt grünes Licht für Algorithmus”, Der Standard, 17th September 2019. Available on 23rd January 2020 at: https://www.derstandard.at/

146. OECD, “Profiling tools for early identification of jobseekers who need extra support”, December 2018, p. 3.

147. Zarsky, T., “Understanding discrimination in the scored society”, Washington Law Review, vol. 89, No. 4, 2014, pp. 1375-1412.

148. Planet Labor, “Austria: an algorithm that evaluates the unemployed (briefly)”, 24th October 2018. Available on 23rd January 2020 at: https://www.planetlabor.com/.

149. OECD, “Profiling tools for early identification of jobseekers who need extra support”, December 2018; Szigetvari, A., “Arbeitsmarktservice gibt grünes Licht für Algorithmus”, cit., 2019.

150. Bellovin, S. M. et al., “When enough is enough: location tracking, mosaic theory, and machine learning”, NYU Journal of Law & Liberty, vol. 8, 2014, p. 612.

151. Oswald, M. & Grace, J., “Intelligence, policing and the use of algorithmic analysis: a freedom of information-based study”, Journal of Information Rights, Policy and Practice, vol. 1, No. 1, 2016, pp. 3-5.

152. O’Neil, C., Weapons of Math Destruction…, cit., 2017, p. 85.

153. Oswald, M. & Grace, J., “Intelligence, policing and the use of algorithmic analysis…”, cit., 2016, p. 4.

154. Ibidem.

155. Ibid.

156. Miró-Llinares, F., “Predictive policing: utopia or dystopia? On attitudes towards the use of big data algorithms for law enforcement”, Revista de Internet, Derecho y Política, No. 30, 2020, pp. 3-5.

157. Ferguson, A. G., “Big data and predictive reasonable suspicion”, University of Pennsylvania Law Review, vol. 163, No. 2, 2015, p. 335.

158. O’Neil, C., Weapons of Math Destruction…, cit., 2017, pp. 24-27; Ritter. N., “Predicting recidivism risk: new tool in Philadelphia shows great promise”, National Institute of Justice Journal, No. 271, 2013, pp. 4-13; Dressel, J. & Farid, H., “The accuracy, fairness, and limits of predicting recidivism”, Science Advances, vol. 4, No. 1, 2018.

159. O’Neil, C., Weapons of Math Destruction…, cit., 2017, pp. 24-27.

160. Ibidem.

161. Ibid.

162. Idem, p. 25.

163. Ibidem.

164. New York Civil Liberties Union, “Stop-and-frisk 2011”, 2012. Available on February 19th 2019 at: https://www.nyclu.org/.

165. Angwin, J., et al, “Machine bias: there’s software used across the country to predict future criminals. And it’s biased against blacks”, Propublica, 23rd May 2016. Available on 18th February 2019 at: https://www.propublica.org/.

166. Northpointe, “Risk assessment”. Available on 27th March 2019 at: https://www.documentcloud.org/.

167. Quijano-Sánchez, L. et al., “Applying automatic text-based detection of deceptive language to police reports: extracting behavioral patterns from a multi-step classification model to understand how we lie to the police”, Knowledge-Based Systems, vol. 149, 2018, pp. 155-168; Kolotúshkina, N., “VERIPOL: la herramienta de la Policía para detectar denuncias falsas”, RTVE, 2nd November 2018. Available on 19th February 2019 at: http://www.rtve.es/.

168. Caballé-Pérez, M. et al., “El quebrantamiento de las órdenes de protección en violencia de género: análisis de los indicadores de riesgo mediante el formulario vpr4.0”, Anuario de Psicología Jurídica, No. 30, 2020, pp. 63-72; Del Castillo, C., “Contra la violencia machista, el odio y las denuncias falsas: los algoritmos que usa la Policía”, eldiario.es, 1st January 2019; González Álvarez, J. L., “Sistema de seguimiento integral en los casos de violencia de género (sistema viogén)”, Cuadernos de la Guardia Civil: Revista de Seguridad Pública, No. 56, 2018, pp. 83-102.

169. Viñas Coll, J., “Así son los superordenadores de Montoro contra el fraude fiscal”, Cinco Días, 24th July 2015; López Zafra, J. M., “Patrones de comportamiento y voracidad fiscal”, El Confidencial, 14th July 2018.

170. Act 22/2018 of the Valencian government on the general inspection of services and on the system of alerts for the prevention of bad practices in the Valencian public administration and its instrumental public sector.

171. Capdeferro Villagrasa, O., “El análisis de riesgos como mecanismo central de un sistema efectivo de prevención de la corrupción. En particular, el sistema de alertas para la prevención de la corrupción basado en inteligencia artificial”, Revista Internacional de Transparencia e Integridad, No. 6, 2018, pp. 1-7.

172. Ibidem.

173. Yeung, K., “Why worry about decision-making by machine?”, in Yeung, K. & Lodge, M., (Eds.), Algorithmic regulation, Oxford, Oxford University Press, 2019, p. 22.

174. For example, article 22 of the GDPR prohibits decisions based solely on automated processing.

175. Citron, D. K., “Technological due process”, Washington University Law Review, vol. 85, No. 6, 2008, pp. 1271-1272; Parasuraman, R. & Miller, C. A., “Trust and etiquette in high-criticality automated systems”, Communications of the ACM, vol. 47, No. 4, 2004, p. 52.

176. Wiedemann, K., “Automated processing of personal data for the evaluation of personality traits: legal and ethical issues”, Max Planck Institute for Innovation and Competition Research Paper No. 18-04, 2018, p. 15. Available on 29th July 2019 at: https://ssrn.com/.

177. Schermer, B. W., “The limits of privacy in automated profiling and data mining”, Computer Law & Security Review, vol. 27, No. 1, 2011, p. 46; Yeung, K. & Lodge, M, “Algorithmic regulation: an introduction”, in Yeung, K. & Lodge, M., (Eds.), Algorithmic regulation, Oxford, Oxford University Press, 2019, p. 10.

178. Schermer, B. W., “The limits of privacy in automated profiling and data mining”, cit., 2011, p. 46; Yeung, K. & Lodge, M, “Algorithmic regulation…”, cit., 2019, p. 10.

179. Yeung, K. & Lodge, M, “Algorithmic regulation…”, cit., 2019, p. 10.

Data protection for the prevention of algorithmic discrimination

Подняться наверх