Читать книгу An Introduction to Intercultural Communication - Fred E. Jandt - Страница 88
Focus on Technology 2.1 Can Technology Be Prejudiced?
ОглавлениеGoogle Photos (in 2015) algorithmically identified Black people as gorillas. Snapchat (in 2016) provided a selfie-altering filter that showed users as an offensive Asian caricature. Software that coded gorillas as black in color may have resulted in machine algorithms that applied that label to people with black skin. One study demonstrated that if one did a Google search for a name more likely to be of African-American descent (e.g., DeShawn, Darnell, or Jermaine), ads for companies that locate criminal records were more likely to be displayed than for names more commonly assigned to Whites (e.g., Geoffrey, Jill, or Emma; Sweeney, 2013). Amazon facial detection technology labeled darker-skinned women as men 31% of the time. Law professor Frank Pasquale (2015) contends that machine algorithms are learning our stereotypes.