Читать книгу The Innovation Ultimatum - Steve Brown - Страница 58

Understand and Mitigate for Bias

Оглавление

When AIs are trained with data that represents examples of human behavior, they learn unwanted human biases that are reflected in that data.

Amazon receives a small mountain of job applications each week. To triage resumes, Amazon built an experimental AI to surface candidates worthy of an interview. There was only one problem. According to a report by Reuters, it quickly became clear that Amazon's AI was biased against women. The problem with the AI had nothing to do with the way it was coded. The AI reflected historical bias in Amazon's hiring process. The unconscious bias of hiring managers had been codified in the recommendations of the AI, which learned to favor male candidates. Amazon canceled the project before it was ever deployed and reviewed its hiring practices. Sometimes AI holds a mirror up to our own humanity.

We should hold our AIs to even higher standards than we hold ourselves, and should strive to create AIs that don't reflect negative human bias. The state of New Jersey built an AI as part of their plan to do away with the bail bonds system. Cash bail has been a part of the U.S. court system for centuries but is criticized for penalizing poorer defendants. According to a 2013 study by the Drug Policy Alliance of New Jersey, 75% of the New Jersey jail population were people awaiting trial. The average wait time for trial was 314 days. Almost 40% of the people awaiting trial were there because they couldn't afford $2,500 or less in bail. Under New Jersey law, everyone must be given bail, no matter the crime, so rich people always avoided jail and bail bondsmen made good money. In January 2017, the state of New Jersey replaced their bail system with an AI that created a public safety assessment (PSA) for each defendant. The PSA was used as a guide by judges. The PSA predicts the chance defendants will commit crime while awaiting trial and whether they will appear for their court date. The new system reduced the number of people in New Jersey jails awaiting trial by 30%. The AI was trained with information about 1.5 million previous defendants in 300 jurisdictions. Race and gender information was deliberately removed from the training data. Developers also removed all information on the name, education level, employment, income, and home address of defendants. Any of these data can be a proxy for race and gender and thus might disadvantage some demographic groups. The way we train our AIs matters. We should expect our AIs to operate ethically, fairly, and without bias. Leaders must ensure that their teams work hard to strip bias from all the AIs they build.

The Innovation Ultimatum

Подняться наверх