Читать книгу Outsmarting AI - Brennan Pursell - Страница 21

Rule 4: AI Needs Parenting

Оглавление

All kids need parenting throughout childhood, from start to finish. Yet unlike children, who we can reasonably expect to grow into self-sufficient adults, AI systems remain needy until the end of their software life cycle.

In the beginning you will have to do everything. Even if the software is readily available, the data has to be prepared, integrated, and validated. The system has to be thoroughly trained and tested over and over again, and then there is the great work of socialization. All coworkers who come into contact with the system will have to understand it, accept it, and work well with it. Once your system is firmly established, however, you will just need to check in and test it periodically to maintain your governance. If the input data changes in some unexpected way, the whole system could go haywire. If it takes a village to raise a child, it takes an organization for AI to succeed!

Once fully functional, an AI system can be generally relied upon to complete its carefully scripted tasks, but beware when it comes to decision-making! At what age do you entrust children with authority? Autonomous algorithms left to make decisions can be like little kids with meds and guns.

Cringe-worthy examples abound. An AI algorithm in Idaho cut Medicare payments to four thousand disabled people, which prompted a major lawsuit. The database it relied on was loaded with gaps and errors. What did the people in charge expect?

Armed with AI, businesses can make themselves truly destructive if the people give in to natural recklessness. Want a global financial crisis worse than 2008–2009? Just set up AI-powered self-executing credit-default swaps. Want a criminal justice system devoid of reason and humanity? Turn it all over to computers. Want World War III? Turn the US president’s nuclear football into a fully autonomous algorithm.

Want to see a real, live, AI-powered social media platform make money at any cost, despite all the suicides, extrajudicial killings, child pornography, child brides, and illegal drugs? Take a close look at Facebook, which owns Instagram and WhatsApp.

Facebook’s AI did not catch the live-streamed massacre in New Zealand on March 15, 2019. It couldn’t. It had either not been trained, or its trained model failed. A Facebook user flagged the gruesome post within minutes, but the company did not react. Only after the police called in, about an hour after the event, did Facebook remove the original video. Facebook, YouTube, Twitter, and Reddit struggled to take down the 1.5 million re-postings of the slaughter.

People using AI need minds unbent by malice and gross negligence.

Outsmarting AI

Подняться наверх