Читать книгу Enterprise AI For Dummies - Zachary Jarvinen - Страница 11

Demystifying Artificial Intelligence

Оглавление

IN THIS CHAPTER

Discovering how you can use AI

Recognizing the key technologies that make AI possible

Looking under the hood to see how it works

While some have traced the history of artificial intelligence back to Greek mythology and philosophers, fast-forward with me to the twentieth century when serious work on AI was directed to practical applications.

The term artificial intelligence was first used in a 1955 proposal for the Dartmouth Summer Research Project on Artificial Intelligence, in which American computer scientist John McCarthy and others wrote:

“We propose that a 2-month, 10-man study of artificial intelligence be carried out during the summer of 1956 at Dartmouth College in Hanover, New Hampshire. The study is to proceed on the basis of the conjecture that every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it.”

Over the following decades, AI progress waxed and waned as development overcame one obstacle only to encounter another.

In this chapter, you get an idea of what, why, and how:

 What the fuss is all about, what AI can do for you, and what it can’t.

 Why now and not 20 years ago, and why AI is suddenly all the rage and wherever you look you see news about everything from self-driving cars to AI-powered showerheads.

 How it works, and how all the moving parts fit together to solve interesting and challenging problems.

Before I go any further, let me get a few definitions out of the way right up front so you’ll know what I mean when I use a term.

Algorithm: A set of rules and calculations used for problem-solving. Some compare an algorithm to the process you follow when you make dinner. The problem to be solved is getting a fully prepared meal on the table, and the algorithm consists of the recipes you use to turn ingredients into the dishes you will serve. An algorithm is not a magic formula; it’s just a regular kind of formula, or rather a set of formulas and instructions.

Machine learning: A collection of algorithms that discover relationships in data with an associated level of confidence based on the likelihood, or probability, that it is a true relationship. Note that I didn’t say ML teaches the machine to think or make decisions the same way humans do. It’s just math. Some pretty fancy math, but still math.

Artificial intelligence: A collection of machine-learning and related technologies used to accomplish tasks that normally require human intelligence, such as to recognize and categorize speech, identify an object or person in a photo or video, or summarize the content of a social media post.

It comes down to pattern recognition. You can think of the human brain as a massively parallel pattern-recognition engine. AI enlists the processing power of computers to automate pattern recognition.

In some ways, AI is more powerful than the human brain, especially in how fast it can match certain patterns. JP Morgan Chase developed a machine-learning system that processed loans that took lawyers and loan officers a total of 360,000 hours to complete; it did this in less than a minute and with fewer mistakes.

In other ways, the human brain is more powerful than current AI implementations. Humans can use all the pattern matching processes that they have learned before to contextualize new pattern matching processes. This ability allows them to be far more adaptable than AI, for now. For example, if you take a photo of a chihuahua from a certain angle, it can look surprisingly like a blueberry muffin. A human can quickly identify which photos are chihuahuas and which are muffins. AI, not so much.

Enterprise AI For Dummies

Подняться наверх