Читать книгу Enterprise AI For Dummies - Zachary Jarvinen - Страница 29

Storage

Оглавление

AI requires massive amounts of data, so massive that it uses a repository technology known as a data lake. A data lake can be used to store all the data for an enterprise, including raw copies of source system data and transformed data.

In the decade from 2010-2020, data storage changed more in terms of price and availability than during the previous quarter century, and due to Moore’s Law, that trend will continue. Laptop-peripheral, solid-state drives priced at hundreds of dollars today have the same capacity as million-dollar hard-drive storage arrays from 20 years ago. Large-scale storage capacity now ranges up to hundreds of petabytes (a hundred million gigabytes) and runs on low-cost commodity servers.

Combined with the advent of more powerful processors, smarter algorithms and readily available data, the arrival of large-scale, low-cost storage set the stage for the AI explosion.

Enterprise AI For Dummies

Подняться наверх