Читать книгу Fail Fast, Learn Faster - Randy Bean - Страница 13

The Implications of Big Data for Large Companies

Оглавление

Long saddled with disparate sources of legacy data, corporations could for the first time successfully integrate these sources due to the cost and speed advantages resulting from Big Data technologies. Corporations could access new sources of information, such as unstructured data sources including documents, text, and pictures, and behavioral data captured through social media channels. The result was a growing sophistication in the data and analytics capabilities of mainstream companies.

Jonathan Bennett is a long-time top executive, EVP, CFO, and business head at The Hartford, a $26 billion insurance and investment firm founded in 1810 with a long history of actuarial analysis, where data has always mattered. Mr. Bennett has maintained a clear-eyed view of both the opportunity and the challenge represented by Big Data. In 2013, he noted that keeping a focus on cost and the benefits of better managing data “is just as important as breaking into new Big Data opportunities. If we can figure it out, cost reductions from the former will help fund expansion in the latter.”6

Big Data did not make traditional data management methods disappear overnight. Change seldom comes quickly. The apostles of traditional data management approaches battled back and demonstrated that data transformation is not quite always as simple as just “load and go.” Any transformation implies change. This meant required changes to business processes, technology tools, employee skill sets, and, most especially, to human mindsets. Although some data engineering has been eliminated or reduced, and Big Data approaches have dramatically reduced the costs of data management, data still needs to be standardized, data quality maintained, and access provided to analyst communities. Data management will continue to be an evolutionary process for companies.

One year after their Stanford University conference, Accel would launch a second $100 million fund, with Accel partner Jake Flomenberg, proclaiming, “Over the past few years, we've focused a tremendous amount of attention on what people like to call the ‘three Vs’ of big data: variety, volume, and velocity. We now believe there is a fourth V, which is end user value, and that hasn't been addressed to the same extent.”7 Accel's Ping Li added, “We are seeing an accelerated rate of innovation in big data, with the newest generation of entrepreneurs re-imagining ways to extract the most value out of big data and fundamentally change the way we work and process information.”8

Michael Stonebraker, a pioneer in the field of data management and the 2014 recipient of the Turing Award, the “Nobel Prize of Computing,” and a member of the faculty at the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL), has also been a champion of Big Data for well over a decade. He believes that for most large companies, Big Data is less about managing the “volume” of data they have, and much more about integrating the wide “variety” of data sources that are available to them – which can include data from legacy transaction systems, behavioral data sources, structured and unstructured data, and all sizes of data sets.9 Stonebraker has estimated that corporations manage to capture only a small fraction of their data for analysis. His focus is on expanding the sources and varieties of data that companies can bring under management.

Fail Fast, Learn Faster

Подняться наверх