Читать книгу Zucked - Roger McNamee - Страница 10

2 Silicon Valley Before Facebook

Оглавление

I think technology really increased human ability.

But technology cannot produce compassion. —DALAI LAMA

The technology industry that gave birth to Facebook in 2004 bore little resemblance to the one that had existed only half a dozen years earlier. Before Facebook, startups populated by people just out of college were uncommon, and few succeeded. For the fifty years before 2000, Silicon Valley operated in a world of tight engineering constraints. Engineers never had enough processing power, memory, storage, or bandwidth to do what customers wanted, so they had to make trade-offs. Engineering and software programming in that era rewarded skill and experience. The best engineers and programmers were artists. Just as Facebook came along, however, processing power, memory, storage, and bandwidth went from being engineering limits to turbochargers of growth. The technology industry changed dramatically in less than a decade, but in ways few people recognized. What happened with Facebook and the other internet platforms could not have happened in prior generations of technology. The path the tech industry took from its founding to that change helps to explain both Facebook’s success and how it could do so much damage before the world woke up.

The history of Silicon Valley can be summed in two “laws.” Moore’s Law, coined by a cofounder of Intel, stated that the number of transistors on an integrated circuit doubles every year. It was later revised to a more useful formulation: the performance of an integrated circuit doubles every eighteen to twenty-four months. Metcalfe’s Law, named for a founder of 3Com, said that the value of any network would increase as the square of the number of nodes. Bigger networks are geometrically more valuable than small ones. Moore’s Law and Metcalfe’s Law reinforced each other. As the price of computers fell, the benefits of connecting them rose. It took fifty years, but we eventually connected every computer. The result was the internet we know today, a global network that connects billions of devices and made Facebook and all other internet platforms possible.

Beginning in the fifties, the technology industry went through several eras. During the Cold War, the most important customer was the government. Mainframe computers, giant machines that were housed in special air-conditioned rooms, supervised by a priesthood of technicians in white lab coats, enabled unprecedented automation of computation. The technicians communicated with mainframes via punch cards connected by the most primitive of networks. In comparison to today’s technology, mainframes could not do much, but they automated large-scale data processing, replacing human calculators and bookkeepers with machines. Any customer who wanted to use a computer in that era had to accept a product designed to meet the needs of government, which invested billions to solve complex problems like moon trajectories for NASA and missile targeting for the Department of Defense. IBM was the dominant player in the mainframe era and made all the components for the machines it sold, as well as most of the software. That business model was called vertical integration. The era of government lasted about thirty years. Data networks as we think of them today did not yet exist. Even so, brilliant people imagined a world where small computers optimized for productivity would be connected on powerful networks. In the sixties, J. C. R. Licklider conceived the network that would become the internet, and he persuaded the government to finance its development. At the same time, Douglas Engelbart invented the field of human-computer interaction, which led to him to create the first computer mouse and to conceive the first graphical interface. It would take nearly two decades before Moore’s Law and Metcalfe’s Law could deliver enough performance to enable their vision of personal computing and an additional decade before the internet took off.

Beginning in the seventies, the focus of the tech industry began to shift toward the needs of business. The era began with a concept called time sharing, which enabled many users to share the use of a single computer, reducing the cost to everyone. Time sharing gave rise to minicomputers, which were smaller than mainframes but still staggeringly expensive by today’s standards. Data networking began but was very slow and generally revolved around a single minicomputer. Punch cards gave way to terminals, keyboards attached to the primitive network, eliminating the need for a priesthood of technicians in white lab coats. Digital Equipment, Data General, Prime, and Wang led in minicomputers, which were useful for accounting and business applications but were far too complicated and costly for personal use. Although they were a big step forward relative to mainframes, even minicomputers barely scratched the surface of customer needs. Like IBM, the minicomputer vendors were vertically integrated, making most of the components for their products. Some minicomputers—Wang word processors, for example—addressed productivity applications that would be replaced by PCs. Other applications survived longer, but in the end, the minicomputer business would be subsumed by personal computer technology, if not by PCs themselves. Main-frames have survived to the present day, thanks in large part to giant, custom applications like accounting systems, which were created for the government and corporations and are cheaper to maintain on old systems than to re-create on new ones. (Massive server farms based on PC technology now attract any new application that needs mainframe-class processing; it is a much cheaper solution because you can use commodity hardware instead of proprietary mainframes.)

ARPANET, the predecessor to today’s internet, began as a Department of Defense research project in 1969 under the leadership of Bob Taylor, a computer scientist who continued to influence the design of systems and networks until the late nineties. Douglas Engelbart’s lab was one of the first nodes on ARPANET. The goal was to create a nationwide network to protect the country’s command and control infrastructure in the event of a nuclear attack.

The first application of computer technology to the consumer market came in 1972, when Al Alcorn created the game Pong as a training exercise for his boss at Atari, Nolan Bushnell. Bushnell’s impact on Silicon Valley went far beyond the games produced by Atari. He introduced the hippie culture to tech. White shirts with pocket protectors gave way to jeans and T-shirts. Nine to five went away in favor of the crazy, but flexible hours that prevail even today.

In the late seventies, microprocessors made by Motorola, Intel, and others were relatively cheap and had enough performance to allow Altair, Apple, and others to make the first personal computers. PCs like the Apple II took advantage of the growing supply of inexpensive components, produced by a wide range of independent vendors, to deliver products that captured the imagination first of hobbyists, then of consumers and some businesses. In 1979, Dan Bricklin and Bob Frankston introduced VisiCalc, the first spreadsheet for personal computers. It is hard to overstate the significance of VisiCalc. It was an engineering marvel. A work of art. Spreadsheets on Apple IIs transformed the productivity of bankers, accountants, and financial analysts.

Unlike the vertical integration of mainframes and minicomputers, which limited product improvement to the rate of change of the slowest evolving part in the system, the horizontal integration of PCs allowed innovation at the pace of the most rapidly improving parts in the system. Because there were multiple, competing vendors for each component, systems could evolve far more rapidly than equivalent products subject to vertical integration. The downside was that PCs assembled this way lacked the tight integration of mainframes and minicomputers. This created a downstream cost in terms of training and maintenance, but that was not reflected in the purchase price and did not trouble customers. Even IBM took notice.

When IBM decided to enter the PC market, it abandoned vertical integration and partnered with a range of third-party vendors, including Microsoft for the operating system and Intel for the microprocessor. The first IBM PC shipped in 1981, signaling a fundamental change in the tech industry that only became obvious a couple of years later, when Microsoft’s and Intel’s other customers started to compete with IBM. Eventually, Compaq, Hewlett-Packard, Dell, and others left IBM in the dust. In the long run, though, most of the profits in the PC industry went to Microsoft and Intel, whose control of the brains and heart of the device and willingness to cooperate forced the rest of the industry into a commodity business.

ARPANET had evolved to become a backbone for regional networks of universities and the military. PCs continued the trend of smaller, cheaper computers, but it took nearly a decade after the introduction of the Apple II before technology emerged to leverage the potential of clusters of PCs. Local area networks (LANs) got their start in the late eighties as a way to share expensive laser printers. Once installed, LANs attracted developers, leading to new applications, such as electronic mail. Business productivity and engineering applications created incentives to interconnect LANs within buildings and then tie them all together over proprietary wide area networks (WANs) and then the internet. The benefits of connectivity overwhelmed the frustration of incredibly slow networks, setting the stage for steady improvement. It also created a virtuous cycle, as PC technology could be used to design and build better components, increasing the performance of new PCs that could be used to design and build even better components.

Consumers who wanted a PC in the eighties and early nineties had to buy one created to meet the needs of business. For consumers, PCs were relatively expensive and hard to use, but millions bought and learned to operate them. They put up with character-mode interfaces until Macintosh and then Windows finally delivered graphical interfaces that did not, well, totally suck. In the early nineties, consumer-centric PCs optimized for video games came to market.

The virtuous cycle of Moore’s Law for computers and Metcalfe’s Law for networks reached a new level in the late eighties, but the open internet did not take off right away. It required enhancements. The English researcher Tim Berners-Lee delivered the goods when he invented the World Wide Web in 1989 and the first web browser in 1991, but even those innovations were not enough to push the internet into the mainstream. That happened when a computer science student by the name of Marc Andreessen created the Mosaic browser in 1993. Within a year, startups like Yahoo and Amazon had come along, followed in 1995 by eBay, and the web that we now know had come to life.

By the mid-nineties, the wireless network evolved to a point that enabled widespread adoption of cell phones and alphanumeric pagers. The big applications were phone calls and email, then text messaging. The consumer era had begun. The business era had lasted nearly twenty years—from 1975 to 1995—but no business complained when it ended. Technology aimed at consumers was cheaper and somewhat easier to use, exactly what businesses preferred. It also rewarded a dimension that had not mattered to business: style. It took a few years for any vendor to get the formula right.

The World Wide Web in the mid-nineties was a beautiful thing. Idealism and utopian dreams pervaded the industry. The prevailing view was that the internet and World Wide Web would make the world more democratic, more fair, and more free. One of the web’s best features was an architecture that inherently delivered net neutrality: every site was equal. In that first generation, everything on the web revolved around pages, every one of which had the same privileges and opportunities. Unfortunately, the pioneers of the internet made omissions that would later haunt us all. The one that mattered most was the choice not to require real identity. They never imagined that anonymity would lead to problems as the web grew.

Time would expose the naïveté of the utopian view of the internet, but at the time, most participants bought into that dream. Journalist Jenna Wortham described it this way: “The web’s earliest architects and pioneers fought for their vision of freedom on the Internet at a time when it was still small forums for conversation and text-based gaming. They thought the web could be adequately governed by its users without their need to empower anyone to police it.” They ignored early signs of trouble, such as toxic interchanges on message boards and in comments sections, which they interpreted as growing pains, because the potential for good appeared to be unlimited. No company had to pay the cost of creating the internet, which in theory enabled anyone to have a website. But most people needed tools for building websites, applications servers and the like. Into the breach stepped the “open source” community, a distributed network of programmers who collaborated on projects that created the infrastructure of the internet. Andreessen came out of that community. Open source had great advantages, most notably that its products delivered excellent functionality, evolved rapidly, and were free. Unfortunately, there was one serious problem with the web and open source products: the tools were not convenient or easy to use. The volunteers of the open source community had one motivation: to build the open web. Their focus was on performance and functionality, not convenience or ease of use. That worked well for the infrastructure at the heart of the internet, but not so much for consumer-facing applications.

The World Wide Web took off in 1994, driven by the Mosaic/Netscape browser and sites like Amazon, Yahoo, and eBay. Businesses embraced the web, recognizing its potential as a better way to communicate with other businesses and consumers. This change made the World Wide Web geometrically more valuable, just as Metcalfe’s Law predicted. The web dominated culture in the late nineties, enabling a stock market bubble and ensuring near-universal adoption. The dot-com crash that began in early 2000 left deep scars, but the web continued to grow. In this second phase of the web, Google emerged as the most important player, organizing and displaying what appeared to be all the world’s information. Apple broke the code on tech style—their products were a personal statement—and rode the consumer wave to a second life. Products like the iMac and iPod, and later the iPhone and iPad, restored Apple to its former glory and then some. At this writing, Apple is one of the most valuable companies in the world. (Fortunately, Apple is also the industry leader in protecting user privacy, but I will get to that later.)

In the early years of the new millennium, a game changing model challenged the page-centric architecture of the World Wide Web. Called Web 2.0, the new architecture revolved around people. The pioneers of Web 2.0 included people like Mark Pincus, who later founded Zynga; Reid Hoffman, the founder of LinkedIn; and Sean Parker, who had cofounded the music file sharing company Napster. After Napster, Parker launched a startup called Plaxo, which put address books in the cloud. It grew by spamming every name in every address book to generate new users, an idea that would be copied widely by social media platforms that launched thereafter. In the same period, Google had a brilliant insight: it saw a way to take control of a huge slice of the open internet. No one owned open source tools, so there was no financial incentive to make them attractive for consumers. They were designed by engineers, for engineers, which could be frustrating to non-engineers.

Google saw an opportunity to exploit the frustration of consumers and some business users. Google made a list of the most important things people did on the web, including searches, browsing, and email. In those days, most users were forced to employ a mix of open source and proprietary tools from a range of vendors. Most of the products did not work together particularly well, creating a friction Google could exploit. Beginning with Gmail in 2004, Google created or acquired compelling products in maps, photos, videos, and productivity applications. Everything was free, so there were no barriers to customer adoption. Everything worked together. Every app gathered data that Google could exploit. Customers loved the Google apps. Collectively, the Google family of apps replaced a huge portion of the open World Wide Web. It was as though Google had unilaterally put a fence around half of a public park and then started commercializing it.

The steady march of technology in the half century prior to 2000 produced so much value—and so many delightful surprises—that the industry and customers began to take positive outcomes for granted. Technology optimism was not equivalent to the law of gravity, but engineers, entrepreneurs, and investors believed that everything they did made the world a better place. Most participants bought into some form of the internet utopia. What we did not realize at the time was that the limits imposed by not having enough processing power, memory, storage, and network bandwidth had acted as a governor, limiting the damage from mistakes to a relatively small number of customers. Because the industry had done so much good in the past, we all believed that everything it would create in the future would also be good. It was not a crazy assumption, but it was a lazy one that would breed hubris.

When Zuck launched Facebook in early 2004, the tech industry had begun to emerge from the downturn caused by the dot-com meltdown. Web 2.0 was in its early stages, with no clear winners. For Silicon Valley, it was a time of transformation, with major change taking place in four arenas: startups, philosophy, economics, and culture. Collectively, these changes triggered unprecedented growth and wealth creation. Once the gravy train started, no one wanted to get off. When fortunes can be made overnight, few people pause to ask questions or consider side effects.

The first big Silicon Valley change related to the economics of startups. Hurdles that had long plagued new companies evaporated. Engineers could build world-class products quickly, thanks to the trove of complementary software components, like the Apache server and the Mozilla browser, from the open source community. With open source stacks as a foundation, engineers could focus all their effort on the valuable functionality of their app, rather than building infrastructure from the ground up. This saved time and money. In parallel, a new concept emerged—the cloud—and the industry embraced the notion of centralization of shared resources. The cloud is like Uber for data—customers don’t need to own their own data center or storage if a service provides it seamlessly from the cloud. Today’s leader in cloud services, Amazon Web Services (AWS), leveraged Amazon.com’s retail business to create a massive cloud infrastructure that it offered on a turnkey basis to startups and corporate customers. By enabling companies to outsource their hardware and network infrastructure, paying a monthly fee instead of the purchase price of an entire system, services like AWS lowered the cost of creating new businesses and shortened the time to market. Startups could mix and match free open source applications to create their software infrastructure. Updates were made once, in the cloud, and then downloaded by users, eliminating what had previously been a very costly and time-consuming process of upgrading individual PCs and servers. This freed startups to focus on their real value added, the application that sat on top of the stack. Netflix, Box, Dropbox, Slack, and many other businesses were built on this model.

Thus began the “lean startup” model. Without the huge expense and operational burden of creating a full tech infrastructure, new companies did not have to aim for perfection when they launched a new product, which had been Silicon Valley’s primary model to that point. For a fraction of the cost, they could create a minimum viable product (MVP), launch it, and see what happened. The lean startup model could work anywhere, but it worked best with cloud software, which could be updated as often as necessary. The first major industry created with the new model was social media, the Web 2.0 startups that were building networks of people rather than pages. Every day after launch, founders would study the data and tweak the product in response to customer feedback. In the lean startup philosophy, the product is never finished. It can always be improved. No matter how rapidly a startup grew, AWS could handle the load, as it demonstrated in supporting the phenomenal growth of Netflix. What in earlier generations would have required an army of experienced engineers could now be accomplished by relatively inexperienced engineers with an email to AWS. Infrastructure that used to require a huge capital investment could now be leased on a monthly basis. If the product did not take off, the cost of failure was negligible, particularly in comparison to the years before 2000. If the product found a market, the founders had alternatives. They could raise venture capital on favorable terms, hire a bigger team, improve the product, and spend to acquire more users. Or they could do what the founders of Instagram and WhatsApp would eventually do: sell out for billions with only a handful of employees.

Facebook’s motto—“Move fast and break things”—embodies the lean startup philosophy. Forget strategy. Pull together a few friends, make a product you like, and try it in the market. Make mistakes, fix them, repeat. For venture investors, the lean startup model was a godsend. It allowed venture capitalists to identify losers and kill them before they burned through much cash. Winners were so valuable that a fund needed only one to provide a great return.

When hardware and networks act as limiters, software must be elegant. Engineers sacrifice frills to maximize performance. The no-frills design of Google’s search bar made a huge difference in the early days, providing a competitive advantage relative to Excite, Altavista, and Yahoo. A decade earlier, Microsoft’s early versions of Windows failed in part because hardware in that era could not handle the processing demands imposed by the design. By 2004, every PC had processing power to spare. Wired networks could handle video. Facebook’s design outperformed MySpace in almost every dimension, providing a relative advantage, but the company did not face the fundamental challenges that had prevailed even a decade earlier. Engineers had enough processing power, storage, and network bandwidth to change the world, at least on PCs. Programming still rewarded genius and creativity, but an entrepreneur like Zuck did not need a team of experienced engineers with systems expertise to execute a business plan. For a founder in his early twenties, this was a lucky break. Zuck could build a team of people his own age and mold them. Unlike Google, Facebook was reluctant to hire people with experience. Inexperience went from being a barrier to being an advantage, as it kept labor costs low and made it possible for a young man in his twenties to be an effective CEO. The people in Zuck’s inner circle bought into his vision without reservation, and they conveyed that vision to the rank-and-file engineers. On its own terms, Facebook’s human resources strategy worked exceptionally well. The company exceeded its goals year after year, creating massive wealth for its shareholders, but especially for Zuck. The success of Facebook’s strategy had a profound impact on the human resources culture of Silicon Valley startups.

In the early days of Silicon Valley, software engineers generally came from the computer science and electrical engineering programs at MIT, Caltech, and Carnegie Mellon. By the late seventies, Berkeley and Stanford had joined the top tier. They were followed in the midnineties by the University of Illinois at Urbana-Champaign, the alma mater of Marc Andreessen, and other universities with strong computer science programs. After 2000, programmers were coming from just about every university in America, including Harvard.

When faced with a surplus for the first time, engineers had new and exciting options. The wave of startups launched after 2003 could have applied surplus processing, memory, storage, and bandwidth to improve users’ well-being and happiness, for example. A few people tried, which is what led to the creation of the Siri personal assistant, among other things. The most successful entrepreneurs took a different path. They recognized that the penetration of broadband might enable them to build global consumer technology brands very quickly, so they opted for maximum scale. To grow as fast as possible, they did everything they could to eliminate friction like purchase prices, criticism, and regulation. Products were free, criticism and privacy norms ignored. Faced with the choice between asking permission or begging forgiveness, entrepreneurs embraced the latter. For some startups, challenging authority was central to their culture. To maximize both engagement and revenues, Web 2.0 startups focused their technology on the weakest elements of human psychology. They set out to create habits, evolved habits into addictions, and laid the groundwork for giant fortunes.

The second important change was philosophical. American business philosophy was becoming more and more proudly libertarian, nowhere more so than in Silicon Valley. The United States had beaten the Depression and won World War II through collective action. As a country, we subordinated the individual to the collective good, and it worked really well. When the Second World War ended, the US economy prospered by rebuilding the rest of the world. Among the many peacetime benefits was the emergence of a prosperous middle class. Tax rates were high, but few people complained. Collective action enabled the country to build the best public education system in the world, as well as the interstate highway system, and to send men to the moon. The average American enjoyed an exceptionally high standard of living.

Then came the 1973 oil crisis, when the Organization of Petroleum Exporting Countries initiated a boycott of countries that supported Israel in the Yom Kippur War. The oil embargo exposed a flaw in the US economy: it was built on cheap oil. The country had lived beyond its means for most of the sixties, borrowing aggressively to pay for the war in Vietnam and the Great Society social programs, which made it vulnerable. When rising oil prices triggered inflation and economic stagnation, the country transitioned into a new philosophical regime.

The winner was libertarianism, which prioritized the individual over the collective good. It might be framed as “you are responsible only for yourself.” As the opposite of collectivism, libertarianism is a philosophy that can trace its roots to the frontier years of the American West. In the modern context, it is closely tied to the belief that markets are always the best way to allocate resources. Under libertarianism, no one needs to feel guilty about ambition or greed. Disruption can be a strategy, not just a consequence. You can imagine how attractive a philosophy that absolves practitioners of responsibility for the impact of their actions on others would be to entrepreneurs and investors in Silicon Valley. They embraced it. You could be a hacker, a rebel against authority, and people would reward you for it. Unstated was the leverage the philosophy conferred on those who started with advantages. The well-born and lucky could attribute their success to hard work and talent, while blaming the less advantaged for not working hard enough or being untalented. Many libertarian entrepreneurs brag about the “meritocracy” inside their companies. Meritocracy sounds like a great thing, but in practice there are serious issues with Silicon Valley’s version of it. If contributions to corporate success define merit when a company is small and has a homogeneous employee base, then meritocracy will encourage the hiring of people with similar backgrounds and experience. If the company is not careful, this will lead to a homogeneous workforce as the company grows. For internet platforms, this means an employee base consisting overwhelmingly of white and Asian males in their twenties and thirties. This can have an impact on product design. For example, Google’s facial-recognition software had problems recognizing people of color, possibly reflecting a lack of diversity in the development team. Homogeneity narrows the range of acceptable ideas and, in the case of Facebook, may have contributed to a work environment that emphasizes conformity. The extraordinary lack of diversity in Silicon Valley may reflect the pervasive embrace of libertarian philosophy. Zuck’s early investor and mentor Peter Thiel is an outspoken advocate for libertarian values.

The third big change was economic, and it was a natural extension of libertarian philosophy. Neoliberalism stipulated that markets should replace government as the rule setter for economic activity. President Ronald Reagan framed neoliberalism with his assertion that “government is not the solution to our problem; it is the problem.” Beginning in 1981, the Reagan administration began removing regulations on business. He restored confidence, which unleashed a big increase in investment and economic activity. By 1982, Wall Street bought into the idea, and stocks began to rise. Reagan called it Morning in America. The problems—stagnant wages, income inequality, and a decline in startup activity outside of tech—did not emerge until the late nineties.

Deregulation generally favored incumbents at the expense of startups. New company formation, which had peaked in 1977, has been in decline ever since. The exception was Silicon Valley, where large companies struggled to keep up with rapidly evolving technologies, creating opportunities for startups. The startup economy in the early eighties was tiny but vibrant. It grew with the PC industry, exploded in the nineties, and peaked in 2000 at $120 billion, before declining by 87 percent over two years. The lean startup model collapsed the cost of startups, such that the number of new companies rebounded very quickly. According to the National Venture Capital Association, venture funding recovered to seventy-nine billion dollars in 2015 on 10,463 deals, more than twice the number funded in 2008. The market power of Facebook, Google, Amazon, and Apple has altered the behavior of investors and entrepreneurs, forcing startups to sell out early to one of the giants or crowd into smaller and less attractive opportunities.

Under Reagan, the country also revised its view of corporate power. The Founding Fathers associated monopoly with monarchy and took steps to ensure that economic power would be widely distributed. There were ebbs and flows as the country adjusted to the industrial revolution, mechanization, technology, world wars, and globalization, but until 1981, the prevailing view was that there should be limits to the concentration of economic power and wealth. The Reagan Revolution embraced the notion that the concentration of economic power was not a problem so long as it did not lead to higher prices for consumers. Again, Silicon Valley profited from laissez-faire economics.

Technology markets are not monopolies by nature. That said, every generation has had dominant players: IBM in mainframes, Digital Equipment in minicomputers, Microsoft and Intel in PCs, Cisco in data networking, Oracle in enterprise software, and Google on the internet. The argument against monopolies in technology is that major innovations almost always come from new players. If you stifle the rise of new companies, innovation may suffer.

Before the internet, the dominant tech companies sold foundational technologies for the architecture of their period. With the exception of Digital Equipment, all of the tech market leaders of the past still exist today, though none could prevent their markets from maturing, peaking, and losing ground to subsequent generations. In two cases, IBM and Microsoft, the business practices that led to success eventually caught the eye of antitrust regulators, resulting in regulatory actions that restored competitive balance. Without the IBM antitrust case, there likely would have been no Microsoft. Without the Microsoft case, it is hard to imagine Google succeeding as it did. Beginning with Google, the most successful technology companies sat on top of stacks created by others, which allowed them to move faster than any market leaders before them. Google, Facebook, and others also broke the mold by adopting advertising business models, which meant their products were free to use, eliminating another form of friction and protecting them from antitrust regulation. They rode the wave of wired broadband adoption and then 4G mobile to achieve global scale in what seemed like the blink of an eye. Their products enjoyed network effects, which occur when the value of a product increases as you add users to the network. Network effects were supposed to benefit users. In the cases of Facebook and Google, that was true for a time, but eventually the value increase shifted decisively to the benefit of owners of the network, creating insurmountable barriers to entry. Facebook and Google, as well as Amazon, quickly amassed economic power on a scale not seen since the days of Standard Oil one hundred years earlier. In an essay on Medium, the venture capitalist James Currier pointed out that the key to success in the internet platform business is network effects and Facebook enjoyed more of them than any other company in history. He said, “To date, we’ve actually identified that Facebook has built no less than six of the thirteen known network effects to create defensibility and value, like a castle with six concentric layers of walls. Facebook’s walls grow higher all the time, and on top of them Facebook has fortified itself with all three of the other known defensibilities in the internet age: brand, scale, and embedding.”

By 2004, the United States was more than a generation into an era dominated by a hands-off, laissez-faire approach to regulation, a time period long enough that hardly anyone in Silicon Valley knew there had once been a different way of doing things. This is one reason why few people in tech today are calling for regulation of Facebook, Google, and Amazon, antitrust or otherwise.

One other factor made the environment of 2004 different from earlier times in Silicon Valley: angel investors. Venture capitalists had served as the primary gatekeepers of the startup economy since the late seventies, but they spent a few years retrenching after the dot-com bubble burst. Into the void stepped angel investors—individuals, mostly former entrepreneurs and executives—who guided startups during their earliest stages. Angel investors were perfectly matched to the lean startup model, gaining leverage from relatively small investments. One angel, Ron Conway, built a huge brand, but the team that had started PayPal proved to have much greater impact. Peter Thiel, Elon Musk, Reid Hoffman, Max Levchin, Jeremy Stoppleman, and their colleagues were collectively known as the PayPal Mafia, and their impact transformed Silicon Valley. Not only did they launch Tesla, Space-X, LinkedIn, and Yelp, they provided early funding to Facebook and many other successful players. More important than the money, though, were the vision, value system, and connections of the PayPal Mafia, which came to dominate the social media generation. Validation by the PayPal Mafia was decisive for many startups during the early days of social media. Their management techniques enabled startups to grow at rates never before experienced in Silicon Valley. The value system of the PayPal Mafia helped their investments create massive wealth, but may have contributed to the blindness of internet platforms to harms that resulted from their success. In short, we can trace both the good and the bad of social media to the influence of the PayPal Mafia.

THANKS TO LUCKY TIMING, Facebook benefitted not only from lower barriers for startups and changes in philosophy and economics but also from a new social environment. Silicon Valley had prospered in the suburbs south of San Francisco, mostly between Palo Alto and San Jose. Engineering nerds did not have a problem with life in the sleepy suburbs because many had families with children, and the ones who did not have kids did not expect to have the option of living in the city. Beginning with the dot-com bubble of the late nineties, however, the startup culture began to attract kids fresh out of school, who were not so happy with suburban life as their predecessors. In a world where experience had declining economic value, the new generation favored San Francisco as a place to live. The transition was bumpy, as most of the San Francisco–based dot-coms went up in flames in 2000, but after the start of the new millennium, the tech population in San Francisco grew steadily. While Facebook originally based itself in Palo Alto—the heart of Silicon Valley, not far from Google, Hewlett-Packard, and Apple—a meaningful percentage of its employees chose to live in the big city. Had Facebook come along during the era of scarcity, when experienced engineers ruled the Valley, it would have had a profoundly different culture. Faced with the engineering constraints of earlier eras, however, the Facebook platform would not have worked well enough to succeed. Facebook came along at the perfect time.

San Francisco is hip, with diverse neighborhoods, decent public transportation, access to recreation, and lots of nightlife. It attracted a different kind of person than Sunnyvale or Mountain View, including two related types previously unseen in Silicon Valley: hipsters and bros. Hipsters had burst onto the public consciousness as if from a base in Brooklyn, New York, heavy on guys with beards, plaid shirts, and earrings. They seemed to be descendants of San Francisco’s bohemian past, a modern take on the Beats. The bros were different, though perhaps more in terms of style than substance. Ambitious, aggressive, and exceptionally self-confident, they embodied libertarian values. Symptoms included a lack of empathy or concern for consequences to others. The hipster and bro cultures were decidedly male. There were women in tech, too, more than in past generations of Silicon Valley, but the culture continued to be dominated by men who failed to appreciate the obvious benefits of treating women as peers. Too many in Silicon Valley missed the lesson that treating others as equals is what good people do. For them, I make a simple economic case: women are 51 percent of the US population; they account for 85 percent of consumer purchases; they control 60 percent of all personal wealth. They know what they want better than men do, yet in Silicon Valley, which invests billions in consumer-facing startups, men hold most of the leadership positions. Women who succeed often do so by beating the boys at their own game, something that Silicon Valley women do with ever greater frequency. Bloomberg journalist Emily Chang described this culture brilliantly in her book, Brotopia.

With the biggest influx of young people since the Summer of Love, the tech migration after 2000 had a visible impact on the city, precipitating a backlash that began quietly but grew steadily. The new kids boosted the economy with tea shops and co-working spaces that sprung up like mushrooms after a summer rain in the forest. But they seemed not to appreciate that their lifestyle might disturb the quiet equilibrium that had preceded their arrival. With a range of new services catering to their needs, delivered by startups of their peers, the hipsters and bros eventually provoked a reaction. Tangible manifestations of their presence, like the luxury buses that took them to jobs at Google, Facebook, Apple, and other companies down in Silicon Valley, drew protests from peeved locals. An explosion of Uber and Lyft vehicles jammed the city’s streets, dramatically increasing commute times. Insensitive blog posts, inappropriate business behavior, and higher housing costs ensured that locals would neither forgive nor forget.

ZUCK ENJOYED THE KIND OF privileged childhood one would expect for a white male whose parents were medical professionals living in a beautiful suburb. As a student at Harvard, he launched Facebook. Thanks to great focus and enthusiasm, Zuck would almost certainly have found success in Silicon Valley in any era, but he was particularly suited to his times. Plus, as previously noted, he had an advantage not available to earlier generations of entrepreneurs: he could build a team of people his age—many of whom had never before had a full-time job—and mold them. This allowed Facebook to accomplish things that had never been done before.

For Zuck and the senior management of Facebook, the goal of connecting the world was self-evidently admirable. The philosophy of “move fast and break things” allowed for lots of mistakes, and Facebook embraced the process, made adjustments, and continued forward. The company maintained a laser focus on Zuck’s priorities, never considering the possibility that there might be flaws in this approach, even when the evidence of such flaws became overwhelming. From all appearances, Zuck and his executive team did not anticipate that people would use Facebook differently than Zuck had envisioned, that putting more than two billion people on the same network would lead to tribalism, that Facebook Groups would amplify that tribalism, that bad actors would take advantage to harm innocent people. They failed to imagine unintended consequences from an advertising business based on behavior modification. They ignored critics. They missed the opportunity to take responsibility when the reputational cost would have been low. When called to task, they protected their business model and prerogatives, making only small changes to their business practices. This trajectory is worth understanding in greater depth.

Zucked

Подняться наверх