Читать книгу Targeted: My Inside Story of Cambridge Analytica and How Trump, Brexit and Facebook Broke Democracy - Brittany Kaiser - Страница 12

Оглавление

5

Terms and Conditions

FEBRUARY–JULY 2015

It didn’t take long for full-time employment at SCL to afford me entry into the higher echelons of the company. In an email copying only a handful of people—Pere, Kieran, Sabhita, Alex Tayler, and me—employees whom Nix considered important and “good fun,” he said, he invited us out for lunch one weekend at his home in Central London.

Located in Holland Park, it was a city house—he had a country one as well—a four-story stone mansion, the interior of which resembled a private members-only club or rooms you would find in Buckingham Palace, except that the art that hung from ceiling to floor weren’t Old Masters but, instead, provocatively modern.

We started at midday with vintage champagne in the drawing room and continued at the dining table for hours, all through which the champagne flowed. Alexander and the others told war stories about their time together in Africa. In 2012, for example, Alexander had moved a team from SCL and his entire family to Kenya, so that he could run the country’s 2013 election himself. He hadn’t too many staff members at the time, and it had been difficult. Research was limited to door-to-door surveys, and messaging was done through roadshows on those convertible stage-trucks I’ve mentioned before.

“That’s why what we’re doing in America is so incredibly exciting,” Alexander said. “Knocking on doors isn’t the only way to get data now. Data is everywhere. And every decision is now data-driven.”

We stayed at his house until dinnertime, and then all of us, woozy and lightheaded, made our way to a bar somewhere for cocktails, and then for a meal somewhere else, and then to another bar, where we capped off the evening.

It was the kind of memorable event that is difficult to recollect entirely the next day, although in the office, I began to see that Alexander’s excitement about America was more than just idle chat over drinks.

Indeed, while I continued to pursue global projects, my colleagues at SCL were increasingly focused on the United States, and their work was no longer confined to the Sweat Box. They were absorbed in daily conversations I could overhear, about their client Ted Cruz. He had signed on with us back in late 2014, for a small contract, but was now upgrading to nearly $5 million in services. Kieran and the other creatives were now producing tons of content for the Texas senator. They huddled at a desktop computer putting together ads and videos, which they would sometimes show off and at which they would sometimes stare, grimacing.

Meanwhile, Alexander was focused entirely on the United States, too. The Cruz campaign had agreed to sign a contract without a noncompete clause, so Alexander was free to pursue other GOP candidates as well. Soon, he had signed Dr. Ben Carson. Next up, he began systematically pitching the rest of the seventeen Republican contenders. For a while, Jeb Bush considered hiring the firm; Alexander said Jeb even flew to London to meet with him. In the end, though, he wanted nothing to do with a company that would even consider working at the same time for his competition. The Bushes were the kind of family who demanded single-minded loyalty from those with whom they worked.

The Cambridge Analytica data team busied themselves preparing for the 2016 U.S. presidential election by interpreting the results of the 2014 midterms. In their glass box, they wrote up case studies from John Bolton’s successful super PAC operation, from Thom Tillis’s senatorial campaign, and from all the North Carolina races. To show how Cambridge had succeeded, they put together a packet explaining how they’d broken the target audience into “Core Republicans,” “Reliable Republicans,” “Turnout Targets,” “Priority Persuasions,” and “Wildcards,” and how they’d messaged them differently on issues ranging from national security to the economy and immigration.

Also, in the data analytics lab, Dr. Jack Gillett produced midterm data visualizations—multicolor charts, maps, and graphics to be added to new slide shows and pitches. And Dr. Alexander Tayler was always on the phone in search of new data from brokers all over the United States.

I was still pursuing SCL projects abroad, but as Cambridge Analytica ramped up for 2016, I was becoming privy, if accidentally, to confidential information, such as the case studies, the videos, the ads, and the chatter around me. I was never copied on CA emails at that time, but there were stories in the air and images on nearby computer screens.

This presented an ethical dilemma. The previous summer, when Allida Black, founder of the Ready for Hillary super PAC was in town, I’d been fully briefed on Democratic Party plans for the election. Now I was receiving a regular paycheck from a company working for the GOP. It didn’t sit well with me, and I knew that it wouldn’t sit well with others.

No one asked me to, but I began to cut my ties with the Democrats, although I was too embarrassed to tell anyone on the Democratic side why. I didn’t want to put the SCL Group on my LinkedIn or Facebook page. I didn’t want any Democratic operatives I knew to have to worry that I’d ever use information I had from them against them. Eventually, I stopped replying to incoming emails from the Ready for Hillary super PAC and from Democrats Abroad, and I made sure that in writing personally to friends who were Democrats, I never included the SCL name on any of my communications. To the Clinton teams, it must have seemed as though I had simply dropped off the map. It wasn’t easy for me to do. I was tempted to read everything that came in, news about exciting meetings and plans. So, after a while, I just let these messages sit in my in-box, unopened, relics of my past.

I also didn’t want my Cambridge Analytica colleagues or the company’s GOP clients to worry about the same. After all, I was a Democrat working in a company that exclusively served Republicans in the United States. I removed the Obama campaign and the DNC from my LinkedIn profile (my public résumé) and erased all other public references I had made to the Democratic Party or my involvement in it. This was painful, to say the least. I also begrudgingly stopped using my Twitter account, @EqualWrights, a catalog of years of my left-leaning activist proclamations. As much as it hurt to close those doors and hide some of the most important parts of myself, it was necessary, I knew, in order for me to grow into the professional political technology consultant I was to become. And one day, perhaps, I could reopen both those accounts and that part of myself.

My change of identity wasn’t just online. In London, I opened up a big box my mother had sent me via FedEx; because she worked for the airlines, she had virtually free international shipping privileges. She had sent me business suit after business suit from her old closet: beautiful Chanel pieces, items by St. John, and specialty outfits from Bergdorf Goodman—what she’d worn years before, when she worked for Enron. I pictured what she looked like back then, when she left for work in the mornings back in Houston. She was always impeccably put together, dashing out the door in the highest of heels and those expensive suits, her makeup perfect. Now the suits were my hand-me-downs. I hung them in the closet of the new flat I’d rented for myself in Mayfair.

The flat was tiny, just one room with a kitchen counter and an electric burner and a bathroom far down a hall, but I’d chosen the place strategically. It was close to work and, more important, in the right neighborhood and on Upper Berkeley Street. If a client asked, in that presumptuous way Brits had, “Where are you staying these days?”—meaning where did I live, meaning of what social class and means I was—I could say without hesitation that I lived in Mayfair. If they filled in the blanks in their imagination with an expansive flat with a view, all the better. In point of fact, my flat was so small that I was nearly already halfway through it when I walked in the door; and when I stood in the middle of it, I could reach my arms out and touch either wall.

I kept those details secret, though, and every morning I strolled out of my Mayfair address wearing a fancy old suit of my mother’s knowing that no one would notice much of a difference between me and any trust-fund baby that owned half of the neighborhood.

“I want you to learn how to pitch,” Alexander said to me one day. I’d been talking to clients about the company for months, but in the end, Alexander or Alex Tayler always had to come in to close the deal, so he meant he wanted me to learn to pitch properly, as expertly and as confidently as he did.

Although he was the CEO, Alexander was still the only real salesperson in the company, and his time was ever more in demand. He needed me in the field, he said. I had never stood up in front of a client to make a PowerPoint presentation myself. It was an art, Alexander said, and he would mentor me.

What was most important, he said, was that I learn to sell myself, and that I wow him. I could choose whichever pitch I’d seen him give: the SCL pitch or the Cambridge Analytica one.

At the time, given that I was having little luck closing SCL contracts after the Nigerian deal, it occurred to me I might need to rethink things. I was also becoming increasingly uncomfortable with aspects of SCL’s work in Africa. Many of the African men I met with didn’t respect me or listen to me because I was young and a woman. Also, I was having ethical qualms, as potential deals sometimes lacked transparency or even verged on illegality, I thought. For example, no one ever wanted a paper trail, which meant that most often there were no written contracts. In the rare cases that there were, the contracts weren’t to include real names or the names of recognizable companies. There were always obfuscations, masks, and nebulous third parties. Those arrangements bothered me for ethical reasons as well as selfish ones: every time a deal was less than clean and straightforward, it narrowed my chances of making an argument for what I was owed in commission.

I was learning every day at SCL about other so-called common practices in international politics. Nothing was straightforward. While in discussions regarding freelance election work with contractors for an Israeli defense and intelligence firm, I heard the contractors boast about their firm doing everything from giving advance warning of attacks on their clients’ campaigns to digging up material that would be useful for counter-operations and opposition messaging. At first it seemed pretty benign to me, even clever and useful. The contractors’ firm was pitching clients similar to the SCL Group’s, even with some overlap, and the firm had worked in nearly as many elections as Alexander had. While SCL did not have internal counter-ops capacity, its work still had the feel of guerrilla warfare. The more I learned about each firm’s strategy, both appeared to be willing to do whatever was needed to win, and that gray area started to bother me. I had suggested SCL work with this firm, as I assumed that two companies working together could produce greater impact for clients, but I was quickly taken out of copy, per usual in Alexander’s practice, and not kept abreast of what was actually happening to achieve said results.

While trying to show value and close my first deal, I had introduced this Israeli firm to the Nigerians. I’m not sure what I expected to come of that, besides my looking more experienced than I was, but the results were not what I had imagined they would be. The Nigerian clients ended up hiring the Israeli operatives to work separately from SCL, and as I was later told, they sought to infiltrate the Muhammadu Buhari campaign and obtain insider information. They were successful in this and then passed information to SCL for use. The messaging that resulted discredited Buhari and incited fear, something I wasn’t privy to at the time, while Sam Patten was running the show on the ground. Ultimately, the contractors and SCL itself were not effective enough to turn the tide of the election in Goodluck Jonathan’s favor. To be fair, the campaign hadn’t even lasted a month, but, regardless, he lost spectacularly to Buhari—by 2.5 million votes. The election would become notorious because it was the first time a Nigerian incumbent president had been unseated and also because it was the most expensive campaign in the history of the African continent.

But what was of most concern to me at the time, when it came to ethics, was where the Nigerian money ended up. As I was to learn from Ceris, of the $1.8 million the Nigerian oil billionaire had paid SCL, the team had, in the short time it worked for the man, spent only $800,000, which meant the profit margin for SCL had been outrageous.

The rest of the money I had brought into the company, a cool $1 million, ended up being sheer profit for Alexander Nix. Given that normal markup for projects was 15–20 percent, this was a spectacularly high figure, in my opinion well outside of normal industry standards. It made me wary about pricing for clients in parts of the world where candidates were desperate to win at any cost. While taking high profits is of course legal, it was deeply unethical when Alexander had told the clients we ran out of money and would need more to keep the team on the ground until the delayed election date. I was sure we had more resources, but still, I was afraid to reveal to Alexander that I knew the markup, and the fact that I didn’t confront him on this haunted me.

Frankly, even some of SCL’s European contracts seemed less than aboveboard when I finally paid attention to the details. On a contract SCL had for the mayoral elections in Vilnius, Lithuania, someone in our company forged Alexander’s signature in order to expedite the closing of the deal. I later found out that the deal itself may even have been granted to us in contravention of a national law requiring that election work be publicly tendered and that we had already received notification that we’d “won” the tender before the end of the window of time during which public firms ought to have been able apply for the contract.

When Alexander discovered that his signature had been forged and that the contract wasn’t entirely kosher, he asked me to fire the person responsible, even though she was the wife of one of his friends from Eton. I did what he asked. Later, it would become clear that though he seemed to be punishing the employee for her behavior, what he was angriest about wasn’t the backroom dealing but the fact that she hadn’t collected SCL’s final payment from the political party in question. He made me chase the money and told me to forget about Sam in Nigeria: concentrate on our next paycheck.

All this had started to overwhelm me, and I was nervous that I was in over my head at SCL’s global helm. I began to look elsewhere in the firm for social projects for which I could use my expertise. I had so much to give and so much to learn about data, and I wasn’t going to let some rogue clients get the better of my strong will and put me off from finishing my PhD research.

On the positive side, I was learning that the most exciting innovations were happening in the United States, and that there were dozens of opportunities in America, most of which, thankfully, had nothing to do with the GOP. In Europe, Africa, and many nations around the globe, SCL was limited in its ability to use data because most countries’ data infrastructures were underdeveloped. At SCL, I’d been unable to work on contracts that both made use of our most innovative and exciting tools and that, I believed, involved our best practices.

Alexander had recently boasted of nearly closing a deal with the biggest charity in the United States, so I hopped onto that to help him close it. The work involved helping the nonprofit identify new donors, something that appealed to me greatly, as I had spent so many years in charity fund-raising that I couldn’t wait to learn a data-driven approach to helping new causes. On the political side, SCL was pitching ballot initiatives in favor of building water reservoirs and high-speed trains, public works projects that could really make a difference in peoples’ lives. The company was even moving into commercial advertising, selling everything from newspapers to cutting-edge health care products, an area I could dip into if my heart desired, Alexander told me.

I wanted to learn how analytics worked, and I wanted to do it where we could see, and measure, our achievements, and where people worked with transparency and honesty. I remembered my work with men like Barack Obama. He had been honorable and impeccably moral, and so had the people around him. The way they campaigned was ethical, involving no big-dollar donors and Barack had insisted on absolutely no negative campaigning, too. He would neither attack his Democratic rivals in the primaries nor go low on Republicans. I was nostalgic for a time when I’d experienced elections that ran according to not only rules and laws, but ethics and moral principles.

It seemed to me that my future at the company, if I were to have one, would be in the United States.

I told Alexander I wanted to learn the Cambridge Analytica pitch. And in choosing to do so, I was choosing to join that company, with all the bells and whistles attached.

I couldn’t wow Alexander with my own pitch without first meeting with Dr. Alex Tayler to learn about the data analytics behind Cambridge Analytica’s success. Tayler’s pitch was much more technical and much more involved in the nitty-gritty of the analytics process, but he showed me how Cambridge Analytica’s so-called secret sauce wasn’t one particular secret thing but really many things that set CA apart from our peers. As Alexander Nix often said, the secret sauce was more like a recipe of several ingredients. The ingredients were really baked into a kind of “cake,” he said.

Perhaps the most important first thing that made CA different from any other communications firm was the size of our database. The database, Tayler explained, was prodigious and unprecedented in depth and breadth, and was growing ever bigger by the day. We had come about it by buying and licensing all the personal information held on every American citizen. We bought that data from every vendor we could afford to pay—from Experian to Axiom to Infogroup. We bought data about Americans’ finances, where they bought things, how much they paid for them, where they went on vacation, what they read.

We matched this data to their political information (their voting habits, which were accessible publicly) and then matched all that again to their Facebook data (what topics they had “liked”). From Facebook alone, we had some 570 individual data points on users, and so, combining all this gave us some 5,000 data points on every single American over the age of eighteen—some 240 million people.

The special edge of the database, though, Tayler said, was our access to Facebook for messaging. We used the Facebook platform to reach the same people on whom we had compiled so much data.

What Alex told me helped bring into focus two events I’d experienced while at the SCL Group, the first when I’d just arrived. One day in December 2014, one of our senior data scientists, Suraj Gosai, had called me over to his computer, where he was sitting with one of our research PhDs and one of our in-house psychologists.

The three of them had developed, they explained, a personality quiz called “the Sex Compass”—a funny name, I thought. It was ostensibly aimed at determining a person’s “sexual personality” by asking probing questions about sexual preferences such as favorite position in bed. The survey wasn’t just a joyride for the user. It was, I came to understand, a means to harvest data points from the answers people gave about themselves, which led to the determination of their “sexual personality,” and a new masked way for SCL to gather the users’ data and that of all their “friends,” while topping it up with useful data points on personality and behavior.

The same was true for another survey that had crossed my desk. It was called “the Musical Walrus.” A tiny cartoon walrus asked a user a series of seemingly benign questions in order to determine that person’s “true musical identity.” It, too, was gathering data points and personality information.

And then there were other online activities that, as Tayler explained, were a means to get at both the 570 data points Facebook already possessed about users and the 570 data points possessed about each of the user’s Facebook friends. When people signed on to play games such as Candy Crush on Facebook, and clicked “yes” to the terms of service for that third-party app, they were opting in to give their data and the data of all their friends, for free, to the app developers and then, inadvertently, to everyone with whom that app developer had decided to share the information. Facebook allowed this access through what has become known as the “Friends API,” a now-notorious data portal that contravened data laws everywhere, as under no legislative framework in the United States or elsewhere is it legal for anyone to consent on behalf of other able-minded adults. As one can imagine, the use of the Friends API became prolific, amounting to a great payday for Facebook. And it allowed more than forty thousand developers, including Cambridge Analytica, to take advantage of this loophole and harvest data on unsuspecting Facebook users.

Cambridge was always collecting and refreshing its data, staying completely up to date on what people cared about at any given time. It supplemented data sets by purchasing more and more every day on the American public, data that Americans gave away every time they clicked on “yes” and accepted electronic “cookies” or clicked “agree” to “terms of service” on any site, not just Facebook or third-party apps.

Cambridge Analytica bought this fresh data from companies such as Experian, which has followed people throughout their digital lives, through every move and every purchase, collecting as much as possible in order, ostensibly, to provide credit scores but also to make a profit in selling that information. Other data brokers, such as Axiom, Magellan, and Labels and Lists (aka L2), did the same. Users do not need to opt in, a process by which they agree to the data collection, usually through extensive terms and conditions meant to put them off reading them—so with an attractively easy, small tick box, collecting data is an even simpler process for these companies. Users are forced to click it anyhow, or they cannot go forth with using whichever game, platform, or service they are trying to activate.

The most shocking thing about data that I learned from Alexander Tayler was where it all came from. I hate to break it to you, but by buying this book (perhaps even by reading it, if you have downloaded the e-book or Audible version), you have produced significant data sets about yourself that have already been bought and sold around the world in order for advertisers to control your digital life.

If you bought this book online, your search data, transaction history, and the time spent browsing each Web page during your purchase were recorded by the platforms you used and the tracking cookies you allowed to drop on your computer, installing a tracking device to collect your online data.

Speaking of cookies, have you ever wondered what Web pages are asking when they request that you “accept cookies”? It’s supposed to be a socially acceptable version of spyware, and you consent to it on a daily basis. It comes to you wrapped in a friendly-sounding word, but it is an elaborate ruse used on unsuspecting citizens and consumers.

Cookies literally track everything you do on your computer or phone. Go ahead and check any browsing add-on such as Mozilla’s Lightbeam (formerly Collusion), Cliqz International’s Ghostery, or the Electronic Frontier Foundation’s Privacy Badger to see how many companies are tracking your online activity. You could find more than fifty. When I first used Lightbeam to see just how many companies were tracking me, I found that by having visited merely two news Web pages within one minute, I had allowed my data to be connected to 174 third-party sites. These sites sell data to even larger “Big Data aggregators” such as Rocket Fuel and Lotame, where your data is the gas that keeps their ad machines running. Everyone who touches your data along the way makes a profit.

If you are reading this book on your Amazon Kindle, on your iPad, in Google Books, or on your Barnes and Noble Nook, you are producing precise data sets that range from how long you took to read each page, at which points you stopped reading and took a break, and which passages you bookmarked or highlighted. Combined with the actual search terms you used to find this book in the first place, this information gives the companies that own the device the data they need to sell you new products. These retailers want you to engage, and even the slightest hint of what you might be interested in is enough to give them an edge. And all this goes on without your being properly informed or consenting to the process in any traditional sense of the term consent.

Now, if you bought this book in a brick-and-mortar store, and assuming you have a smartphone with GPS tracking switched on—when you use Google Maps, it creates valuable location data that is sold to companies such as NinthDecimal—your phone recorded your entire journey to the bookshop and, upon your arrival, tracked how long you spent there, how long you looked at each item, and even perhaps what the items were, before you chose this book over others. Upon buying the book, if you used a credit or debit card, your purchase was recorded in your transaction history. From there, your bank or credit card company sold that information to Big Data aggregators and vendors, who went on to sell it as soon as they could.

Now, if you’re back home reading this, your robot vacuum cleaner, if you have one, is recording the location of the chair or couch on which you’re sitting. If you have an Alexa, Siri, Cortana, or other voice-activated “assistant” nearby, it records when you laugh out loud or cry while reading the revelations on these pages. You may even have a smart fridge or coffeemaker that records how much coffee and milk you go through while reading.

All these data sets are known as “behavioral data,” and with this data, it is possible for data aggregators to build a picture of you that is incredibly precise and endlessly useful. Companies can then tailor their products to align with your daily activities. Politicians use your behavioral data to show you information so that their message will ring true to you, and at the right time: Think of those ads about education that just happen to play on the radio at the precise moment you’re dropping your kids off at school. You’re not paranoid. It’s all orchestrated.

And what’s also important to understand is that when companies buy your data, the cost to them pales in comparison to how much the data is worth when they sell advertisers access to you. Your data allows anyone, anywhere, to purchase digital advertising that targets you for whatever purpose—commercial, political, honest, nefarious, or benign—on the right platform, with the right message, at the right time.

But how could you resist? You do everything electronically because it’s convenient. Meanwhile, the cost of your convenience is vast: you are giving one of your most precious assets away for free while others profit from it. Others make trillions of dollars out of what you’re not even aware you are giving away each moment. Your data is incredibly valuable, and CA knew that better than you or most of our clients.

When Alexander Tayler taught me what Cambridge Analytica could do, I learned that in addition to purchasing data from Big Data vendors, we had access to our clients’ proprietary data, aka data they produced themselves that was not purchasable on the open market. Depending on our arrangements with them, that data could remain theirs or it could become part of our intellectual property, meaning that we could retain their proprietary data to use, sell, or model as our own.

It was a uniquely American opportunity. Data laws in countries such as the United Kingdom, Germany, and France don’t allow such freedoms. That’s why America was such fertile ground for Cambridge Analytica, and why Alexander had called the U.S. data market a veritable “Wild West.”

When Cambridge Analytica refreshed data, meaning updating the locally held database with new data points, we struck a range of agreements with clients and vendors. Depending on those agreements, the data sets could cost either in the millions of dollars or nothing, as Cambridge sometimes struck data-sharing agreements by which we shared our proprietary data with other companies for theirs. No money had to change hands. An example of this comes from the company Infogroup, which has a data-sharing “co-op” that nonprofits use to identify donors. When one nonprofit shares with Infogroup its list of donors, and how much each gave, it receives in return the same data on other donors, their habits, fiscal donation brackets, and core philanthropic preferences.

From the massive database that Cambridge had compiled from all these different sources, it then went on to do something else that differentiated it from its competitors. It began to mix the batter of the figurative “cake” Alexander had talked about. While the data sets we possessed were the critical foundation, it was what we did with them, our use of what we called “psychographics,” that made Cambridge’s work precise and effective.

The term psychographics was created to describe the process by which we took in-house personality scoring and applied it to our massive database. Using analytic tools to understand individuals’ complex personalities, the psychologists then determined what motivated those individuals to act. Then the creative team tailored specific messages to those personality types in a process called “behavioral microtargeting.”

With behavioral microtargeting, a term Cambridge trademarked, they could zoom in on individuals who shared common personality traits and concerns and message them again and again, fine-tuning and tweaking those messages until we got precisely the results we wanted. In the case of elections, we wanted people to donate money; learn about our candidate and the issues involved in the race; actually get out to the polling booths; and vote for our candidate. Likewise, and most disturbing, some campaigns also aimed to “deter” some people from going to the polls at all.

As Tayler detailed the process, Cambridge took the Facebook user data he had gathered from entertaining personality surveys such as the Sex Compass and the Musical Walrus, which he had created through third-party app developers, and matched it with data from outside vendors such as Experian. We then gave millions of individuals “OCEAN” scores, determined from the thousands of data points about them.

OCEAN scoring grew out of academic behavioral and social psychology. Cambridge used OCEAN scoring to determine the construction of people’s personalities. By testing personalities and matching data points, CA found it was possible to determine the degree to which an individual was “open” (O), “conscientious” (C), “extroverted” (E), “agreeable” (A), or “neurotic” (N). Once CA had models of these various personality types, they could go ahead and match an individual in question to individuals whose data was already in the proprietary database, and thus group people accordingly. So that was how CA could determine who among the millions upon millions of people whose data points CA had were O, C, E, A, N, or even a combination of several of those traits.

It was OCEAN that allowed for Cambridge’s five-step approach.

First, CA could segment all the people whose info they had into even more sophisticated and nuanced groups than any other communications firm. (Yes, other companies were also able to segment groups of people beyond their basic demographics such as gender and race, but those companies, when determining advanced characteristics such as party affinity or issue preference, often used crude polling to determine where people generally stood on issues.) OCEAN scoring was nuanced and complex, allowing Cambridge to understand people on a continuum in each category. Some people were predominantly “open” and “agreeable.” Others were “neurotic” and “extroverts.” Still others were “conscientious” and “open.” There were thirty-two main groupings in all. A person’s “openness” score indicated whether he or she enjoyed new experiences or was more inclined to rely on and appreciate tradition. The “conscientiousness” score indicated whether a person preferred planning over spontaneity. The “extroversion” score revealed the degree to which one liked to engage with others and be part of a community. “Agreeableness” indicated whether the person put others’ needs before their own. And “neuroticism” indicated how likely the person was to be driven by fear when making decisions.

Depending on the varied subcategories in which people were sorted, CA then added in the issues about which they had already shown an interest (say, from their Facebook “likes”) and segmented each group with even more refinement. For example, it was too simplistic to see two women who were thirty-four years old and white and who shopped at Macy’s as the same person. Rather, by doing the psychographic profiling and then adding to it everything ranging from the women’s lifestyle data to their voting records to their Facebook “likes” and credit scores, CA’s data scientists could begin to see each woman as profoundly different from the other. People who looked alike weren’t necessarily alike at all. They therefore shouldn’t be messaged together. While this seems obvious—it was a concept supposedly already permeating the advertising industry at the time Cambridge Analytica came along—most political consultants had no idea how to do this or that it was even possible. It would be for them a revelation and a means to victory.

Second, CA provided clients, political and commercial, with a benefit that set the company apart: the accuracy of its predictive algorithms. Dr. Alex Tayler, Dr. Jack Gillett, and CA’s other data scientists constantly ran new algorithms, producing much more than mere psychographic scores. They produced scores for every person in America, predicting on a scale of 0 to 100 percent how likely, for example, each was to vote; how likely each was to belong to a particular political party; or what toothpaste each was likely to prefer. CA knew whether you were more likely to want to donate to a cause when clicking a red button or a blue, and how likely you were to wish to hear about environmental policy versus gun rights. After breaking people up into groups using their predictive scores, CA’s digital strategists and data scientists spent much of their time testing and retesting these “models,” or user groupings called “audiences,” and refining them to a high degree of accuracy, with up to 95 percent confidence in those scores.

Third, CA then took what they had learned from these algorithms and turned around and used platforms such as Twitter, Facebook, Pandora (music streaming), and YouTube to find out where the people they wished to target spent the most interactive time. Where was the best place to reach each person? It might be through something as physical and basic as direct paper “snail” mail sent to an actual mailbox. It might be in the form of a television ad or in whatever popped up at the top of that person’s Google search engine. By purchasing lists of key words from Google, CA was able to reach users when they typed those words into their browsers or search engines. Each time they did, they would be met with materials (ads, articles, etc.) that CA had designed especially for them.

At the fourth step in the process, another ingredient in the “cake recipe,” and the one that put CA head and shoulders above the competition, above every political consulting firm in the world, they found ways to reach targeted audiences, and to test the effectiveness of that reach, through client-facing tools such as the one CA designed especially for its own use. Called Ripon, this canvassing software program for door-to-door campaigners and phone bankers allowed its users direct access to your data as they approached your house or called you on the phone. Data-visualization tools also helped them determine their strategy before you’d even opened your door or picked up your phone.

Then campaigns would be designed based on content our in-house team had composed—and the final, fifth step, the micro-targeting strategy, allowed everything from video to audio to print ads to reach the identified targets. Using an automated system that refined that content again and again, we were able to understand what made individual users finally engage with that content in a meaningful way. We might learn that it took as many as twenty or thirty variations of the same ad sent to the same person thirty different times and placed on different parts of their social media feed before they clicked on it to act. And knowing that, our creatives, who were producing new content all the time, knew how to reach those same people the next time CA sent something out.

The even more sophisticated data dashboards that CA set up in campaign “war rooms” provided project and campaign managers with metrics in real time, giving them up-to-the-minute reads on how a particular piece of content was working and how many impressions and clicks that content was getting per dollar spent. Right in front of their eyes, they could see what was working and what was not, whether they were getting the return on investment they wanted, and how to adjust their strategy to do better. With these tools, those watching the data dashboards were able to monitor up to ten thousand different “campaigns within campaigns” we were running for them at any given time.

What CA did was evidence-based. CA could provide clients with a clear picture of what they had done, whom they’d reached, and, by scientifically surveying a representative sample, what percentage of the people they had targeted were taking action as a result of the targeted messaging.

It was revolutionary.

When I learned these things from Alex Tayler, I was dumbfounded but also fascinated. I had had no idea of the reach of data collection in America, and although it made me think back to Edward Snowden’s warnings about mass surveillance, Tayler explained everything to me in such a matter-of-fact way that I saw it as just the “way things were done.”

It was all so no-nonsense; nothing was dark or troubling. This was just how the data economy flowed, I imagined. Soon, I came to understand that I had been naïve to think I could achieve my goals with anything less than a big database. Didn’t I want to be heard? Didn’t I want to be effective? Yes, I did. At the time, I couldn’t think of anything I wanted more.

As successful as this five-step approach had been, I learned in 2015 that it was about to change, when Facebook announced that as of April 30, it would, after so many years of openness, be closing its user data to “third-party app” developers, companies like CA. At that point, according to Dr. Tayler, a critical piece of CA’s data gathering would be jeopardized. No longer could Tayler freely gather data from Facebook through the Friends API.

No longer could he use the Sex Compass or the Musical Walrus.

He had just a short time to grab whatever data he could before that window closed, Dr. Tayler told me.

And CA wasn’t alone. Around the world, everyone else was rushing. Facebook was becoming a walled garden. After April 30, Tayler told me, it would allow data-gathering companies to use the data they had already harvested from it, and to advertise on its platform and use its analytics, but the companies wouldn’t be able to harvest any new data.

Tayler showed me lists of thousands of categories of user data still up for grabs, if not from Facebook itself then from one of its developers. Somehow, other app developers were selling data they had gathered from Facebook, so even if CA couldn’t collect it directly, Tayler could buy it easily from any number of sources. So easily, he said, that I didn’t question it.

And there was so much to choose from. There were groupings of people according to their attitudes about everything from the food brands they preferred to their fashion choices to what they believed or didn’t believe about climate change. All this information was there for the taking. I looked at the list and marked the groups I thought would be most interesting, based on clients I imagined we might have in the future. Tayler gave the same lists to other CA employees and asked them to choose groups, too.

The more the better, he said.

I now know this was against Facebook’s policies, but one of Tayler’s final purchases of Facebook data would occur on May 6, 2015, a whole week after Facebook said this was no longer possible. Strange, I thought. How did we get the data if the API was already closed?

After an extensive time with Dr. Tayler, I sat down and put together my Cambridge Analytica pitch, borrowing from Tayler and Alexander freely, using some of their slides but also adapting them and adding my own so that I would feel more comfortable with the way I personally explained the company to clients.

In the Sweat Box one afternoon, I finally pitched Alexander. When I had finished, he told me I had done a very good job, but that I needed to work on some of the details in order to demonstrate more clarity and more confidence.

“The most important thing is to sell yourself,” he reminded me. The data sell will come naturally once the clients love you, he said, and he sent me out to pitch to every single person in the office. It was in that way that I gained greater knowledge about the company but also got to know my colleagues better.

Krystyna Zawal, a Polish associate project manager new to the company who accepted chocolates as currency, helped me fine-tune the part of my presentation using the case studies that had come from the John Bolton super PAC and the North Carolina midterms.

Bianca Independente, a fun-loving Italian in-house psychologist, helped me understand the larger context of OCEAN modeling, explaining that CA’s expertise in it had come from the nonprofit out of which SCL had grown: the academic research center at Cambridge University called Behavioural Dynamics Institute, or BDI. As Bianca explained, BDI had been affiliated with more than sixty academic institutions, and that’s what had given the SCL Group its academic bona fides. She was working diligently to add to the body of knowledge through experiments.

From Harris McCloud and Sebastian Richards, who were a messaging expert and a creative, respectively, I learned better ways to frame complex technical concepts for laypeople. And Jordan, who worked in research, provided me with visuals that could help me better explain those concepts in a slide show. Kieran literally helped me mock up new slides.

My colleagues provided me with their expertise, which was an embarrassment of riches. They clarified so much for me, and when I approached Alexander again to pitch him in the Sweat Box, I felt ready.

I made sure that I was perfectly dressed, as though for a real client. I wore bright red lipstick. I lowered the lights. Then I began.

“Good afternoon.”

On the wall was the Cambridge Analytica logo, an angular abstract depiction of the human brain and the cerebral cortex, composed not of gray matter but of simple, short mathematical segments printed in white on a crimson background.

“Cambridge Analytica is the newest and most cutting-edge company in the political space in America,” I said. “We specialize in what we call the science of behavioral change communication. What that means is that we’ve”—I pulled up another slide, one showing two equal-size puzzle pieces that fitted together perfectly—“taken behavioral and clinical and experimental psychology and combined that with world-class data analytics.”

I pulled up another slide.

“We have some of the best data scientists and PhDs in this space, working with psychologists to put together data-driven strategies—that means that all your communications strategies are no longer guesswork. All your communications are based on science,” I said.

Next, I discussed how blanket and informational advertising was useless and how the SCL Group had moved on from the old Mad Men

Targeted: My Inside Story of Cambridge Analytica and How Trump, Brexit and Facebook Broke Democracy

Подняться наверх