A Hurricane Flattens Facebook

As the Cambridge Analytica story broke over the weekend, Facebook has struggled to formulate a response.
Image may contain Human Person and Face
David Paul Morris/Bloomberg/Getty Images

Two weeks ago, Facebook learned that The New York Times, Guardian, and Observer were working on blockbuster stories based on interviews with a man named Christopher Wylie. The core of the tale was familiar but the details were new, and now the scandal was attached to a charismatic face with a top of pink hair. Four years ago, a slug of Facebook data on 50 million Americans was sucked down by a UK academic named Aleksandr Kogan, and wrongly sold to Cambridge Analytica. Wylie, who worked at the firm and has never talked publicly before, showed the newspapers a trove of emails and invoices to prove his allegations. Worse, Cambridge appears to have lied to Facebook about entirely deleting the data.

To Facebook, before the stories went live, the scandal appeared bad but manageable. The worst deeds had been done outside of Facebook and long ago. Plus, like weather forecasters in the Caribbean, Facebook has been busy lately. Just in the past month, they’ve had to deal with scandals created by vacuous Friday tweets from an ad executive, porn, the darn Russian bots, angry politicians in Sri Lanka, and even the United Nations. All of those crises have passed with limited damage. And perhaps that’s why the company appears to have underestimated the power of the storm clouds moving in.

On Friday night, the company made its first move, jumping out in front of the news reports to publish its own blog post announcing that it was suspending Cambridge Analytica’s use of the platform. It also made one last stern appeal to ask The Guardian not to use the word “breach” in its story. The word, the company argued, was inaccurate. Data had been misused, but moats and walls had not been breached. The Guardian apparently did not find that argument sympathetic or persuasive. On Saturday its story appeared, “Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach.”

The crisis was familiar in a way: Facebook has burned its fingers on issues of data privacy frequently in its 14 year history. But this time it was different. The data leakage hadn’t helped Unilever sell mayonnaise. It appeared to have helped Donald Trump sell a political vision of division and antipathy. The news made it look as if Facebook’s data controls were lax and that its executives were indifferent. Around the world lawmakers, regulators, and Facebook users began asking very publicly how they could support a platform that didn’t do more to protect them. Soon, powerful politicians were chiming in and demanding to hear from Zuckerberg.

As the storm built over the weekend, Facebook’s executives, including Mark Zuckerberg and Sheryl Sandberg, strategized and argued late into the night. They knew that the public was hammering them, but they also believed that the fault lay much more with Cambridge Analytica than with them. Still, there were four main questions that consumed them. How could they tighten up the system to make sure this didn’t happen again? What should they do about all the calls for Zuckerberg to testify? Should they sue Cambridge Analytica? And what could they do about psychologist Joseph Chancellor, who had helped found Kogan’s firm and who now worked, of all places, at Facebook?

By Monday, Facebook remained frozen, and Zuckerberg and Sandberg stayed silent. Then, late in the afternoon in Menlo Park, more bad news came. The New York Times reported that Alex Stamos, the company’s well-respected chief of security, had grown dissatisfied with the top of senior management and was planning to exit in a few months. Some people had known this for a while, but it was still a very bad look. You don’t want news about your head of data security bailing when you’re having a crisis about how to secure your data. And then news broke that Facebook had been denied in its efforts to get access to Cambridge Analytica’s servers. The United Kingdom’s Information Commissioner’s Office, which had started an investigation, would handle that.

A company-wide Q&A was called for Tuesday but for some reason it was led by Facebook’s legal counsel, not its leaders, both of whom have remained deafeningly silent and both of whom reportedly skipped the session. Meanwhile, the stock had collapsed, chopping $36 billion off the company’s market value on Monday. By mid-Tuesday morning, it had fallen 10 percent since the scandal broke. What the company expected to be a tough summer storm had turned into a Category 5 hurricane.

Walking in the Front Door

The story of how Kogan ended up with data on 50 million American Facebook users sounds like it should involve secret handshakes and black hats. But Kogan actually got his Facebook data by just walking in Facebook’s front door and asking for it. Like all technology platforms, Facebook encourages outside software developers to build applications to run inside it, just like Google does with its Android operating system and Apple does with iOS. And so in November 2013 Kogan, a psychology professor at the University of Cambridge, created an application developer account on Facebook and explained why he wanted access to Facebook’s data for a research project. He started work soon thereafter.

Kogan had created the most anodyne of tools for electoral manipulation: an app based on personality quizzes. Users signed up and answered a series of questions. Then the app would take those answers, mush them together with that person’s Facebook likes and declared interests, and spit out a profile that was supposed to know the test-taker better than he knew himself.

About 270,000 Americans participated. However what they didn’t know was that by agreeing to take the quiz and giving Facebook access to their data, they also granted access to many of their Facebook friends’ likes and interests as well. Users could turn off this setting, but it’s hard to turn off something you don’t know exists and that you couldn’t find if you did. Kogan quickly ended up with data on roughly 50 million people.

About five months after Kogan began his research, Facebook announced that it was tightening its app review policies. For one: Developers couldn’t mine data from your friends anymore. The barn door was shut, but Facebook told all the horses already in the pasture that they had another year to run around. Kogan, then, got a year and a half to do his business. And when the stricter policies went into effect, Facebook promptly rejected version two of his app.

By then Kogan had already mined the data and sold it to Cambridge Analytica, violating his agreement with Facebook and revealing one of the strange asymmetries of this story. Facebook knows everything about its users—but in some ways it knows nothing about its developers. And so Facebook didn’t start to suspect that Kogan had misused its data until it read a blaring headline in The Guardian in December 2015: “Ted Cruz using firm that harvested data on millions of unwitting Facebook users.”

That story passed out of the cycle quickly though, swept away by news about the Iowa caucuses. And so while Facebook’s legal team might have been sweating at the end of 2015, outwardly Zuckerberg projected an air of total calm. His first public statement after the Guardian story broke was a Christmas note about all the books he’d read: “Reading has given me more perspective on a number of topics -- from science to religion, from poverty to prosperity, from health to energy to social justice, from political philosophy to foreign policy, and from history to futuristic fiction.”

An Incomplete Response

When the 2015 Guardian story broke, Facebook immediately secured written assertions from Cambridge Analytica, Kogan, and Christopher Wylie that the data had been deleted. Lawyers on all sides started talking, and by the early summer of 2016 Facebook had more substantial legal agreements with Kogan and Wylie certifying that the data had been deleted. Cambridge Analytica signed similar documents, but their paperwork wasn’t submitted until 2017. Facebook’s lawyers describe it as a tortured and intense legal process. Wylie describes it as a pinkie promise. “All they asked me to do was tick a box on a form and post it back,” he told the Guardian.

Facebook’s stronger option would have been to insist on an audit of all of Cambridge Analytica’s machines. Did the data still exist, and had it been used at all? And in fact, according to the standard rules that developers agree to, Facebook reserves that right. “We can audit your app to ensure it is safe and does not violate our Terms. If requested, you must provide us with proof that your app complies with our terms,” the policy currently states, as it did then.

Kogan, too, may have merited closer scrutiny regardless, especially in the context of the 2016 presidential campaign. In addition to his University of Cambridge appointment, Kogan was also an associate professor at St. Petersburg State University, and had accepted research grants from the Russian government.

Why didn’t Facebook conduct an audit—a decision that may go down as Facebook’s most crucial mistake? Perhaps because no audit can ever be completely persuasive. Even if no trace of data exists on a server, it could still have been stuck on a hard-drive and shoved in a closet. Facebook’s legal team also insists that an audit would have been time-consuming and would have required a court order even though the developer contract allows for one. A third possible explanation is fear of accusations of political bias. Most of the senior employees at Facebook are Democrats who blanch at allegations that they would let politics seep into the platform.

Whatever the reason, Facebook trusted the signed documents from Cambridge Analytica. In June 2016, Facebook staff even went down to San Antonio to sit with Trump campaign officials and the Cambridge Analytica consultants by their side.

To Facebook, the story seemed to go away. In the year following Trump’s victory, public interest advocates hammered Cambridge Analytica over its data practices, and other publications, particularly The Intercept, dug into its practices. But Facebook, according to executives at the company, never thought to double check if the data was gone until reporters began to call this winter. And then it was only after the story broke that Facebook considered serious action including suing Cambridge Analytica. A lawyer for the company, Paul Grewal, told WIRED on Monday evening that “all options are on the table.”

What Comes Next

Of Facebook’s many problems, one of the most confusing appears to be figuring out what to do with Chancellor, who currently works with the VR team. He may know about the fate of the user data, but this weekend the company was debating how forcefully it could ask him since it could be considered a violation of rules protecting employees from being forced to give up trade secrets from previous jobs.

A harder question is when, and how exactly, Zuckerberg and Sandberg should emerge from their bunkers. Sandberg, in particular, has passed through the crucible of the past two years relatively unscathed. Zuckerberg’s name now trends on Twitter when crises hit, and this magazine put his bruised face on the cover. Even Stamos has taken heat during the outcry over the Russia investigation. And a small bevy of brave employees have waded out into the rushing rivers of Twitter, where they have generally been sucked below the surface or swept over waterfalls.

The last most vexing question is what to do to make Facebook data safer. For much of the past year, Facebook has been besieged by critics saying that it should make its data more open. It should let outsiders audit its data and peer around inside with a flashlight. But it was an excess of openness with developers—and opaque privacy practices—that got the company in trouble here. Facebook tightened up third-party access in 2015, meaning an exact replay of the Cambridge Analytica fiasco couldn’t happen today. But if the company decides to close down even further, then what happens to the researchers doing genuinely important work using the platform? How well can you vet intentions? A possible solution would be for Facebook to change its data retention policies. But doing so could undermine how the service fundamentally works, and make it far more difficult to catch malevolent actors—like Russian propaganda teams—after the fact.

User data is now the foundation of the internet. Every time you download an app, you give the developer access to bits of your personal information. Every time you engage with any technology company—Facebook, Google, Amazon, and so on—you help build their giant database of information. In exchange, you trust that they won’t do bad things with that data, because you want the services they offer.

Responding to a thread about how to fix the problem, Stamos tweeted, “I don’t think a digital utopia where everybody has privacy, anonymity and choice, but the bad guys are magically kept out, can exist.”

At its core, according to a former Facebook executive, the problem is really an existential one. The company is very good at dealing with things that happen frequently and have very low stakes. When mistakes happen, they move on. According to the executive, the philosophy of the company has long been “We’re trying to do good things. We’ll make mistakes. But people are good and the world is forgiving.”

If Facebook doesn’t find a satisfactory solution, it faces the unsavory prospect of heavy regulation. Already in the UK, the General Data Protection Regulation rule will give people much more insight and control over what data companies like Facebook take, and how it’s used. In the US, senators like Ron Wyden, Mark Warner, Amy Klobuchar, and others may have the appetite for similar legislation, if Facebook’s privacy woes continue.

Facebook will hold its all-hands today, and hope for that inevitable moment when something horrible happens elsewhere and everyone’s attention turns. But it also knows that things might get worse, much worse. The nightmare scenario will come if the Cambridge Analytica story fully converges with the story of Russian meddling in American democracy: if it turns out that the Facebook data harvested by Cambridge Analytica ended up in the hands of Putin’s trolls.

At that point, Facebook will have to deal with yet another devastating asymmetry: data from a silly quiz app, created under obsolete rules, fueling a national security crisis. But those asymmetries are just part of the nature of Facebook today. The company has immense power, and it’s only begun to grapple with its immense responsibility. And the world isn’t as forgiving of Silicon Valley as it used to be.

Facebook and Cambridge Analytica

This story has been updated to include further details about Tuesday's company-wide meeting.