Facebook

“Yes,” It’s Bad: Facebook Stock Plunges as Data Scandal Takes a Dark Turn

For Mark Zuckerberg, the story just keeps getting worse.
Image may contain Mark Zuckerberg Clothing Sleeve Apparel Human Person and Long Sleeve
By Drew Angerer/Getty Images.

Already reeling from reports that Cambridge Analytica had quietly harvested private information from more than 50 million users, Facebook stock plunged on Monday, erasing about $40 billion from the social-media company’s market cap. In the court of public opinion, Facebook had already been buffeted for months by a steady stream of damaging stories about its content-moderation practices and its role in the fake-news pandemic. Now, with the curtain ripped brutally away from its invasive, highly targeted advertising model, longtime critics of Facebook are taking the opportunity to call for new regulations on a tech giant that many believe has grown too powerful for its own good. “Welp,” tweeted the C.E.O. of another major Silicon Valley company. “Tech is definitely about to get regulated. And probably for the best.”

Facebook’s unprecedented dive on Wall Street came in response to a pair of stories over the weekend from The New York Times and the Observer (the Sunday edition of the Guardian), which reported that Cambridge Analytica—the mysterious, Mercer-funded political data firm that built so-called “psychographic profiles” of voters for the Trump campaign—had bamboozled Facebook (or broken their rules, depending on one’s interpretation) by hoovering up personal data on unsuspecting people. Whistleblower Chris Wylie, who worked with academic Aleksandr Kogan, called their creation “Steve Bannon’s psychological-warfare mindfuck tool.” Within hours of the story breaking, #DeleteFacebook had become a trending hashtag.

The details of the Wylie-Kogan op show how easy it was to collect data far beyond their purview. According to a copy of a contract Wylie showed the Observer, on June 4, 2014, SCL Group—Cambridge Analytica’s parent company—entered into an agreement with Global Science Research, run by Kogan, that was specifically centered on the gathering and processing of Facebook data, with the aim of matching it to personality traits and voter rolls. Users who downloaded the G.S.R. app thought they were agreeing to share their data as part of a personality quiz; in reality, Kogan was collecting information about their friends, too—all of which was perfectly legal, under Facebook’s terms of service, up until the point that Kogan handed that data back to Cambridge Analytica. “Facebook could see it was happening,” Wylie told the Observer. “Their security protocols were triggered because Kogan’s apps were pulling this enormous amount of data, but apparently Kogan told them it was for academic use. So they were like, ‘Fine.’ ” (In a statement to the Times, Cambridge Analytica blamed Kogan for violating Facebook’s rules and said that it had deleted the data two years ago.)

The story continues to get worse for both Facebook and Cambridge Analytica. Though Cambridge Analytica C.E.O Alexander Nix told British Parliament last month that the company had “never worked with a Russian organization in Russia or any other country, and we don’t have any relationship with Russia or Russian individuals,” two former SCL insiders told the Times that they held “at least three meetings” with top Lukoil brass. Both SCL and Lukoil denied that the meetings were political in nature, but Wylie contested this claim. “I remember being super confused,” he said. “I kept asking Alexander, ‘Can you explain to me what they want?’ I don’t understand why Lukoil wants to know about political targeting in America.”

While the fact that one of Cambridge Analytica’s contractors violated Facebook’s rules for political ends is concerning, the far more troubling prospect is the possibility that Facebook failed to adequately respond or inform users. Perhaps it didn't want to draw too much scrutiny to the structures upon which it had made so much money: Facebook’s appeal to advertisers, after all, is that it is built to micro-target people along an endless series of cultural, demographic, and psychological axis. That advertising is worthless if it cannot be claimed to sway consumers. And Facebook, which generated $40 billion in revenue in 2017, is one of the most ruthlessly effective advertising platforms on Earth.

Regulation of that profit machine may soon be on the way. On Monday the company faced the most vehement public backlash since it was revealed to have been abused by Russian operatives, with Senators Amy Klobuchar and John Kennedy writing in a letter to Senate Judiciary Committee chairman Chuck Grassley that “Facebook, Google, and Twitter have amassed unprecedented amounts of personal data and use this data when selling advertising, including political advertisements. The lack of oversight on how data is stored and how political advertisements are sold raises concerns about the integrity of American elections as well as privacy rights.”

The response across the Atlantic, where the European Union is poised to crack down on Silicon Valley with a plan that would strip tech companies of their tax havens, was equally severe. In the U.K., a spokesperson for Prime Minister Theresa May called the allegations “clearly very concerning,” and supported an investigation into all parties. British M.P. Damian Collins, who chairs the U.K. Parliament’s Digital, Culture, Media, and Sport Committee, said that it seemed Nix “has deliberately misled the Committee and Parliament by giving false statements,” and called for Facebook C.E.O. Mark Zuckerberg to testify before lawmakers. “It is not acceptable that they have previously sent witnesses who seek to avoid asking difficult questions by claiming not to know the answers,” he said. “This also creates a false reassurance that Facebook’s stated policies are always robust and effectively policed.”

From a P.R. perspective, Facebook’s response has indeed been a shambles. On Friday, it suspended Cambridge Analytica’s and Wylie’s Facebook accounts. “If these reports are true, it’s a serious abuse of our rules. All parties involved—including the SCL Group/Cambridge Analytica, Christopher Wylie, and Aleksandr Kogan—certified to us that they destroyed the data in question,” Facebook V.P. and deputy general counsel Paul Grewal said in a statement. Yet according to the Times, which has reportedly seen sets of the harvested data, Facebook “downplayed the scope of the leak and questioned whether any of the data still remained out of its control.” Nearly two years after the existence of the data was first reported, Wylie said he got a letter asking him to delete it, and to check a box confirming he had done so. “That to me was the most astonishing thing,” he told the Observer. “They waited two years and did absolutely nothing to check that the data was deleted. All they asked me to do was tick a box on a form and post it back.”

Before its story hit the presses, Facebook’s lawyer reportedly warned the Observer that it was making “false and defamatory” allegations. And when the article was published anyway, a clutch of the company’s executives channeled a similar approach on Twitter. “This was unequivocally not a data breach,” wrote Andrew Bosworth. “No systems were infiltrated, no passwords or information were stolen or hacked.” Facebook chief security officer Alex Stamos, meanwhile, wrote that “the recent Cambridge Analytica stories by The New York Times and the Guardian are important and powerful, but it is incorrect to call this a ‘breach’ under any reasonable definition of the term.” Critics were quick to point out that this clarification was, if anything, more damning.

Stamos’s tweets were later deleted, and later on Monday it was reported that he would exit the company in August, allegedly over internal disagreements about Facebook’s role in spreading disinformation. According to The New York Times, Stamos had clashed with top executives including Sheryl Sandberg over how much of Russia’s activity Facebook should disclose.

Bosworth continued his efforts at damage control on Monday, with a lengthy Facebook post to “share some of the answers” he’d gathered after a weekend of blowback. “Isn’t this bad?” he asked, in a final rhetorical question. “Yes,” he replied. “This issue can no longer happen the way it did given what we fixed in the product three years ago, but that doesn't change what happened. It’s a breach of trust.”

This article has been updated.