Skip to content
Link copied to clipboard

A whistleblower’s power: Key takeaways from the Facebook Papers

A personal decision by Facebook CEO Mark Zuckerberg leads to a crackdown on dissent in Vietnam. Measures to suppress hateful, deceptive content are lifted after the American presidential election.

Former Facebook employee Frances Haugen testifies during a Senate Committee on Commerce, Science, and Transportation on Oct. 5.
Former Facebook employee Frances Haugen testifies during a Senate Committee on Commerce, Science, and Transportation on Oct. 5.Read moreDrew Angerer / Abaca Press

A personal decision by Facebook CEO Mark Zuckerberg leads to a crackdown on dissent in Vietnam. Measures to suppress hateful, deceptive content are lifted after the American presidential election in 2020, as pro-Trump groups disputing the legitimacy of the election experience “meteoric” growth. A dummy test account on Facebook in India is flooded with violent anti-Muslim propaganda - which remains visible for weeks on the real account of a frightened Muslim college student in northern India.

A trove of internal Facebook documents reveals that the social media giant has privately and meticulously tracked real-world harms exacerbated by its platforms, ignored warnings from its employees about the risks of their design decisions and exposed vulnerable communities around the world to a cocktail of dangerous content.

The Facebook Papers
The Facebook Papers project is a collaboration among 17 American news organizations. Journalists from a variety of newsrooms worked together to gain access to thousands of pages of internal company documents obtained by Frances Haugen, the former Facebook product manager-turned-whistle-blower. A separate consortium of European news outlets had access to the same set of documents. Each member of the consortium pursued its own independent reporting on the document contents and their significance.

Disclosed to the U.S. Securities and Exchange Commission by whistleblower Frances Haugen, the Facebook Papers were provided to Congress in redacted form by Haugen's legal counsel. The redacted versions were reviewed by a consortium of news organizations, including The Washington Post, which obtained additional internal documents and conducted interviews with dozens of current and former Facebook employees.

A mix of presentations, research studies, discussion threads and strategy memos, the Facebook Papers provide an unprecedented view into how executives at the social media giant weigh trade-offs between public safety and their own bottom line. Some of the documents were first reported by the Wall Street Journal.

Here are key takeaways from The Post's investigation:

Zuckerberg’s public claims often conflict with internal research

Haugen references Zuckerberg's public statements at least 20 times in her SEC complaints, asserting that the CEO's unique degree of control over Facebook forces him to bear ultimate responsibility for a litany of societal harms caused by the company's relentless pursuit of growth.

The documents also show that Zuckerberg's public statements are often at odds with internal company findings.

For example, Zuckerberg testified last year before Congress that the company removes 94 percent of the hate speech it finds. But in internal documents, researchers estimated that the company was removing less than 5 percent of all hate speech on Facebook.

Facebook spokeswoman Dani Lever denied that Zuckerberg "makes decisions that cause harm" and dismissed the findings, saying they are "based on selected documents that are mischaracterized and devoid of any context."

Facebook dropped its guard before the Jan. 6 insurrection

During the run-up to the 2020 U.S. presidential election, the social media giant dialed up efforts to police content that promoted violence, misinformation and hate speech. But after Nov. 6, Facebook rolled back many of the dozens of measures aimed at safeguarding U.S. users. A ban on the main Stop the Steal group didn't apply to the dozens of look-alike groups that popped up in what the company later concluded was a "coordinated" campaign, documents show.

By the time Facebook tried to reimpose its "break the glass" measures, it was too late: A pro-Trump mob was storming the U.S. Capitol.

Facebook officials said they planned exhaustively for the election and its aftermath, anticipated the potential for post-election violence, and always expected the challenges to last through the inauguration of President Biden on Jan. 20.

Facebook fails to effectively police content in much of the world

For all Facebook's troubles in North America, its problems with hate speech and misinformation are dramatically worse in the developing world. Documents show that Facebook has meticulously studied its approach abroad, and is well aware that weaker moderation in non-English-speaking countries leaves the platform vulnerable to abuse by bad actors and authoritarian regimes.

According to one 2020 summary, the vast majority of its efforts against misinformation - 84 percent - went toward the United States, the documents show, with just 16 percent going to the "Rest of World," including India, France and Italy.

Though Facebook considers India a top priority, activating large teams to engage with civil society groups and protect elections, the documents show that Indian users experience Facebook without critical guardrails common in English-speaking countries.

Facebook's Lever said the company has made "progress," with "global teams with native speakers reviewing content in over 70 languages along with experts in humanitarian and human rights issues."

"We've hired more people with language, country and topic expertise," Lever said, adding that Facebook has "also increased the number of team members with work experience in Myanmar and Ethiopia to include former humanitarian aid workers, crisis responders and policy specialists."

Facebook chooses maximum engagement over user safety

Zuckerberg has said the company does not design its products to persuade people to spend more time on them. But dozens of documents suggest the opposite.

The company exhaustively studies potential policy changes for their effects on user engagement and other factors key to corporate profits. Amid this push for user attention, Facebook abandoned or delayed initiatives to reduce misinformation and radicalization.

One 2019 report tracking a dummy account set up to represent a conservative mother in North Carolina found that Facebook's recommendation algorithms led her to QAnon, an extremist ideology that the FBI has deemed a domestic terrorism threat, in just five days. Still, Facebook allowed QAnon to operate on its site largely unchecked for another 13 months.

"We have no commercial or moral incentive to do anything other than give the maximum number of people as much of a positive experience as possible," Facebook's Lever said, adding that the company is "constantly making difficult decisions."

The Washington Post’s Elizabeth Dwoskin, Shibani Mahtani, Cat Zakrzewski, Craig Timberg, Will Oremus and Jeremy Merrill contributed to this report.