Facebook shares stay positive despite release of new whistleblower documents

Key Points
  • The Facebook Papers include stories from 17 U.S. news outlets with access to internal documents provided by former employee Frances Haugen.
  • The documents shed light on Facebook's handling of Jan. 6 and hate speech in languages outside of English.
  • A Facebook spokesperson said the company does not put profits over people's well-being.

In this article

Facebook beats earnings, misses on revenue — What five investors are saying about the stock now
Facebook beats earnings, misses on revenue — What five investors are saying about the stock now

The Facebook Papers, a series of articles published by a consortium of 17 U.S. news outlets beginning Friday, shed new light on the company's thinking behind its actions leading up to the Capitol insurrection on Jan. 6 and its ability to fend off hate speech in languages outside of English.

Facebook shares ended the trading day Monday up 1.3% after the news outlets published their stories based on the leaked documents. The company is also scheduled to report quarterly earnings after markets close Monday.

The documents were provided to the news outlets by Frances Haugen, a former Facebook employee who took tens of thousands of pages of internal research with her before she left. She's since provided those documents to Congress and the Securities and Exchange Commission, seeking whistleblower status.

"At the heart of these stories is a premise which is false," a Facebook spokesperson said in a statement in response to the flood of reporting. "Yes, we're a business and we make profit, but the idea that we do so at the expense of people's safety or wellbeing misunderstands where our own commercial interests lie. The truth is we've invested $13 billion and have over 40,000 people to do one job: keep people safe on Facebook."

Here are some of the major themes the Facebook Papers have explored so far:

Jan. 6

The documents revealed frustration among Facebook's ranks about the company's ability to get the spread of content that potentially incites violence under control.

"Haven't we had enough time to figure out how to manage discourse without enabling violence?" an employee wrote on an internal message board during the riot outside the U.S. Capitol on Jan. 6, according to The Associated Press. "We've been fueling this fire for a long time and we shouldn't be surprised it's now out of control."

Facebook had put additional emergency measures in place ahead of the 2020 election to stem the spread of violent or dangerous content if needed. But as many as 22 of those measures were set aside after the election and before Jan. 6, internal documents reviewed by AP showed.

A Facebook spokesperson told the outlet its use of those measures followed signals from its own platform and law enforcement.

Language barriers

Some of the reports showed how Facebook's content moderation systems can fall flat when faced with languages besides English.

AP reported that Arabic poses a particularly difficult challenge for content moderators. Arabic-speaking users have learned to use symbols or extra spaces in words thought to set off flags in Facebook's systems, like the names of militant groups.

While the methods are meant by some to avoid an overzealous content moderation system, AP reported that certain measures have managed to avoid Facebook's hate speech censors.

"We were incorrectly enforcing counterterrorism content in Arabic," an internal Facebook document said, according to AP. Meanwhile, it said, the system "limits users from participating in political speech, impeding their right to freedom of expression."

Facebook told AP it's put more resources into recruiting local dialect and topic experts, and has researched ways to improve its systems.


Other reports show that some Facebook employees were dismayed by the company's handling of misinformation in India, believing leadership made decisions to avoid angering the Indian government.

Hate speech concerns in the region were amplified by similar language barrier issues as in the Middle East. According to AP, Facebook added hate speech classifiers in Hindi and Bengali in 2018 and 2020, respectively.

One researcher who set up an account as a user in India in 2019 found that by following Facebook's algorithm recommendations, they saw "more images of dead people in the past three weeks than I've seen in my entire life total," in the News Feed, according to The New York Times.

A Facebook spokesperson told the Times that hate speech against marginalized groups in India and elsewhere has been growing, and it's "committed to updating our policies as hate speech evolves online."

Retaining users

Other reports showed the existential issues facing the company if it failed to hold onto enough young users.

The platform is already experiencing a dip in engagement among teens, The Verge reported based on the internal documents.

"Most young adults perceive Facebook as a place for people in their 40s and 50s," a March presentation from a team of data scientists said, according to The Verge. "Young adults perceive content as boring, misleading, and negative. They often have to get past irrelevant content to get to what matters."

The documents showed that Facebook plans to test several ideas to increase teen engagement, like asking young users to update their connections and tweaking the News Feed algorithm to show users posts from outside their own network.

A Facebook spokesperson told The Verge that the platform is "no different" from any social media site that wants teens to use its services.


Facebook has spent the past few years fighting the label of a monopoly, which many lawmakers and academics say is appropriate for a platform of its scale.

But among its ranks, Facebook employees acknowledge the vast power of the platform with details that could fuel ongoing and future antitrust lawsuits. The FTC recently filed an amended complaint alleging Facebook illegally maintained monopoly power in personal social networking services after a judge threw out its initial claims.

According to a report from Politico, 78% of American adults and nearly all teens in the U.S. use Facebook's services. Even though competitors like TikTok and Snap have made progress with teen users, Facebook and Instagram continue to maintain a stronghold on activities like connecting with others on common interests and sharing photos and videos, according to a survey of users last year.

And once they sign up, few actually leave the platforms, Facebook's own research reportedly shows.

In a 2018 presentation reviewed by Politico, employees wrote that despite "Facebook-the-company" doing only "okay" with teens around the world, "we do have one of the top social products — with growing market share — almost everywhere."

Facebook spokesperson Christopher Sgro told Politico that, "Far from supporting the government's case, the documents presented to Facebook firmly reinforce what Facebook has always said: We compete with a broad range of services for people's time and attention, including apps that offer social, community, video, news and messaging features."

Subscribe to CNBC on YouTube.

WATCH: The messy business of content moderation on Facebook, Twitter, YouTube

Why content moderation costs billions and is so tricky for Facebook, Twitter, YouTube and others
The big, messy business of content moderation on Facebook, Twitter, YouTube