Tech Drivers

Facebook has been talking up its third-party fact-checkers, but at least one says it's checking just one post per day

Key Points
  • FactCheck.org, one of several third-party reviewers contracted by Facebook, told The Wall Street Journal it reviews less than one Facebook post per day, on average.
  • Facebook has been ramping up detection efforts in light of foreign interference in the 2016 presidential election and ahead of the midterm elections in November.
  • The company has promised 20,000 employees will be working on content moderation and integrity concerns by the end of the year but says there are still limitations.
Getty Images

Facebook has been talking up its third-party fact-checking partners as the company continues its war against fake news and misinformation campaigns.

But at least one partner isn't actually reviewing much content.

FactCheck.org, one of several third-party reviewers contracted by Facebook, told The Wall Street Journal it reviews less than one Facebook post per day, on average.

Facebook has been ramping up detection efforts in light of foreign interference in the 2016 presidential election and ahead of the midterm elections in November. CEO Mark Zuckerberg testified before Congress that the company's artificial intelligence algorithms were getting better but that human fact-checkers were essential to the "arms race" of content moderation.

This week, Facebook showed off its recently formed "War Room" dedicated to election security.

FactCheck assigned two full-time staffers of its eight-person team to work specifically with Facebook, the company told the Journal. But its staffers are seeing little work. Other third-party organizations reported similar workloads, the Journal reported.

"There's no silver bullet to fighting misinformation, which is why we take a multi-prong approach," a Facebook spokesperson said in a statement to CNBC. "Fact-checking is an important part of our effort, but it's [by] no means the only strategy we're using."

Facebook also targets fake accounts and behavior commonly associated with false posts, such as clickbait, the spokesperson said.

Facebook first announced the fact-checking partners in 2016 and at the time identified FactCheck.org, Snopes, Politifact and ABC News. The company now includes the Associated Press and The Weekly Standard Fact Check among its U.S.-based reviewers.

Facebook has promised 20,000 employees will be working on content moderation and integrity concerns by the end of the year but says there are still limitations.

"Even where fact-checking organizations do exist, there aren't enough to review all potentially false claims online. It can take hours or even days to review a single claim," Tessa Lyons, a product manager in charge of News Feed, said in a June blog post.

Representatives for FactCheck.org were not immediately available to comment.

Inside Facebook's effort to fight election manipulation
VIDEO2:0502:05
Inside Facebook's effort to fight election manipulation