Tech

Inside Facebook's 'war room,' where the company is fighting to stop election manipulation ahead of the midterms

Key Points
  • Facebook has assembled teams from across the company in a 900-square-foot "war room," to help identify and stop attempts to manipulate elections, including the Nov. 6 U.S. midterms.
  • WhatsApp, Instagram, operations, software engineering, data science, research operations, legal, policy, communications — they're all represented in the room.
  • Ahead of this month's votes in the Brazilian presidential election, the company identified an effort to suppress voter turnout with fake posts saying the election was delayed due to protests. Facebook was quickly able to shut it down.
Inside Facebook's effort to fight election manipulation
VIDEO2:0502:05
Inside Facebook's effort to fight election manipulation

It's a windowless room, packed with about two dozen desks, a half-dozen screens showing TV news and Twitter feeds and even more monitors lining the walls tracking trends in Facebook user behavior.

This is Facebook's first ever "war room," designed to prevent election manipulation by improving data-sharing across the company and enabling quick decision-making. This roughly 900-square-foot room, which Facebook recently showed to journalists, is a visual representation of the company's commitment to dramatically improving communication and security ahead of the Nov. 6 U.S. midterms.

This demonstration of Facebook's internal efforts comes after a long string of security breaches and privacy hacks, going back to Russian manipulation of the 2016 presidential elections. Since the revelation of the Cambridge Analytica privacy scandal in March, Facebook shares have fallen 14 percent. Now, the social-media giant is pulling out all the stops to prevent another debacle and more negative headlines.

With less than three weeks before the U.S. election, and even less time ahead of the Oct. 28 runoff for the Brazilian presidential election, this room is the hub for Facebook's work to identify the spread of fake news and quickly shut it down. The company says its current combination of technology and 20,000 employees focused on safety and security would have blocked the Russian manipulation of the 2016 election.

"We've essentially done much scenario planning and 'war games' internally within the war room to plan out different types of problems that we may see," said Samidh Chakrabarti, who oversees Facebook's elections and civic engagement team. "We've practiced and we've done drills to see how we can detect that, how we can come to quick decisions, and how we can take quick action."

The war room is staffed from 4 a.m. until midnight, and as of next week, will be buzzing 24/7 with representatives from teams that represent every corner of the company. WhatsApp, Instagram, operations, software engineering, data science, research operations, legal, policy, communications — they're all represented in the room. Charts of user behavior on Facebook and its other apps are on monitors around the room. Facebook uses machine learning and artificial intelligence to monitor the spikes that could point to hate speech, fake news going viral or efforts at voter suppression.

Nathaniel Gleicher, Facebook's head of cybersecurity, said the company's goal is that the election be fair, and that "debate around the election be authentic. ... The biggest concern is any type of effort to manipulate that."

Ahead of the Brazilian vote, the company identified an effort to suppress turnout and was able to shut it down quickly, thanks in part to the proximity of so many teams in a single room.

"Content that was telling people that due to protests, that the election would be delayed a day," said Chakrabarti. "This was not true, completely false. So we were able to detect that using AI and machine learning. The war room was alerted to it. Our data scientists looked into what was behind it and then they passed it to our engineers and operations specialists to be able to remove this at scale from our platform before it could go viral."

Facebook is combining its teams focused on the U.S. and Brazilian elections because fighting what the company calls "bad actors" is a global problem that never ends. The idea is that these teams can share information about the latest tactics they're seeing and share best practices for blocking them.

Gleicher warns that Facebook is seeing growing efforts to manipulate the public debate as we get closer to the midterms.

"Part of the reason we have this war room up and running, is so that as these threats develop, not only do we respond to them quickly, but we continue to speed up our response, and make our response more effective and efficient." And Gleicher says it's not just foreign interference but also domestic "bad actors" who are concealing their identity, using fake accounts and manipulating content on the site as they aim to spread fake news.

Facebook's decision to showcase its war room comes on the heels of its announcement that it will take down posts aimed at voter suppression, rather than simply minimizing their spread.

"This is always going to be an arms race, so the adversaries that we're facing who seek to meddle in elections, they are sophisticated and well-funded," said Chakrabarti. "That is the reason we've made huge investments both in people and technology to stay ahead and secure our platforms."