- Facebook executives took the stage at the Code Conference on Tuesday to defend the company's failure to anticipate how bad actors would use the platform.
- COO Sheryl Sandberg emphasized that threats are always evolving, and Facebook is trying to anticipate future threats, not just react.
How come nobody from Facebook got fired over the Cambridge Analytica data leak scandal?
That was the first question moderator Kara Swisher asked Facebook COO Sheryl Sandberg on stage at the Code Conference in Rancho Palos Verdes, California, on Tuesday.
"People do get fired," Sandberg said, but Facebook doesn't trot them out as examples.
But in the end, she said, responsibility belongs at the top. Mark Zuckerberg built the platform, and Sandberg and other members of the senior leadership team didn't anticipate well enough what would happen when all of humanity was using it.
Both Sandberg and Facebook CTO Mike Schroepfer tried to explain the difficulty of striking the right balance between free speech and safety on the platform. Schroepfer said there's a tension between "giving people tools for free expression, and really locking things down" by having human moderators read and vet every post on the site.
"We always had ways for people to control your data," he said, but now Facebook put it at the top of everybody's news feed and made it easy to delete it. "All of those controls existed, they were just harder to find for people, we just made them easier."
Addressing the reports of Russians using fake accounts to spread misinformation to sway the 2016 election, Sandberg once again said that they just didn't see it coming.
"Threats change," she said. In 2016, people were largely worried about spamming and phishing, following things like the Sony email hacks.
"We didn't see coming a different kind of more insidious threat, but once we saw it, we did publish a white paper" and made a series of changes. She said Facebook is taking strong steps for the 2018 midterm elections and the company looks forward to facing the challenge of bad actors trying to use Facebook to influence the results.
Sandberg also said that Facebook was thinking about how to disrupt the economic incentives that existed for generating outrageous stories -- for example by taking clickbait farms out of Facebook's ad networks.
Sandberg argued that Facebook should not be broken up under antitrust laws, noting that there are safety benefits to having multiple products under the same company's control.
"If you are doing child exploitative content, WhatsApp is encrypted," she said, meaning it could allow criminals to exchange information without getting caught. But Facebook will know who they are when they post publicly on that platform and will be able to suspect their WhatsApp accounts as well.
Schroepfer also said that the company is working on allowing users to delete all information that Facebook has on them, similar to clearing information like cookies and sites visited from a web browser.
Finally, Sandberg noted that the company is making "huge investments" that will affect profitability to make the platform safer and prevent these kinds of abuse.
"It's the biggest cultural shift I've seen in the whole history of the company," said Schroepfer.