Tech

Facebook leaks reportedly reveal why it allows some videos of suicide and deaths to stay online

Key Points
  • Facebook's internal guidelines on dealing with sex and violence were leaked and published by The Guardian.
  • The documents outline what is acceptable and what is inappropriate on the social networking site.
The Guardian leaks Facebook's internal guidelines
VIDEO0:3800:38
The Guardian leaks Facebook's internal guidelines

Facebook's internal guidelines on how it deals with a swathe of posted content from violence to revenge porn, has been leaked and published by The Guardian.

The documents reveal the often strenuous conditions moderators work under which includes having to make a decision on whether a post should be removed in just 10 seconds, according to the report.

The Guardian said it had seen over 100 internal training manuals, spreadsheets and flowcharts, on how Facebook moderators treat issues such as violence, hate speech, terrorism, pornography, racism and self-harm.

According to the documents, moderators are told:

  • Comments such as "someone shoot Trump" should be taken down as he is a head of state. But a comment such as: "To snap a b----'s neck, make sure to apple all your pressure to the middle of her throat", is acceptable because it is not seen as a credible threat.
  • Videos of violent deaths may not always need to be taken down because they can help "create awareness".
  • Videos of abortion only violate if they contain nudity.
  • To leave livestreams of attempts to self-harm and only remove the footage "once there's no longer an opportunity to help the person", the Guardian reports.

"Keeping people on Facebook safe is the most important thing we do. We work hard to make Facebook as safe as possible while enabling free speech. This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously," Monika Bickert, head of global policy management at Facebook, said in a statement on Monday.

"Mark Zuckerberg recently announced that over the next year, we'll be adding 3,000 people to our community operations team around the world — on top of the 4,500 we have today — to review the millions of reports we get every week, and improve the process for doing it quickly."

Facebook has come under heavy criticism in recent months over the way it has tackled inappropriate content on its platform.

Read the full Guardian report here.