KEY POINTS
  • Facebook rolled out a new machine learning tool last year to identify images with both children and nudity.
  • The tool helped moderators remove 8.7 million images containing child nudity over the last quarter.
  • The COO at the National Center for Missing and Exploited Children said the group expects 6 million more tips from Facebook and other tech companies compared to last year.
Facebook founder and CEO Mark Zuckerberg.

Facebook said on Wednesday that company moderators during the last quarter removed 8.7 million user images of child nudity with the help of previously undisclosed software that automatically flags such photos.

The machine learning tool rolled out over the last year identifies images that contain both nudity and a child, allowing increased enforcement of Facebook's ban on photos that show minors in a sexualized context.