Facebook won’t be drawn on whether it’s changing its policies on images of child violence after leaks

Facebook CEO Mark Zuckerberg
Getty Images

Facebook has been under scrutiny this week after its policies dealing with sexual and violent imagery and videos were leaked and published by the Guardian, including details of guidelines on violence against children.

But it won't say whether changes to specific policies will be made as a result of the leak. The documents suggest that some images of non-sexual physical abuse and bullying of children do not have to be deleted, and Facebook says that publishing some of these can help children to be rescued. But U.K. children's charity the NSPCC told the Guardian that it wants extremist content to be taken down automatically, using algorithms.

Facebook has a team of experts who review its policies relating to all types of imagery, and it is thought that these include adapting those to societal attitudes. Context and culture are important in deciding whether to remove something, Monika Bickert, Facebook's head of global policy management, wrote in a post on the Guardian, citing the example of criticizing the monarchy, which may be acceptable in the U.K. but is illegal in other countries.

But Nicola Mendelsohn, Facebook's EMEA vice president, did not give specific details on whether or how it will adapt its policies relating to violence against children, when asked directly.

Nicola Mendelsohn is vice president of Facebook EMEA

"We've said it a lot, but it is definitely worth saying it again that keeping people safe is the most important thing that we do, and we are all working hard to make sure that Facebook is as safe as possible," she told CNBC by phone.

Mendelsohn reiterated Zuckerberg's announcement earlier this month that a further 3,000 content reviewers would be hired this month to look at the "millions of reports" it gets weekly, in addition to the 4,500 it currently has.

"So that doesn't just mean bringing more people in, it also means we need to keep refining and building better tools in order to keep the community safe as well, as well as continuing to make it easier and simpler for people to actually report to us," she added.

"We are a tech company, we are always learning, we are always adapting and we are always building, that's at the heart of what we are, but at the heart of it the most important thing is that we keep people safe, that's what matters more than anything."

Facebook was heavily criticized for removing the "Terror of War" photo in September 2016, which shows a naked girl running away from a napalm attack during the Vietnam War, and Mendelsohn says policies were changed as a result.

"The 'Terror of War' picture, we did learn and we did change our policy. It's a really interesting example where, most people would not want a naked picture of a young child in any of their feeds, but give it a historical context and it changes the nature of the conversation," she told CNBC.

Asked whether imagery reaches Mendelsohn to make a final decision she said: "It's the work of the community operations team who are properly trained to look at different things, so that is their area to look after."

Other tech companies such as Google have also been under scrutiny for violent and extreme content that appears online, and in March apologized to brands that their ads have been appearing next to such videos on YouTube.

Follow CNBC International on Twitter and Facebook.