Marketing.Media.Money

Facebook, YouTube and Twitter may only get one hour to remove extreme content with new European rules

Adam Jeffery | CNBC
Key Points
  • Social media sites are likely to have only one hour to remove terrorist content in rules being drafted by the European Union, according to the Financial Times.
  • Removal of such content in an hour is currently voluntary, but draft rules by the European Commission that would force companies to remove it within that time frame are set to be published next month, the newspaper reported.

Social media sites are likely to have only one hour to remove terrorist content in rules being drafted by the European Union, according to the Financial Times.

Removal of such content in an hour is currently voluntary, but draft rules by the European Commission that would force companies to remove it within that time frame are set to be published next month, the newspaper reported.

Companies such as Facebook, YouTube and Twitter would have to remove the content within an hour of it being flagged as illegal by law enforcement bodies. The draft regulation would apply to websites of all sizes, according to the EU Commissioner for Security Julian King. He said that policies for the removal of videos and other posts were not always clear, telling the FT: "All this leads to such content continuing to proliferate across the internet, reappearing once deleted and spreading from platform to platform."

Facebook taking down more content on terrorist groups
VIDEO1:0701:07
Facebook taking down more content on terrorist groups

The EU has previously said that illegal content is particularly damaging when it is first published. "Terrorist content is most harmful in the first hours of its appearance because of its fast spreading and entails grave risks to citizens and society at large," it stated on its website in March, as it tightened its voluntary guidelines.

Facebook said in an April blog post that it removed 1.9 million pieces of ISIS and Al-Qaeda content during the first quarter of 2018, with 99 percent being taken down before it was reported. Google-owned YouTube uses machine learning to identify and remove videos, with 90 percent identified by computers. Google said in April it is set to have 10,000 people working to address content that violates its guidelines by the end of 2018.

YouTube and Facebook said they were not commenting on the draft regulation, while Twitter had not responded to CNBC's request for comment.

Read the full report by the Financial Times here.