- Google outlines four new steps it's taking to fight online extremist content on YouTube.
- These include boosting resources to develop technology to identify terrorist-related content and making it harder to discover videos that contain inflammatory religious or supremacist content.
- It comes after internet firms have received criticism from politicians about not dealing with extremist content quick enough.
Google has outlined four steps it's taking to fight the spread of extremist material on its YouTube video service.
Kent Walker, general counsel at Google, said Sunday the U.S. technology giant is "committed to being part of the solution" to tackling online extremist content.
"Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all," Walker wrote in a blog post.
"There should be no place for terrorist content on our services."
The four new steps are:
- Putting more engineering resource into developing further artificial intelligence software that can be trained to identify and remove extremist content.
- Expanding the number of independent experts in YouTube's Trusted Flagger program. Google will add 50 expert non-government organizations to the 63 organizations that are already part of the program, and support them with additional grants. Google said Trusted Flagger reports are accurate over 90 percent of the time.
- Taking a tougher stance against videos that do not clearly violate YouTube's rules. For example, a video that has inflammatory religious or supremacist content will appear behind a warning, will not be monetized, recommended or even eligible for users to make comments on. The aim is to make these videos have less engagement so they are harder to find.
- YouTube is working with Jigsaw – a company behind "The Redirect Method" – which uses ad targeting to send potential ISIS recruits to anti-terrorist videos, which could change their mind about joining extremist organizations. Google said that in previous trials of this system, potential recruits have clicked through on the ads at an "unusually high rate" and watched over half a million minutes of video content that "debunks terrorist recruiting messages."
The latest measures build upon Google's previous efforts to fight extremist content on its platform amid a broader criticism of internet companies from politicians.
U.K. Prime Minister Theresa May urged technology companies to do more to tackle online extremism following the Manchester Arena bombing last month. May, alongside French President Emmanuel Macron, even said they would look at proposals to fine internet companies that fail to take down such content.
Google admitted that the problem of fighting terrorist content online is tough, but said it is committed to doing more.
"Extremists and terrorists seek to attack and erode not just our security, but also our values; the very things that make our societies open and free. We must not let them," Walker said.
"Together, we can build lasting solutions that address the threats to our security and our freedoms. It is a sweeping and complex challenge. We are committed to playing our part."