Tech

YouTube says it will add more people to vet extremist content on its platform

Key Points
  • Alphabet-owned YouTube said it will add more people to review and remove violent or extremist content from the video platform
  • YouTube CEO Susan Wojcicki said in a blog post that the company is also launching new comment moderation tools and in some instances shutting down comments entirely
  • The aim, according to the blog post, is to increase the total number of people across Google who are working to remove content that violate the company's policies

Alphabet's YouTube said Monday it plans to add more people next year to review and remove violent or extremist content on the video platform.

YouTube is taking stern actions to protect its users against inappropriate content with stricter policies and larger enforcement teams, YouTube CEO Susan Wojcicki said in a blog post.

"We are also taking aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether," Wojcicki said.

The goal is to bring the total number of people across Google working to address content that might violate its policies to over 10,000 in 2018, she said.

YouTube last week updated its recommendation feature to spotlight videos users are likely to find the most gratifying, brushing aside concerns that such an approach can trap people in bubbles of misinformation and like-minded opinions.

YouTube had been facing a lot of criticism from advertisers and regulators and advocacy groups for failing to police content and account for the way its services shape public opinion.