Tech

YouTube says it will recommend fewer videos pushing conspiracy theories and other misinformation

Key Points
  • The company has come under fire in recent months for highlighting conspiracy theories and politically-motivated attack ads alongside otherwise unassuming content.
  • Any improvements to the recommendation system will likely be well-received, if not difficult to maintain amid a general polarization of online content. 
  • Ultimately, YouTube said, the change will only affect a "very small set of videos." 
Google CEO Sundar Pichai speaks on stage during the annual Google I/O developers conference in Mountain View, California, May 8, 2018.
Stephen Lam | Reuters

Google-owned YouTube said Friday that it will suggest fewer controversial videos as a means to improve its recommendation system.

The company has come under fire in recent months for highlighting conspiracy theories and politically-motivated attack ads alongside otherwise unassuming content. YouTube has seen particular criticism for suggesting extremist videos alongside children's content and for featuring misinformation around national tragedies.

Any improvements to the recommendation system will likely be well-received, although difficult to maintain amid a general polarization of online content.

Google-owned YouTube will begin "taking a closer look at how we can reduce the spread of content that comes close to—but doesn't quite cross the line of — violating our Community Guidelines," the company said in a blog post.

"To that end, we'll begin reducing recommendations of borderline content and content that could misinform users in harmful ways — such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11."

This isn't the first time the company has addressed the problem of misleading videos. Last year, YouTube CEO Susan Wojcikci announced that YouTube would start featuring links to Wikipedia and other reliable sources on videos that promoted conspiracy theories. The tactic has not stopped the spread of those videos, however, as YouTube's recommendation algorithm still sends people to a wide variety of bizarre and misleading content, as a recent test from BuzzFeed showed.

In limiting the videos that are recommended, YouTube could face similar backlash to the type Facebook and Twitter have faced from conservative posters and political pundits who claim the companies are intentionally restricting their reach.

"To be clear, this will only affect recommendations of what videos to watch, not whether a video is available on YouTube," the company said. "We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users."

Ultimately, YouTube said, the change will only affect a "very small set of videos."

WATCH: YouTube's top earners made $180.5 million in 2018

YouTube's top earners made $180.5 million in 2018
VIDEO1:3201:32
YouTube's top earners made $180.5 million in 2018