Tech

YouTube removed 1.6 million channels last quarter, mostly for being spam or scams

Key Points
  • Google's video platform YouTube removed 7.85 million videos and 1.67 million channels between July and September.
  • This is the first time YouTube has listed its channel deletion, most of which it said was due to spam or scam content, and the first time it included information about deleted comments. 
  • In the past year, the company has faced increased criticism for pushing users towards conspiracy theories and extreme content.
YouTube CEO Susan Wojcicki speaks onstage during the YouTube Brandcast 2018 presentation at Radio City Music Hall on May 3, 2018 in New York City.
Source: YouTube

Google's battle against inappropriate content on its video platform rages on.

YouTube removed 7.85 million videos and 1.67 million channels between July and September, according to its latest YouTube Community Guidelines enforcement report. This is the fourth such report YouTube has published, but the first that includes information about removing channels, versus just individual videos. Because of the 1.67 million channels suspended, an additional 50.2 million videos disappeared too. 

YouTube will delete a channel entirely if it receives three strikes within three months or commits a single egregious violation, like child sexual exploitation. The most high-profile removal of the year came in August, when YouTube deleted the channel of right-wing conspiracy theorist and InfoWars radio host Alex Jones.

YouTube says that most of the videos it removed — 79.6 percent — violated its policies on spam, misleading content or scams, while 12.6 percent were removed for nudity or sexual content. Only about 1 percent of channels were removed for promotion of violence, violent extremism, harassment or hateful or abusive content, although videos of that nature have attracted the most scrutiny in the past year.

The site has been the recent the subject of several investigations showing how it highlights extreme content, like conspiracy theories or hyperpartisan points of view, over more measured videos. Google CEO Sundar Pichai was grilled during his congressional testimony earlier this week about a specific a conspiracy theory about Hillary Clinton and other politicians and celebrities drinking children's blood.

Pichai said that YouTube is "constantly undertaking efforts to deal with misinformation," but that there was "more work to be done."

The crux of the issue is that while YouTube's "Community Guidelines" include removing videos that "incite harm or violence," it does not remove videos simply for containing falsehoods. Although conspiracies, like the infamous #Pizzagate theory that led to shooting in a Washington, D.C., pizza shop, may ultimately inspire acts of violence, the videos don't explicitly do so, which means that YouTube generally won't remove them. In the past year, YouTube has made efforts to surface more authoritative content and has started linking videos that promote conspiracy theories to "fact-based" sites like Wikipedia pages. Late last year, Google vowed to have 10,000 people focused on content violations by the end of 2018, and a spokesperson tells CNBC that it's on target to hit that goal.

Beyond the societal concerns, this content has caused YouTube's financials to suffer, too. Last March, top brands pulled their ads off the video site after some were founding running alongside extremist content. 

YouTube said that 80 percent of the videos it removed in the third quarter were first detected by machines and that of those, 74.5 percent never received a single view.

For the first time, YouTube also broke out the number of violative comments it removed: 224 million in the third quarter.

You can view the full report here.

WATCH: Google's Larry Page has backed two flying-car start-ups — here's a look inside one of them

This Larry Page-backed 'flying car' will cost the same as an SUV and will hit the market next year
VIDEO4:1104:11
This affordable 'flying car' will hit the market in 2019