YouTube issued a strike against controversial conspiracy theorist Alex Jones' channel, and removed four of his videos from the site.
Jones claimed the videos concerned were "critical of liberalism," and linked to a statement on his website, InfoWars, which displays the videos that were removed.
YouTube did not comment directly on the warning against Jones, or the removal of his videos, but highlighted its content guidelines concerning the endangerment of children and hate speech.
"We have long standing policies against child endangerment and hate speech," a spokesperson for YouTube said in an emailed statement. "We apply our policies consistently according to the content in the videos, regardless of the speaker or the channel. We also have a clear three strikes policy and we terminate channels when they receive three strikes in three months."
In one of the removed clips, entitled "How to Prevent Liberalism — A Public Service Announcement," a man can be seen briefly choking and shoving a child to the ground. Two others criticize Islam and the migration of Muslims to Europe.
As a result of the strike, Jones' YouTube channel will be unable to live stream for 90 days.
YouTube operates on a "three strikes and you're out policy." If an account is given a second strike within three months of receiving the first, it is unable to post videos for two weeks. Video makers who receive a third strike within a three-month period are banned.
Earlier this week, the controversial online personality, a supporter of President Donald Trump, appeared to threaten Special Counsel Robert Mueller, claiming that the former FBI chief covered up sex crimes. Jones called Mueller a "monster" and "a demon I will take down." In response to backlash over the comments, Jones called the controversy a "mainstream media hoax."
The strike against Jones by YouTube highlights yet another example in the ongoing battle for online content platforms to deal with misinformation being shared by users.
Multiple tech giants have been criticized over how they handle politically-charged content. On the one hand, right-wing commentators say platforms like Facebook and YouTube suppress and censor conservative viewpoints, while mostly left-leaning critics have lambasted tech firms for not doing enough to tackle the spread of fake news and other misinformation.
The use of personal data and algorithms to target consumers with tailored content has also proved to be a sticking point for many, especially when it comes to politics. Facebook, for example, has been embroiled in scandal over how it let a developer share the data of 87 million people with controversial analytics firm Cambridge Analytica — one which seems to have had a palpable effect on the firm's revenues.
It is feared that the prevalence of misinformation online and the way it was fed to consumers played a role in the outcome of both the 2016 presidential election and the U.K. Brexit referendum.
Meanwhile, YouTube owner Google was hit with a $5 billion antitrust fine from the European Union last week over the alleged abuse of the dominance of its Android mobile operating system. Alphabet, Google's parent company, reported better-than-expected earnings earlier this week, despite the fine. But concerns linger over how the EU's order that Google unbundle its apps from Android might affect the company's business in the long term.