YouTube updated its medical misinformation policy Wednesday to include Covid-19 vaccine information that goes against expert consensus.
"In the coming weeks, we will also have more to announce on the work we're doing to raise authoritative sources on our site related to Covid-19 vaccine content," the Alphabet-owned company said in a statement.
The policy expansion comes as Google gets ready for a slew of misinformation it expects — and is already seeing — on its video platform. It gave examples of false claims such as that a vaccine will kill people or cause infertility or that microchips will be implanted in people who receive the vaccine.
Google's YouTube, like other social media, is under pressure to contain misinformation that has proliferated since the beginning of the coronavirus pandemic in early 2020. Drug companies are testing vaccines but most don't expect the public to have access until mid-2021.
Google has for some time not allowed anti-vaccine content in its ads and hasn't allowed publishers to monetize that content under its "inappropriate content" policies.
The company said when it comes to a potential Covid vaccination, it will take action against false claims about vaccines or ads that discourage vaccines under an existing policy that prohibits content that contradicts authoritative scientific consensus relating to a current major health crisis. The company said it removed over 200,000 videos related to dangerous or misleading Covid information since early February.