- TikTok says it has deleted roughly 29,000 coronavirus-related videos in Europe for breaking its rules.
- Approximately 3,000 of those contained medical misinformation.
- TikTok and other social media platforms like Facebook and YouTube are attempting to fight the spread of misinformation.
TikTok said late Thursday it has deleted roughly 29,000 coronavirus-related videos in Europe for breaking its rules.
The Chinese-owned app, which allows users to post short video clips, says it does not allow misinformation that could harm people's health to be shared on its platform.
Approximately 3,000 of the clips contained medical misinformation, TikTok said.
"Based on our analysis, our overall conclusion is that we have seen low levels of content that violated our Community Guidelines related to Covid-19 in Europe," wrote Theo Bertram, TikTok's European director of government relations and public policy in a blog post.
Bertram added: "The total number of videos related to Covid-19 and the total number that violated our policies grew to a peak in March 2020, declining in April 2020, and falling significantly during May and June 2020."
TikTok shows a banner with the words "'Learn the facts about Covid-19" on videos that contain words, hashtags or music related to coronavirus. The banner redirects users to verifiable, trusted sources of information. The company said the banner had been displayed on over 7 million videos in Europe.
Chris Stokel-Walker, internet culture writer and author of the book "YouTubers," told CNBC that the figures aren't all that surprising.
"The app is a huge video platform," he said. "We know that pre-pandemic, at least 27 million videos were uploaded a day, and third-party data indicates that it's only become more popular during lockdown."
He added that TikTok had a "very strong" A.I. and human moderation department which often removed videos before they were seen by anyone.
"Like many other platforms at present recognizes the need to provide factual information about something that's literally a life and death situation, and conversely, not to spread disinformation about it," he said.
In the second half of 2019, TikTok removed a total of 49 million videos, according to its second transparency report, which it published last week. Globally, the main reason for removal was "adult nudity and sexual activities," with one in four of the deleted videos removed for this reason in December.
Fewer than 1% of all videos published on the platform are removed for content violations, TikTok said.