- Facebook will start warning users if they liked, reacted or commented on Covid-19 misinformation that the company has since removed.
- The feature will roll out in the coming weeks.
- The company said it's removed hundreds of thousands of pieces of Covid-19 misinformation that could lead to physical harm.
The feature will roll out in the coming weeks, Facebook said in a blog post.
"These messages will connect people to COVID-19 myths debunked by the World Health Organization including ones we've removed from our platform for leading to imminent physical harm," Guy Rosen, Facebook's vice president of integrity, said in a blog post.
After the WHO declared Covid-19 a global health emergency in January, Facebook started removing misinformation about the outbreak from its platforms. The company said Thursday that it's removed hundreds of thousands of pieces of misinformation that could lead to physical harm, such as inaccurate content that says physical distancing is ineffective or drinking bleach cures the virus.
"We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook," Rosen said.
Facebook, which has been criticized for its handling of health issues, has made several coronavirus-related adjustments to its platform over the past few months.
For example, it has increased the number of partners working on fact-checking to limit the spread of false claims. It also started showing pop-ups that link to official health resources on Instagram and Facebook, which have directed more than 2 billion people to resources from health authorities.