Covid-19 has taken over our lives, but it hasn't been around for very long. Scientists have been studying it in real-time as the world collectively goes through a traumatic, world-changing event. It's killed more than half a million people—and we don't even know exactly where it came from.
Which, on the face of it, isn't very reassuring. As a result, conspiracy theories have spread alongside the virus. And for companies like Google, Facebook, and Twitter, longstanding questions about how they should handle misinformation on their platforms have never been more relevant.
Especially because social media seems to be actively making things worse. "The more you rely on social media for your news, the more likely you are to be prone to this dynamic where you're not only failing to identify fake news as fake, but factual information as true," explained Cornell professor government Sarah Kreps.
How to best fight misinformation is complicated, messy, and deeply intertwined with politics and cultural values.
In April, 47% of Republicans believed that the coronavirus has been made a bigger deal than it really is. By June that had grown to 63%. Among Democrats, it went from 14 to 18%.
In a recent Cornell study co-authored by Sarah Kreps and now in preprint, Democrats were consistently more likely to correctly identify a Covid-related headline as true or false than were independents or Republicans. But on average, Republicans and Democrats hold similar levels of science knowledge.
So why is this happening? Watch the video to find out more—and what companies are doing about it.