Tech

Facebook, YouTube usage linked to belief in coronavirus conspiracy theories, study finds

Key Points
  • The research, carried out by Ipsos Mori for King's College London, provides an insight into how some of the misconceptions about Covid-19 have gained traction as well as where they are sourced.
  • The study said 60% of those who believe the virus is linked to 5G radiation get their information from YouTube, compared with 14% of those who think this belief is false.
  • It also found that people who use social media to find information on the virus are more likely to have broken lockdown rules.

In this article

In this photo illustration a smartphone screen reading "Covid 19" and "coronavirus disease" is displayed as logos of social media applications are seen behind, in Ankara, Turkey on April 2, 2020.
Hakan Nural | Anadolu Agency via Getty Images

People using social media platforms like Facebook and YouTube to find information about the coronavirus are more likely to believe in conspiracy theories about the disease, according to new research out of the U.K.

The study, carried out by Ipsos Mori for King's College London and published Thursday, provides an insight into how some of the misconceptions around Covid-19 have gained traction as well as where they are sourced.

For example, 30% of Britons surveyed in late May thought that the coronavirus was likely created in a lab, up from 25% at the start of April, while 8% believed the symptoms most people blame on Covid-19 appear to be connected with 5G radiation. A smaller minority (7%) believe there is no hard evidence that the coronavirus exists. Each of these claims have been dismissed by scientists.

The study said 60% of those who believe the virus is linked to 5G radiation get their information from YouTube, compared with 14% of those who think this belief is false. Meanwhile, 56% of people who believe there's no hard evidence Covid-19 exists use Facebook to source their information, nearly three times higher than the 20% who believe otherwise.

Bogus beliefs surrounding 5G in particular have led to real-world consequences. Dozens of telephone masts in Europe have been set alight while telecom engineers have been harassed on the streets by people claiming the technology is in some way linked with the disease. This has led to calls from authorities for social media firms to do more to counter misinformation about the pandemic.

One of the most commonly-held views is that 5G, the fifth generation of mobile internet, weakens people's immune systems, making them prone to contagion. However, health fears about wireless networks are not new and scientists have slapped down the suggestion that 5G poses risks to human health.

The research, published in the peer-reviewed medical journal Psychological Medicine, found a strong link between social media usage and false beliefs about Covid-19. The findings were based on three separate surveys conducted online from May 20 to May 22 and involved 2,254 interviews with U.K. residents aged 16-75.

The study also found that people who use social media to find information on the virus are more likely to have broken lockdown rules that have been enforced in an effort to contain it. Researchers said 58% of those who had gone outside with Covid-19 symptoms use YouTube as their main information source, much higher than the 16% who haven't. And 37% of people who have had friends or family visit them in their home cite Facebook as a key source, compared to 23% of those who haven't.

"This is not surprising, given that so much of the information on social media is misleading or downright wrong," said Daniel Allington, senior lecturer in social and cultural artificial intelligence at King's College London.

How TikTok became the hottest app of 2020
VIDEO11:2811:28
How TikTok became the hottest app of 2020

"Now that some of the lockdown rules are being relaxed, people will have to make more and more of their own decisions about what is safe or unsafe — which means that access to good-quality information about Covid-19 will be more important than ever. It's time for us to think about what action we can take to address this very real problem."

Facebook and YouTube both say they remove certain types of misinformation about the coronavirus, such as fake treatments and suggestions that it's linked to 5G technology. Both platforms also work with health authorities like the World Health Organization and Britain's National Health Service to display accurate information about the virus.

A Facebook spokesperson said: "We have removed hundreds of thousands of Covid-19-related misinformation that could lead to imminent harm including posts about false cures, claims that social distancing measures do not work, and that 5G causes coronavirus."

A YouTube spokesperson said: "We're committed to providing timely and helpful information about Covid-19 during this critical time, including raising authoritative content, reducing the spread of harmful misinformation and showing information panels, using WHO data and the NHS resources, to help combat misinformation," a YouTube spokesperson said.