COVID-19 Misinformation Newsletter 26 October 2021
26 October 2021
Vaccine hesitancy becomes vaccine resistance
The COVID-19 Misinformation Newsletter is prepared by the staff of the Programme on Democracy and Technology (DemTech) at Oxford University. We summarise the latest independent research and high-quality news reporting on the production and consumption of computational propaganda and campaigns to manipulate public understanding of the health crisis. The newsletter is edited by Dr Aliaksandr Herasimenka. It is a two-minute read.
YouTube is expanding its medical misinformation policies to include “misinformation about vaccines in general.” The platform will now prohibit content which “falsely alleges that approved vaccines are dangerous.”
TikTok has been criticised for the abundance of ‘conspiracy theory’ content relating to Covid-19 available to younger audiences. Researchers at NewsGuard published details about examples of misinformation content on the platforms in June. However, as of October, the content was still available, the Guardian notes. “The more anti-vaccine content kids interact with, the more anti-vaccine content they’ll be shown,” a representative of NewsGuard observed.
Farmers, ranchers, and veterinarians in the US are facing shortages of the anti-parasite drug, ivermectin, due to misinformation about the drug as a cure for COVID-19. This comes despite warnings from health officials. Farmers warn the shortage leaves them ill-equipped to protect livestock from getting sick. According to the New York Times, groups promoting ivermectin “continue to flourish” on Facebook.
Vaccine hesitancy is heavily affected by demographics, political opinions and media use patterns. According to a survey by Indiana University’s Observatory on Social Media, US participants who were vaccine hesitant were more likely to be male, white, Republican and younger. They also tended to favour YouTube over Twitter. “Vaccine hesitancy has become vaccine resistance,” noted James Shanahan, one of the authors of the study.
Misinformation spreaders are far more likely to take on central positions in the misinformation URL co-sharing network than fact checkers on Facebook. According to a study published in Harvard Kennedy School (HKS) Misinformation Review, this demonstrates the remarkable ability of spreaders of misinformation to coordinate communication strategies.
Some features make some social media platforms more fertile places for conspiracy beliefs related to the COVID-19 pandemic than others. According to a study based on cross-national survey data published in New Media & Society, the features of Twitter have a negative effect on conspiracy beliefs. At the same time, YouTube and Facebook have features that are found to have a positive effect on these beliefs.
Sign up for the DemTech Newsletters on COVID-19 Misinformation and China Information Operations