This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Covid-related misinformation on YouTube: The spread of misinformation videos on social media and the effectiveness of platform policies
Summary
For this memo, we identified all Covid-related videos which circulated on social media, but which YouTube eventually removed because they contained false information. Between October, 2019 and June, 2020 there were 8,105 such videos – less than 1% of all YouTube videos about the coronavirus. We find that:
- It took YouTube on average 41 days to remove videos containing false information, based on a subset of videos for which this data was available.
- Surprisingly, Covid-related misinformation videos do not find their audience through YouTube itself, but largely by being shared on Facebook.
- Facebook placed warning labels about false information only on 55 videos, less than 1% of the misinformation videos shared on the platform.
- Misinformation videos were shared almost 20 million times on social media, which is more than the shares gathered by the five largest English-language news sources on YouTube combined (CNN, ABC News, BBC, Fox News and Al Jazeera).
Aleksi Knuutila, Aliaksandr Herasimenka, Hubert Au, Jonathan Bright & Philip N. Howard. “COVID-related misinformation on YouTube: The spread of misinformation videos on social media and the effectiveness of platform policies.” COMPROP Data Memo 2020.6, 21.09.2020. Oxford, UK: Project on Computational Propaganda. demtech.oii.ox.ac.uk