The team’s 2019 Global Inventory of Organised Social Media Manipulation has been widely covered, including in the following:
The researchers compiled information from news organizations, civil society groups and governments to create one of the most comprehensive inventories of disinformation practices by governments around the world. They found that the number of countries with political disinformation campaigns more than doubled to 70 in the last two years, with evidence of at least one political party or government entity in each of those countries engaging in social media manipulation.
In addition, Facebook remains the No. 1 social network for disinformation, the report said. Organized propaganda campaigns were found on the platform in 56 countries.
The Oxford Internet Institute researchers found bot accounts are very widely used to spread political propaganda (80% of countries studied used them). However the use of human agents was even more prevalent (87% of countries).
Bot-human blended accounts, which combine automation with human curation in an attempt to fly under the BS detector radar, were much rarer: Identified in 11% of countries.
While hacked or stolen accounts were found being used in just 7% of countries.
In another key finding from the report, the researchers identified 25 countries working with private companies or strategic communications firms offering a computational propaganda as a service, noting that: ‘In some cases, like in Azerbaijan, Israel, Russia, Tajikistan, Uzbekistan, student or youth groups are hired by government agencies to use computational propaganda.”
New York Times Magazine: So the internet didn’t turn out the way we hoped.
State-sponsored disinformation is on the rise. According to the Oxford Internet Institute, the number of countries with political disinformation campaigns nearly doubled to 70 in the last two years or so. Facebook remains the preferred platform for pushing propaganda; organized information operations were found on the social network in 56 countries. Perhaps most terrifying, it has been reported that disinformation tactics are spreading around the world as countries learn from one another.
One of the researchers’ main findings is that Facebook “remains the dominant platform for cyber troop activity,” though “since 2018, we have collected evidence of more cyber troop activity on image- and video-sharing platforms such as Instagram and YouTube. We have also collected evidence of cyber troops running campaigns on WhatsApp.”
In an annual report on disinformation trends, Oxford Internet Institute’s Computational Propaganda Research Project said Facebook remained the most popular platform for social media manipulation due to the size of the company and its global outreach.
However, visual content shared online meant that users of Google’s YouTube video platform and Facebook’s Instagram photo-sharing site are being increasingly targeted with false or misleading messages, said Samantha Bradshaw, one of the report’s authors.
“On Instagram and YouTube it’s about the evolving nature of fake news — now there are fewer text-based websites sharing articles and it’s more about video with quick, consumable content,” she said. “Memes and videos are so easy to consume in an attention-short environment.”
“It’s easier to automatically analyse words than it is an image,” Bradshaw said. “And images are often more powerful than words with more potential to go viral.”
India figures in a small bunch of seven countries — along with China, Iran, Pakistan, Russia, Saudi Arabia, and Venezuela — where state actors use computational propaganda on Facebook and Twitter to influence global audiences, according to a comprehensive report on disinformation campaigns released by the Computational Propaganda project at Oxford on Thursday.
The report found at least seven instances of “cyber troops” in India, and private contractors came out to be the most active “cyber troops” in the country.
By examining Malaysia’s cybertrooper activity, the Oxford Internet Institute said it found evidence of a “medium-capacity” cybertrooper team with formal training and staff estimates of 50 to 2,000 people.
Malaysia’s cybertrooper activity is mainly from fake bot accounts – highly automated accounts designed to mimic human behaviour online, it said in its report.
These include accounts on Facebook, WhatsApp, YouTube and Twitter.
These accounts are used to spread pro-government or pro-party propaganda, attack the opposition in smear campaigns, and suppress participation through personal attacks of harassment.
The study found that this strategy has been employed by government agencies, politicians and political parties, private contractors, civil society organisations, citizens and influencers.