• Home >
  • About >
  • Funding Partners


    Strengthening Digital Democracy Luminate Foundation, Inc. (2018)

    The DemTech teawas established in January 2016 as the Computational Propaganda and the gift from Luminate (then the Omidyar Network) commenced in June 2018.  This support is for a three-part programme of activity that greatly enhanceour ability to track trends with regularity, respond rapidly when new opportunities arise, and provide independent research and advice to policy makers and civil society groups.


    • Professor Philip Howard, University of Oxford
    • Funder: Luminate Foundation Inc. (previously the Omidyar Network) 
    • Award: £666,084.28
    • Proposal Number: Luminate 2018
    • Dates: 2018-2021

    Restoring Trust in Social Media Civic Engagement 2017

    Increasingly, social media platforms have become tools for manipulating public opinion during elections. Political actors make use of technological proxies in the form of proprietary algorithms and semi-automated social actors—political bots—in subtle attempts to manipulate public opinion. Through the ERC COMPROP Consolidator award, researchers have demonstrated that even simple bots (i) effectively keep negative messages and fake news in circulation longer, (ii) target journalists and civil society groups, and (iii) operate with little oversight from social media firms. Such bots have negative consequences both for public trust in technology innovation and for the quality of public deliberation in Europe’s democracies. ERC researchers have been able to identify highly automated, politically-manipulative social media accounts post-hoc, and this Proof of Concept project will allow researchers to take what we have learned and produce an online tool that allows the public to evaluate suspicious social media accounts. Most social media platforms are slow to address troll and bot activity, so this innovative tool will put ERC research into public service in Europe and around the world.


    • Professor Philip Howard, University of Oxford
    • Funder: European Research Council
    • Award: €149,132
    • Proposal Number: 767454
    • Dates: 2017-2019
    • Proposal Text
    European Research Council

    Misinformation, Science and Media 2017

    In this new, three-year programme, researchers from the Oxford Internet Institute and the Reuters Institute for the Study of Journalism will examine the interplay between systematic misinformation campaigns, news coverage, and increasingly important social media platforms for public understanding of science and technological innovation. In some key domains of public life there appears to be coordinated efforts to ruin the reputation of science and innovation. Scientists now protest in the streets just to get policymakers to embrace evidence-based policy making. Long-held consensus on the causes and consequences of climate change, tobacco-induced cancers, and value of public health strategies increasingly seem open for debate. We have political leaders who claim to be unable to discern what expert consensus is—even when experts organize to make explicit statements about levels of confidence and certainty around particular areas of research. Social media platforms have become a powerful venue for those aiming to deflating public support for action based on reliable research, and previously trusted technological innovations come to have negligible impact. In this three-year project, we will examine the interplay between systematic misinformation campaigns, news, and increasingly important social media platforms for public understandings of science and innovation. We aim to increase our understanding of the role of “junk science” and fake news in influencing—or even undermining—public understanding of scientific issues and develop evidence-based recommendations for scientists, journalists, and policymakers interested in effective science communication in the 21st century.


    Oxford Martin School

    Computational Propaganda 2015

    Social media can have an impressive impact on civic engagement and political discourse. Yet increasingly we find political actors using digital media and automated scripts for social control. Computational propaganda—through bots, botnets, and algorithms—has become one of the most concerning impacts of technology innovation. Unfortunately, bot identification and impact analysis are among the most difficult research challenges facing the social and computer sciences. DemTech objectives are to advance a) rigorous social and computer science on bot use, b) critical theory on digital manipulation and political outcomes, c) our understanding of how social media propaganda impacts social movement organization and vitality. This project will innovate through i) “real-time” social and information science actively disseminated to journalists, researchers, policy experts and the interested public, ii) the first detailed data set of political bot activity, iii) a deepened regional expert network able to detect bots and their impact in Europe. DemTech will achieve this through multi-method and reflexive work packages: 1) international qualitative fieldwork with teams of bot makers and computer scientists working to detect bots; 2a) construction of an original event data set of incidents of political bot use and 2b) treatment of the data set with fuzzy set and traditional statistics; 3) computational theory for detecting political bots and 4) a sustained dissemination strategy. This project will employ state-of-the-art “network ethnography” techniques, use the latest fuzzy set / qualitative comparative statistics, and advance computational theory on bot detection via cutting-edge algorithmic work enhanced by new crowd-sourcing techniques. Political bots are already being deployed over social networks in Europe. DemTech will put the best methods in social and computer science to work on the size of the problem and the possible solutions.


    European Research Council

    The Production and Detection of Bots 2014

    Political bots are manipulating public opinion over major social networking applications. This project enables a new team of social and information scientists to investigate the impact of automated scripts, commonly called bots, on social media. The PIs will study both the bot scripts and the people making such bots, and then work with computer scientists to improve the way we catch and stop such bots. Experience suggests that political bots are most likely to appear during an international crisis, and are usually designed to promote the interests of a government in trouble. Political actors have used bots to manipulate conversations, demobilize opposition, and generate false support on popular sites like Twitter and Facebook from the U.S. as well as Sina Weibo from China.


    • Professor Philip Howard, University of Oxford
    • Funder: National Science Foundation
    • Award: $218,825.00
    • Proposal Number: 8060
    • Dates: 2014-2016
    • Proposal Text
    NSF logo