Skip to main

Impact case study database

The impact case study database allows you to browse and search for impact case studies submitted to the REF 2021. Use the search and filters below to find the impact case studies you are looking for.
Waiting for server

Combatting ‘Viral’ Misinformation: Enabling Journalists, Advocacy Organisations and Policy Makers to Understand and Counter Harmful and Hateful Online Content

1. Summary of the impact

The public profile and impact of King’s Digital Humanities research on the online circulation of ‘viral’ misinformation has been heightened in the context of what the World Health Organization (WHO) has called a global coronavirus ‘infodemic’. Via three strands of research, King’s has enabled journalists, advocacy organisations and policymakers to understand and counter the impact of harmful and hateful online content. Digital methods developed at King’s have empowered major international newsrooms to investigate the spread of online ‘fake news’. King’s research on conspiracy theories and hate speech has underpinned civil society campaigns that have influenced social media platform policies and led to the permanent ‘de-platforming’ of high-profile purveyors of harmful misinformation. King’s research on the public health implications of social media misinformation has been drawn on by the UK’s Department for Digital, Culture, Media and Sport (DCMS) and referenced in the Scientific Advisory Group for Emergencies’ (SAGE) pandemic documentation. King’s digital research has also directly informed the development of the concept of ‘hateful extremism’, which is now central to the work of the UK Government’s Commission for Countering Extremism.

2. Underpinning research

King’s researchers have used digital research methods to map how misinformation and disinformation circulate online, producing new insights into digital platform infrastructure. King’s researchers have also evidenced the harmful consequences of online misinformation in the context of the coronavirus pandemic, and the ongoing digital transformation of the public sphere.

Collaborative, practice-based research at King’s has illuminated the role of online platform infrastructure for the circulation of viral misinformation, and developed digital methods for investigating the phenomenon

King’s researchers (Bounegru, Gray and Venturini) have extensive industry-collaborative research experience in the field of data journalism. Their work on data visualisation practices has demonstrated how non-conventional formats such as network graphs can be used in journalistic investigations [1]. The development of these techniques led to further research to build new methodological tactics for exploring the infrastructure of viral online mis/disinformation– also known as ‘fake news’. This research demonstrates the importance of the link economy, the metrification of engagement through ‘likes’ and the tracker economy for the circulation of misinformation. Conclusions here show the importance of shifting focus away from the misleading content of fake news, drawing attention instead to the technical conditions of its circulation [2]. These insights also emerged within the Field Guide to ‘Fake News’ [3], a research project whose investigation, compilation and writing was led by Bounegru, Gray and Venturini. The material in the Field Guide was produced through research workshops and ‘data sprints’; a collaboration format drawing on approaches associated with open-source software development, open data and civic hacking in order to convene different participants to co-produce data and research projects. The innovative 216-page Field Guide is presented as a usable, open-access set of original digital methods for the practical investigation and visualisation of flows of online misinformation. Designed to appeal to journalistic, civil society, policy-orientated and academic researchers, the Field Guide presents practice-based research findings in the form of ‘recipes’: original digital methods that can, for instance, enable users to map ‘fake news hotspots’ on specific social media platforms, or use websites’ tracker signatures to uncover the techno-commercial infrastructure of fake news sites that accelerates the viral spread of misinformation. The Columbia Journalism Review (7 April 2017) describes how the accessible, practical ‘recipes’ of the Field Guide use “beautifully produced graphics [that] look like workflows to aid the reader step by step”. Highlighting how the Guide brings together research insights on the modern media environment with original user-orientated digital methods, the reviewer notes that “the reader who goes through these exercises with patience will not only discover stories, and grow in understanding of how news travels these days, but also gain a crucial set of skills for understanding media consumption in the new era”.

King’s research has demonstrated the role of social media platform interfaces in the circulation of conspiracy theories

Also looking at digital platform infrastructures, Allington’s research into hateful misinformation has revealed how online platform features – such as the ranking of comments by popularity – can insulate harmful conspiracy theories from rational challenge [4]. This research deconstructed the hateful fantasies of Britain’s most high-profile conspiracy theorist, David Icke, and revealed how YouTube’s commenting interface actively disadvantaged critical responders and had the effect of amplifying misinformation and bigotry.

King’s research has evidenced the harmful public health and wider societal impacts of viral misinformation and hateful conspiracy beliefs

Allington has used quantitative social science methodologies to study conspiracy theories as a particularly harmful kind of misinformation. These methods include content analysis, surveys and quantitative text analysis, combined with various forms of statistical modelling [4,5]. Through systematic analysis of user comments and survey responses, Allington’s work finds evidence that social media misinformation may be having a measurable and harmful effect on attitudes. The research shows a negative relationship between coronavirus conspiracy beliefs and health-protective behaviours, and a positive relationship between coronavirus conspiracy beliefs and the use of social media as a source of information about the pandemic [5]. These findings are featured in one of the most-read articles ever published in the influential journal Psychological Medicine (impact factor 5.813). Allington’s research on online extremist discourses has also demonstrated broader socio-political harms caused by conspiracist thinking, including diversion from constructive democratic politics. In an independently peer-reviewed study for the UK Government’s Commission for Countering Extremism, Allington developed a survey instrument to examine the relationship between revolutionary far left ideology and sympathies for political violence [6]. The research emphasised that far left groups in the UK are not directly engaged in and do not advocate violence, but highlighted the dangers of conspiracist thinking and found a link between certain kinds of ‘radical’ attitudes and sympathy for violent extremist tactics.

3. References to the research

  1. Bounegru, L., Venturini, T., Gray, J. and Jacomy, M. (2016). Narrating networks: exploring the affordances of networks as storytelling devices in journalism. Digital Journalism, 5(6), 699–730. doi:10.1080/21670811.2016.1186497.

  2. Gray, J., Bounegru, L. and Venturini, T. (2020). ‘Fake news’ as infrastructural uncanny. New Media & Society, 22(2), 317–341. doi:10.1177/1461444819856912.

  3. Bounegru, L., Gray, J., Venturini, T. and Mauri, M. (compilers/authors) (2018). Field Guide to ‘Fake News’ and Other Information Disorders. Amsterdam: Public Data Lab. doi:10.2139/ssrn.3097666.

  4. Allington, D. and Joshi, T. (2020). ‘What others dare not say’: an antisemitic conspiracy fantasy and its YouTube audience. Journal of Contemporary Antisemitism, 3(1), 35–54. doi:10.26613/jca/3.1.42.

  5. Allington, D., Duffy, B., Wessely, S., Dhavan, N. and Rubin, J. (2020). Health-protective behaviour, social media usage and conspiracy belief during the COVID-19 public health emergency. Psychological Medicine, 1–7. doi:10.1017/S003329172000224X.

  6. Allington, D., McAndrew, S. and Hirsh, D. (2019). Violent Extremist Tactics and the Ideology of the Sectarian Far Left. London: Commission for Countering Extremism.

4. Details of the impact

Changes to our digital information environment have prompted global concerns about the spread of harmful and hateful misinformation. Coronavirus is the first pandemic of the social media age, and the implications of ‘viral’ misinformation have been highlighted in what WHO describes as an ‘infodemic’: an overabundance of information that includes the deliberate dissemination of falsehoods to undermine public health responses and advance harmful alternative agendas. King’s researchers have long been at the forefront of efforts to understand and tackle this. Journalists worldwide are now using digital methods from King’s [1,2,3] to investigate the digital infrastructures that facilitate viral misinformation and put pressure on tech companies, in order to more actively combat its circulation. King’s research [4] has enabled advocacy groups to successfully campaign for social media companies to ‘de-platform’ influential spreaders of misinformation. This research has also informed UK Government policymaking about coronavirus-related [5] and hateful-extremist conspiracy beliefs [6].

King’s research has directly enabled journalists in major international newsrooms to investigate online misinformation and influence the policies of major digital platforms

King’s Field Guide to ‘Fake News’ has provided journalists with original, research-derived methods to investigate the online platforms, algorithmic ranking systems and techno-commercial processes that enable the circulation of misinformation and ‘junk news’. The Field Guide has been cited as a key reference for responding to misinformation by organisations such as the BBC World Service, La Repubblica, Le Monde, Transparency International and UNESCO [A]. Insights from the Field Guide have been used and cited in collaborative investigations with NRC, Le Monde, Politico Europe and BuzzFeed News. BuzzFeed’s collaboration with King’s researchers in the process of researching and testing the Field Guide methods enabled their reporters to uncover how misinformation publishers were still earning money from major ad networks. This story resulted in a Google review of the sites in question and led to Google’s disabling of ads on those sites that were in violation of its policies [A].

The King’s research insights developed and accessibly presented in the Field Guide have had a direct impact on the investigative capacity of non-profit organisations, such as Media Matters for America, a research and information centre that monitors, analyses and corrects misinformation in the US media. Media Matters’ Research Director states that “having such a thorough resource available to help guide our thinking made possible rapid advancements in our research capacity – enabling researchers to effectively identify and help mitigate the effects of harmful narratives, both online and off” [A].

This research has also been put into practice through ongoing collaborations with First Draft News – a coalition brought together by the Google News Lab with the goal of fighting mis/disinformation online. The organisation’s Director states that “First Draft has embedded the effective digital methods produced by King’s research (as published in the Field Guide to Fake News) into all of our journalistic investigative and training activities. We draw in particular from their materials which support visual network analysis and extracting, analysing and visualising data from the web and online social media platforms” [B]. First Draft has been conducting regular training activities with journalists, policymakers and non-governmental organisations (NGOs), and has used the methods derived from the Field Guide and subsequent research to deliver training webinars to staff monitoring misinformation at organisations such as WHO, UNICEF and Mercy Corps [B]. First Draft’s Research Manager highlights how their use of King’s research has increased the accessibility of digital methods. He notes a “paradigm shift” among journalists and civil society actors in their realisation that data-driven approaches for investigating misinformation are both essential and actionable: “before, journalists weren’t necessarily using data, or were intimidated by data driven approaches. But you have to use them in order to investigate misinformation, the approaches that KCL [King’s] has been building for many years”. First Draft’s use of King’s research on accessible digital methods has “underline[d] that importance to journalists and normaliz[ed] these types of approaches within newsrooms” [B].

First Draft itself also conducts extensive monitoring of misinformation online and describes itself as an international ‘wire service’ on these issues for global media. First Draft’s Director states that “the digital methods developed through King’s research have directly capacitated our investigators to use new techniques to identify and track problematic online content” [B]. The results of these investigations have a wide and high-profile international reach and are disseminated to First Draft’s network of international journalists and newsrooms, for instance through their CrossCheck initiative’s Slack channel. This has over 400 members from major international newsrooms such as the BBC, AFP, CNN, CBS, NBC and Reuters. First Draft investigations using digital methods derived from King’s research are frequently cited by high-profile journalists covering issues related to misinformation, for instance NBC reporting on the spread of ‘QAnon’ conspiracy beliefs on social media in the context of the 2020 US election [B]. First Draft “believe[s] that the increased journalistic scrutiny and data collection on misinformation patterns (which First Draft’s use of King’s digital methods has significantly contributed to) has in turn influenced major social media companies’ recent steps towards more active content moderation and labelling” [B].

King’s research has directly influenced the content moderation practices of social media companies and has contributed to the permanent ‘de-platforming’ of influential purveyors of harmful coronavirus conspiracy beliefs

Prior to the coronavirus pandemic, the UK charity Campaign Against Antisemitism (CAA) used Allington’s research on the hateful content of David Icke’s online conspiracy videos in their discussions with YouTube’s policy team. CAA’s engagement with YouTube/Google influenced the platform’s removal of some of these videos in a wider take-down of white supremacist content in 2019 [C]. During the pandemic, there has been a further surge in such conspiracy theories, some of which incorporate similar hateful, antisemitic tropes. Again, David Icke has been a major purveyor of this misinformation. According to the Center for Countering Digital Hate (CCDH), an international not-for-profit NGO, one of Icke’s YouTube videos – which claims the Rothschilds were involved in planning the coronavirus pandemic – had been viewed 5.9 million times, making it the 27th most watched video about coronavirus on the platform. CCDH estimates that across social media platforms Icke’s coronavirus conspiracy videos have been viewed around 30 million times [C].

The full, permanent removal of Icke’s social media accounts was subsequently lobbied for by CCDH, and Allington’s research informed and influenced their (ultimately successful) campaign. As CCDH’s CEO explains, “Taking the lead from these research findings, our campaign #DeplatformIcke highlighted how social media platforms profit from such misinformation through online traffic and advertising revenue. Dr Allington’s research provided the campaign with an authoritative documentation that steered [its] direction and added significant weight to our conclusions and calls for action from tech companies. Our #DeplatformIcke report cites Dr Allington’s research extensively and we acknowledge the valuable insight that he provided to us in its preparation. As such, [this] research played a vital role in the success of CCDH’s campaign for major social media platforms to remove Icke’s accounts” [C]. The #DeplatformIcke report was launched in April 2020 and, at the beginning of May, Facebook and YouTube permanently removed Icke’s accounts (Twitter followed suit in November). CCDH’s report – for which King’s research was vital – was cited in UK and international news reporting on the decisions taken by the social media companies, and Icke himself noted the influence of CCDH’s campaign when discussing his de-platforming from YouTube (C).

High-profile King’s research about the impacts of online misinformation has directly informed Government policymaking on digital harms and ‘hateful extremism’

Allington’s survey research on social media use, conspiracism and health-protective behaviour in the coronavirus pandemic has been discussed extensively in the media, both in the UK (with coverage on the BBC’s Today programme, The Guardian and The Daily Telegraph) and internationally (in The New York Times and on Voice of America) [D]. This research was taken up by campaigning organisation Avaaz, being cited in the press release accompanying its open letter signed by over 100 health professionals to social media platform CEOs [D]. The research findings were then cited in internal DCMS materials and in reports by SAGE on viral transmission risks in further and higher education. Referencing Allington’s research, the SAGE report notes that “lower adherence to Government guidelines [is] associated with exposure to conspiracy theories in social media” and emphasises the importance of “countering false messaging on social media” [E]. Indicating the international implications of the findings, this research has also been cited in a European Commission Science for Policy Report [E].

The policy impact of Allington’s research was further acknowledged in his invitation to join the Counter Disinformation Policy Forum convened by the UK Government’s Minister of State for Digital and Culture, and hosted by DCMS. Held in December 2020, the first Forum focused on the Government’s communication strategy for the coronavirus vaccine rollout, and Allington was one of 22 invited participants, alongside journalistic, civil society and tech company representatives. Allington directly contributed his research insight to the Forum, focusing on his findings that people who get their information from social media are more likely to believe in conspiracy theories and less likely to agree with the lockdown; and that vaccine hesitancy is higher among those who get their information about coronavirus from social media rather than traditional media [E].

Considering the wider impacts of misinformation and conspiracy beliefs on society, Allington’s research on the influence of politically extremist discourse has had a direct influence on the development and adoption of ‘hateful extremism’ as the key concept in the Commission for Countering Extremism’s work. The Commission was established in March 2018 to support society to challenge all forms of extremism and provide impartial advice to government. The Commission engaged Allington as lead researcher for an independent study on the relationship between revolutionary far left ideology and sympathy for violent extremism in 2019. The Commission has acknowledged that this research fed directly into and informed their flagship 2019 report Challenging Hateful Extremism [F]. The report’s articulation of this new category of ‘hateful extremism’ – informed by King’s research – has underpinned the Commission’s subsequent commitment to undertake a review of relevant laws. The Former Assistant Commissioner for Specialist Operations of the Metropolitan Police Service, Sir Mark Rowley, is leading this review and has stated that he is “convinced that the Commission’s clarity of focus on ‘hateful extremism’ can help identify the gaps that exist at the boundaries of current laws, such as hate crime and terrorism, which are being exploited daily by extremists” [F].

5. Sources to corroborate the impact

A. Sources on global use of King’s Field Guide to ‘Fake News’: report by BBC World Service (2018); Buzzfeed News, 4 April 2017; La Repubblica, 5 April 2017; Le Monde, 4 December 2017; report by Transparency International (2018), Fake News and Anti-Corruption; report by UNESCO (2018), World Trends in Freedom of Expression: 2017/2018 Global Report; testimonial email from Vice President of Media Matters for America.

B. Sources on use of KCL research by First Draft News: testimonial letter from Director; testimonial interview with Research Manager; NBC News, 21 August 2020.

C. Sources on impact of Allington’s research on Campaign Against Antisemitism (CAA) and Center for Countering Digital Hate (CCDH): testimonial letter from CAA Chief Executive; report by CCDH (2020), #DeplatformIcke: How Big Tech Powers and Profits from David Icke’s Lies and Hate; testimonial letter from CCDH’s CEO; Sky News, 3 May 2020.

D. Sources on how Allington’s survey research has informed public debate: BBC News, 17 June 2020; The Guardian, 8 April 2020; The Telegraph, 18 June 2020; The New York Times, 17 August 2020; Voice of America, 1 July 2020; Health Professionals Make Urgent Call to Social Media CEOs, Avaaz, 7 May 2020.

E. Sources on how King’s research has informed Government policy: UK SAGE report, Principles for Managing SARS-CoV-2 Transmission, 3 September 2020; EC JRC Policy Report (2020), Technology and Democracy; UK Counter Disinformation Policy Forum Readout, 2 December 2020.

F. Sources on impact of King’s research on UK Government’s Commission for Countering Extremism (CCE): testimonial letter from CCE; CCE report (2019), Challenging Hateful Extremism; ‘Commission for Countering Extremism launches a legal review’ Gov.uk webpage, 10 June 2020.

Additional contextual information