Skip to main

Impact case study database

The impact case study database allows you to browse and search for impact case studies submitted to the REF 2021. Use the search and filters below to find the impact case studies you are looking for.
Waiting for server

Create Converge: Transforming digital content creation, kickstarting careers and informing public debate through innovations in cinematic visual effects methodologies

1. Summary of the impact

Peter Richardson’s research into the development and deployment of new, more economical and accessible methodologies for the production of complex cinematic visual effects has led to impact across three substantive areas:

  1. New Cinematic Methods. The creation of new and innovative methods for creating complex visual effects at lower price points has increased opportunities for emergent filmmakers to create high quality cinematic films. Richardson’s programmes have supported young people aged 16-24 to make films, and provided training, development and production support for over 70 innovative short film projects, with worldwide audiences.

  2. Policy and critical public debate. The film Slaughterbots has raised awareness of the danger of lethal autonomous (AI) weapons. It was screened at the November 2017 meeting of the Geneva Convention on Conventional Weapons, has generated online reach of over 4 million YouTube views and continues to contribute to public debate on the issue.

  3. Capacity and Audience Building. Research into the size and scale of creative industries in the North Sea Region (NSR) and the potential impact of virtual reality (VR) technologies (particularly in non-entertainment contexts) has led to the development of new audiences and markets.

2. Underpinning research

Richardson’s research into innovative visual effects techniques has been funded by a series of substantial grants from the European Union commencing with the €5.2 million North Sea Screen Partnership project (2009-2012 at Dundee, then 2012-2014 at the University of Hertfordshire; all underpinning research described herein was undertaken at Hertfordshire) and latterly a €1.5 million Create Converge grant (2016-2020). The research falls into three main themes:

1.New Cinematic Methods

The Games and Visual Effects Lab (G+VERL) at the University of Hertfordshire, led by Richardson (Principal Investigator) with David Tree (Research Fellow), researches methods and workflows for attaining high-quality cinematic visual effects (VFX) and immersive experiences. VFX is now ubiquitous in cinema, however, the price point for cinematic-quality VFX is prohibitively high with the market dominated by large multinational companies such as Framestore and ILM. The research asked: can high-quality cinematic VFX sequences be achieved in the short to medium format? What new methodologies and processes can deliver these sequences in a time and cost-effective manner? Richardson developed a novel combination of existing technologies, specifically the application of a new on-set algorithm to stich High dynamic range imaging (HDRI) images together quickly to create a photo-realistic environment into which CGI assets can be placed. The ease of use and cost-effectiveness of this novel combination opens the door to the use of high-quality VFX by users who would not previously have had access.

These new processes were developed iteratively through a series of Richardson’s films, disseminated via film festivals, screenings, conferences and academic papers worldwide: ‘ Miss Donnithorne’s Maggott’ (2012) [ 3.1], ‘ Molly Makes A Dress’ (2015) [ 3.2], ‘ Caravan at The Edge Of The World’ (2017), ‘ The Slotin Paradox’ (2019), ‘ Singularity’ (2020) [ 3.2].

2. Policy and critical public debate

Richardson co-produced the film ‘Slaughterbots’ (2017), commissioned by the Future of Life Institute (funded by Elon Musk and Professor Stephen Hawking, c.£250,000) to increase public understanding of the threat posed by the weaponisation of artificial intelligence (AI) technology. Filmed at the University of Hertfordshire, Slaughterbots portrays a near-future scenario where swarms of autonomous drones use AI and facial recognition to assassinate political opponents, and utilises cinematic tropes closely associated with Hollywood disaster films. The research explored whether cinematic drama can be used to increase public awareness of the dangers posed by the un-fettered development of lethal autonomous weapons [ 3.4].

3. Capacity and Audience Building

‘Create Converge’ is an EU Interreg-funded project which brings visualisation and games tech together across various sectors, aiming to boost digital industries in the North Sea region which includes Scotland, England, Denmark, Germany, the Netherlands, and Sweden. Richardson’s work on capacity building for the NSR has produced many significant outputs showcasing the scope of VR technologies within and beyond entertainment:

Zero Point VR (2017) was an immersive VR installation at the Barbican, London – a collaboration between choreographer Darren Johnston (ArrayUK), Richardson and Tree (G+VERL) and physicist Merritt Moore (Oxford University). The research investigated novel methods to visualise, in the form of a virtual reality installation, the concept of zero point energy (the lowest possible energy that a quantum mechanical system may have) in a form understandable by the general public. The experience was presented via the HTC Vive VR system with room scale tracking; to reduce distraction, motion controllers were not used. Instead the participant interacts by focussing their gaze on ‘totems’: when maintaining the gaze for more than 10 seconds the totem transports the participant to the next level [ 3.5].

Aarhus Walks On Water’ (2017) Research into emergent forms of immersive technology led to an invitation from the European Union 2017 City of Culture board for Richardson to participate and shape, as co-creative director, the flagship event Aarhus Walks On Water (AWOW). Richardson contributed the overarching immersive concept and the opening film: ‘Stereo-Lith-Hydro-D.98.’ The film tells the story of a group of women who take a journey through a dystopian modernist city; as the women each arrive at a high tower, they signal to a lone high board diver. The diver performs the perfect dive; as she hits the water she enters a surreal underwater landscape. As she emerges from the water a live immersive fashion/tech show begins, where models walk on a specially engineered catwalk, giving the effect of literally walking on water [ 3.6].

Richardson (with the G+VERL) produced the data visualisation project ‘Visualising Creative Work’ (2019), which investigated novel methods for visualising the scale, economic benefit and impact on society of creative industries in the Danish North Sea regions. The team developed algorithmic interrogation software, which gathered a dataset of businesses in Denmark who are involved in the creative industries. A virtual reality map viewable using a VR headset was then produced using a bespoke algorithm developed at G+VERL. The tool allows such datasets to be viewed in many novel ways – for instance, as a cityscape where the heights of buildings represent the capacity and net worth of each company, and their outputs.

3. References to the research

3.1 Richardson, P. (2013) A ‘Real Time Image Conductor’ or a Kind of Cinema?: Towards Live Visual Effects. Special Issue Leonardo Electronic Almanac, [online] 19(3). MIT Press. Available: https://www.leoalmanac.org/wp-content/uploads/2013/11/LEAVol19No3-Richardson.pdf ISBN 978-1-906897-22-2.

3.2 ‘Molly Makes A Dress’ (2015) [Film]. Richardson, P. (Director) 2016 Premier: Bokeh South African Fashion Film Festival. Nominations: ‘Best Fashion Film’ 2016 (runner up) and Mercedes Benz Award. https://www.youtube.com/watch?v=2rXjBc5_WOc

3.3 ‘Singularity’ (2020) [Virtual Reality Artwork]. Richardson, P. (Exec Producer) BBC Arts. Funding: BBC RnD, New Creatives, Screen South Arts Council England (£20,000) Available Dec 2020 on Steam / BBC Arts. https://store.steampowered.com/app/1429890/Singularity/

3.4 ‘Slaughterbots’ (2017) [Film]. Richardson, P. (Co Producer). Funding: commissioned by the Future of Life Institute (c.£250,000). https://www.youtube.com/watch?v=9CO6M2HsoIA

3.5 ‘Zero Point VR’ (2017) [Virtual Reality Installation]. Richardson, P., Array (Darren Johnson Choreography), Oxford University Clarendon Physics Lab (Merrit Moore). The Barbican Centre, London 25-27 May 2017. Output submitted to REF2021.

3.6 ‘Aarhus Walks On Water’ (2017) [Digital Immersive Technology Event]. Richardson, P. (Co-Creative Director). Funding: European Capital of Culture, Dansk Bank and Interactive Denmark (€240,000). Output submitted to REF2021.

Funding: ‘Create Converge’ (2016-2020) http://createconverge.eu. Funded by EU North Sea Region Interreg (€1.5M). Partners: University of Hertfordshire, Abertay University Scotland, Filmby Aarhus Denmark, Tayscreen Scotland, Digital Dundee, Media Evolution Sweden, Filmforderung Hamburg Schleswig-Holstein Germany, Screen South England, Subatomic Netherlands and VIA University College Denmark.

4. Details of the impact

Richardson’s research is designed purposefully to help industry adapt to and work with emergent technologies and workflows. It has led to streamlined VFX methods with significant cost savings, opportunities for emergent filmmakers, new understandings and opportunities for the creative industries, and has contributed to critical public and policy debate.

1. New cinematic methods – impacts on filmmakers and the industry

Young and emergent filmmakers are often excluded from utilising VFX within their films, due to such technology’s prohibitively high price point. New workflows and methodologies for high quality cinematic but cost-effective VFX developed by Richardson have delivered significant cost savings, and driven schemes that have facilitated emergent filmmakers aged 16-24, contributed to innovation and enabled access to careers in the creative industries:

The ‘Innovation Shorts’ programme (2014, £121,000 EU grant) in collaboration with Screen South (UK) and FilmFyn (Denmark), co-produced four visual effects driven short films. Richardson acted as Executive Producer and Visual Effects Supervisor for the films, which were premiered at the 2014 Berlin Film Festival.

  • Noted as “a great success for Screen South”, the programme “served as proof of concept to the European film industry that cost effective VFX could be achieved” using these methods, with “a cost saving for VFX of approximately one third the market price (as of 2016)” - Jo Nolan, Managing Director, Screen South [ 5.1].

  • It further “ launched and enhanced the careers of the four filmmakers and teams involved”, including: the Director who went on to form Fire Panda LTD, an award-winning VR + AR development studio creating innovative interactive content and experiences which has a staff of four and employs many freelancers; the Writer / Director who has since written and directed film and TV series and won an International Emmy Award for ‘USS Callister’, his episode of Black Mirror 3 [ 5.1].

‘Innovation Shorts’ served as the blueprint for the Random Acts Ignition Network (2015-2017), set up by Richardson and Screen South, funded by the Arts Council and Channel 4 (£400,000). Over 100 young filmmakers in the south of England received funding, training and mentorship from industry professionals. Richardson worked as executive producer alongside Screen South and Fly Film, and the scheme produced 72 innovative short films which were either broadcast or shown online as part of Channel 4’s Random Acts series [ 5.1]. The scheme kick-started many careers in the UK film industry. A qualitative impact analysis commissioned by Screen South found that 100% of the lead creatives (111) and crew (302) believe it assisted with their next job, career or studying steps. Participants in the scheme have gone on to have their films screened at festivals worldwide, and win awards and grants including from the BFI and London Independent Film Festival [ 5.2].

Richardson’s innovative, research driven approach to film was “pivotal in developing new workflows and paradigms for storytelling in the rapidly emerging field of virtual reality. His research showed that high quality short form VR can be achieved at a lower price point (under £20,000)” [ 5.1]. New Creatives (2019-2021) is part of a Screen South / BBC Arts / Arts Council England programme (£600,000): a talent development scheme offering artists aged 16-30 the chance to develop their creative and technical skills, developing fresh and innovative short films, audio and interactive works. Richardson’s research was “a major contributor to Screen South working with BBC Arts and BBC R&D to agree the inclusion of the interactive / VR strand” of the scheme [ 5.1]. As Executive Producer and Production Partner (interactive gaming / VR), Richardson oversaw the development and production of 3 new Interactive VR experiences: ‘Singularity’ (2020), ‘Soil will Save Us’ (2020) and ‘Dichotomy’ (2021) and Phillip 21 (2021).

Screen South’s Managing Director concludes that without Richardson’s “long term commitment to innovative and novel technologies in film and now VR it is hard to imagine that Screen South would have been able to deliver the quality of projects expected of our funders” [ 5.1]. Further, Matthew Nelson, Managing Director for Space Digital states that since the release of Slaughterbots (below), their “worldwide reputation for making highly impactful campaigning films has grown, leading to an increase in production and postproduction work for our company.” [ 5.3]

2. Policy and critical public debate

Co-produced by Richardson in collaboration with Space Digital (Producer Mathew Nelson), Slaughterbots (2017) used Richardson’s novel methodology for the production of low cost, high quality cinematic visual effects. The film, which depicts lethal attacks by AI-controlled drone weapons, was launched anonymously to go viral – gaining 200,000 views in its first day alone. The work has generated vast media attention and public debate: by the end of 2020 it had over 4 million views on YouTube [ 5.4] and has been featured on BBC, CNN, South China Morning Post, Fox News and The Economist to name a small selection. Science Alert (November 2017) called it “the best warning against autonomous weapons” [ 5.5].

The film has also contributed to global policy debate on fully autonomous weapons. It was presented by world-renowned AI researcher Professor Stuart Russell (UC Berkeley) to a meeting of over 70 countries at the Fifth Review Conference of the Geneva Convention on Conventional Weapons (CCW, 13-17 Nov 2017 [ 5.6]). The film concludes with a statement from Russell that the technology depicted in the film already exists and could be deployed in the way imagined. The meeting marked an initial step towards agreeing a new CCW protocol on lethal autonomous weapons systems (LAWS), with meetings ongoing. According to the Campaign to Stop Killer Robots, the film was “central in generating a slew of media coverage throughout the two weeks of CCW meetings” [ 5.7]. A study by the US Air Force, published in Future Warfare Series No. 60, May 2020, cites the film in arguing that LAWS are weapons of mass destruction. In January 2020, US Presidential Candidate Andrew Yang shared a link to Slaughterbots when he called for a global ban on the use of autonomous weaponry [ 5.8]. The film was “instrumental in securing an additional 20,000 signatures to the Future of Life Institute’s petition to ban autonomous weapons” [ 5.3].

Slaughterbots continues to contribute to critical public debate, being frequently featured in mainstream media worldwide in discussions on autonomous weaponry. A Guardian ‘long read’ in October 2020, ‘ Machines set loose to slaughter’: the dangerous rise of military AI’ states: “The lesson that the film, Slaughterbots, is trying to impart is clear: tiny killer robots are either here or a small technological advance away…when it comes to the future of war, the line between science fiction and industrial fact is often blurry.” Scenes from Slaughterbots also appeared in a New York Times video, AI is Making it easier to kill (you): Here’s how (December 2019; over 900,000 YouTube views). In it, Professor Russell discusses the danger of LAWS and the slow process of achieving international consensus on them [ 5.5].

3. Capacity and Audience Building

As part of the Create Converge project (2016-2020), VFX techniques have been showcased to businesses and the public in the EU North Sea Region, raising the profile of these emergent technologies, facilitating collaboration and encouraging growth in non-entertainment contexts.

Two 2017 events described in Section 2, Zero Point VR (a VR installation held at the Barbican) and Aarhus Walks on Water (a live immersive technology event), attracted over 600 and over 3,000 visitors respectively. The Aarhus event received an additional 400 viewings through a 360° VR live feed; an impact survey conducted by Aarhus University revealed the audience thought the event to be “particularly inspiring (49%), sensuous (45%) and funny (35%)”, and 72% considered it a “rethinking event” – demonstrating viewers’ reactions to this new technology. [ 5.9; 5.10]

The ‘Visualising Creative Work (VCW)’ (2017-2019) data visualisation project is a virtual reality map which allows the viewer to explore data on local creative industries in novel ways. It was disseminated at events across Denmark as a prototype. These events, particularly the workshop ‘ **VR Ready - AR You?**’ led by Richardson (250 attendees), helped businesses identify opportunities for the use of VR in regional government, human resources, procurement, the Danish national health sector, design, TV and cinema. Businesses were asked if they would consider VR as a way of expanding their businesses and if key business issues could be solved via VR solutions. The responses were overwhelmingly positive with many businesses planning on incorporating VR into their existing business model, e.g.: “As a company we are now investigating VR as a way to allow juries to better understand complex circumstances of marine accidents in courtroom situations” – Managing Partner Solis Marine UK. Solis Marine have as of July 2020 commissioned six VR simulations which have been used in Marine Accident investigations in Malta, Gibraltar, UK and the Pacific. The VCW tool has since been expanded and developed further, with Dundee City Council/TayScreen reporting that: “The industrial communities of Create Converge partners Filmby Aarhus and Media Evolution each manage physical and virtual hubs for around 80 and over 400 companies respectively. The VCW provides invaluable visual data on companies and business relationships to inform potential business and collaboration opportunities. Examples include Mannd, Ate VR and Digital Devotion Games in Denmark; IKEA of Sweden AB, Beyond Innovation and W Communications Agency in Sweden” [ 5.11].

5. Sources to corroborate the impact

5.1 Letter from Jo Nolan, Managing Director, Screen South

5.2 Random Acts Analysis Report, Greenshoot

5.3 Letter from Matthew Nelson, Managing Director, Space Digital Limited

5.4 Slaughterbots: https://youtu.be/9CO6M2HsoIA; https://youtu.be/HipTO_7mUOw

5.5 Slaughterbots media coverage document

5.6 Stuart Russell, remarks delivered at UN: https://meetings.unoda.org/section/ccw-gge-2017-statements/

5.7 Report from the Campaign to Stop Killer Robots: https://www.stopkillerrobots.org/wp-content/uploads/2018/02/CCW_Report_Nov2017_posted.pdf

5.8 Slaughterbots policy coverage document

5.9 Zero Point VR attendance data

5.10 Aarhus Walks on Water Impact Survey / Evaluation & attendance data

5.11 Letter from Julie Craik, Business Development, Dundee City Council/TayScreen

Additional contextual information

Grant funding

Grant number Value of grant
1 £1,395,507