Skip to main

Impact case study database

The impact case study database allows you to browse and search for impact case studies submitted to the REF 2021. Use the search and filters below to find the impact case studies you are looking for.

Search and filter

Filter by

  • The University of Bath
   None selected
  • 11 - Computer Science and Informatics
   None selected
   None selected
   None selected
   None selected
   None selected
   None selected
Waiting for server
Download currently selected sections for currently selected case studies (spreadsheet) (generating)
Download currently selected case study PDFs (zip) (generating)
Download tags for the currently selected case studies (spreadsheet) (generating)
Currently displaying text from case study section
Showing impact case studies 1 to 3 of 3
Submitting institution
The University of Bath
Unit of assessment
11 - Computer Science and Informatics
Summary impact type
Societal
Is this case study continued from a case study submitted in 2014?
No

1. Summary of the impact

Research at the University of Bath in Artificial Intelligence has impacted tools and design techniques, public policy, and industry practice in the understanding of ethics and mitigating unintended algorithmic bias. This research has:

  • Influenced UK, European and International standards in robot ethics, including British Standards for Robots and Robotic Devices and standards for the Institute of Electrical and Electronics Engineers (IEEE).

  • Improved the understanding of policymakers and influenced decision-making globally (USA, Canada, Europe, UK) through direct application of Bath research, advisory and expert roles on key committees (e.g., UK Government All-Party Parliamentary Group, International Committee of the Red Cross).

  • Reduced gender bias in Google Translate, which, drawing directly on examples from Bath research, now provides feminine and masculine translations for some gender-neutral words.

2. Underpinning research

As Artificial Intelligence (AI) has transitioned from model-based approaches to increasingly data-driven machine learning techniques, resulting technologies have become more prone to various forms of bias, raising serious concerns over the ethical implications of AI and increasing demand from designers and policy makers to understand, mitigate and legislate for these biases.

Research by Bryson at the University of Bath has focused on exploring and addressing the ethical implications of AI and bias and, together with colleagues, developed ethical principles for robotics.

As AI techniques increasingly innovated in data-driven approaches, some in the field presumed that model-based work was of decreasing importance, whereas Bryson (with collaborators at Princeton University) recognised that the increasing lack of legibility would emphasise further the study of accountability in AI ethics and bias. This culminated in work which demonstrated that the use of natural language training corpora would result in models that embody cultural biases in a variety of forms, including gender and race. The authors demonstrated that tools such as Google Translate embed bias, for example when translating from a language without gendered pronouns to one with them. The research highlighted how Google Translate translates from Turkish’s gender-free pronouns to ‘he is a doctor’ but ‘she is a nurse’ [ REF1].

In further research (2017), Bryson and colleagues across the globe considered how we can guide the way technology impacts society. The methodologies that underpin these demonstrations have led to the development of new ethical principles, new standards and standardization processes to ensure the safety, security, and reliability of AI [ REF2, REF3, REF4]. Bryson argues that while making AI moral agents or patients is an intentional and avoidable action, avoidance would be our most ethical choice. Bryson argued (2010) that the potential of robotics should be understood as the potential to extend our own abilities and to address our own goals [ REF5]. However, robots should not be described as persons, nor given legal nor moral responsibility for their actions [ REF2]. Robots should also not have a deceptive appearance - they should not fool people into thinking they are similar to empathy-deserving moral patients. Bryson also argues that clear, generally-comprehensible descriptions of an artefact’s goals and intelligence should be available to any owner, operator, or other concerned party [ REF4]. Finally, Bryson’s work has shown that the transparency of machine learning can be radically improved by providing real-time visualisation of a robot’s AI, an approach that also helps an observer to understand the robot’s behaviour [ REF6].

3. References to the research

[REF 1] Caliskan, A, Bryson, JJ & Narayanan, A 2017, 'Semantics derived automatically from language corpora contain human-like biases', Science, vol. 356, no. 6334, pp. 183-186. https://doi.org/10.1126/science.aal4230

[REF 2] Bryson, JJ & Winfield, A 2017, 'Standardizing Ethical Design for Artificial Intelligence and Autonomous Systems', Computer, vol. 50, no. 5, 7924235, pp. 116 - 119. https://doi.org/10.1109/MC.2017.154

[REF 3] Boden, M, Bryson, J, Caldwell, D, Dautenhahn, K, Edwards, L, Kember, S, Newman, P, Parry, V, Pegman, G, Rodden, T, Sorrell, T, Wallis, M, Whitby, B & Winfield, A 2017, 'Principles of robotics: regulating robots in the real world', Connection Science, vol. 29, no. 2, pp. 124-129. https://doi.org/10.1080/09540091.2016.1271400

[REF 4] Bryson, JJ 2018, 'Patiency Is Not a Virtue: The Design of Intelligent Systems and Systems of Ethics', Ethics and Information Technology, vol. 20, no. 1, pp. 15-26. https://doi.org/10.1007/s10676-018-9448-6

[ REF 5] Bryson, JJ 2010, Robots should be slaves. in Y Wilks (ed.), Close engagements with artificial companions: key social, psychological, ethical and design issues. Natural Language Processing, vol. 8, John Benjamins Publishing Company, Amsterdam, pp. 63-74. https://doi.org/10.1075/nlp.8.11bry

[ REF 6] Wortham, RH, Theodorou, A & Bryson, JJ 2017, Improving robot transparency: real-time visualisation of robot AI substantially improves understanding in naive observers. in 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)., 8172491, IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), vol. 26, IEEE, IEEE RO-MAN 2017, Lisbon, Portugal, 28/08/17. https://doi.org/10.1109/ROMAN.2017.8172491

4. Details of the impact

1) Influencing standards in robot ethics in the UK and worldwide

Bath research has directly influenced the development and content of the British Standard BS8611. BS8611 is the “ earliest explicit ethical standard in robotics” and the “ one standard [that] specifically addresses AI”, providing “ guidance on how designers can identify potential ethical harm, undertake an ethical risk assessment of their robot or AI, and mitigate any ethical risks identified” [ A, p.4, 66]. “ At the heart of BS8611 is a set of 20 distinct ethical hazards and risks […] Advice on measures to mitigate the impact of each risk is given alongside suggestions on how such measures might be verified or validated” [ REF4]. The drafting of the standard resulted from Bryson’s invitation to the UK Robot Ethics Forum (London, 2015); she provided further consultation on the development of BS8611 during 2015 with 5 of the 9 ethical principles set out in 5.1.1. coming directly from Bryson’s research [ REF4]; the standard, BS8611:2016, is available to purchase via the BSI website [ K].

The Institute of Electrical and Electronics Engineers (IEEE) is the world’s largest technical professional organisation representing more than 400,000 members in over 160 countries. The University of Bath research influenced the IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems and the IEEE P7000 series of standards, providing engineers and technologists with an implementable process to minimize ethical risk for their organizations, stakeholders and end users [ B]. Bryson’s work, cited in the IEEE manifesto Ethically Aligned Design [ B], informed principles around transparency, embedding values into autonomous intelligent systems and Affective Computing. Since 2016 Bryson has co-chaired the IEEE Affective Computing Committee [ B, p.14] and sits on the Sustainable Development group, that informs the P7000 standards.

**2) Improving the understanding of policymakers **

Bath research has led to improved understanding, mitigation and legislation for AI biases among designers and policymakers. This influence was achieved through Bryson’s expert advice to UK, pan-European, and international governmental, inter-governmental and professional bodies.

UK Government Policy: Drawing on her research, Bryson influenced UK AI policy making, “ advising the Government to enable the on-demand and routine auditing of AI and algorithmic systems” through her appointment as Expert Advisor to the All-Party Parliamentary Group on Artificial Intelligence (2017), in which she “ discussed the issue of algorithmic biases” [ L] and through other Expert Panel memberships [e.g., C]. Former Deputy Prime Minister Sir Nick Clegg stated “ *Your presentation to the group (Open Reason Round Table Event in November 2017) and your contributions to the following discussion, was immensely useful…the round table played an integral role in preparing for the speech on the politics around artificial intelligence which I delivered at the end of last year (2017)*” [ D].

Pan-European Policy: Bryson’s research [ REFS 5, 6] is cited in the European Parliamentary Research Service (EPRS) study on the ethical implications and moral questions arising from the development and implementation of AI technologies, including wealth inequality and political upheaval that could result from the rise in AI and the immorality of giving robots moral agency [ A, p. 12, 14, 20, 35]. Further, the European Committee for Standardisation and the European Committee for Electrotechnical Standardisation (CEN-CENELEC) cites Bryson’s work in its Roadmap [ REF3; E, p.29, 34] and notes that her research “ has been vital in making the case for AI standards in Europe” [ J].

International Representation: Among a number of influential roles, including: UN Centre for Policy Research (AI legal standards), Canadian Institute for Advanced Research, Mindfire ((Switzerland) Ethics Board); Bryson’s research has informed the International Committee of the Red Cross (ICRC) around ethical issues raised by autonomous weapon systems and the requirement for human control over the use of force [ F, p.1]. This work emphasised that states must urgently establish limits on autonomy in weapon systems and was cited in the ICRC 2020 report [ G], commended “ primarily to government decision makers in the realms of international law, arms control, defence and foreign affairs”. This also led to the publication of an Accountability and Transparency report [ F, p.4].

Bryson’s influence is widely recognised: in 2017 Bryson was ranked as one of the Top 50 female artificial intelligence influencers in the world (Onalytica); and in 2019 Bryson was listed by Siliconrepublic as one of 10 AI influencers you should be following on Twitter [ H].

3) Reducing gender bias in Google Translate

Bryson’s research [ REF3] informed changes in Google Translate operations. The Product Manager at Google Translate quotes Bryson’s example word for word on Turkish gender-free pronouns [ REF3] and explains the changes Google made: “ Our latest development in this effort addresses gender bias by providing feminine and masculine translations for some gender-neutral words on the Google Translate website … Now you’ll get both a feminine and masculine translation for a single word […] For example, if you type ‘o bir doktor’ in Turkish, you’ll now get ‘she is a doctor’ and ‘he is a doctor’ as the gender-specific translations” [ I]. Google Translate is the world’s most-used machine translation system, translating more than 100,000,000,000 words a day for 500,000,000 users.

5. Sources to corroborate the impact

[ A] European Parliament. March 2020. “The ethics of artificial intelligence: Issues and initiatives”. Panel for the future of science and technology. https://www.europarl.europa.eu/RegData/etudes/STUD/2020/634452/EPRS_STU(2020)634452_EN.pdf

[ B] IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems, Ethically Aligned Design: A Vision for Prioritizing Human Well-being with Autonomous and Intelligent Systems, IEEE, 2018. https://standards.ieee.org/content/dam/ieee-standards/standards/web/documents/other/ead_v2.pdf

[ C] Beard, Simon. APPG for Future Generations event: How do we make AI safe for humans? 20 July 2018. https://www.cser.ac.uk/news/appg-ai-safe/

[ D] Letter from Former Deputy Prime Minister, 16 January 2018.

[ E] CEN-CENELEC. “Focus Group on Artificial Intelligence (AI): Final draft for commenting: Roadmap from Focus Group on AI”, 11 July 2020.

[ F] Invited participant, “Ethics and autonomous weapon systems: An ethical basis for human control?” A roundtable panel of International Committee of the Red Cross (ICRC) “Accountability and Transparency.” Humanitarium, Geneva, Switzerland, 28-29 August 2017. Contributed to ICRC report based on that meeting: Ethics and autonomous weapon systems: An ethical basis for human control? 3 April 2018. https://www.icrc.org/en/download/file/69961/icrc_ethics_and_autonomous_weapon_systems_report_3_april_2018.pdf

[ G] Boulanin, V., Davison, N., Goussac, N., and Carlsson, M. P. 2020. Limits on Autonomy in Weapon Systems: Identifying Practical Elements of Human Control. https://www.icrc.org/en/document/limits-autonomous-weapons

[ H] Influence awards: Dunmore, L. Women in Tech: Hot Topics and Top Influencers, 2017.

https://onalytica.com/blog/posts/women-tech-hot-topics-top-influencers/; Darmody, J. 10 AI influencers you should be following on Twitter, 2019 https://www.siliconrepublic.com/people/ai-influencers-twitter

[ I] James Kuczmarski, “Reducing gender bias in Google Translate”, Google Translate Blog Post, December 6th 2018. https://www.blog.google/products/translate/reducing-gender-bias-google-translate/

[ J] Testimonial Letter from Convenor of CEN-CENELEC AI Focus Group, 22 December 2020.

[ K] BS 8611:2016. Robots and robotic devices. Guide to the ethical design and application of robots and robotic systems, April 2016.

[ L] Governance, Social and Organisational Perspective on AI. 11 September 2017. http://appg\-ai.org/wp\-content/uploads/2017/12/appgai\_theme\-report\-5.pdf

Submitting institution
The University of Bath
Unit of assessment
11 - Computer Science and Informatics
Summary impact type
Cultural
Is this case study continued from a case study submitted in 2014?
No

1. Summary of the impact

Facial animation is frequently used across the entertainment industry for movies and video games. Prof. Cosker’s research has led to the development of new tools and technologies to create facial animation which enable typically time-consuming tasks to be performed quickly and within limited resources. These tools have been used through the EPSRC Centre for the Analysis of Motion, Entertainment Research and Applications (CAMERA) at the University of Bath, by major broadcasting corporations (BBC), world-leading film & animation (Aardman animation) and creative studios (Marshmallow Laser Feast and Satore Studios) to enable them to produce higher quality facial animation more efficiently. This has led to the creation of award-winning animations that have enriched cultural experiences for audiences across the globe.

2. Underpinning research

It has always been a challenge across the video game and movie industries to create facial animation quickly and at high quality without expensive resources (e.g., human effort and facilities) and/or specialist expertise. Over the last 12 years, Prof. Cosker has performed research in this area – understanding faces and their movement – and developed methodologies to create facial animation more efficiently.

Between 2007 and 2012, Prof. Cosker used 4D imaging technology to acquire the first database [R1] of psychologically validated facial Action Units (AUs) according to the Facial Action Coding System (FACS) created by Dr. Paul Ekman – an American Psychologist who pioneered our understanding of facial emotion with the publication of FACS in the 1970s. This allowed Cosker to observe, for the first time, dynamic facial movements in 3D during AUs. Part of processing the data included a method to track faces using a novel Active Appearance Model (AAM) formulation that was trained on the fly using key-frames [R1]. The method for tracking movement in faces was later improved by introducing mesh based optical flow [R2], which greatly improved the 3D alignment and registration of facial meshes.

While working on a RAEng Fellowship [G1] that developed methods for 3D facial capture and tracking, Cosker began collaborating with Oliver James (Chief Scientist) and Martin Parsons (Head of Creative Animation) at Double Negative Visual Effects – now DNEG; Martin Parsons now heads the CAMERA studio. DNEG, one of the world’s leading visual effects studios with multiple Oscar wins, required a solution to track actors’ faces from Head Mounted Cameras (HMCs) while working on the Disney movie John Carter and turned to Prof Cosker. Testing the feasibility of Cosker’s facial tracking solutions on DNEG test footage led to the development of a tool called ‘Kojak’, used in the movie on over 200 shots. Prof. Cosker then won a four year Royal Society Industry fellowship (50% hosted at DNEG [G2]), during which his research added an assessment of modern camera tracking approaches in VFX and further developed facial animation [R2].

Between 2011 and 2013, Cosker developed a new method for real-time animation that preserves rigid areas defined in the texture map; this was shown to work on facial animations where the character was reptilian or had scales [R3]. Cosker proposed use cases for this technology leading to patents with Disney Research.

This research underpinned later work (2014 – present) to automatically estimate action units on facial animation models using motion capture [R4]. In an InnovateUK project with The Imaginarium [G3] the motion capture research was developed further and Cosker devised an end-to-end pipeline to build facial models for actors from a single 3D scan of their face which could then be animated quickly at high quality using motion capture. This collaboration underpinned the establishment of CAMERA (now a GBP20,000,000 research centre) and resulted in a novel optimisation scheme and a new method to acquire motion information ‘in between’ motion capture markers, i.e., on skin features. More recently, Cosker extended the work further to enable faces to be animated quickly by unskilled users – even children [R5]. This is being explored in experiments with psychiatric patients with Prof. Essi Viding (UCL) and Dr. Isabelle Mareschal (QMU) as part of an MRC grant [G4].

3. References to the research

[R1] Reed, K & Cosker, D 2019, 'User-Guided Facial Animation through an Evolutionary Interface', Computer Graphics Forum, vol. 38, no. 6, pp. 165-176. https://doi.org/10.1111/cgf.13612

[R2] Ravikumar, S, Davidson, C, Kit, D, Campbell, NDF, Benedetti, L & Cosker, D 2016, Reading Between The Dots: Combining 3D Markers And FACS Classification For High-Quality Blendshape Facial Animation. in K Moffatt & T Popa (eds), Proceedings of Graphics Interface 2016: Victoria, British Columbia, Canada, 1- 3 June 2016 . Canadian Human-Computer Communications Society, Ontario, Canada, pp. 143-151. https://doi.org/10.20380/GI2016.18

[R3] Cosker, D, Krumhuber, E & Hilton, A 2012, A FACS valid 3D dynamic action unit database with applications to 3D dynamic morphable facial modeling. in 2011 IEEE International Conference on Computer Vision (ICCV)., 6126510, International Conference on Computer Vision, IEEE, pp. 2296-2303, 13th International Conference on Computer Vision (ICCV), Barcelona, Spain, 6/11/11. https://doi.org/10.1109/ICCV.2011.6126510

[R4] Li, W, Cosker, D, Brown, M & Tang, R 2013, Optical flow estimation using Laplacian Mesh Energy. in 2013 IEEE Conference on Computer Vision and Pattern Recognition. IEEE Conference on Computer Vision and Pattern Recognition, vol. 2013, IEEE, pp. 2435-2442, IEEE International Conference on Computer Vision and Pattern Recognition (CVPR), Oregon, US, UK United Kingdom, 25/06/13. https://doi.org/10.1109/CVPR.2013.315

[R5] Koniaris, C, Cosker, D, Yang, X, Mitchell, K & Matthews, I 2013, Real-time content-aware texturing for deformable surfaces. in CVMP '13: Proceedings of the 10th European Conference on Visual Media Production., 11, Association for Computing Machinery, pp. 1-10. https://doi.org/10.1145/2534008.2534016

The research underpinning this case study was based on the following grants (all with Prof. Cosker as PI).

[G1] 2007-2012: Exploiting 4D Data for Creating Next Generation Facial Modelling and Animation Techniques (GBP460,640FEC). The Royal Academy of Engineering Research Fellowship.

[G2] 2012-2016: Next Generation Facial Capture and Animation (GBP100,887 FEC). Partner: Double Negative Visual Effects. The Royal Society Industry Fellowship.

[G3] 2015-2017: Goal Oriented Real Time Intelligent Performance Retargeting (GBP29,997 FEC). Partner: The Imaginarium. Innovate UK.

[G4] 2019-2022. A tool to reveal Individual Differences in Facial Perception (GBP402,113) Medical Research Council (MRC)

[G5] 2016-2018. HARPC: HMC for Augmented Reality Performance Capture (GBP119,025, Total project value GBP517,616 FEC). Partner: The Imaginarium. Innovate UK

[G6] 2015-2020: Centre for the Analysis of Motion, Entertainment Research and Applications - CAMERA 1.0 (GBP4,998,728 FEC).

[G7] 2020-2025: Centre for the Analysis of Motion, Entertainment Research and Applications – CAMERA 2.0 (GBP4,150,000 FEC).

4. Details of the impact

Through CAMERA, Cosker’s research insights into facial motion capture (R1 – R5) have enhanced VR production in resource constrained environments, yielding high quality, realistic, award-winning films and games that have reached world-wide audiences.

Enhancing the quality and efficiency of VR film production

‘Chameleon’ – Marshmallow Laser Feast. A 2016 collaboration (with the commissioner British Council, creative agency Marshmallow Laser Feast (MLF) and local artists) co-created ‘Chameleon’, a VR film about the community in Mexico City. This used Cosker’s ‘3D scan to actor’ tool to generate high quality 3D facial models in a time-scale that met the tight constraints of their project: “without the use of CAMERA’s technology, MLF would not have been able to deliver the 3D facial models for the project quickly enough and at the level of realism desired” [S1; Executive Producer, MLF]. ‘Chameleon’ has attracted world-wide audiences and enriched cultural experiences through exhibitions such as the 15-day International Documentary Film Festival Amsterdam (IDFA; 2016) which included a Dutch royal visit, 2 exhibits at MUTEK (Mexico City, 2016) and the VR World Congress 2016, with approximately 14,000 visitors across these exhibits. MLF have demonstrated the film to the BBC audience development/marketing team, and CAMERA have demonstrated it to more than 200 visitors [S1].

‘Is Anna OK?’ – Aardman Animation. ‘Is Anna OK?’ is a VR experience created from a commercial project collaboration between CAMERA, Aardman and BBC, utilising the ‘3D scan to actor’ tools together with motion capture to produce a user and production-friendly facial motion capture and modelling solution [S2] to create lifelike characters. It tells a true story of the personal repercussions of brain trauma from different perspectives. In 2018 ‘Is Anna OK?’ was made available on the Oculus Store to download, was part of a national press campaign by the BBC, toured at multiple festivals and was used to raise awareness of brain injury by Headway [S3]; the related YouTube video has more than 16,000 views. CAMERA’s role was crucial: “ Collaborating with CAMERA and making use of their facial animation technology made the final experience richer and possible within a tight budget. The animation CAMERA produced was core to delivering the experience of Anna and her family. They have been touched by what has been created and are glad others will get to hear their story and that awareness of brain injury will be raised further” [S2; Head of Interactive Production, Aardman Animation].

Cosmos Within Us – Satore Studios. CAMERA worked with Satore Studios [S3] on ‘Cosmos Within Us’ (2019) to tell the story of a 60-year-old man suffering from Alzheimer’s. Satore required photorealistic characters to achieve their vision but had no access to affordable technology that would deliver the required quality. The University of Bath’s facial motion capture tools created performances that were key to the immersive experience, allowing Satore to deliver high quality animation on time and on budget. CAMERA tools “were critical to delivery of the experience due to the presence of the character during the piece. The use of the grandmother in the space, made for a new form of interaction between viewer and haptics, which would be very hard to replicate every time, unless it was recorded” [S3; Creative Director and Founder, Satore Studio] . ‘Cosmos within Us’ has won critical acclaim, debuting at the Cannes film festival and subsequently shown 132 times at the Venice Film Festival (2019) [S5] and 40 times at the Raindance Film festival (2019), where it won the prestigious best immersive experience award [S6] .

Related projects:

  • ‘11:11 Memories Retold’. Created with Bandai Namco and Aardman (their first video game [S7]), CAMERA created all of the motion capture for this video game, nominated for a BAFTA in 2019 [S8], won the TIGA award for Best Educational or Serious Game [S9] and is available on XBox One, PC and PS4 [S7].

  • ‘Magic Butterfly’, a VR experience created with the Welsh National Opera and REWIND where the viewer could watch lead soprano Kara Son perform in the opera Madame Butterfly. CAMERA created all of the animation of Kara performing using its motion capture technology, therefore contributing the essential part of this experience. The piece has won several awards, including Creative Technology Award (Gold Winner) and Immersive Art Award (Gold Winner) and was attended by more than 12,000 people in 2017, touring in the UK and Hong Kong, Copenhagen and Dubai [S9].

  • Johnnie Walker HIK+ campaign with Satore studio, including creation of dance animation for a performer captured by CAMERA and projected for 5 days onto the side of a large public building (Torre Reforma) on one of the most important avenues in central Mexico City in 2017. The experience engaged 120,000 people and generated 63,000,000 brand impacts [S10].

5. Sources to corroborate the impact

[S1] Evidence letter: Executive Producer of Marshmallow Laser Feast, 8 December 2020

[S2] Evidence letter: Head of Interactive Production at Aardman Animation, 8 December 2020.

[S3] Headway Press Release, 25 October 2018 Headway teams up with BBC Stories to explain brain injury in VR project. https://www.headway.org.uk/news-and-campaigns/news/2018/headway-teams-up-with-bbc-stories-to-explain-brain-injury-in-vr-project/

[S4] Evidence letter: Creative Director and Founder of Satore Studio, 31 December 2020.

[S5] Venice Biennale website, 2019. Cosmos within us: Venice virtual reality. https://www.labiennale.org/en/cinema/2019/venice-virtual-reality/cosmos-within-us

[S6] Raindance Film Festival website, 2019. Raindance immersive award winners presented by Bose. 29 September. https://www.raindance.org/raindance-immersive-award-winners-presented-by-bose/

[S7] Aardman website, 2019, 11-11: Memories Retold. https://www.aardman.com/work/11-11-memories-retold-game/

[S8] BAFTA website, 2019. https://www.bafta.org/games/awards/british-game-2019

[S9] TIGA Games Industry Awards website, 2019 Winners https://tiga.org/awards/2019-winners

[S10] Welsh National Opera website, 2017/2018. Magic Butterfly. https://wno.org.uk/archive/2017-2018/magic-butterfly

[S11] Satore Studio website, 2017. HIK+. https://satorestudio.com/portfolio_page/hik/

Submitting institution
The University of Bath
Unit of assessment
11 - Computer Science and Informatics
Summary impact type
Societal
Is this case study continued from a case study submitted in 2014?
No

1. Summary of the impact

University of Bath research using participatory approaches to design and evaluate digital technologies, including mobile systems, and augmented and virtual reality for interactivity and learning, has:

  • Informed and shaped British Broadcasting Corporation (BBC) R&D projects and programmes including the flagship micro:bit programme that distributed over 1,000,000 micro:bits to year 7 pupils across the UK in 2016, 2,000,000 by 2018;

  • Supported development of award-winning digital educational products leading to investment in SMEs including ScienceScope, circa GBP500,000 in 2020;

  • Informed the design of an award-winning virtual reality non-fiction documentary, ‘The Waiting Room VR’, showcased at the 76th Venice film festival in 2019.

2. Underpinning research

Participatory research plays an important role in facilitating deployments of digital technologies into real world settings that enhance developmental and creative goals and applications. University of Bath research, led by Professor Stanton Fraser, focuses on the design and evaluation of mobile and ubiquitous technologies to enhance interactivity and learning.

For the past 15 years, Professor Stanton Fraser and her team have collaborated with industry partners and schools to explore the role of technology in children’s learning, evaluating its use and improving its design. In 2005-2006, Stanton Fraser carried out a school-based Participatory Design (PD) exercise with children, to develop an environmental pollution sensor that could be used with a mobile phone. This led to a prototype which connected to SME partner ScienceScope’s logbook datalogger via Bluetooth. This research demonstrated that large-scale co-design carried out ‘in the wild’ in everyday classrooms was a potentially useful design technique [1].

This research acted as an important pre-cursor to the follow up Participate project, a large-scale multi-partner research project in which Bath collaborated with the BBC, Microsoft, British Telecom, ScienceScope, Blast Theory (an internationally renowned group of interactive media artists) and the University of Nottingham (2005-2008, GBP3,000,000 funded by EPSRC) [A]. The research explored how pervasive computing could support nationwide campaigns and education [1-3] using research ‘in the wild’, iterative public trials and observational studies of emerging technologies [1]. University of Bath led the formal education workstream, building on the original mobile device prototype to explore the role that local environmental sensing could have on learning within schools. Through this project they developed the MobGeoSens hardware and integrated software to enable data collection and visualisation using mobile phone (Nokia), GPS receiver and ScienceScope’s Bluetooth-linked Sensor data logger [2]. They used a two-stage trial design, a pilot in two schools followed by a larger trial in 13 schools with children aged 13 to 15 years during 2006-2007 [3, 4] using a participatory approach. The research demonstrated that MobGeoSens enhanced engagement in learning [1-3] and identified the contextual factors to help the implementation of technology in educational settings including resources, teacher/school engagement and support [3].

The EPSRC Virtual Realities (VR) project (2017-2020, GBP1,300,000) with University of Bristol and UWE explored the design characteristics of VR and how to address interaction design challenges. Professor Stanton Fraser previously demonstrated the role of spatial cognition in VR and the benefits of an egocentric (first person) viewpoint for spatial tasks [4]. Drawing on this work, the VR research project analysed a representative sample of 150 Virtual Reality non-fiction (VRNF) titles released between 2012-2018 and identified 64 characteristics [5]. This was the first time the characteristic features of VRNF had been analysed at this scale. The research identified two critical aspects of VRNF, the viewer role and embodiment and social interaction, shedding light on new audience roles as interactive participants in these new forms of immersive environments. However, the study found that there were very few titles that fully exploit the potential of egocentric perspectives and they rarely provided visible evidence of physical embodiment but, where these were provided, they enhanced the experience [5].

3. References to the research

1. Kanjo, E, Benford, S, Paxton, M, Chamberlain, A, Stanton Fraser, D, Woodgate, D, Crellin, D & Woolard, A 2008, 'MobGeoSen: facilitating personal geosensor data collection and visualization using mobile phones', Personal and Ubiquitous Computing, vol. 12, no. 8, pp. 599-607. https://doi.org/10.1007/s00779-007-0180-1 Joint publication with industrial partners Sciencescope and BBC.

2. Chamberlain, A, Paxton, M, Glover, K, Flintham, M, Price, D, Greenhalgh, C, Benford, S, Tolmie, P, Kanjo, E, Gower, A, Gower, A, Woodgate, D & Stanton Fraser, D 2014, 'Understanding mass participatory pervasive computing systems for environmental campaigns', Personal and Ubiquitous Computing, vol. 18, no. 7, pp. 1775-1792. https://doi.org/10.1007/s00779-013-0756-x Joint publication with industrial partners BT.

3. Woodgate, D, Stanton Fraser, D, Gower, A, Glancy, M, Gower, AP, Chamberlain, A, Dillon, T & Crellin, D 2010, Using mobile and pervasive technologies to engage formal and informal learners in scientific debate. in T Goh (ed.), Multiplatform E-Learning Systems and Technologies: Mobile Devices for Ubiquitous ICT-Based Education. Information Science Reference, pp. 196-214. https://doi.org/10.4018/978-1-60566-703-4.ch012 Joint publication with industrial partners Sciencescope and the BBC.

4. Foreman, N, Stanton Fraser, D, Wilson, PN, Duffy, H & Parnell, R 2005, 'Transfer of spatial knowledge to a two-level shopping mall in older people, following virtual exploration', Environment and Behavior, vol. 37, no. 2, pp. 275-292. https://doi.org/10.1177/0013916504269649

5. Bevan, C, Green, DP, Farmer, H, Rose, M, Cater, K, Stanton Fraser, D & Brown, H 2019, Behind the Curtain of the 'Ultimate Empathy Machine': On the Composition of Virtual Reality Nonfiction Experiences. in CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 2019 edn, 506, CHI Conference Proceedings, Association for Computing Machinery, Glasgow, United Kingdom, pp. 1-12. https://doi.org/10.1145/3290605.3300736 (approx. 20% acceptance rate and honourable mention).

Grants underpinning this research include:

[A] Nov 2005-Nov 2008. GBP3,500,000 TSB/EPSRC The Participate project: Pervasive Computing for Mass Participation in Environmental Monitoring. With University of Nottingham, BT, Microsoft, BBC, Blast Theory, Sciencescope

[B] Jan 2012-Jan 2016. GBP5,000,000 AHRC. REACT Knowledge Exchange Hub for the Creative Economy. With UWE, Universities of Bristol, Exeter and Cardiff and the Pervasive Media Studio.

[C] July 2017-Jan 2020 GBP1,300,000 EPSRC Virtual Realities – Immersive Documentary Encounters. With University of Bristol and UWE. EP/P025595/1.

4. Details of the impact

Influencing BBC Research and Development projects, programmes, and products

The BBC and the University of Bath have collaborated for the past 15 years (most recently within the Virtual Realities and Bristol and Bath Creative R&D Cluster projects) on research into mobile systems, augmented and virtual realities that have provided the pathway to a variety of impacts of the Bath research. Head of BBC Research and Development Northern Lab states that the work with Bath: “creates an ecosystem for research that […] enables us to be plugged into regional networks in the UK leading the way in Human Computer Interactions (HCI)” [A]. The strength of this long-term relationship was reflected in the University of Bath being selected by the BBC as 1 of 6 partners in its User Experience (UX) Research Partnership in 2013 [A, B]. This led to a programme of collaborative research, knowledge exchange projects and placements of Bath researchers and students over the past 7 years (between 2013 and 2020), giving the BBC access to “expertise and research insights in Human Computer Interaction and psychology together with undertaking in-depth and rigorous analysis, research and evaluation” [A], and leading to additional collaborative research projects such as those noted above [A, B].

Insights from these collaborative research projects have directly benefited the BBC. The work from the Participate project on formal education led by Bath [1-3, A] continues to have an impact on BBC school-based/formal education programmes and on their delivery of educational products. The Head of BBC Research and Development (Northern Lab) notes a “ripple effect in that the Participate project led to the Microbit project [..] From the Participate project […] we knew what it would take to engage teachers, train teachers to be comfortable in using the technology in their lessons addressing the barriers to adoption and delivering tech at scale” [A, 2]. The micro:bit is a pocket-sized codeable computer with motion detection, built-in compass and Bluetooth technology. A BBC flagship programme “Make it Digital” distributed the micro:bit to every child in year 7 (or equivalent) across the UK in 2016. In 2016, 1,000,000 micro:bits were distributed, with 2,000,000 by 2018 and they are now being distributed internationally [C].

Supporting product development and investment in SMEs

Bath’s collaborative research programme with ScienceScope [2-4], an SME that develops educational sensors led to ScienceScope gaining “substantial benefits from working with Professor Stanton Fraser at the University of Bath[...] Our collaborative research on the mobile sensing and mobile data and the insights from this has very much led to our ground-breaking Internet of Things work including informing our IoT sensing kits. On the international stage ScienceScope punches well above its weight in part through our partnership with the University of Bath. We have recently won a number of contracts in both the UAE (Project Class Explore 2018-2020) and Singapore (IOT@School 2015-2018 ), worth GBP200,000 and GBP300,000 respectively, to carry out novel product development for internet of things and research, as a result” [D, ScienceScope CEO]. ScienceScope has become an international player in educational sensor distribution, winning GBP1,200,000 of export deals in 2018 [E].

The GBP5,000,000 AHRC REACT Knowledge Exchange Hub (between 2012 and 2017) delivered huge impact including 76 new pieces of software and 10 new companies, with a further GBP5,353,569 invested in projects [F, pp.4-5]. As CI and expert adviser on REACT, Stanton Fraser’s research insights were key to the research programme design leading to specific benefits for end users. In relation to the Play Sandbox as part of REACT (2014 to 2015), “Professor Stanton Fraser’s insights and expertise around codesign with children, including digital educational tools and ‘in the wild’ approaches has fed into the research design and associated projects of the REACT programme, including Sensible Object (Beast of Balance), BioBeats (Breathing Stone) and Enabling Play (Millie Moreorless)” [G, REACT Executive Producer]. These projects were successful, securing further investment and funding [F] to support commercially successful products and businesses.

Enriching creative processes and cultural experiences through virtual reality documentaries

The EPSRC VR project led to the commissioning of three non-fiction VR documentaries. Of these, two had producers new to VR. One of these documentaries, ‘The Waiting Room VR’, presents a personal story of the Director’s breast cancer journey from diagnosis through treatment to recovery [H, I]. A key design feature of the piece, rooted in the Bath research, was the viewpoint of the participant; the Director praised the support and insights she received, stating that “Professor Stanton Fraser’s (Danaë’s) research around spatial viewpoint, cognition and behaviour [4], in real and virtual environments, was very useful to me when developing The Waiting Room and considering this perspective (…) Danaë’s research around spatial viewpoint and cognition was crucial in these discussions and decisions” [J]. ‘The Waiting Room’ was chosen to showcase at the 76th Venice Film Festival (2019), selected for IDFA in Amsterdam (2019). In addition to reaching audiences by being showcased at these major forums for film, the documentary won The IDFA DocLab Award for Digital Storytelling (2019) and was selected for Forbes’ Top 50 XR experiences of 2019 [H].

5. Sources to corroborate the impact

[A] Testimonial Letter, Head of BBC Research and Development Northern Lab, BBC, 15 January 2021.

[B] Testimonial Letter, Lead Research Scientist, BBC, 7 December 2020.Including collated screenshots of:

Bath Echo, BBC Team up with University of Bath for Research Project, 19 July 2013. https://www.bathecho.co.uk/news/bbc\-team\-up\-with\-university\-of\-bath\-for\-research\-project\-51396/; and

BBC Research and Development blog, BBC R&D launches the UX Research Partnership, 2013,

https://www.bbc.co.uk/rd/blog/2013-07-bbc-rd-launches-the-ux-research-partnership

[C] Micro:Bit website, accessed 20 January 2021, https://microbit.org/impact/case-studies/milestones-for-the-bbc-microbit/ and https://microbit.org/

[D] Testimonial Letter, Chief Executive Officer, ScienceScope, 20 January 2021.

[E] UK Government website, Department for International Trade, 23 July 2018. “ Bath tech firm secures £1.2 million worth of export deals”. 25 July. Available at: https://www.gov.uk/government/news/bath-tech-firm-secures-12-million-worth-of-export-deals

[F] REACT report 2012-2016. A playground for new ideas. A place to collaborate. Experiment, explore, create. Spark change. Go disrupt. http://www.react-hub.org.uk/publications/react-report/ REACT impact report: https://www.watershed.co.uk/sites/default/files/publications/2018-10-08/reactreport.pdf

[G] Testimonial e-mail, Chief Executive Officer Watershed, Executive Producer of the REACT programme, 26 March 2021.

[H] ‘The Waiting Room VR’: A Film and VR Experience. 2019. https://eastcityfilms.com/the-waiting-room-vr

[I] Immerse UK website, ‘The Waiting Room: One Year On’. https://www.immerseuk.org/case-study/the-waiting-room-one-year-on/ accessed 20 January 2021.

[J] Testimonial Letter, Director, ‘The Waiting Room’, 7 December 2020

Showing impact case studies 1 to 3 of 3

Filter by higher education institution

UK regions
Select one or more of the following higher education institutions and then click Apply selected filters when you have finished.
No higher education institutions found.
Institutions

Filter by unit of assessment

Main panels
Select one or more of the following units of assessment and then click Apply selected filters when you have finished.
No unit of assessments found.
Units of assessment

Filter by continued case study

Select one or more of the following states and then click Apply selected filters when you have finished.

Filter by summary impact type

Select one or more of the following summary impact types and then click Apply selected filters when you have finished.

Filter by impact UK location

UK Countries
Select one or more of the following UK locations and then click Apply selected filters when you have finished.
No UK locations found.
Impact UK locations

Filter by impact global location

Continents
Select one or more of the following global locations and then click Apply selected filters when you have finished.
No global locations found.
Impact global locations

Filter by underpinning research subject

Subject areas
Select one or more of the following underpinning research subjects and then click Apply selected filters when you have finished.
No subjects found.
Underpinning research subjects