Impact case study database
Shaping Digital Industry Best Practice to Improve User-Experience for children.
1. Summary of the impact
The University of Central Lancashire’s Child Computer Interaction Group (ChiCI) has pioneered interaction design with children, influencing Digital Industries across the world. Our research into Child User-Experience (ChildUX) contributes methods, practices and policies by which software design with children takes place. Impact occurs through our research feeding into unique courses and ready-to-use materials for ChildUX. The courses use our discoveries to instruct industry on how to ethically design, interact and evaluate technology using children. Over 450 delegates from the USA, India, Canada and Singapore, have attended these events which also target academics engaged in teaching ChildUX, which increases the capacity of universities globally to teach ChildUX. IT organisations including Amberlight and Kano have adopted our methods, changed their practice and policies and, in collaboration with Amberlight, we have improved the usability, accessibility and content of games delivered by the BBC to children on tablet computers.
2. Underpinning research
User-Experience is a well-established discipline and practice used in both industry and research. The User Experience Professionals' Association (UXPA) and the British Computer Society (BCS) are professional bodies that accredit practitioners in user-experience, evaluating technology for usability. However, their procedures are specifically for, and about, adult facing methods. Professor Read’s research goes back three decades and shows that children as young as seven can use survey tools such as ‘The Fun Toolkit’ which we designed in 2002, and other methods, to evaluate computer products [1]. Our research is very practical, open source, and our research and applications are widely used by the digital industries. Assessing children’s user preferences is difficult as they tend to acquiesce toward what they feel an adult questioner may expect. In our research [1] a democratic approach was adopted so that children were able to express a choice for one particular design or another.
In retrospect, however, we concluded that an inclusive approach was required to ensure that we captured all the voices, artwork and ideas of every child who engaged in our surveys. We pioneered the first study of using drawings to get user experience feedback from children and to discover what they considered to be fun. This drawing methodology was also used to evaluate ‘goal fit’ - the method used to establish how users go about knowing what to do in games and entertainment interfaces. ‘Experience it, Draw it and Rate it’ refined the traditional evaluation methods for the design of new technologies for children. The methods used were shown to be highly reliable and could be coded, resulting in an easy user experience method for involving children as young as 5-7 years old [2].
Our research then moved on to construct methods by which researchers and analysts can clarify their ethical objectives in design and consider whose values were being considered – their own or that of the children. This was the first method devised to enable researchers to communicate with children about their research, to facilitate its ethical use and then being able to use it with children and teachers. Our approach involved creating a series of questions that challenge the designer to consider the appropriateness of the technical solution selected and whether it was appropriate that children be involved. This work on theoretically derived and empirically studied methods for working ethically with vulnerable populations has resulted in the ‘CHECk’ Tool, which has been adopted by other researchers [3].
By 2016 our methods had become refined and we were able to demonstrate that it was possible to obtain long-term feedback from children on their use of computer games and software based over a three-year period. We also established that interviewing on a one-to-one basis, as in our ‘Memoline’ trial, produced much better data for analysts. Again, a combination of a democratic and inclusive approach was utilised. Children were interviewed about the coloured timelines they had filled in for the ‘Memoline’ process. This allows researchers to explore the reasons why children’s experiences change over time when using a particular product. [4]
Our research in association with the BBC for special needs children involved us in studying what happens in the home with children’s recreational use of tablet computers. The children who participated in the study had cognitive, sensory or physical needs and these multiple challenges brought to light many difficulties parents experience when mediating tablet use. These groups are not typical tech users. The study reported on home use along four interacting areas of common concern: supporting family play; fitting technology use into the family day; staying turned on or off; and assisting the parent in determining rules and systems. This study identified and raised many challenges with regard to human computer interaction and the complexity of usability. Collectively representing the many kinds of physical, sensory and cognitive challenges presented by the participants served to illuminate the difficulties that all families have. This study, co-written with the BBC, therefore demonstrated how to successfully approach designing user experience from the perspective of those on the margins, rather than via the typical users found in the middle ground [5].
We then went on to create a process called ‘RAId’ (Rapid Analysis of design Ideas), which enables the ethical and inclusive analysis of large sets of design data. We developed a
method that was as inclusive as possible when dealing fairly, effectively and reasonably with the large volume of participant’s designs. These designs had been created in what we termed a ‘fast and furious’ design process involving 120 teenagers working in small groups in the space of 90 minutes. This was participative design on a grand scale and had not been attempted before. Furthermore, it necessitated that investigators considered design ideas with appropriate care and respect while the method also allowed novel ideas to be tracked from, and attributed back to, each young contributor [6].
3. References to the research
- Read, J. C., & MacFarlane, S. (2006, June). ‘Using the fun toolkit and other survey methods to gather opinions in child computer interaction. In Proceedings of the 2006 conference on Interaction design and children,’ (pp. 81-88). ACM. https://doi.org/10.1145/1139073.1139096
2 Xu, D., Read, J. C., Sim, G., & McManus, B. (2009, June). ‘Experience it, draw it, rate it: capture children's experiences with their drawings.’ In, Proceedings of the 8th International Conference on Interaction Design and Children (pp. 266-270). ACM. https://doi.org/10.1145/1551788.1551849
Read, J. C., Horton, M., Sim, G., Gregory, P., Fitton, D., & Cassidy, B. (2013, April). ‘CHECk: a tool to inform and encourage ethical practice in participatory design with children. In CHI'13 Extended Abstracts on Human Factors in Computing Systems,’ (pp. 187-192). ACM. https://doi.org/10.1145/2468356.2468391
Sim, G., Nouwen, M., Vissers, J., Horton, M., Sleders, K. and Zamman, B. (2016). ‘Using the MemoLine to capture changes in user experience over time with children,’ International Journal of Child Computer Interaction, 8(1), 1-14. https://doi.org/10.1016/j.ijcci.2016.07.001
Read, J.C., Horton, M., Clarke, S., Jones, R., Fitton, D. and Sim, G. (2018). ‘Designing for the 'at home' experience of parents and children with tablet games,’ In Proceedings of the 17th ACM Conference on Interaction Design and Children (IDC '18). ACM, New York, NY, USA, 441-448. https://doi.org/10.1145/3202185.3202769
Read, J.C., Fitton, D. Sim, G. and Horton, M. (2016). ‘How Ideas make it through to Designs: Process and Practice,’ In Proceedings of the 9th Nordic Conference on Human-Computer Interaction (NordiCHI '16). ACM, New York, NY, USA, Article 16, 10 pages. https://doi.org/10.1145/2971485.2971560
All articles are peer reviewed.
4. Details of the impact
The work carried out by University of Central Lancashire’s Child-Computer Interaction Group (ChiCI) impacts on the IT design industry and, consequently, on the children that use, consume and interact with computer-based games and applications. Our impact is to have changed industry practice and policy, and that of software designers by providing free to use tools that enable ethical and practical child computer interaction research and product development [A, C, D, F and J]. Our research activities and courses delivered worldwide have also helped shape industry practice and policy, for example with the BBC, who have changed how they deliver apps to the 4.1 million children users within the UK [A]. The Indian Institute of Technology in Guwahati, India, has said that our “course has enabled us to acquire the skills and competencies … thus increasing our capacity to conduct participatory design sessions with children.” [B]
Use of UCLan User Experience methods to shape BBC policy
Our work with the BBC, in conjunction with Amberlight [C] has changed how the corporation delivers content via tablet apps to 4.1 million children in the UK as well as for others around the world. We designed a survey tool for twenty families whose children had mild physical impairment or learning difficulties and asked them to fill in diaries over a two-week period. Based on the analysis of the data they provided, the BBC have changed their policy. Prior to our work the BBC’s main delivery of an app was via their website and children were not accessing games through this platform. Our research contributed to the decision to place the games through app stores such as the Google Play Store to ensure all children could find the games. In addition, modifications were made to the accessibility settings of the games so that parents could more easily interact and assist all children to gain the best user experience, further enabling their participation in online activities. Modifications to the games and the delivery has therefore also impacted on many of the 73,000 children in the UK who have complex cognitive or physical needs who use BBC software [C]. This was a further impact from our study which has led to improved social inclusion, in the form of access to games, for these children and families. This study is seen as good practice from the sector as the participatory design survey methods involved users ‘from the margins’ rather than those who represent the middle-ground experience. The work with the BBC instigated changes to the corporation’s games accessibility guidelines. These guidelines are used by UX teams and games developers within the Children’s BBC team for products distributed across various platforms online and in app stores. The findings also influenced the reshaping of the CBeebies games delivery roadmap to include a new evaluation milestone which included children’s accessibility needs [A]. Suzanne Clark, Senior UX Designer BBC Research & Development said: “As a direct response to the findings from research conducted by Amberlight and UCLAN in 2017, the BBC updated its mobile accessibility guidelines to include more specific requirements around designing and testing for devices with alternative inputs.” This is now part of the BBC’s Mobile Accessibility Guidelines. She goes on to describe how further BBC development occurred because, “The work also informed our ‘How to’ guide on the BBC Global Experience Language blog, and enabled us to communicate problems faced by users as we designed games for motor accessibility exemplars.” [D]
ChiCI UX Playbook and website and Fun Toolkit applications
The ChiCI UX Playbook and website has been developed to enable practitioners to use and adopt our methods. Five of our methods have been identified and transformed into educational materials and can be accessed from the site: The Fun ToolKit [1]; Drawing Intervention [2]; CHECk Tool [3]; MemoLine [4] and RAiD [6]. These tools have enabled organisations and researchers across the world to work ethically to obtain data and make informed decisions about the user experience of technologies for children [H]. Prof. Panos Markopoulos, Vice-Dean Director of Research at Eindhoven University of Technology commented: “In my research projects and collaborative projects with industry concerning the design of outdoor games for children and games to support social interactions between children, I have…used the Fun Toolkit quite regularly. I estimate that my team has applied it to evaluate at least 20 different interactive prototypes over the last ten years. A recent example of a successful industrial application concerned the evaluation of Oopsie Heroes (https://oopsieheroes.com/\) an application to help young children stop bedwetting. The system which has been launched as a commercial product was co-designed with children, and the Fun Toolkit was an essential instrument for obtaining feedback regarding the interactive prototypes developed during the design and development process, and thus helping ensure children would enjoy interacting with this product that concerns a very sensitive part of their life.” [E]
The Fun Toolkit ensures that accurate, practical and reliable feedback from children’s user experience can be fed back into computer design. It comprises of three separate tools that can be used together or singly. They are the ‘Smileyometer’, the ‘Fun Sorter’ and the ‘Again, Again’ scale. It has been promoted to over 200 practitioners, 250 academics and industrial software engineers at courses and workshops in the USA, India, Indonesia and the UK. Dr Eunice Sari, CEO of UX Indonesia reports that the ChiCI group’s methods “have been widely adopted by practitioners within the UX community in Indonesia.”[F] It is featured in industry standard websites, such as ‘ALL ABOUT UX: Information for user experience professionals,’ as a validated method of ensuring that the feedback gained from evaluative workshops using children contributes successfully to innovations in design and delivery of new products [G]. These methods have been transformed into training and development materials for the use of IT researchers and industry professionals. Alongside the Fun Toolkit we have ‘Drawing Intervention’, an easy-to-use method of evaluating children’s drawings while giving reliable scores across different interfaces and coders; the CHECk Tool, which instructs and encourages ethical practice in participatory design with children; MemoLine, which captures children’s user experience longitudinally; and finally, RAiD which enables large numbers of young participants to engage in feedback and evaluative processes. This suite of methods and tools have enabled organisations including the digital consultancies KANO and Amberlight [C] and child participation computer researchers to work ethically with young people and obtain data that gives them informed decisions about children’s user experience of technologies. Shu Ting Huang Software Product Lead for Kano said: “As an organisation we have a tradition of working with children in usability and UX work. The usability tools from the ChiCI group at UCLan are the first we have seen that are specifically designed for children that are appropriate to industries like ourselves. Before these tools being made available, we have had to rely on usability and UX tools that were intended for adults and have had to make adjustments to make them work with children. The UX Playbook and the associated tools will enable us to carry out more effective and more child-friendly usability and UX evaluations.” [I]
Capacity Building
The Smileyometer is popular across industry in applications where feedback from children is required. Examples include a Royal Holloway Research Project called ‘Children and the Police: Investigating children’s perceptions of the police, and the way that the police work with young children’; another freelance UX designer, Paris based David Phanouvong, has used the tool to assist a company called Pandacraft in the evaluation of a themed play ‘box’ app they send on a monthly basis to their 3-to-12-years-old subscribers; the ‘influential’ Malaysian blogger ‘Flying Dance’ recommends its use for blog evaluation; it is used in a study involving children with Autistic Spectrum Disorder (ASD) playing drums to evaluate their feelings of enjoyment and fun; the Danish University of Aalto lists the Fun Toolkit on their ‘Experience Platform,’ an ‘open community for people interested in experience research’ and further international usage is demonstrated by the Fun Toolkit being available from the Brazilian website for UX designers Célula de Design e Multimídia [J].
Use of the Fun Toolkit is paired with the courses and workshops we deliver, in association with universities that host these events around the world. This has resulted in an increased capacity of academics to deliver Child Computer Interaction methods. For instance, our workshop, ‘Teaching the Next Generation of Child-Computer Interaction Researchers and Designers’ at Interaction Design and Children (IDC) conference 2020 was attended by 40 participants. The Indian Institute of Technology at Guwahati hosted our Advanced Course in Methods for Child Computer Interaction and have incorporated our methods into their curriculum at post graduate level, resulting in joint publications with the Child Computer Interaction Group and their Masters students. Abhishek Shrivastava, Assistant Professor in the Department of Design comments that: “the course proved highly novel and helped the delegates in learning newer methods for evaluation with children. Across several different informal feedback sessions following the course, we have learnt from delegates that the course was effective in terms of imparting relevant knowledge aimed at engaging children and designing products for them. … Overall having this course at IIT Guwahati has benefited the staff and students who attended and helped increase our understanding and awareness of methods that are appropriate for working with children.” [B]
The methods from the course have also been adopted into teaching in universities. At Eindhoven University of Technology Prof. Panos Markopoulos, Vice-Dean Director of Research said that his “colleagues and I have been using the Fun Toolkit quite extensively: … Between 2008 and 2015 myself and my colleague Prof. Tilde Bekker delivered a one-week master’s level course, (total effort for students 40hours, on average 20 participants per year) covering Interaction Design and Children. The Fun Toolkit was taught to students who used it in their practical assignments. At TU/e we follow a challenge-based education model in which students learn by taking on real-world challenges. Roughly 40-50 student projects every year concern children as users of technology for learning, entertainment and health. I estimate that at least half of these projects have applied one or the other part of the Fun Toolkit because of its versatility…In short, I would like to appreciate my warm appreciation for the Fun Toolkit, but also the other works of the ChiCI group relating to co-designing with children. They have been inspiring and directly useful for our own work.” [E]
A number of schools and companies have been unable to complete their projects with us as a result of the Covid-19 pandemic. As Kano says “Since being introduced to the UX Playbook by the ChiCI team in February 2020, our work with children has been limited
due to the ongoing COVID crisis” [I]. We have also been prevented from collecting further feedback data from schools due to the pandemic.
5. Sources to corroborate the impact
A. BBC website for designers: How to design accessible games
B. Testimonial letter from Head of School, Indian Institute of Technology at Guwahati
C. BBC designer website: Alternative input methods along with joint document with Amberlight for the BBC
D. Testimonial from Suzanne Clarke of BBC on how ChiCI findings were applied at BBC.
E. Testimonial from Prof. Panos Markopoulos, Vice-Dean, Director of Research Eindhoven College of Technology
F. Testimony from Eunice Sari UX Indonesia
G. Listing of Fun Toolkit on All about UX website
H. ChiChi UX Playbook containing the Fun Toolkit
I. Email testimony, Shu Ting Huang, Software Product Lead - Kano
J. Examples of the use of the Smileyometer by industry