Skip to main

Impact case study database

The impact case study database allows you to browse and search for impact case studies submitted to the REF 2021. Use the search and filters below to find the impact case studies you are looking for.
Waiting for server

Efficient and Robust Signal/Image Processing and Machine Learning Methods for Real Life Challenging Scenarios

1. Summary of the impact

Novel signal/image analysis and machine learning methods have been developed and efficiently implemented into different applications and devices even with low computational capabilities. Highly innovative UWS research is achieving significant impact in several areas including electronics and computing industries, where top multinational companies such as Cirrus Logic and SME’s like Codeplay Software Ltd; MODO Systems Ltd; and Lumen Research Ltd implemented our methods to realise socio-economic benefits. The applicability of our developments has gained recognition from the Government, implementations in the Health and Social Care sector and have informed the standardisation and open access community.

2. Underpinning research

The Affective and Human Computing for Smart Environment (AHCSE) research centre aims at the creation of state-of-the-art knowledge in signal processing (1D/2D) and machine learning, with a specific focus on making all our developments applicable in real life challenging scenarios. Combining all the disciplines, we have created accurate, robust and efficient algorithms and applications:

Emotion recognition by physiological signals (EEG, and ECG) and implemented in Emotional Gym for social care sector; we developed efficient and robust algorithms for automated recognition of human affect (Emotion). To address it, we created a DREAMER, a pioneering open access database consisting of recordings of EEG and ECG signals captured while audio-visual stimuli was presented to the participants in order to elicit specific emotions [3.1]. Portable, wearable, and wireless devices were used for both the EEG and ECG capture allowing the evaluation of affect recognition algorithms on signals that can be conveniently captured in everyday scenarios, providing the means to integrate affect computing methods into a wide variety of tasks. After assessing the quality of the participants’ ratings, a baseline for this database was then established by evaluating EEG and ECG-based features through participant-wise supervised classification experiments. To demonstrate the feasibility of integrating affective computing methods to everyday applications [3.2] through the use of portable/wearable equipment like Emotional Gym project. The initial research was carried out through an innovation grant funded by Loretto Care and followed by the research grant from Construction Scotland Innovation Centre with Wheatley group, a leading and award-winning provider of care and support services [3.I].

Continuous real-time monitoring of respiratory patients in connected health scenarios by detecting and analysing cough episodes from audio signals acquired in challenging noisy scenarios; we proposed a more robust approach to cough detection with high sensitivity (up to 88.51%) and specificity (up to 99.77%) in a variety of noisy environments [3.3] by considering two dimensional spectrograms and feature extraction techniques. However, the implementation of such methods together with easily customisable machine learning algorithms resulted in fast battery drainage from the mobile device (less than 2 hours), so, we proposed efficient machine learning implementations for the smartphone that offers accuracy of more than 93% when continuously monitoring cough for 48 hours while keeping the smartphone at full functionality [3.4]. The original research was carried out during a ground-breaking Smartcough project funded by the Scottish Funding Council (SFC) [3.A]. We exploited the use of this technology for Early Detection of Lung Cancer funded by Cancer Research UK [3.B].

Other major research strands include: video decoder optimisation, which is based on the parallel decoder architecture for latest video coding standard High Efficiency Video Codec (HEVC) (in 2014) that was proposed for mobile platforms. In addition, the subjective and objective quality evaluation of HEVC standard was conducted under the global standardisation body MPEG, a working group within the International Organization for Standardization (ISO) and the International Electrotechnical Commission (ICE). MPEG is responsible for the development of international standards for compression, decompression, processing, and coded representation of moving pictures, audio and their combination [3.5, 3.C, 3.G]. Real-time content based image retrieval (CBIR) framework has been developed based on the highly efficient deep learning within mobile platform [3.6]; Furthermore, alongside the CBIR framework, a state-of-the-art person identification and recognition framework was developed and implemented within Mobile platform [3.D, 3.F, 3.I]. The research team has also developed the first real-time mobile eye tracking framework based on deep learning. It implements the stages of data acquisition, large data storage, data cleansing and mining, feature extraction, developing new machine learning models, testing these models, adjusting the parameters of the models and deploying the newly invented AI models in a distributed environment [3.E].

3. References to the research

3.1 Katsigiannis, S. and Ramzan, N., (2018) DREAMER: A Database for Emotion Recognition Through EEG and ECG Signals from Wireless Low-cost Off-the-Shelf Devices. IEEE Journal of Biomedical and Health Informatics, 22 (2): 98-107. https://doi.org/10.1109/JBHI.2017.2688239.

3.2 Katsigiannis, S., Willis, R. and Ramzan, N., (2019) A QoE and Simulator Sickness Evaluation of a Smart-Exercise-Bike Virtual Reality System via User Feedback and Physiological Signals. IEEE Transactions on Consumer Electronics, 65(1): 119-127. https://doi.org/10.1109/TCE.2018.2879065.

3.3 Monge-Alvarez, J., Hoyos-Barcelo, C., Lesso, P. and Casaseca-de-la-Higuera, P., (2019) Robust Detection of Audio-Cough Events Using Local Hu Moments. IEEE Journal of Biomedical and Health Informatics, 23(1): 184-196. https://doi.org/10.1109/JBHI.2018.2800741.

3.4 Hoyos Barceló, C., Monge-Álvarez, J., Shakir, M. Z., Alcaraz Calero, J. M., & Casaseca, J. P. (2017). Efficient k-NN implementation for real-time detection of cough events in smartphones. IEEE Journal of Biomedical and Health Informatics, 22(5): 1672-1771. https://doi.org/10.1109/JBHI.2017.2768162.

3.5 Tan, T., Weerakkody, R., Mrak, M., Ramzan, N., Baroncini, V., Ohm, J. and Sullivan, G., 2016. Video Quality Evaluation Methodology and Verification Testing of HEVC Compression Performance. IEEE Transactions on Circuits and Systems for Video Technology, 26(1): 76-90. https://doi.org/10.1109/TCSVT.2015.2477916.

3.6 Alzu'bi, A., Amira, A., Ramzan, N., (2017) “Content-based image retrieval with compact deep convolutional features”, Neurocomputing, 249: 95-105. https://doi.org/10.1016/j.neucom.2017.03.072.

Grants

3.A *Casaseca, P. , SMARTCOUGH: Continuous intelligent cough detection and identification using smartphones, Scottish Funding Council (Digital Health & Care Institute (DHI)), June 2015 to November 2016, GBP69,940.84.

3.B Casaseca, P., Usoro, I., Dahal, K., Audio Technology to detect lung cancer earlier, Cancer Research UK, June 2016 to November 2018 , GBP19,244.00

3.C Ramzan, N., Video processing and computer vision for mobile environment, Innovate UK: KTP with CodePlay Software Ltd, September 2015 to September 2017, GBP85,821

3.D Ramzan, N., Shakir, MZ., Pervez, Z., To develop the next generation of digital content management platform, Innovate UK: KTP with MODO Systems Ltd, July 2018 to June 2020, GBP109,918 .

3.E Ramzan, N., Keir, P., To develop a software tool for predicting a viewer’s likely response to a range of visual communications, Innovate UK: KTP with Lumen Research, April 2018 to September 2020, GBP144,733

3.F Ramzan, N., Pervez ,Z ., To develop an automated safeguarding platform to prevent misuse of collaborative platforms used within the education, Innovate UK: KTP with Seric Systems, August 2019 to January 2022, GBP149,569

3.G -* Ramzan, N., Gisbert, H., To develop the next generation of automated surveillance system to enhance security & public safety, Innovate UK: KTP with Visual Management Systems Limited, February 2020 to January 2023, GBP180,278.

3.H Ramzan, N., Pervez, Z., Keir, P., Johnstone, J, To develop an advanced remote health monitoring and behavioural analysis solution utilising machine vision and artificial intelligence, Innovate UK: KTP with Kibble Education and Care Centre, April 2020 to December 2022, GBP124,366

3.I Ramzan, N., Shakir, MZ., Pervez, Z., Advanced Training in Health Innovation Knowledge Alliance (ATHIKA), Erasmus+, January 2019 to December 2021, EUR945,060

4. Details of the impact

Significant socio-economic impacts have been achieved by the UWS research team.

Societal Impact: Pioneering new care home solution for the elderly has been developed directly from our research [5.9]. Our Emotional Gym solution deployed at two different care home sites funded by Construction Scotland Innovation Centre (CSIC) and Loretto Care has enabled care home residents to improve their mobility, exercise and stay physically active for longer periods. Our research has created new emotional technology driven health and care management systems and its originality and reach have been recognised in 2018 with the following awards:

Laing Buisson Awards: Innovation & Leaders Category – Innovation in Care Award

Holyrood Connect Digital Health and Care Awards - Digital Innovation Award

Our expertise in mobile app development and signal processing has resulted in the development of a novel publicly available mobile app - Stay Safe Scotland in partnership with the Council of Ethnic Minority Voluntary Sector Organisations (CEMVO) Scotland, with funding from the Scottish Government to help tackle the disproportionate number of ethnic minority people affected by Covid-19 [5.10]. The app seeks to overcome major barriers faced by ethnic minority communities in Scotland by providing social distancing guidelines in a variety of different languages and predicting footfall data at nearly 100 supermarkets throughout Scotland to help users schedule visits at quieter times, and to avoid queuing and overcrowding.

Public Impact: Loretto Care and Wheatley Group have implemented our life changing research on emotion recognition based on physiological signals, resulting in the development of an innovative stationary exercise bike integrated with in-house developed super-interactive games to monitor health and wellbeing of general public with limited mobility (e.g. elderly people). Our results have shown that more than 30% residents in care homes are highly likely to participate in physical activities and sports thereby improving the health and wellbeing of the public.

Over 93% accuracy has been achieved by using our pioneering SmartCough App for detecting cough and enabled the health service operators and general public to monitor the respiratory conditions remotely, enabling accurate prediction of changes in the severity in respiratory diseases [5.10]. Our robust and energy efficient smartphone-based cough detector mobile app exploits some advanced machines learning algorithms.

Knowledge Impact: Our emotion recognition research has produced first ever publicly available open source dataset (DREAMER) that can be used to create new industrial research and development programmes based on physiological signals [5.1]. After its publication (https://zenodo.org/record/546113\), the dataset has attracted a significant attention from the international professional audiences and researchers (e.g. Facebook/Occulus). Up until 21/08/2020, 3757 requests for access to the dataset had been made via the “zenodo.org” platform, 5000 full text views of the publication via the “IEEE Xplore” platform.

Furthermore, the work on video quality evaluation has been recognised worldwide and accelerated the verification of the HEVC video coding standard. The outcome of this research was two contribution to MPEG standard [5.3] and has been selected for the Best of IET and IBC papers [5.2] and awarded with the Best Paper Award of IEEE Transaction of Circuit and System for Video Technology [5.2] and downloaded by more than 21,000.

Economic Impact: Codeplay Software Ltd has achieved x10 predicted impact in financial terms and a doubling of business size (headcount), gain international level recognition in the field of AI positioning them along Google, Intel, Arm, NVIDIA and resulting in both significant business growth (e.g. company has established long term collaborative partnerships with the ARM, Intel, Imagination and Renesas **[5.6]**) and accrued significant profit (e.g. company awarded with seven-figure contract during the period of the KTP project). This global success directly resulted from the InnovateUK funded project with the UWS research team.

Our cloud computing project with MODO Ltd funded by Innovate UK has integrated an advanced machine algorithm (e.g. CBIR and person identification with MODO’s core product architecture) and thereby enhanced an overall customer proposition. Our results have enabled the company to secure a six-figure investment and has significantly raised the company’s valuation and profile for future growth [5.4]. This project received top “outstanding” rating by Innovate UK [5.5].

Lumen Ltd. have leveraged AI/ML within its innovative product that helps to generate 50% more revenue and almost doubling the headcount [5.7] as a result of the research with UWS. This huge success stems from the InnovateUK-funded project that has enabled our team to develop the first mobile eye tracking framework on the market. This real-time mobile eye tracking framework is based on deep learning and has generated significant industry interest, boosting Lumen’s visibility and market share.

Significant InnovateUK successes have since substantially increased industry interest in pioneering UWS research, resulting in three further KTP projects ( VMS, SERIC with Education Scotland), and Kibble) as well as research interest demonstrated in two Erasmus+ projects (ATHIKA and Remote Health Monitoring)

The above industry-focused successes of the team and its leadership have been crowned with the Knowledge Exchange Champion award 2020 [5.8] awarded to Ramzan for contributing to over GBP15,000,000 to UK economy via 8 KTPs projects with leading industrial and commercial partners in the UK.

5. Sources to corroborate the impact

5.1 Open Source Dataset:

  1. Dreamer ( https://zenodo.org/record/546113 )

  2. BED ( https://zenodo.org/record/4309472#.YF35Xq_7Tcu)

  3. Best Paper award and Standardised Contributions:

Best Paper award

  • IEEE Transaction of Circuit and System for Video Technology

  • Best of IET and IBC paper

Standards Contribution:

  • T Tan, M Mrak, V Baroncini, N Ramzan, “Report on HEVC compression performance verification testing”, International Organization For Standardization Organisation Internationale De Normalisation Coding Of Moving Pictures And Audio ISO/IEC JTC1/SC29/WG11 N14420, April 2014

  • V Baroncini, G S Blasi, T Ebrahimi, P Hanhart, N Ramzan, Ivan Zupancic, “Formal subjective assessment of Screen Content Coding (SCC)”, International Organization For Standardization Organisation Internationale De Normalisation Coding Of Moving Pictures And Audio ISO/IEC JTC1/SC29/WG11, April 2014

  1. Testimonial from MODO

  2. Outstanding KTP: Innovate UK evaluation of UWS and MODO KTP, highest ranking

  3. Codeplay KTP: Best of Best KTP submission

  4. Testimonial from the Vice President of Research at Lumen Research.

  5. Media Coverage: SFC Knowledge Exchange Champion Award, Paisley Daily Express

  6. Testimonial from Wheatley Group

  7. Testimonial from CEMVO:

  • Media Coverage – Paisley Teams App to help BAME people through Pandemic, Aug 25, 2020, Daily Express, UK
  1. Case study for Cough Detection

  2. Case Study

  3. Presentation at TEDx

Additional contextual information

Grant funding

Grant number Value of grant
INFO906 £69,941
C59355/A22878 £19,244
509567 £85,821
511138 £144,733
511839 £180,278
11465 £149,569
12215 £124,366
601106-EPP-1-2018-1-ES-EPPKA2-KA £829,000