Multimodal Classification of Stressful Environments in Visually Impaired Mobility Using EEG and Peripheral Biosignals
- Submitting institution
-
Queen Mary University of London
- Unit of assessment
- 11 - Computer Science and Informatics
- Output identifier
- 523
- Type
- D - Journal article
- DOI
-
10.1109/TAFFC.2018.2866865
- Title of journal
- IEEE Transactions on Affective Computing
- Article number
- -
- First page
- 203
- Volume
- 12
- Issue
- 1
- ISSN
- 1949-3045
- Open access status
- Deposit exception
- Month of publication
- August
- Year of publication
- 2018
- URL
-
-
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
1
- Research group(s)
-
-
- Citation count
- -
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- Introduced new feature-level fusion models for brain and peripheral biosignals, able to predict affective state in-the-wild, with novel insight into AI-assisted mobility and navigation design for the well-being of visually impaired people. Work received Best Paper Award at HCI 2016 (https://bit.ly/3e8tOEX), contributed to H2020 project ?Sound of Vision? winning the ?Tech for Society? Award at ICT 2018 (https://bit.ly/2TpxQRt), and was featured in November 2017 issue of business magazine Platinum (https://bit.ly/36ipeS2). Work led to invited talk (Kalimeri) at ACM ICMR 2017 Workshop on Wearable MultiMedia (https://bit.ly/36iWR6o, p. iii).
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -