Synch-Graph : multisensory emotion recognition through neural synchrony via graph convolutional networks
- Submitting institution
-
University of St Andrews
- Unit of assessment
- 11 - Computer Science and Informatics
- Output identifier
- 266336143
- Type
- E - Conference contribution
- DOI
-
10.1609/aaai.v34i02.5491
- Title of conference / published proceedings
- Proceedings of the AAAI Conference on Artificial Intelligence (AAAI-20)
- First page
- 1351
- Volume
- -
- Issue
- -
- ISSN
- 2159-5399
- Open access status
- Compliant
- Month of publication
- April
- Year of publication
- 2020
- URL
-
-
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
1
- Research group(s)
-
A - Artificial Intelligence
- Citation count
- -
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- To the best of our knowledge, this paper is the first to model neural synchrony using graph convolutional neural network GCN and the first to use spiking neural network SNN and GCN for multisensory integration. GCN enables the classification of neural synchrony patterns, regardless of the type of data. This proposed work paves a way to new opportunities in multisensory learning and integration research area. It is not only applicable to emotion recognition but has various applications in sensor fusion such as human-robot interaction. Moreover, it is applicable to the computational neuroscience research field.
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -