Audio-Visual-Olfactory Resource Allocation for Tri-modal Virtual Environments
- Submitting institution
-
Birmingham City University
- Unit of assessment
- 11 - Computer Science and Informatics
- Output identifier
- 11Z_OP_D2030
- Type
- D - Journal article
- DOI
-
10.1109/TVCG.2019.2898823
- Title of journal
- IEEE Transactions on Visualization and Computer Graphics
- Article number
- -
- First page
- 1865
- Volume
- 25
- Issue
- 5
- ISSN
- 1077-2626
- Open access status
- Compliant
- Month of publication
- -
- Year of publication
- 2019
- URL
-
-
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
-
- Research group(s)
-
-
- Citation count
- 0
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- Via experiments, this work identified perceptual attention across the audio-visual-olfactory spectrum for different computational constraints. Users’ expectations change depending on the computation available. A regression model was developed to encode these observations and can successfully predict sensory attention. The provided model can be used in targeting appropriate simulation fidelity for multi-sensory virtual worlds. For example, digital twins or applied virtual therapy: https://doi.org/10.3389/frvir.2020.585993
This is an output of EPSRC grant EP/K014056/1 and was showcased at EPSRC’s ‘Science for a Successful Nation’ event on Wednesday 21 February 2018, exhibited to over 400 attendees including members of the House of Lords.
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -