Audio-Visual-Olfactory Resource Allocation for Tri-modal Virtual Environments
- Submitting institution
-
Royal College of Art(The)
- Unit of assessment
- 32 - Art and Design: History, Practice and Theory
- Output identifier
- Asadipour2
- Type
- E - Conference contribution
- DOI
-
10.1109/TVCG.2019.2898823
- Title of conference / published proceedings
- IEEE Transactions on Visualization and Computer Graphics
- First page
- 1865
- Volume
- 25
- Issue
- 5
- ISSN
- 1077-2626
- Open access status
- Technical exception
- Month of publication
- -
- Year of publication
- 2019
- URL
-
https://ieeexplore.ieee.org/document/8642346
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
6
- Research group(s)
-
-
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- This peer-reviewed journal article presents a novel resource distribution technique to achieve an optimal perceptual experience (visual, auditory, olfaction) in Virtual Experiences within a given computational budget. The research formed a key part of joint EPSRC-Jaguar Land Rover grant EP/K014056/1.
The significant impact of limited computational resources in multimodal virtual reality simulations is often key to failure in providing an authentic and immersive experience. This article investigates user preferences in allocating computational budgets to different modalities in various scenarios.
The method used introduced a reproducible and innovative technique to identify the recommended resolution to present each sensory stimulus and to optimise resource allocation in real-time without compromising the user requirements. Experimental and validation studies were designed with the help of human participants, to identify key olfactory parameters (simulation), to collect subjectively allocated budgets across tri-modal stimuli for load balancing (modelling, n=25), and to validate the resource allocation model for restricted computational budgets (validation, n=6).
Asadipour was the co-author and technical consultant (co-supervisor) for the two PhD students Dhokia and Doukakis in this project, mentoring both students on design, coding/fabrication, and usability testing of the multisensory display to accurately simulate reality. A calibrated olfactory system was innovated by Dhokia [1] which is used in this work. In addition, a Graphical User Interface (GUI), four virtual scenes, and an accurate resource allocation model were introduced by Doukakis.
Other dissemination:
[1] A peer-reviewed conference presentation and paper in olfaction which was presented Computer Graphics & Visual Computing (CGVC’16), Eurographics Association, Bournemouth, UK, 2016.
[2] This project was presented (Asadipour, Harvey, and Chalmers) at the Science for a Successful Nation event and was selected as top choice by EPSRC, 2018.
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -