Designing Interactions with Multilevel Auditory Displays in Mobile Audio-Augmented Reality
- Submitting institution
-
The University of Kent
- Unit of assessment
- 12 - Engineering
- Output identifier
- 10116
- Type
- D - Journal article
- DOI
-
10.1145/2829944
- Title of journal
- ACM Transactions on Computer-Human Interaction
- Article number
- 3
- First page
- -
- Volume
- 23
- Issue
- 1
- ISSN
- 1073-0516
- Open access status
- Out of scope for open access requirements
- Month of publication
- February
- Year of publication
- 2016
- URL
-
https://kar.kent.ac.uk/58619/
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
4
- Research group(s)
-
-
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- This paper investigates the use of multilevel auditory displays to enable eyes-free mobile interaction with indoor location-based information in non-guided audio-augmented environments. The results of the study provide practical guidelines for designing effective eyes-free interactions for richer auditory soundscapes, and are relevant for augmented reality applications in cultural-historical contexts such as archaeological sites and exhibitions. The research has been cited (2018) in ACM TOMM and CHI.
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -