Tap the shapeTones: Exploring the effects of crossmodal congruence in an audio-visual interface
- Submitting institution
-
University of Greenwich
- Unit of assessment
- 11 - Computer Science and Informatics
- Output identifier
- 20968
- Type
- E - Conference contribution
- DOI
-
10.1145/2858036.2858456
- Title of conference / published proceedings
- Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems
- First page
- 1055
- Volume
- 0
- Issue
- -
- ISSN
- -
- Open access status
- -
- Month of publication
- -
- Year of publication
- 2016
- URL
-
-
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
4
- Research group(s)
-
-
- Citation count
- 6
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- The paper examines the impact of crossmodal congruence in a memory task, using a touchscreen interface, comparing audio, visual, and audiovisual stimuli. The paper combines two methodological perspectives: task performance and user engagement. The focus on engagement in application of crossmodal perception to interface design is novel. This has important implications in terms of design for accessibility, as demonstrated by the work that has cited this paper so far. The full paper has been presented in 2016 at the CHI conference, the premier international conference of Human-Computer Interaction. In that year the acceptance rate was 23%.
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -