TextPursuits: Using Text for Pursuits-Based Interaction and Calibration on Public Displays
- Submitting institution
-
University of Glasgow
- Unit of assessment
- 11 - Computer Science and Informatics
- Output identifier
- 11-09910
- Type
- E - Conference contribution
- DOI
-
10.1145/2971648.2971679
- Title of conference / published proceedings
- 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2016)
- First page
- 274
- Volume
- -
- Issue
- -
- ISSN
- -
- Open access status
- -
- Month of publication
- September
- Year of publication
- 2016
- URL
-
http://eprints.gla.ac.uk/170225/
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
5
- Research group(s)
-
-
- Citation count
- 30
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- ORIGINALITY: First to use Smooth Pursuit eye movements for 1) implicit calibration of eye trackers while reading text and 2) calibration-free selection of text content. RIGOUR: Demonstrated through rigorous implementation and evaluation of two novel applications: Read2Calibrate and EyeVote. Two detailed empirical studies revealed configurations for highest selection and calibration accuracy. SIGNIFICANCE: Resulted in 5 significant recommendations that enable gaze-based interaction on ubiquitous displays using implicit calibration. This opens a huge new range of natural and hygienic (touch-free) interaction opportunities in public spaces. Published at UbiComp, top venue for ubiquitous computing.
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -