EgoCap : egocentric marker-less motion capture with two fisheye cameras
- Submitting institution
-
The University of Bath
- Unit of assessment
- 11 - Computer Science and Informatics
- Output identifier
- 146414024
- Type
- D - Journal article
- DOI
-
10.1145/2980179.2980235
- Title of journal
- ACM Transactions on Graphics
- Article number
- 162
- First page
- 1
- Volume
- 35
- Issue
- 6
- ISSN
- 0730-0301
- Open access status
- Compliant
- Month of publication
- November
- Year of publication
- 2016
- URL
-
-
- Supplementary information
-
https://dl.acm.org/action/downloadSupplement?doi=10.1145%2F2980179.2980235&file=a162-rhodin.zip&download=true
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
7
- Research group(s)
-
-
- Citation count
- 30
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- This work is published at SIGGRAPH Asia, one of the leading conferences in computer graphics. It proposes a radically new motion capture paradigm where cameras on a helmet or VR headset track the user's motion without any markers. This enables extremely large capture volumes, e.g. outdoors, but also outperforms existing optical motion-capture approaches in crowded scenes. It promises easy motion capture for virtual reality scenarios in which users can finally see their own body in virtual reality. Our work inspired follow-up research by Facebook/Oculus, a leading VR headset manufacturer (xR-EgoPose, Tome et al., ICCV 2019).
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -