Synthesizing benchmarks for predictive modeling
- Submitting institution
-
The University of Leeds
- Unit of assessment
- 11 - Computer Science and Informatics
- Output identifier
- UOA11-4034
- Type
- E - Conference contribution
- DOI
-
10.1109/CGO.2017.7863731
- Title of conference / published proceedings
- Proceedings of the 2017 IEEE/ACM International Symposium on Code Generation and Optimization (CGO)
- First page
- 86
- Volume
- -
- Issue
- -
- ISSN
- 2164-2397
- Open access status
- Technical exception
- Month of publication
- February
- Year of publication
- 2017
- URL
-
-
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
3
- Research group(s)
-
E - DSS (Distributed Systems and Services)
- Citation count
- 15
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- Received the single Best Paper Award in CGO-17 from 116 submissions (top 1%). Tackles the shortage of realistic benchmarks in systems research, an outstanding problem for 30+ years. Follow-up work won a distinguished paper award in ISSTA 2019 (not submitted by UoL for REF-21). Has major influence and is taught in compiler courses at UC Berkeley and Australian National University. Also helped to secure a prestigious RAEng Research Fellowship, a Royal Society collaboration grant and industry research investment of over £330K for follow-up work. Forming part of a PhD thesis that won the 2020 SICSA Best PhD Dissertation Award.
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -