A game-based approximate verification of deep neural networks with provable guarantees
- Submitting institution
-
University of Exeter
- Unit of assessment
- 11 - Computer Science and Informatics
- Output identifier
- 6407
- Type
- D - Journal article
- DOI
-
10.1016/j.tcs.2019.05.046
- Title of journal
- Theoretical Computer Science
- Article number
- -
- First page
- 298
- Volume
- 807
- Issue
- -
- ISSN
- 0304-3975
- Open access status
- Technical exception
- Month of publication
- February
- Year of publication
- 2020
- URL
-
-
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
4
- Research group(s)
-
-
- Citation count
- 7
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- Using novel game theory, this paper proposes and implements a general approximate verification tool for quantifying the robustness of modern deep neural networks (DNNs) with provable guarantees. It is one of the most important research outcomes of a £5 million EPSRC Mobile Autonomy Programme Grant: Safety, Trust and Integrity. Its methodology also directly resulted in a £2 million ERC Advanced Grant: FUN2MODEL: From Function-based to model-based automated probabilistic reasoning for Deep Learning. It is the long version of the TACAS'18 conference paper, which was one of the very first research works on approximate safety verification on DNNs
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -