Counterfactual Explanations Without Opening the Black Box: Automated Decisions and the GDPR
- Submitting institution
-
The University of Surrey
- Unit of assessment
- 12 - Engineering
- Output identifier
- 9024422_3
- Type
- D - Journal article
- DOI
-
10.2139/ssrn.3063289
- Title of journal
- Harvard Journal of Law & Technology
- Article number
- -
- First page
- 841
- Volume
- 31
- Issue
- -
- ISSN
- 1556-5068
- Open access status
- Compliant
- Month of publication
- -
- Year of publication
- 2017
- URL
-
-
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
-
- Research group(s)
-
-
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- The paper is part of the what-if tool (tensorflow), one of the most widely used machine learning tool kits in the world (https://pair-code.github.io/what-if-tool/). It is featured in the UKRI-AI Review 202 (https://www.ukri.org/about-us/what-we-do/ai-review-transforming-our-world-with-ai/); was cited in GDPR guidelines on the right to explanation; and was cited by multiple experts in evidence to House of Commons Select Committee on AI. Google, Vodaphone, Flock, IBM and Accentuar have implemented Counterfactual Explanation in their products.
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -