Deep Convolutional Networks as shallow Gaussian Processes
- Submitting institution
-
University of Bristol
- Unit of assessment
- 11 - Computer Science and Informatics
- Output identifier
- 232650853
- Type
- E - Conference contribution
- DOI
-
-
- Title of conference / published proceedings
- International Conference on Learning Representations
- First page
- 1
- Volume
- -
- Issue
- -
- ISSN
- -
- Open access status
- -
- Month of publication
- May
- Year of publication
- 2019
- URL
-
-
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
2
- Research group(s)
-
A - Artificial Intelligence and Autonomy
- Citation count
- -
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- This paper was very timely, with work on the same topic from Google Brain published at the same venue (Novak et al. ICLR2019), and inspired follow up work (Aitchison; ICML2019) which reconciles the properties of finite and infinite network. The work also inspired a series of papers (e.g. Yang, NeurIPS2019) showing other neural net architectures such as Transformers are also Gaussian-process distributed. Results are incorporated into popular "Neural Tangents" library from Google Brain with 1k GitHub stars (Novak et al. ICLR2020). The approach allows a stronger theoretical understanding of neural network behaviour and generalisation (e.g. Lee et al. NeurIPS2019).
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -