Space of Functions Computed by Deep-Layered Machines
- Submitting institution
-
Aston University
- Unit of assessment
- 11 - Computer Science and Informatics
- Output identifier
- 46281308
- Type
- D - Journal article
- DOI
-
10.1103/PhysRevLett.125.168301
- Title of journal
- Physical Review Letters
- Article number
- 168301
- First page
- -
- Volume
- 125
- Issue
- 16
- ISSN
- 0031-9007
- Open access status
- Compliant
- Month of publication
- October
- Year of publication
- 2020
- URL
-
-
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
2
- Research group(s)
-
A - Aston Institute of Urban Technology and the Environment (ASTUTE)
- Citation count
- 1
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- Layered networks, such as deep-learning machines play a significant role in modern machine learning activities but not much is known on the way they operate. The paper uses a statistical physics framework to explore the relation between layered and recurrent architectures and the change in entropy of functions implemented by them, depending on the number of layers and activation functions used. It shows that single-layer recurrent architectures, with much fewer free parameters cover the same function space as deep-layered networks, which could be of practical importance. An article about the paper was published in the on-line magazine Insidebigdata https://insidebigdata.com/2020/10/16/whats-under-the-hood-of-neural-networks/.
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -