Scalable Training of Artificial Neural Networks with Adaptive Sparse Connectivity: A Network Science Perspective
- Submitting institution
-
University of Derby
- Unit of assessment
- 11 - Computer Science and Informatics
- Output identifier
- 785861-3
- Type
- D - Journal article
- DOI
-
10.1038/s41467-018-04316-3
- Title of journal
- Nature Communications
- Article number
- 2383
- First page
- -
- Volume
- 9
- Issue
- -
- ISSN
- 2041-1723
- Open access status
- Compliant
- Month of publication
- June
- Year of publication
- 2018
- URL
-
https://www.nature.com/articles/s41467-018-04316-3
- Supplementary information
-
https://static-content.springer.com/esm/art%3A10.1038%2Fs41467-018-04316-3/MediaObjects/41467_2018_4316_MOESM1_ESM.pdf
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
5
- Research group(s)
-
-
- Citation count
- 48
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- This paper has made a breakthrough in machine learning and led to significant further research. First proposed in this paper, sparse evolutionary training (SET) avoids overfitting, and improves classification accuracy even on datasets exceeding 20,000 dimensions and less than 100 samples. With SET, models with one million neurons can be trained on a laptop without GPU (Shiwei Liu et al., Neural Computing and Applications 2020, DOI:10.1007/s00521-020-05136-7). The code has been extensively reused by the community, while the article has been paper of the week for several weeks in a row, being downloaded over 21,000 times.
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -