Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science
- Submitting institution
-
Edinburgh Napier University
- Unit of assessment
- 11 - Computer Science and Informatics
- Output identifier
- 2005945
- Type
- D - Journal article
- DOI
-
10.1038/s41467-018-04316-3
- Title of journal
- Nature Communications
- Article number
- 2383
- First page
- 1
- Volume
- 9
- Issue
- 1
- ISSN
- 2041-1723
- Open access status
- Compliant
- Month of publication
- June
- Year of publication
- 2018
- URL
-
-
- Supplementary information
-
https://github.com/dcmocanu/sparse-evolutionary-artificial-neural-networks
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
5
- Research group(s)
-
-
- Citation count
- 48
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- This paper has made a breakthrough in machine learning and led to significant further research. First proposed in this paper, the sparse evolutionary training (SET) implementations avoid overfitting, improve classification accuracy even on datasets exceeding 20,000 dimensions with less than 100 samples, and generated new records in machine learning (Shiwei Liu et al., Neural Computing and Applications 2020,https://doi.org/10.1007/s00521-020-05136-7). The code has been extensively reused by the community. It has been Nature Communications paper of the week for several weeks in a row and has already been downloaded over 22K times.
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -