Group sparse regularization for deep neural networks
- Submitting institution
-
Edinburgh Napier University
- Unit of assessment
- 11 - Computer Science and Informatics
- Output identifier
- 1792511
- Type
- D - Journal article
- DOI
-
10.1016/j.neucom.2017.02.029
- Title of journal
- Neurocomputing
- Article number
- -
- First page
- 81
- Volume
- 241
- Issue
- -
- ISSN
- 0925-2312
- Open access status
- Technical exception
- Month of publication
- February
- Year of publication
- 2017
- URL
-
-
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
3
- Research group(s)
-
-
- Citation count
- 99
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- This influential paper is the first to propose a highly general sparse-regularization strategy to deal with the challenging task of simultaneously selecting input-variables and hidden nodes in deep neural networks (DNNs). Representative of a body of work from EP/M026981/1, the study showed our proposed sparse group Lasso penalty can produce extremely compact DNNs without sacrificing performance in benchmark classification tasks. Our work has transformed theoretical and applied research in sparse DNNs in a range of contexts, including a learning method for optimising neural networks, which has been patented by Toshiba (20200012945US; 2020008993JP).
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -