Bounding the search space for global optimization of neural networks learning error: an interval analysis approach
- Submitting institution
-
Birkbeck College
- Unit of assessment
- 11 - Computer Science and Informatics
- Output identifier
- 174
- Type
- D - Journal article
- DOI
-
-
- Title of journal
- Journal of Machine Learning Research
- Article number
- -
- First page
- 1
- Volume
- 17
- Issue
- -
- ISSN
- 1533-7928
- Open access status
- Compliant
- Month of publication
- September
- Year of publication
- 2016
- URL
-
https://jmlr.org/papers/volume17/14-350/14-350.pdf
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
3
- Research group(s)
-
1 - Algorithms, Verification and Software
- Citation count
- 3
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- This paper addresses the problem of defining guaranteed bounds in the search space for seeking a global minimiser of the loss function before training with a global search method, proposing a novel approach based on interval analysis and presenting both theoretical results and empirical verification on well-known benchmarks. Outer approximations of the solutions to the interval equations resulting from the function implemented by the network nodes are derived as convex polytopes which enclose the connection weights that are the global minimisers. The survey “Global optimization issues in deep network regression”, J. Global Optimization 2019, dedicates a section to our method.
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -