Sequence Classification Restricted Boltzmann Machines with Gated Units
- Submitting institution
-
City, University of London
- Unit of assessment
- 11 - Computer Science and Informatics
- Output identifier
- 841
- Type
- D - Journal article
- DOI
-
10.1109/TNNLS.2019.2958103
- Title of journal
- IEEE Transactions on Neural Networks and Learning Systems
- Article number
- -
- First page
- 4806
- Volume
- 31
- Issue
- 11
- ISSN
- 2162-237X
- Open access status
- Compliant
- Month of publication
- January
- Year of publication
- 2020
- URL
-
-
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
5
- Research group(s)
-
-
- Citation count
- 0
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- This work is significant because it is the first to combine the advantages of dynamic Bayesian networks and recurrent neural networks. The paper addresses the exponential nature of gradient computations and the resulting intractability of current approaches by introducing a conditional probability distribution for the optimization of sequences with the use of gated Boltzmann machines. The paper evaluates the proposed model on optical character recognition, chunking, and multi-resident activity recognition in smart homes showing performance comparable to that of the state-of-the-art in all cases, but requiring far fewer parameters and therefore less computation for the optimisation and application.
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -