An Attentive Neural Architecture for Fine-grained Entity Type Classification
- Submitting institution
-
University College London
- Unit of assessment
- 11 - Computer Science and Informatics
- Output identifier
- 16205
- Type
- E - Conference contribution
- DOI
-
10.18653/v1/W16-1313
- Title of conference / published proceedings
- Proceedings of the 5th Workshop on Automated Knowledge Base Construction (AKBC) at the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies Proceedings of the Workshop
- First page
- 69
- Volume
- W16-1313
- Issue
- -
- ISSN
- 0000-0000
- Open access status
- Out of scope for open access requirements
- Month of publication
- June
- Year of publication
- 2016
- URL
-
-
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
3
- Research group(s)
-
-
- Citation count
- -
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- Introduced a new state-of-the-art method for the long-established task of fine-grained entity type classification by investigating a series of neural network approaches. Furthermore, it was one of the works to pioneer model analysis by relating neural method behaviour to established linguistic features – something that has now become commonplace across the field. For this the work received an outstanding paper award and plenary session talk at a tier-A conference.
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -