Neural AMR: Sequence-to-Sequence Models for Parsing and Generation
- Submitting institution
-
Heriot-Watt University
- Unit of assessment
- 11 - Computer Science and Informatics
- Output identifier
- 16927875
- Type
- E - Conference contribution
- DOI
-
10.18653/v1/P17-1014
- Title of conference / published proceedings
- Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
- First page
- 146
- Volume
- -
- Issue
- -
- ISSN
- -
- Open access status
- -
- Month of publication
- July
- Year of publication
- 2017
- URL
-
-
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
4
- Research group(s)
-
-
- Citation count
- 33
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- Originality: First Deep Learning models for Natural Language Generation (NLG) and Semantic Parsing from a Meaning Representation of Natural Language (AMR). It was the first to use large pretraining corpora for this task and influenced upcoming publications on AMR. Rigour: The NLG model we proposed is still the state of the art on a benchmark dataset (AMR LDC2015T12). Significance: AMR is a widely adopted MR in the Computational Linguistics community (68 papers since its release in 2012). ACL is the top conference in Natural Language Processing (25% acceptance rate).
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -