Latent Tree Learning with Differentiable Parsers: Shift-Reduce Parsing and Chart Parsing
Abstract
Latent tree learning models represent sentences by composing their words according to an induced parse tree, all based on a downstream task. These models often outperform baselines which use (externally provided) syntax trees to drive the composition order. This work contributes (a) a new latent tree learning model based on shift-reduce parsing, with competitive downstream performance and non-trivial induced trees, and (b) an analysis of the trees learned by our shift-reduce model and by a chart-based model.- Anthology ID:
- W18-2903
- Volume:
- Proceedings of the Workshop on the Relevance of Linguistic Structure in Neural Architectures for NLP
- Month:
- July
- Year:
- 2018
- Address:
- Melbourne, Australia
- Venues:
- ACL | WS
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 13–18
- Language:
- URL:
- https://www.aclweb.org/anthology/W18-2903
- DOI:
- 10.18653/v1/W18-2903
- PDF:
- http://aclanthology.lst.uni-saarland.de/W18-2903.pdf