Syntax-driven Iterative Expansion Language Models for Controllable Text Generation

Noe Casas, José A. R. Fonollosa, Marta R. Costa-jussà


Abstract
The dominant language modeling paradigm handles text as a sequence of discrete tokens. While that approach can capture the latent structure of the text, it is inherently constrained to sequential dynamics for text generation. We propose a new paradigm for introducing a syntactic inductive bias into neural text generation, where the dependency parse tree is used to drive the Transformer model to generate sentences iteratively. Our experiments show that this paradigm is effective at text generation, with quality between LSTMs and Transformers, and comparable diversity, requiring less than half their decoding steps, and its generation process allows direct control over the syntactic constructions of the generated text, enabling the induction of stylistic variations.
Anthology ID:
2020.spnlp-1.1
Volume:
Proceedings of the Fourth Workshop on Structured Prediction for NLP
Month:
November
Year:
2020
Address:
Online
Venues:
EMNLP | spnlp
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–10
Language:
URL:
https://www.aclweb.org/anthology/2020.spnlp-1.1
DOI:
10.18653/v1/2020.spnlp-1.1
Bib Export formats:
BibTeX MODS XML EndNote
PDF:
http://aclanthology.lst.uni-saarland.de/2020.spnlp-1.1.pdf
Optional supplementary material:
 2020.spnlp-1.1.OptionalSupplementaryMaterial.pdf