Improving Neural Machine Translation with Soft Template Prediction

Jian Yang, Shuming Ma, Dongdong Zhang, Zhoujun Li, Ming Zhou


Abstract
Although neural machine translation (NMT) has achieved significant progress in recent years, most previous NMT models only depend on the source text to generate translation. Inspired by the success of template-based and syntax-based approaches in other fields, we propose to use extracted templates from tree structures as soft target templates to guide the translation procedure. In order to learn the syntactic structure of the target sentences, we adopt constituency-based parse tree to generate candidate templates. We incorporate the template information into the encoder-decoder framework to jointly utilize the templates and source text. Experiments show that our model significantly outperforms the baseline models on four benchmarks and demonstrates the effectiveness of soft target templates.
Anthology ID:
2020.acl-main.531
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5979–5989
Language:
URL:
https://www.aclweb.org/anthology/2020.acl-main.531
DOI:
10.18653/v1/2020.acl-main.531
Bib Export formats:
BibTeX MODS XML EndNote
PDF:
http://aclanthology.lst.uni-saarland.de/2020.acl-main.531.pdf
Video:
 http://slideslive.com/38929072