Semantic Neural Machine Translation Using AMR

Linfeng Song, Daniel Gildea, Yue Zhang, Zhiguo Wang, Jinsong Su


Abstract
It is intuitive that semantic representations can be useful for machine translation, mainly because they can help in enforcing meaning preservation and handling data sparsity (many sentences correspond to one meaning) of machine translation models. On the other hand, little work has been done on leveraging semantics for neural machine translation (NMT). In this work, we study the usefulness of AMR (abstract meaning representation) on NMT. Experiments on a standard English-to-German dataset show that incorporating AMR as additional knowledge can significantly improve a strong attention-based sequence-to-sequence neural translation model.
Anthology ID:
Q19-1002
Volume:
Transactions of the Association for Computational Linguistics, Volume 7
Month:
March
Year:
2019
Address:
Venue:
TACL
SIG:
Publisher:
Note:
Pages:
19–31
Language:
URL:
https://www.aclweb.org/anthology/Q19-1002
DOI:
10.1162/tacl_a_00252
Bib Export formats:
BibTeX MODS XML EndNote
PDF:
http://aclanthology.lst.uni-saarland.de/Q19-1002.pdf