Dependency-Based Relative Positional Encoding for Transformer NMT

Yutaro Omote, Akihiro Tamura, Takashi Ninomiya


Abstract
This paper proposes a new Transformer neural machine translation model that incorporates syntactic distances between two source words into the relative position representations of the self-attention mechanism. In particular, the proposed model encodes pair-wise relative depths on a source dependency tree, which are differences between the depths of the two source words, in the encoder’s self-attention. The experiments show that our proposed model achieves 0.5 point gain in BLEU on the Asian Scientific Paper Excerpt Corpus Japanese-to-English translation task.
Anthology ID:
R19-1099
Volume:
Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019)
Month:
September
Year:
2019
Address:
Varna, Bulgaria
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd.
Note:
Pages:
854–861
Language:
URL:
https://www.aclweb.org/anthology/R19-1099
DOI:
10.26615/978-954-452-056-4_099
Bib Export formats:
BibTeX MODS XML EndNote
PDF:
http://aclanthology.lst.uni-saarland.de/R19-1099.pdf