Japanese-Russian TMU Neural Machine Translation System using Multilingual Model for WAT 2019

Aizhan Imankulova, Masahiro Kaneko, Mamoru Komachi


Abstract
We introduce our system that is submitted to the News Commentary task (Japanese<->Russian) of the 6th Workshop on Asian Translation. The goal of this shared task is to study extremely low resource situations for distant language pairs. It is known that using parallel corpora of different language pair as training data is effective for multilingual neural machine translation model in extremely low resource scenarios. Therefore, to improve the translation quality of Japanese<->Russian language pair, our method leverages other in-domain Japanese-English and English-Russian parallel corpora as additional training data for our multilingual NMT model.
Anthology ID:
D19-5221
Volume:
Proceedings of the 6th Workshop on Asian Translation
Month:
November
Year:
2019
Address:
Hong Kong, China
Venues:
EMNLP | WAT | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
165–170
Language:
URL:
https://www.aclweb.org/anthology/D19-5221
DOI:
10.18653/v1/D19-5221
Bib Export formats:
BibTeX MODS XML EndNote
PDF:
http://aclanthology.lst.uni-saarland.de/D19-5221.pdf