Jun Ogata


2019

pdf bib
Our Neural Machine Translation Systems for WAT 2019
Wei Yang | Jun Ogata
Proceedings of the 6th Workshop on Asian Translation

In this paper, we describe our Neural Machine Translation (NMT) systems for the WAT 2019 translation tasks we focus on. This year we participate in scientific paper tasks and focus on the language pair between English and Japanese. We use Transformer model through our work in this paper to explore and experience the powerful of the Transformer architecture relying on self-attention mechanism. We use different NMT toolkit/library as the implementation of training the Transformer model. For word segmentation, we use different subword segmentation strategies while using different toolkit/library. We not only give the translation accuracy obtained based on absolute position encodings that introduced in the Transformer model, but also report the the improvements in translation accuracy while replacing absolute position encodings with relative position representations. We also ensemble several independent trained Transformer models to further improve the translation accuracy.

2010

pdf bib
PodCastle: A Spoken Document Retrieval Service Improved by Anonymous User Contributions
Masataka Goto | Jun Ogata
Proceedings of the 24th Pacific Asia Conference on Language, Information and Computation