Sarah’s Participation in WAT 2019

Raymond Hendy Susanto, Ohnmar Htun, Liling Tan


Abstract
This paper describes our MT systems’ participation in the of WAT 2019. We participated in the (i) Patent, (ii) Timely Disclosure, (iii) Newswire and (iv) Mixed-domain tasks. Our main focus is to explore how similar Transformer models perform on various tasks. We observed that for tasks with smaller datasets, our best model setup are shallower models with lesser number of attention heads. We investigated practical issues in NMT that often appear in production settings, such as coping with multilinguality and simplifying pre- and post-processing pipeline in deployment.
Anthology ID:
D19-5219
Volume:
Proceedings of the 6th Workshop on Asian Translation
Month:
November
Year:
2019
Address:
Hong Kong, China
Venues:
EMNLP | WAT | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
152–158
Language:
URL:
https://www.aclweb.org/anthology/D19-5219
DOI:
10.18653/v1/D19-5219
Bib Export formats:
BibTeX MODS XML EndNote
PDF:
http://aclanthology.lst.uni-saarland.de/D19-5219.pdf