Chantal Amrhein


2020

pdf bib
On Romanization for Model Transfer Between Scripts in Neural Machine Translation
Chantal Amrhein | Rico Sennrich
Findings of the Association for Computational Linguistics: EMNLP 2020

Transfer learning is a popular strategy to improve the quality of low-resource machine translation. For an optimal transfer of the embedding layer, the child and parent model should share a substantial part of the vocabulary. This is not the case when transferring to languages with a different script. We explore the benefit of romanization in this scenario. Our results show that romanization entails information loss and is thus not always superior to simpler vocabulary transfer methods, but can improve the transfer between related languages with different scripts. We compare two romanization tools and find that they exhibit different degrees of information loss, which affects translation quality. Finally, we extend romanization to the target side, showing that this can be a successful strategy when coupled with a simple deromanization model.

2019

pdf bib
Post-editing Productivity with Neural Machine Translation: An Empirical Assessment of Speed and Quality in the Banking and Finance Domain
Samuel Läubli | Chantal Amrhein | Patrick Düggelin | Beatriz Gonzalez | Alena Zwahlen | Martin Volk
Proceedings of Machine Translation Summit XVII Volume 1: Research Track

2017

pdf bib
C-3MA: Tartu-Riga-Zurich Translation Systems for WMT17
Matīss Rikters | Chantal Amrhein | Maksym Del | Mark Fishel
Proceedings of the Second Conference on Machine Translation