POSTECH Submission on Duolingo Shared Task

Junsu Park, Hongseok Kwon, Jong-Hyeok Lee


Abstract
In this paper, we propose a transfer learning based simultaneous translation model by extending BART. We pre-trained BART with Korean Wikipedia and a Korean news dataset, and fine-tuned with an additional web-crawled parallel corpus and the 2020 Duolingo official training dataset. In our experiments on the 2020 Duolingo test dataset, our submission achieves 0.312 in weighted macro F1 score, and ranks second among the submitted En-Ko systems.
Anthology ID:
2020.ngt-1.16
Volume:
Proceedings of the Fourth Workshop on Neural Generation and Translation
Month:
July
Year:
2020
Address:
Online
Venues:
ACL | NGT | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
139–143
Language:
URL:
https://www.aclweb.org/anthology/2020.ngt-1.16
DOI:
10.18653/v1/2020.ngt-1.16
Bib Export formats:
BibTeX MODS XML EndNote
PDF:
http://aclanthology.lst.uni-saarland.de/2020.ngt-1.16.pdf
Video:
 http://slideslive.com/38929830