Go Figure! Multi-task transformer-based architecture for metaphor detection using idioms: ETS team in 2020 metaphor shared task

Xianyang Chen, Chee Wee (Ben) Leong, Michael Flor, Beata Beigman Klebanov


Abstract
This paper describes the ETS entry to the 2020 Metaphor Detection shared task. Our contribution consists of a sequence of experiments using BERT, starting with a baseline, strengthening it by spell-correcting the TOEFL corpus, followed by a multi-task learning setting, where one of the tasks is the token-level metaphor classification as per the shared task, while the other is meant to provide additional training that we hypothesized to be relevant to the main task. In one case, out-of-domain data manually annotated for metaphor is used for the auxiliary task; in the other case, in-domain data automatically annotated for idioms is used for the auxiliary task. Both multi-task experiments yield promising results.
Anthology ID:
2020.figlang-1.32
Volume:
Proceedings of the Second Workshop on Figurative Language Processing
Month:
July
Year:
2020
Address:
Online
Venues:
ACL | Fig-Lang | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
235–243
Language:
URL:
https://www.aclweb.org/anthology/2020.figlang-1.32
DOI:
10.18653/v1/2020.figlang-1.32
Bib Export formats:
BibTeX MODS XML EndNote
PDF:
http://aclanthology.lst.uni-saarland.de/2020.figlang-1.32.pdf
Video:
 http://slideslive.com/38929727