Parsing in the absence of related languages: Evaluating low-resource dependency parsers on Tagalog
Angelina Aquino | Franz de Leon
Proceedings of the Fourth Workshop on Universal Dependencies (UDW 2020)
Cross-lingual and multilingual methods have been widely suggested as options for dependency parsing of low-resource languages; however, these typically require the use of annotated data in related high-resource languages. In this paper, we evaluate the performance of these methods versus monolingual parsing of Tagalog, an Austronesian language which shares little typological similarity with any existing high-resource languages. We show that a monolingual model developed on minimal target language data consistently outperforms all cross-lingual and multilingual models when no closely-related sources exist for a low-resource language.