Find or Classify? Dual Strategy for Slot-Value Predictions on Multi-Domain Dialog State Tracking

Jianguo Zhang, Kazuma Hashimoto, Chien-Sheng Wu, Yao Wang, Philip Yu, Richard Socher, Caiming Xiong


Abstract
Dialog state tracking (DST) is a core component in task-oriented dialog systems. Existing approaches for DST mainly fall into one of two categories, namely, ontology-based and ontology-free methods. An ontology-based method selects a value from a candidate-value list for each target slot, while an ontology-free method extracts spans from dialog contexts. Recent work introduced a BERT-based model to strike a balance between the two methods by pre-defining categorical and non-categorical slots. However, it is not clear enough which slots are better handled by either of the two slot types, and the way to use the pre-trained model has not been well investigated. In this paper, we propose a simple yet effective dual-strategy model for DST, by adapting a single BERT-style reading comprehension model to jointly handle both the categorical and non-categorical slots. Our experiments on the MultiWOZ datasets show that our method significantly outperforms the BERT-based counterpart, finding that the key is a deep interaction between the domain-slot and context information. When evaluated on noisy (MultiWOZ 2.0) and cleaner (MultiWOZ 2.1) settings, our method performs competitively and robustly across the two different settings. Our method sets the new state of the art in the noisy setting, while performing more robustly than the best model in the cleaner setting. We also conduct a comprehensive error analysis on the dataset, including the effects of the dual strategy for each slot, to facilitate future research.
Anthology ID:
2020.starsem-1.17
Volume:
Proceedings of the Ninth Joint Conference on Lexical and Computational Semantics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Venues:
*SEMEVAL | COLING | starsem
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
154–167
Language:
URL:
https://www.aclweb.org/anthology/2020.starsem-1.17
DOI:
Bib Export formats:
BibTeX MODS XML EndNote
PDF:
http://aclanthology.lst.uni-saarland.de/2020.starsem-1.17.pdf