Feng Shi


2020

pdf bib
Structured Attention for Unsupervised Dialogue Structure Induction
Liang Qiu | Yizhou Zhao | Weiyan Shi | Yuan Liang | Feng Shi | Tao Yuan | Zhou Yu | Song-Chun Zhu
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)

Inducing a meaningful structural representation from one or a set of dialogues is a crucial but challenging task in computational linguistics. Advancement made in this area is critical for dialogue system design and discourse analysis. It can also be extended to solve grammatical inference. In this work, we propose to incorporate structured attention layers into a Variational Recurrent Neural Network (VRNN) model with discrete latent states to learn dialogue structure in an unsupervised fashion. Compared to a vanilla VRNN, structured attention enables a model to focus on different parts of the source sentence embeddings while enforcing a structural inductive bias. Experiments show that on two-party dialogue datasets, VRNN with structured attention learns semantic structures that are similar to templates used to generate this dialogue corpus. While on multi-party dialogue datasets, our model learns an interactive structure demonstrating its capability of distinguishing speakers or addresses, automatically disentangling dialogues without explicit human annotation.

2014

pdf bib
Personal Attributes Extraction in Chinese Text Bakeoff in CLP 2014: Overview
Ruifeng Xu | Shuai Wang | Feng Shi | Jian Xu
Proceedings of The Third CIPS-SIGHAN Joint Conference on Chinese Language Processing