Cut to the Chase: A Context Zoom-in Network for Reading Comprehension

Sathish Reddy Indurthi, Seunghak Yu, Seohyun Back, Heriberto Cuayáhuitl


Abstract
In recent years many deep neural networks have been proposed to solve Reading Comprehension (RC) tasks. Most of these models suffer from reasoning over long documents and do not trivially generalize to cases where the answer is not present as a span in a given document. We present a novel neural-based architecture that is capable of extracting relevant regions based on a given question-document pair and generating a well-formed answer. To show the effectiveness of our architecture, we conducted several experiments on the recently proposed and challenging RC dataset ‘NarrativeQA’. The proposed architecture outperforms state-of-the-art results by 12.62% (ROUGE-L) relative improvement.
Anthology ID:
D18-1054
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
570–575
Language:
URL:
https://www.aclweb.org/anthology/D18-1054
DOI:
10.18653/v1/D18-1054
Bib Export formats:
BibTeX MODS XML EndNote
PDF:
http://aclanthology.lst.uni-saarland.de/D18-1054.pdf
Attachment:
 D18-1054.Attachment.zip
Video:
 https://vimeo.com/305205548