papers AI Learner
The Github is limit! Click to go to the new site.

Gated Group Self-Attention for Answer Selection

2019-05-26
Dong Xu, Jianhui Ji, Haikuan Huang, Hongbo Deng, Wu-Jun Li

Abstract

Answer selection (answer ranking) is one of the key steps in many kinds of question answering (QA) applications, where deep models have achieved state-of-the-art performance. Among these deep models, recurrent neural network (RNN) based models are most popular, typically with better performance than convolutional neural network (CNN) based models. Nevertheless, it is difficult for RNN based models to capture the information about long-range dependency among words in the sentences of questions and answers. In this paper, we propose a new deep model, called gated group self-attention (GGSA), for answer selection. GGSA is inspired by global self-attention which is originally proposed for machine translation and has not been explored in answer selection. GGSA tackles the problem of global self-attention that local and global information cannot be well distinguished. Furthermore, an interaction mechanism between questions and answers is also proposed to enhance GGSA by a residual structure. Experimental results on two popular QA datasets show that GGSA can outperform existing answer selection models to achieve state-of-the-art performance. Furthermore, GGSA can also achieve higher accuracy than global self-attention for the answer selection task, with a lower computation cost.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1905.10720

PDF

http://arxiv.org/pdf/1905.10720


Similar Posts

Comments