papers AI Learner
The Github is limit! Click to go to the new site.

Energy-based Self-attentive Learning of Abstractive Communities for Spoken Language Understanding

2019-04-20
Guokan Shang (1 and 2), Antoine Jean-Pierre Tixier (1), Michalis Vazirgiannis (1 and 3), Jean-Pierre Lorré (2) ((1) École Polytechnique, (2) Linagora, (3) AUEB)

Abstract

Abstractive Community Detection is an important Spoken Language Understanding task, whose goal is to group utterances in a conversation according to whether they can be jointly summarized by a common abstractive sentence. This paper provides a novel approach to this task. We first introduce a neural contextual utterance encoder featuring three types of self-attention mechanisms. We then evaluate it against multiple baselines within the powerful siamese and triplet energy-based meta-architectures. Moreover, we propose a general sampling scheme that enables the triplet architecture to capture subtle clustering patterns, such as overlapping and nested communities. Experiments on the AMI corpus show that our system improves on the state-of-the-art and that our triplet sampling scheme is effective. Code and data are publicly available.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1904.09491

PDF

http://arxiv.org/pdf/1904.09491


Similar Posts

Comments