Energy-based Self-attentive Learning of Abstractive Communities for Spoken Language Understanding

Guokan Shang1, Antoine Tixier2, Michalis Vazirgiannis3, Jean-Pierre Lorré4
1École Polytechnique, 2Ecole Polytechnique, Palaiseau, France, 3Ecole Polytechnique, 4Linagora


Abstract

Abstractive community detection is an important spoken language understanding task, whose goal is to group utterances in a conversation according to whether they can be jointly summarized by a common abstractive sentence. This paper provides a novel approach to this task. We first introduce a neural contextual utterance encoder featuring three types of self-attention mechanisms. We then train it using the siamese and triplet energy-based meta-architectures. Experiments on the AMI corpus show that our system outperforms multiple energy-based and non-energy based baselines from the state-of-the-art. Code and data are publicly available.