papers AI Learner
The Github is limit! Click to go to the new site.

Semantically Conditioned Dialog Response Generation via Hierarchical Disentangled Self-Attention

2019-05-30
Wenhu Chen, Jianshu Chen, Pengda Qin, Xifeng Yan, William Yang Wang

Abstract

Semantically controlled neural response generation on limited-domain has achieved great performance. However, moving towards multi-domain large-scale scenarios is shown to be difficult because the possible combinations of semantic inputs grow exponentially with the number of domains. To alleviate such scalability issue, we exploit the structure of dialog acts to build a multi-layer hierarchical graph, where each act is represented as a root-to-leaf route on the graph. Then, we incorporate such graph structure prior as an inductive bias to build a hierarchical disentangled self-attention network, where we disentangle attention heads to model designated nodes on the dialog act graph. By activating different (disentangled) heads at each layer, combinatorially many dialog act semantics can be modeled to control the neural response generation. On the large-scale Multi-Domain-WOZ dataset, our algorithm can yield an improvement of over 5.0 BLEU score, and in human evaluation, it also significantly outperforms other baselines over various metrics including consistency, etc.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1905.12866

PDF

http://arxiv.org/pdf/1905.12866


Similar Posts

Comments