papers AI Learner
The Github is limit! Click to go to the new site.

BERT for Joint Intent Classification and Slot Filling

2019-02-28
Qian Chen, Zhu Zhuo, Wen Wang

Abstract

Intent classification and slot filling are two essential tasks for natural language understanding. They often suffer from small-scale human-labeled training data, resulting in poor generalization capability, especially for rare words. Recently a new language representation model, BERT (Bidirectional Encoder Representations from Transformers), facilitates pre-training deep bidirectional representations on large-scale unlabeled corpora, and has created state-of-the-art models for a wide variety of natural language processing tasks after simple fine-tuning. However, there has not been much effort on exploring BERT for natural language understanding. In this work, we propose a joint intent classification and slot filling model based on BERT. Experimental results demonstrate that our proposed model achieves significant improvement on intent classification accuracy, slot filling F1, and sentence-level semantic frame accuracy on several public benchmark datasets, compared to the attention-based recurrent neural network models and slot-gated models.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1902.10909

PDF

http://arxiv.org/pdf/1902.10909


Similar Posts

Comments