papers AI Learner
The Github is limit! Click to go to the new site.

Multi-Task Deep Neural Networks for Natural Language Understanding

2019-01-31
Xiaodong Liu, Pengcheng He, Weizhu Chen, Jianfeng Gao

Abstract

In this paper, we present a Multi-Task Deep Neural Network (MT-DNN) for learning representations across multiple natural language understanding (NLU) tasks. MT-DNN not only leverages large amounts of cross-task data, but also benefits from a regularization effect that leads to more general representations in order to adapt to new tasks and domains. MT-DNN extends the model proposed in Liu et al. (2015) by incorporating a pre-trained bidirectional transformer language model, known as BERT (Devlin et al., 2018). MT-DNN obtains new state-of-the-art results on ten NLU tasks, including SNLI, SciTail, and eight out of nine GLUE tasks, pushing the GLUE benchmark to 82.2% (1.8% absolute improvement). We also demonstrate using the SNLI and SciTail datasets that the representations learned by MT-DNN allow domain adaptation with substantially fewer in-domain labels than the pre-trained BERT representations. Our code and pre-trained models will be made publicly available.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1901.11504

PDF

http://arxiv.org/pdf/1901.11504


Comments

Content