papers AI Learner
The Github is limit! Click to go to the new site.

Structural Supervision Improves Learning of Non-Local Grammatical Dependencies

2019-03-03
Ethan Wilcox, Peng Qian, Richard Futrell, Miguel Ballesteros, Roger Levy

Abstract

State-of-the-art LSTM language models trained on large corpora learn sequential contingencies in impressive detail, and have been shown to acquire a number of non-local grammatical dependencies with some success. Here we investigate whether supervision with hierarchical structure enhances learning of a range of grammatical dependencies, a question that has previously been addressed only for subject-verb agreement. Using controlled experimental methods from psycholinguistics, we compare the performance of word-based LSTM models versus Recurrent Neural Network Grammars (RNNGs) (Dyer et al., 2016), which represent hierarchical syntactic structure and use neural control to deploy it in left-to-right processing, on two classes of non-local grammatical dependencies in English – Negative Polarity licensing and filler-gap Dependencies – tested in a range of configurations. Using the same training data for both models, we find that the RNNG outperforms the LSTM on both types of grammatical dependencies and even learns many of the Island Constraints on the filler-gap dependency. Structural supervision thus provides data efficiency advantages over purely string-based training of neural language models in acquiring human-like generalizations about non-local grammatical dependencies.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1903.00943

PDF

http://arxiv.org/pdf/1903.00943


Comments