papers AI Learner
The Github is limit! Click to go to the new site.

Assessing BERT's Syntactic Abilities

2019-01-16
Yoav Goldberg

Abstract

I assess the extent to which the recently introduced BERT model captures English syntactic phenomena, using (1) naturally-occurring subject-verb agreement stimuli; (2) “coloreless green ideas” subject-verb agreement stimuli, in which content words in natural sentences are randomly replaced with words sharing the same part-of-speech and inflection; and (3) manually crafted stimuli for subject-verb agreement and reflexive anaphora phenomena. The BERT model performs remarkably well on all cases.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1901.05287

PDF

http://arxiv.org/pdf/1901.05287


Comments

Content