papers AI Learner
The Github is limit! Click to go to the new site.

Cross-topic distributional semantic representations via unsupervised mappings

2019-04-11
Eleftheria Briakou, Nikos Athanasiou, Alexandros Potamianos

Abstract

In traditional Distributional Semantic Models (DSMs) the multiple senses of a polysemous word are conflated into a single vector space representation. In this work, we propose a DSM that learns multiple distributional representations of a word based on different topics. First, a separate DSM is trained for each topic and then each of the topic-based DSMs is aligned to a common vector space. Our unsupervised mapping approach is motivated by the hypothesis that words preserving their relative distances in different topic semantic sub-spaces constitute robust \textit{semantic anchors} that define the mappings between them. Aligned cross-topic representations achieve state-of-the-art results for the task of contextual word similarity. Furthermore, evaluation on NLP downstream tasks shows that multiple topic-based embeddings outperform single-prototype models.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1904.05674

PDF

http://arxiv.org/pdf/1904.05674


Comments

Content