papers AI Learner
The Github is limit! Click to go to the new site.

Semantic Hilbert Space for Text Representation Learning

2019-02-26
Benyou Wang, Qiuchi Li, Massimo Melucci, Dawei Song

Abstract

Capturing the meaning of sentences has long been a challenging task. Current models tend to apply linear combinations of word features to conduct semantic composition for bigger-granularity units e.g. phrases, sentences, and documents. However, the semantic linearity does not always hold in human language. For instance, the meaning of the phrase ivory tower' can not be deduced by linearly combining the meanings of ivory’ and `tower’. To address this issue, we propose a new framework that models different levels of semantic units (e.g. sememe, word, sentence, and semantic abstraction) on a single \textit{Semantic Hilbert Space}, which naturally admits a non-linear semantic composition by means of a complex-valued vector word representation. An end-to-end neural network~\footnote{https://github.com/wabyking/qnn} is proposed to implement the framework in the text classification task, and evaluation results on six benchmarking text classification datasets demonstrate the effectiveness, robustness and self-explanation power of the proposed model. Furthermore, intuitive case studies are conducted to help end users to understand how the framework works.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1902.09802

PDF

http://arxiv.org/pdf/1902.09802


Similar Posts

Comments