papers AI Learner
The Github is limit! Click to go to the new site.

Compositionality for Recursive Neural Networks

2019-01-30
Martha Lewis

Abstract

Modelling compositionality has been a longstanding area of research in the field of vector space semantics. The categorical approach to compositionality maps grammar onto vector spaces in a principled way, but comes under fire for requiring the formation of very high-dimensional matrices and tensors, and therefore being computationally infeasible. In this paper I show how a linear simplification of recursive neural tensor network models can be mapped directly onto the categorical approach, giving a way of computing the required matrices and tensors. This mapping suggests a number of lines of research for both categorical compositional vector space models of meaning and for recursive neural network models of compositionality.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1901.10723

PDF

http://arxiv.org/pdf/1901.10723


Similar Posts

Comments