papers AI Learner
The Github is limit! Click to go to the new site.

Improving Tree-LSTM with Tree Attention

2019-01-01
Mahtab Ahmed, Muhammad Rifayat Samee, Robert E. Mercer

Abstract

In Natural Language Processing (NLP), we often need to extract information from tree topology. Sentence structure can be represented via a dependency tree or a constituency tree structure. For this reason, a variant of LSTMs, named Tree-LSTM, was proposed to work on tree topology. In this paper, we design a generalized attention framework for both dependency and constituency trees by encoding variants of decomposable attention inside a Tree-LSTM cell. We evaluated our models on a semantic relatedness task and achieved notable results compared to Tree-LSTM based methods with no attention as well as other neural and non-neural methods and good results compared to Tree-LSTM based methods with attention.

Abstract (translated by Google)
URL

https://arxiv.org/abs/1901.00066

PDF

https://arxiv.org/pdf/1901.00066


Similar Posts

Comments