papers AI Learner
The Github is limit! Click to go to the new site.

Simplified Gating in Long Short-term Memory Recurrent Neural Networks

2017-01-12
Yuzhen Lu, Fathi M. Salem

Abstract

The standard LSTM recurrent neural networks while very powerful in long-range dependency sequence applications have highly complex structure and relatively large (adaptive) parameters. In this work, we present empirical comparison between the standard LSTM recurrent neural network architecture and three new parameter-reduced variants obtained by eliminating combinations of the input signal, bias, and hidden unit signals from individual gating signals. The experiments on two sequence datasets show that the three new variants, called simply as LSTM1, LSTM2, and LSTM3, can achieve comparable performance to the standard LSTM model with less (adaptive) parameters.

Abstract (translated by Google)
URL

https://arxiv.org/abs/1701.03441

PDF

https://arxiv.org/pdf/1701.03441


Similar Posts

Comments