papers AI Learner
The Github is limit! Click to go to the new site.

Simplified Long Short-term Memory Recurrent Neural Networks: part II

2017-07-14
Atra Akandeh, Fathi M. Salem

Abstract

This is part II of three-part work. Here, we present a second set of inter-related five variants of simplified Long Short-term Memory (LSTM) recurrent neural networks by further reducing adaptive parameters. Two of these models have been introduced in part I of this work. We evaluate and verify our model variants on the benchmark MNIST dataset and assert that these models are comparable to the base LSTM model while use progressively less number of parameters. Moreover, we observe that in case of using the ReLU activation, the test accuracy performance of the standard LSTM will drop after a number of epochs when learning parameter become larger. However all of the new model variants sustain their performance.

Abstract (translated by Google)
URL

https://arxiv.org/abs/1707.04623

PDF

https://arxiv.org/pdf/1707.04623


Similar Posts

Comments