papers AI Learner
The Github is limit! Click to go to the new site.

Memory Visualization for Gated Recurrent Neural Networks in Speech Recognition

2017-02-27
Zhiyuan Tang, Ying Shi, Dong Wang, Yang Feng, Shiyue Zhang

Abstract

Recurrent neural networks (RNNs) have shown clear superiority in sequence modeling, particularly the ones with gated units, such as long short-term memory (LSTM) and gated recurrent unit (GRU). However, the dynamic properties behind the remarkable performance remain unclear in many applications, e.g., automatic speech recognition (ASR). This paper employs visualization techniques to study the behavior of LSTM and GRU when performing speech recognition tasks. Our experiments show some interesting patterns in the gated memory, and some of them have inspired simple yet effective modifications on the network structure. We report two of such modifications: (1) lazy cell update in LSTM, and (2) shortcut connections for residual learning. Both modifications lead to more comprehensible and powerful networks.

Abstract (translated by Google)
URL

https://arxiv.org/abs/1609.08789

PDF

https://arxiv.org/pdf/1609.08789


Similar Posts

Comments