papers AI Learner
The Github is limit! Click to go to the new site.

On Compression of Unsupervised Neural Nets by Pruning Weak Connections

2019-01-21
Zhiwen Zuo, Lei Zhao, Liwen Zuo, Feng Jiang, Wei Xing, Dongming Lu

Abstract

Unsupervised neural nets such as Restricted Boltzmann Machines(RBMs) and Deep Belif Networks(DBNs), are powerful in automatic feature extraction,unsupervised weight initialization and density estimation. In this paper,we demonstrate that the parameters of these neural nets can be dramatically reduced without affecting their performance. We describe a method to reduce the parameters required by RBM which is the basic building block for deep architectures. Further we propose an unsupervised sparse deep architectures selection algorithm to form sparse deep neural networks.Experimental results show that there is virtually no loss in either generative or discriminative performance.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1901.07066

PDF

http://arxiv.org/pdf/1901.07066


Similar Posts

Comments