papers AI Learner
The Github is limit! Click to go to the new site.

Neural Rejuvenation: Improving Deep Network Training by Enhancing Computational Resource Utilization

2018-12-02
Siyuan Qiao, Zhe Lin, Jianming Zhang, Alan Yuille

Abstract

In this paper, we study the problem of improving computational resource utilization of neural networks. Deep neural networks are usually over-parameterized for their tasks in order to achieve good performances, thus are likely to have underutilized computational resources. This observation motivates a lot of research topics, e.g. network pruning, architecture search, etc. As models with higher computational costs (e.g. more parameters or more computations) usually have better performances, we study the problem of improving the resource utilization of neural networks so that their potentials can be further realized. To this end, we propose a novel optimization method named Neural Rejuvenation. As its name suggests, our method detects dead neurons and computes resource utilization in real time, rejuvenates dead neurons by resource reallocation and reinitialization, and trains them with new training schemes. By simply replacing standard optimizers with Neural Rejuvenation, we are able to improve the performances of neural networks by a very large margin while using similar training efforts and maintaining their original resource usages.

Abstract (translated by Google)
URL

https://arxiv.org/abs/1812.00481

PDF

https://arxiv.org/pdf/1812.00481


Similar Posts

Comments