papers AI Learner
The Github is limit! Click to go to the new site.

Cramnet: Layer-wise Deep Neural Network Compression with Knowledge Transfer from a Teacher Network

2019-04-11
Jon Hoffman

Abstract

Neural Networks accomplish amazing things, but they suffer from computational and memory bottlenecks that restrict their usage. Nowhere can this be better seen than in the mobile space, where specialized hardware is being created just to satisfy the demand for neural networks. Previous studies have shown that neural networks have vastly more connections than they actually need to do their work. This thesis develops a method that can compress networks to less than 10% of memory and less than 25% of computational power, without loss of accuracy, and without creating sparse networks that require special code to run.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1904.05982

PDF

http://arxiv.org/pdf/1904.05982


Similar Posts

Comments