papers AI Learner
The Github is limit! Click to go to the new site.

Dream Distillation: A Data-Independent Model Compression Framework

2019-05-17
Kartikeya Bhardwaj, Naveen Suda, Radu Marculescu

Abstract

Model compression is eminently suited for deploying deep learning on IoT-devices. However, existing model compression techniques rely on access to the original or some alternate dataset. In this paper, we address the model compression problem when no real data is available, e.g., when data is private. To this end, we propose Dream Distillation, a data-independent model compression framework. Our experiments show that Dream Distillation can achieve 88.5% accuracy on the CIFAR-10 test set without actually training on the original data!

Abstract (translated by Google)
URL

http://arxiv.org/abs/1905.07072

PDF

http://arxiv.org/pdf/1905.07072


Similar Posts

Comments