papers AI Learner
The Github is limit! Click to go to the new site.

Aggregated Learning: A Deep Learning Framework Based on Information-Bottleneck Vector Quantization

2019-02-12
Hongyu Guo, Yongyi Mao, Ali Al-Bashabsheh, Richong Zhang

Abstract

Based on the notion of information bottleneck (IB), we formulate a quantization problem called “IB quantization”. We show that IB quantization is equivalent to learning based on the IB principle. Under this equivalence, the standard neural network models can be viewed as scalar (single sample) IB quantizers. It is known, from conventional rate-distortion theory, that scalar quantizers are inferior to vector (multi-sample) quantizers. Such a deficiency then inspires us to develop a novel learning framework, AgrLearn, that corresponds to vector IB quantizers for learning with neural networks. Unlike standard networks, AgrLearn simultaneously optimizes against multiple data samples. We experimentally verify that AgrLearn can result in significant improvements when applied to several current deep learning architectures for image recognition and text classification. We also empirically show that AgrLearn can reduce up to 80% of the training samples needed for ResNet training.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1807.10251

PDF

http://arxiv.org/pdf/1807.10251


Similar Posts

Comments