papers AI Learner
The Github is limit! Click to go to the new site.

Channel Locality Block: A Variant of Squeeze-and-Excitation

2019-01-06
Huayu Li

Abstract

Attention mechanism is a hot spot in deep learning field. Using channel attention model is an effective method for improving the performance of the convolutional neural network. Squeeze-and-Excitation block takes advantage of the channel dependence, selectively emphasizing the important channels and compressing the relatively useless channel. In this paper, we proposed a variant of SE block based on channel locality. Instead of using full connection layers to explore the global channel dependence, we adopt convolutional layers to learn the correlation between the nearby channels. We term this new algorithm Channel Locality(C-Local) block. We evaluate SE block and C-Local block by applying them to different CNNs architectures on cifar-10 dataset. We observed that our C-Local block got higher accuracy than SE block did.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1901.01493

PDF

http://arxiv.org/pdf/1901.01493


Similar Posts

Comments