Abstract
Flow-based generative models have recently become one of the most efficient approaches to model the data generation. Indeed, they are constructed with a sequence of invertible and tractable transformations. Glow first introduced a simple type of generative flow using an invertible 1x1 convolution. However, the 1x1 convolution suffers from limited flexibility compared to the standard convolutions. In this paper, we propose a novel invertible nxn convolution approach that overcomes the limitations of the invertible 1x1 convolution. In addition, our proposed network is not only tractable and invertible but also uses fewer parameters than standard convolutions. The experiments on CIFAR-10, ImageNet, and Celeb-HQ datasets, have showed that our invertible nxn convolution helps to improve the performance of generative models significantly.
Abstract (translated by Google)
URL
http://arxiv.org/abs/1905.10170