papers AI Learner
The Github is limit! Click to go to the new site.

Global Convergence to the Equilibrium of GANs using Variational Inequalities

2018-09-11
Ian Gemp, Sridhar Mahadevan

Abstract

In optimization, the negative gradient of a function denotes the direction of steepest descent. Furthermore, traveling in any direction orthogonal to the gradient maintains the value of the function. In this work, we show that these orthogonal directions that are ignored by gradient descent can be critical in equilibrium problems. Equilibrium problems have drawn heightened attention in machine learning due to the emergence of the Generative Adversarial Network (GAN). We use the framework of Variational Inequalities to analyze popular training algorithms for a fundamental GAN variant: the Wasserstein Linear-Quadratic GAN. We show that the steepest descent direction causes divergence from the equilibrium, and guaranteed convergence to the equilibrium is achieved through following a particular orthogonal direction. We call this successful technique Crossing-the-Curl, named for its mathematical derivation as well as its intuition: identify the game’s axis of rotation and move “across” space in the direction towards smaller “curling”.

Abstract (translated by Google)
URL

https://arxiv.org/abs/1808.01531

PDF

https://arxiv.org/pdf/1808.01531


Similar Posts

Comments