papers AI Learner
The Github is limit! Click to go to the new site.

Relaxed Wasserstein with Applications to GANs

2019-05-04
Xin Guo, Johnny Hong, Tianyi Lin, Nan Yang

Abstract

Comparing probability distributions is an integral part of many modern data-driven applications, such as generative adversarial networks (GANs) and distributionally robust optimization (DRO). We propose a novel class of statistical divergences called \textit{Relaxed Wasserstein} (RW) divergence, which generalizes Wasserstein divergence and is parametrized by the class of strictly convex and differentiable functions. We establish for RW divergence several probabilistic properties, many of which are crucial for the success of Wasserstein divergence. In addition, we derive theoretical results showing that the underlying convex function in RW plays an important role in variance stabilization, shedding light on the choice of appropriate convex function. We develop a version of GANs based on RW divergence and demonstrate via empirical experiments that RW-based GANs (RWGANs) lead to superior performance in image generation problems compared to existing approaches. In particular, we find that in our experiments RWGANs are fastest in generating meaningful images compared to other GANs. We also illustrate the use of RW divergence in constructing ambiguity sets for DRO problems, and the robust portfolio problem under mean-variance framework.

Abstract (translated by Google)
URL

https://arxiv.org/abs/1705.07164

PDF

https://arxiv.org/pdf/1705.07164


Similar Posts

Comments