papers AI Learner
The Github is limit! Click to go to the new site.

Implicit Generation and Generalization in Energy-Based Models

2019-03-20
Yilun Du, Igor Mordatch

Abstract

Energy based models (EBMs) are appealing due to their generality and simplicity in likelihood modeling, but have been traditionally difficult to train. We present techniques to scale MCMC based EBM training, on continuous neural networks, and show its success on the high-dimensional data domains of ImageNet32x32, ImageNet128x128, CIFAR-10, and robotic hand trajectories, achieving significantly better samples than other likelihood models and on par with contemporary GAN approaches, while covering all modes of the data. We highlight unique capabilities of implicit generation, such as energy compositionality and corrupt image reconstruction and completion. Finally, we show that EBMs generalize well and are able to achieve state-of-the-art out-of-distribution classification, exhibit adversarially robust classification, coherent long term predicted trajectory roll-outs, and generate zero-shot compositions of models.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1903.08689

PDF

http://arxiv.org/pdf/1903.08689


Similar Posts

Comments