arXiv Analytics

Sign in

arXiv:1705.08868 [cs.LG]AbstractReferencesReviewsResources

Flow-GAN: Bridging implicit and prescribed learning in generative models

Aditya Grover, Manik Dhar, Stefano Ermon

Published 2017-05-24Version 1

Evaluating the performance of generative models for unsupervised learning is inherently challenging due to the lack of well-defined and tractable objectives. This is particularly difficult for implicit models such as generative adversarial networks (GANs) which perform extremely well in practice for tasks such as sample generation, but sidestep the explicit characterization of a density. We propose Flow-GANs, a generative adversarial network with the generator specified as a normalizing flow model which can perform exact likelihood evaluation. Subsequently, we learn a Flow-GAN using a hybrid objective that integrates adversarial training with maximum likelihood estimation. We show empirically the benefits of Flow-GANs on MNIST and CIFAR-10 datasets in learning generative models that can attain low generalization error based on the log-likelihoods and generate high quality samples. Finally, we show a simple, yet hard to beat baseline for GANs based on Gaussian Mixture Models.

Related articles: Most relevant | Search more
arXiv:1901.07667 [cs.LG] (Published 2019-01-23)
Composition and decomposition of GANs
arXiv:1109.3940 [cs.LG] (Published 2011-09-19)
Learning Discriminative Metrics via Generative Models and Kernel Learning
arXiv:1911.05020 [cs.LG] (Published 2019-11-12)
Generative adversarial networks (GAN) based efficient sampling of chemical space for inverse design of inorganic materials