arXiv Analytics

Sign in

arXiv:1909.12598 [cs.LG]AbstractReferencesReviewsResources

"Best-of-Many-Samples" Distribution Matching

Apratim Bhattacharyya, Mario Fritz, Bernt Schiele

Published 2019-09-27Version 1

Generative Adversarial Networks (GANs) can achieve state-of-the-art sample quality in generative modelling tasks but suffer from the mode collapse problem. Variational Autoencoders (VAE) on the other hand explicitly maximize a reconstruction-based data log-likelihood forcing it to cover all modes, but suffer from poorer sample quality. Recent works have proposed hybrid VAE-GAN frameworks which integrate a GAN-based synthetic likelihood to the VAE objective to address both the mode collapse and sample quality issues, with limited success. This is because the VAE objective forces a trade-off between the data log-likelihood and divergence to the latent prior. The synthetic likelihood ratio term also shows instability during training. We propose a novel objective with a "Best-of-Many-Samples" reconstruction cost and a stable direct estimate of the synthetic likelihood. This enables our hybrid VAE-GAN framework to achieve high data log-likelihood and low divergence to the latent prior at the same time and shows significant improvement over both hybrid VAE-GANS and plain GANs in mode coverage and quality.

Related articles: Most relevant | Search more
arXiv:2006.11432 [cs.LG] (Published 2020-06-19)
Online Kernel based Generative Adversarial Networks
arXiv:2411.09642 [cs.LG] (Published 2024-11-14)
On the Limits of Language Generation: Trade-Offs Between Hallucination and Mode Collapse
arXiv:2307.09742 [cs.LG] (Published 2023-07-19)
Improved Distribution Matching for Dataset Condensation