arXiv Analytics

Sign in

arXiv:1912.03915 [stat.ML]AbstractReferencesReviewsResources

Learning Disentangled Representations via Mutual Information Estimation

Eduardo Hugo Sanchez, Mathieu Serrurier, Mathias Ortner

Published 2019-12-09Version 1

In this paper, we investigate the problem of learning disentangled representations. Given a pair of images sharing some attributes, we aim to create a low-dimensional representation which is split into two parts: a shared representation that captures the common information between the images and an exclusive representation that contains the specific information of each image. To address this issue, we propose a model based on mutual information estimation without relying on image reconstruction or image generation. Mutual information maximization is performed to capture the attributes of data in the shared and exclusive representations while we minimize the mutual information between the shared and exclusive representation to enforce representation disentanglement. We show that these representations are useful to perform downstream tasks such as image classification and image retrieval based on the shared or exclusive component. Moreover, classification results show that our model outperforms the state-of-the-art model based on VAE/GAN approaches in representation disentanglement.

Related articles: Most relevant | Search more
arXiv:2010.03459 [stat.ML] (Published 2020-10-07)
Learning disentangled representations with the Wasserstein Autoencoder
arXiv:1808.06670 [stat.ML] (Published 2018-08-20)
Learning deep representations by mutual information estimation and maximization
arXiv:2006.12204 [stat.ML] (Published 2020-06-22)
Telescoping Density-Ratio Estimation