arXiv Analytics

Sign in

arXiv:2306.07255 [cs.LG]AbstractReferencesReviewsResources

Conditional Matrix Flows for Gaussian Graphical Models

Marcello Massimo Negri, F. Arend Torres, Volker Roth

Published 2023-06-12Version 1

Studying conditional independence structure among many variables with few observations is a challenging task. Gaussian Graphical Models (GGMs) tackle this problem by encouraging sparsity in the precision matrix through an $l_p$ regularization with $p\leq1$. However, since the objective is highly non-convex for sub-$l_1$ pseudo-norms, most approaches rely on the $l_1$ norm. In this case frequentist approaches allow to elegantly compute the solution path as a function of the shrinkage parameter $\lambda$. Instead of optimizing the penalized likelihood, the Bayesian formulation introduces a Laplace prior on the precision matrix. However, posterior inference for different $\lambda$ values requires repeated runs of expensive Gibbs samplers. We propose a very general framework for variational inference in GGMs that unifies the benefits of frequentist and Bayesian frameworks. Specifically, we propose to approximate the posterior with a matrix-variate Normalizing Flow defined on the space of symmetric positive definite matrices. As a key improvement on previous work, we train a continuum of sparse regression models jointly for all regularization parameters $\lambda$ and all $l_p$ norms, including non-convex sub-$l_1$ pseudo-norms. This is achieved by conditioning the flow on $p>0$ and on the shrinkage parameter $\lambda$. We have then access with one model to (i) the evolution of the posterior for any $\lambda$ and for any $l_p$ (pseudo-) norms, (ii) the marginal log-likelihood for model selection, and (iii) we can recover the frequentist solution paths as the MAP, which is obtained through simulated annealing.

Related articles: Most relevant | Search more
arXiv:1207.4255 [cs.LG] (Published 2012-07-18, updated 2015-10-24)
On the Statistical Efficiency of $\ell_{1,p}$ Multi-Task Learning of Gaussian Graphical Models
arXiv:1711.05391 [cs.LG] (Published 2017-11-15)
Semiblind subgraph reconstruction in Gaussian graphical models
arXiv:2006.12598 [cs.LG] (Published 2020-06-22)
Support Union Recovery in Meta Learning of Gaussian Graphical Models