arXiv Analytics

Sign in

arXiv:1207.4255 [cs.LG]AbstractReferencesReviewsResources

On the Statistical Efficiency of $\ell_{1,p}$ Multi-Task Learning of Gaussian Graphical Models

Jean Honorio, Tommi Jaakkola, Dimitris Samaras

Published 2012-07-18, updated 2015-10-24Version 2

In this paper, we present $\ell_{1,p}$ multi-task structure learning for Gaussian graphical models. We analyze the sufficient number of samples for the correct recovery of the support union and edge signs. We also analyze the necessary number of samples for any conceivable method by providing information-theoretic lower bounds. We compare the statistical efficiency of multi-task learning versus that of single-task learning. For experiments, we use a block coordinate descent method that is provably convergent and generates a sequence of positive definite solutions. We provide experimental validation on synthetic data as well as on two publicly available real-world data sets, including functional magnetic resonance imaging and gene expression data.

Comments: Submitted on October 21, 2015 to the Journal of Machine Learning Research
Categories: cs.LG, stat.ML
Related articles: Most relevant | Search more
arXiv:1711.05391 [cs.LG] (Published 2017-11-15)
Semiblind subgraph reconstruction in Gaussian graphical models
arXiv:2006.12598 [cs.LG] (Published 2020-06-22)
Support Union Recovery in Meta Learning of Gaussian Graphical Models
arXiv:2306.07255 [cs.LG] (Published 2023-06-12)
Conditional Matrix Flows for Gaussian Graphical Models