arXiv Analytics

Sign in

arXiv:2207.08670 [stat.CO]AbstractReferencesReviewsResources

Gradient-based data and parameter dimension reduction for Bayesian models: an information theoretic perspective

Ricardo Baptista, Youssef Marzouk, Olivier Zahm

Published 2022-07-18Version 1

We consider the problem of reducing the dimensions of parameters and data in non-Gaussian Bayesian inference problems. Our goal is to identify an "informed" subspace of the parameters and an "informative" subspace of the data so that a high-dimensional inference problem can be approximately reformulated in low-to-moderate dimensions, thereby improving the computational efficiency of many inference techniques. To do so, we exploit gradient evaluations of the log-likelihood function. Furthermore, we use an information-theoretic analysis to derive a bound on the posterior error due to parameter and data dimension reduction. This bound relies on logarithmic Sobolev inequalities, and it reveals the appropriate dimensions of the reduced variables. We compare our method with classical dimension reduction techniques, such as principal component analysis and canonical correlation analysis, on applications ranging from mechanics to image processing.

Related articles: Most relevant | Search more
arXiv:1908.06687 [stat.CO] (Published 2019-08-19)
Bayesian models for survival data of clinical trials: Comparison of implementations using R software
arXiv:2308.06845 [stat.CO] (Published 2023-08-13)
csSampling: An R Package for Bayesian Models for Complex Survey Data
arXiv:1904.05886 [stat.CO] (Published 2019-04-11)
Markov chain Monte Carlo importance samplers for Bayesian models with intractable likelihoods