arXiv Analytics

Sign in

arXiv:2102.12976 [stat.CO]AbstractReferencesReviewsResources

A Hybrid Approximation to the Marginal Likelihood

Eric Chuu, Debdeep Pati, Anirban Bhattacharya

Published 2021-02-24Version 1

Computing the marginal likelihood or evidence is one of the core challenges in Bayesian analysis. While there are many established methods for estimating this quantity, they predominantly rely on using a large number of posterior samples obtained from a Markov Chain Monte Carlo (MCMC) algorithm. As the dimension of the parameter space increases, however, many of these methods become prohibitively slow and potentially inaccurate. In this paper, we propose a novel method in which we use the MCMC samples to learn a high probability partition of the parameter space and then form a deterministic approximation over each of these partition sets. This two-step procedure, which constitutes both a probabilistic and a deterministic component, is termed a Hybrid approximation to the marginal likelihood. We demonstrate its versatility in a plethora of examples with varying dimension and sample size, and we also highlight the Hybrid approximation's effectiveness in situations where there is either a limited number or only approximate MCMC samples available.

Related articles: Most relevant | Search more
arXiv:0808.2902 [stat.CO] (Published 2008-08-21, updated 2012-01-09)
A Short History of Markov Chain Monte Carlo: Subjective Recollections from Incomplete Data
arXiv:1706.03649 [stat.CO] (Published 2017-06-12)
Fractional Langevin Monte Carlo: Exploring Lévy Driven Stochastic Differential Equations for Markov Chain Monte Carlo
arXiv:0810.5474 [stat.CO] (Published 2008-10-30)
Approximating the marginal likelihood using copula