arXiv Analytics

Sign in

arXiv:1401.1667 [stat.CO]AbstractReferencesReviewsResources

On general sampling schemes for Particle Markov chain Monte Carlo methods

Eduardo F. Mendes, Christopher K. Carter, Robert Kohn

Published 2014-01-08, updated 2015-01-12Version 2

Particle Markov Chain Monte Carlo methods (PMCMC) [Andrieu et. al. 2010] are used to carry out inference in non-linear and non-Gaussian state space models, where the posterior density of the states is approximated using particles. Current approaches usually carry out Bayesian inference using a particle Marginal Metropolis-Hastings algorithm, a particle Gibbs sampler, or a particle Metropolis within Gibbs sampler. Our article gives a general approach for constructing sampling schemes that converge to the target distributions given in Andrieu et. al. [2010] and Olsson and Ryden [2011]. Our approach shows how the three ways of generating variables mentioned above can be combined flexibly. The advantage of our general approach is that the sampling scheme can be tailored to obtain good results for different applications. We investigate the properties of the general sampling scheme, including conditions for uniform convergence to the posterior. We illustrate our methods with examples of state space models where one group of parameters can be generated in a straightforward manner in a particle Gibbs step by conditioning on the states, but where it is cumbersome and inefficient to generate such parameters when the states are integrated out. Conversely, it may be necessary to generate a second group of parameters without conditioning on the states because of the high dependence between such parameters and the states. A particularly important case that is illustrated by our examples occurs when there are many unknown parameters. In this case it is usually highly inefficient to generate all the parameters without conditioning on the states and it is preferable to generate as many parameters as possible using particle Gibbs and only generate those parameters that are highly correlated with the states using particle Marginal Metropolis-Hastings.

Related articles: Most relevant | Search more
arXiv:1610.08962 [stat.CO] (Published 2016-10-27)
On embedded hidden Markov models and particle Markov chain Monte Carlo methods
arXiv:1110.2873 [stat.CO] (Published 2011-10-13, updated 2012-03-13)
On the use of backward simulation in particle Markov chain Monte Carlo methods
arXiv:1011.2437 [stat.CO] (Published 2010-11-10)
Efficient Bayesian Inference for Switching State-Space Models using Discrete Particle Markov Chain Monte Carlo Methods