{ "id": "1401.1667", "version": "v2", "published": "2014-01-08T11:25:24.000Z", "updated": "2015-01-12T07:00:23.000Z", "title": "On general sampling schemes for Particle Markov chain Monte Carlo methods", "authors": [ "Eduardo F. Mendes", "Christopher K. Carter", "Robert Kohn" ], "categories": [ "stat.CO" ], "abstract": "Particle Markov Chain Monte Carlo methods (PMCMC) [Andrieu et. al. 2010] are used to carry out inference in non-linear and non-Gaussian state space models, where the posterior density of the states is approximated using particles. Current approaches usually carry out Bayesian inference using a particle Marginal Metropolis-Hastings algorithm, a particle Gibbs sampler, or a particle Metropolis within Gibbs sampler. Our article gives a general approach for constructing sampling schemes that converge to the target distributions given in Andrieu et. al. [2010] and Olsson and Ryden [2011]. Our approach shows how the three ways of generating variables mentioned above can be combined flexibly. The advantage of our general approach is that the sampling scheme can be tailored to obtain good results for different applications. We investigate the properties of the general sampling scheme, including conditions for uniform convergence to the posterior. We illustrate our methods with examples of state space models where one group of parameters can be generated in a straightforward manner in a particle Gibbs step by conditioning on the states, but where it is cumbersome and inefficient to generate such parameters when the states are integrated out. Conversely, it may be necessary to generate a second group of parameters without conditioning on the states because of the high dependence between such parameters and the states. A particularly important case that is illustrated by our examples occurs when there are many unknown parameters. In this case it is usually highly inefficient to generate all the parameters without conditioning on the states and it is preferable to generate as many parameters as possible using particle Gibbs and only generate those parameters that are highly correlated with the states using particle Marginal Metropolis-Hastings.", "revisions": [ { "version": "v1", "updated": "2014-01-08T11:25:24.000Z", "abstract": "Particle Markov Chain Monte Carlo methods [Andrieu et. al. 2010] are used to carry out inference in non-linear and non-Gaussian state space models, where the posterior density of the states is approximated using particles. Current approaches have usually carried out Bayesian inference using a particle Metropolis-Hastings algorithm or a particle Gibbs sampler. In this paper, we give a general approach to constructing sampling schemes that converge to the target distributions given in Andrieu et. al. [2010] and Olsson and Ryden [2011]. We describe our methods as a particle Metropolis within Gibbs sampler (PMwG). The advantage of our general approach is that the sampling scheme can be tailored to obtain good results for different applications. We investigate the properties of the general sampling scheme, including conditions for uniform convergence to the posterior. We illustrate our methods with examples of state space models where one group of parameters can be generated in a straightforward manner in a particle Gibbs step by conditioning on the states, but where it is cumbersome and inefficient to generate such parameters when the states are integrated out. Conversely, it may be necessary to generate a second group of parameters without conditioning on the states because of the high dependence between such parameters and the states. Our examples include state space models with diffuse initial conditions, where we introduce two methods to deal with the initial conditions.", "comment": null, "journal": null, "doi": null }, { "version": "v2", "updated": "2015-01-12T07:00:23.000Z" } ], "analyses": { "keywords": [ "markov chain monte carlo methods", "particle markov chain monte carlo", "general sampling scheme", "state space models" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable", "adsabs": "2014arXiv1401.1667M" } } }