{ "id": "2108.10277", "version": "v1", "published": "2021-08-23T16:39:05.000Z", "updated": "2021-08-23T16:39:05.000Z", "title": "Conditional sequential Monte Carlo in high dimensions", "authors": [ "Axel Finke", "Alexandre H. Thiery" ], "comment": "47 pages, 5 figures", "categories": [ "stat.CO" ], "abstract": "The iterated conditional sequential Monte Carlo (i-CSMC) algorithm from Andrieu, Doucet and Holenstein (2010) is an MCMC approach for efficiently sampling from the joint posterior distribution of the $T$ latent states in challenging time-series models, e.g. in non-linear or non-Gaussian state-space models. It is also the main ingredient in particle Gibbs samplers which infer unknown model parameters alongside the latent states. In this work, we first prove that the i-CSMC algorithm suffers from a curse of dimension in the dimension of the states, $D$: it breaks down unless the number of samples (\"particles\"), $N$, proposed by the algorithm grows exponentially with $D$. Then, we present a novel \"local\" version of the algorithm which proposes particles using Gaussian random-walk moves that are suitably scaled with $D$. We prove that this iterated random-walk conditional sequential Monte Carlo (i-RW-CSMC) algorithm avoids the curse of dimension: for arbitrary $N$, its acceptance rates and expected squared jumping distance converge to non-trivial limits as $D \\to \\infty$. If $T = N = 1$, our proposed algorithm reduces to a Metropolis--Hastings or Barker's algorithm with Gaussian random-walk moves and we recover the well known scaling limits for such algorithms.", "revisions": [ { "version": "v1", "updated": "2021-08-23T16:39:05.000Z" } ], "analyses": { "keywords": [ "high dimensions", "gaussian random-walk moves", "squared jumping distance converge", "random-walk conditional sequential monte carlo", "infer unknown model parameters alongside" ], "note": { "typesetting": "TeX", "pages": 47, "language": "en", "license": "arXiv", "status": "editable" } } }