arXiv:2009.10831 [stat.CO]AbstractReferencesReviewsResources
Bayesian Update with Importance Sampling: Required Sample Size
Daniel Sanz-Alonso, Zijian Wang
Published 2020-09-22Version 1
Importance sampling is used to approximate Bayes' rule in many computational approaches to Bayesian inverse problems, data assimilation and machine learning. This paper reviews and further investigates the required sample size for importance sampling in terms of the $\chi^2$-divergence between target and proposal. We develop general abstract theory and illustrate through numerous examples the roles that dimension, noise-level and other model parameters play in approximating the Bayesian update with importance sampling. Our examples also facilitate a new direct comparison of standard and optimal proposals for particle filtering.
Categories: stat.CO
Related articles: Most relevant | Search more
arXiv:2310.18488 [stat.CO] (Published 2023-10-27)
Variance-based sensitivity of Bayesian inverse problems to the prior distribution
arXiv:1709.09763 [stat.CO] (Published 2017-09-27)
Multilevel Sequential${}^2$ Monte Carlo for Bayesian Inverse Problems
arXiv:1906.08850 [stat.CO] (Published 2019-06-20)
Pushing the Limits of Importance Sampling through Iterative Moment Matching