arXiv Analytics

Sign in

arXiv:1902.03394 [stat.ML]AbstractReferencesReviewsResources

A stochastic version of Stein Variational Gradient Descent for efficient sampling

Lei Li, Jian-Guo Liu, Zibu Liu, Jianfeng Lu

Published 2019-02-09Version 1

We propose in this work RBM-SVGD, a stochastic version of Stein Variational Gradient Descent (SVGD) method for efficiently sampling from a given probability measure and thus useful for Bayesian inference. The method is to apply the Random Batch Method (RBM) for interacting particle systems proposed by Jin et al to the interacting particle systems in SVGD. While keeping the behaviors of SVGD, it reduces the computational cost, especially when the interacting kernel has long range. Numerical examples verify the efficiency of this new version of SVGD.

Related articles: Most relevant | Search more
arXiv:2102.12956 [stat.ML] (Published 2021-02-25)
Stein Variational Gradient Descent: many-particle and long-time asymptotics
arXiv:2011.10480 [stat.ML] (Published 2020-11-20)
On the coercivity condition in the learning of interacting particle systems
arXiv:1810.11693 [stat.ML] (Published 2018-10-27)
Stein Variational Gradient Descent as Moment Matching