arXiv Analytics

Sign in

arXiv:2006.09797 [stat.ML]AbstractReferencesReviewsResources

A Non-Asymptotic Analysis for Stein Variational Gradient Descent

Anna Korba, Adil Salim, Michael Arbel, Giulia Luise, Arthur Gretton

Published 2020-06-17Version 1

We study the Stein Variational Gradient Descent (SVGD) algorithm, which optimises a set of particles to approximate a target probability distribution $\pi\propto e^{-V}$ on $\mathbb{R}^d$. In the population limit, SVGD performs gradient descent in the space of probability distributions on the KL divergence with respect to $\pi$, where the gradient is smoothed through a kernel integral operator. In this paper, we provide a novel finite time analysis for the SVGD algorithm. We obtain a descent lemma establishing that the algorithm decreases the objective at each iteration, and provably converges, with less restrictive assumptions on the step size than required in earlier analyses. We further provide a guarantee on the convergence rate in Kullback-Leibler divergence, assuming $\pi$ satisfies a Stein log-Sobolev inequality as in Duncan et al. (2019), which takes into account the geometry induced by the smoothed KL gradient.

Related articles: Most relevant | Search more
arXiv:1707.03663 [stat.ML] (Published 2017-07-12)
Underdamped Langevin MCMC: A non-asymptotic analysis
arXiv:2102.12956 [stat.ML] (Published 2021-02-25)
Stein Variational Gradient Descent: many-particle and long-time asymptotics
arXiv:1910.12794 [stat.ML] (Published 2019-10-28)
Stein Variational Gradient Descent With Matrix-Valued Kernels