arXiv Analytics

Sign in

arXiv:2203.13284 [stat.ML]AbstractReferencesReviewsResources

Local optimisation of Nyström samples through stochastic gradient descent

Matthew Hutchings, Bertrand Gauthier

Published 2022-03-24Version 1

We study a relaxed version of the column-sampling problem for the Nystr\"om approximation of kernel matrices, where approximations are defined from multisets of landmark points in the ambient space; such multisets are referred to as Nystr\"om samples. We consider an unweighted variation of the radial squared-kernel discrepancy (SKD) criterion as a surrogate for the classical criteria used to assess the Nystr\"om approximation accuracy; in this setting, we discuss how Nystr\"om samples can be efficiently optimised through stochastic gradient descent. We perform numerical experiments which demonstrate that the local minimisation of the radial SKD yields Nystr\"om samples with improved Nystr\"om approximation accuracy.

Comments: 14 pages, 5 figures. Submitted to LOD 2022 conference
Categories: stat.ML, cs.LG
Related articles: Most relevant | Search more
arXiv:1803.00195 [stat.ML] (Published 2018-03-01)
The Regularization Effects of Anisotropic Noise in Stochastic Gradient Descent
arXiv:1908.07607 [stat.ML] (Published 2019-08-20)
Automatic and Simultaneous Adjustment of Learning Rate and Momentum for Stochastic Gradient Descent
arXiv:2502.06719 [stat.ML] (Published 2025-02-10)
Gaussian Approximation and Multiplier Bootstrap for Stochastic Gradient Descent