arXiv Analytics

Sign in

arXiv:2007.09426 [stat.ML]AbstractReferencesReviewsResources

Improved Convergence Speed of Fully Symmetric Learning Rules for Principal Component Analysis

Ralf Möller

Published 2020-07-18Version 1

Fully symmetric learning rules for principal component analysis can be derived from a novel objective function suggested in our previous work. We observed that these learning rules suffer from slow convergence for covariance matrices where some principal eigenvalues are close to each other. Here we describe a modified objective function with an additional term which mitigates this convergence problem. We show that the learning rule derived from the modified objective function inherits all fixed points from the original learning rule (but may introduce additional ones). Also the stability of the inherited fixed points remains unchanged. Only the steepness of the objective function is increased in some directions. Simulations confirm that the convergence speed can be noticeably improved, depending on the weight factor of the additional term.

Related articles: Most relevant | Search more
arXiv:2107.07115 [stat.ML] (Published 2021-07-15)
Principal component analysis for Gaussian process posteriors
arXiv:2306.01122 [stat.ML] (Published 2023-06-01)
On the Convergence of Coordinate Ascent Variational Inference
arXiv:2209.02305 [stat.ML] (Published 2022-09-06)
Rates of Convergence for Regression with the Graph Poly-Laplacian