arXiv Analytics

Sign in

arXiv:2302.12761 [math.NA]AbstractReferencesReviewsResources

Randomized low-rank approximation of parameter-dependent matrices

Daniel Kressner, Hei Yin Lam

Published 2023-02-24Version 1

This work considers the low-rank approximation of a matrix $A(t)$ depending on a parameter $t$ in a compact set $D \subset \mathbb{R}^d$. Application areas that give rise to such problems include computational statistics and dynamical systems. Randomized algorithms are an increasingly popular approach for performing low-rank approximation and they usually proceed by multiplying the matrix with random dimension reduction matrices (DRMs). Applying such algorithms directly to $A(t)$ would involve different, independent DRMs for every $t$, which is not only expensive but also leads to inherently non-smooth approximations. In this work, we propose to use constant DRMs, that is, $A(t)$ is multiplied with the same DRM for every $t$. The resulting parameter-dependent extensions of two popular randomized algorithms, the randomized singular value decomposition and the generalized Nystr\"{o}m method, are computationally attractive, especially when $A(t)$ admits an affine linear decomposition with respect to $t$. We perform a probabilistic analysis for both algorithms, deriving bounds on the expected value as well as failure probabilities for the $L^2$ approximation error when using Gaussian random DRMs. Both, the theoretical results and numerical experiments, show that the use of constant DRMs does not impair their effectiveness; our methods reliably return quasi-best low-rank approximations.

Related articles: Most relevant | Search more
arXiv:2210.13160 [math.NA] (Published 2022-10-24)
Algorithms for geometrical operations with NURBS surfaces
arXiv:2308.05814 [math.NA] (Published 2023-08-10)
Randomized low-rank approximations beyond Gaussian random matrices
arXiv:2212.01127 [math.NA] (Published 2022-12-02)
Randomized low-rank approximation for symmetric indefinite matrices