arXiv Analytics

Sign in

arXiv:2203.07329 [math.NA]AbstractReferencesReviewsResources

Randomized algorithms for Tikhonov regularization in linear least squares

Maike Meier, Yuji Nakatsukasa

Published 2022-03-14Version 1

We describe two algorithms to efficiently solve regularized linear least squares systems based on sketching. The algorithms compute preconditioners for $\min \|Ax-b\|^2_2 + \lambda \|x\|^2_2$, where $A\in\mathbb{R}^{m\times n}$ and $\lambda>0$ is a regularization parameter, such that LSQR converges in $\mathcal{O}(\log(1/\epsilon))$ iterations for $\epsilon$ accuracy. We focus on the context where the optimal regularization parameter is unknown, and the system must be solved for a number of parameters $\lambda$. Our algorithms are applicable in both the underdetermined $m\ll n$ and the overdetermined $m\gg n$ setting. Firstly, we propose a Cholesky-based sketch-to-precondition algorithm that uses a `partly exact' sketch, and only requires one sketch for a set of $N$ regularization parameters $\lambda$. The complexity of solving for $N$ parameters is $\mathcal{O}(mn\log(\max(m,n)) +N(\min(m,n)^3 + mn\log(1/\epsilon)))$. Secondly, we introduce an algorithm that uses a sketch of size $\mathcal{O}(\text{sd}_{\lambda}(A))$ for the case where the statistical dimension $\text{sd}_{\lambda}(A)\ll\min(m,n)$. The scheme we propose does not require the computation of the Gram matrix, resulting in a more stable scheme than existing algorithms in this context. We can solve for $N$ values of $\lambda_i$ in $\mathcal{O}(mn\log(\max(m,n)) + \min(m,n)\,\text{sd}_{\min\lambda_i}(A)^2 + Nmn\log(1/\epsilon))$ operations.

Related articles: Most relevant | Search more
arXiv:1602.03307 [math.NA] (Published 2016-02-10)
Some matrix nearness problems suggested by Tikhonov regularization
arXiv:2012.14875 [math.NA] (Published 2020-12-29)
Estimating solution smoothness and data noise with Tikhonov regularization
arXiv:1901.10382 [math.NA] (Published 2019-01-29)
Tikhonov Regularization Within Ensemble Kalman Inversion