arXiv Analytics

Sign in

arXiv:2212.12674 [math.NA]AbstractReferencesReviewsResources

Data-Driven Linear Complexity Low-Rank Approximation of General Kernel Matrices: A Geometric Approach

Difeng Cai, Edmond Chow, Yuanzhe Xi

Published 2022-12-24Version 1

A general, {\em rectangular} kernel matrix may be defined as $K_{ij} = \kappa(x_i,y_j)$ where $\kappa(x,y)$ is a kernel function and where $X=\{x_i\}_{i=1}^m$ and $Y=\{y_i\}_{i=1}^n$ are two sets of points. In this paper, we seek a low-rank approximation to a kernel matrix where the sets of points $X$ and $Y$ are large and are not well-separated (e.g., the points in $X$ and $Y$ may be ``intermingled''). Such rectangular kernel matrices may arise, for example, in Gaussian process regression where $X$ corresponds to the training data and $Y$ corresponds to the test data. In this case, the points are often high-dimensional. Since the point sets are large, we must exploit the fact that the matrix arises from a kernel function, and avoid forming the matrix, and thus ruling out most algebraic techniques. In particular, we seek methods that can scale linearly, i.e., with computational complexity $O(m)$ or $O(n)$ for a fixed accuracy or rank. The main idea in this paper is to {\em geometrically} select appropriate subsets of points to construct a low rank approximation. An analysis in this paper guides how this selection should be performed.

Related articles: Most relevant | Search more
arXiv:2407.10600 [math.NA] (Published 2024-07-15)
Spectral Properties of Infinitely Smooth Kernel Matrices in the Single Cluster Limit, with Applications to Multivariate Super-Resolution
arXiv:2109.04330 [math.NA] (Published 2021-09-09)
On iterated interpolation
arXiv:1304.6991 [math.NA] (Published 2013-04-25)
A Geometric Approach Towards Momentum Conservation