arXiv Analytics

Sign in

arXiv:2203.12532 [math.NA]AbstractReferencesReviewsResources

Stability of convergence rates: Kernel interpolation on non-Lipschitz domains

Tizian Wenzel, Gabriele Santin, Bernard Haasdonk

Published 2022-03-23Version 1

Error estimates for kernel interpolation in Reproducing Kernel Hilbert Spaces (RKHS) usually assume quite restrictive properties on the shape of the domain, especially in the case of infinitely smooth kernels like the popular Gaussian kernel. In this paper we leverage an analysis of greedy kernel algorithms to prove that it is possible to obtain convergence results (in the number of interpolation points) for kernel interpolation for arbitrary domains $\Omega \subset \mathbb{R}^d$, thus allowing for non-Lipschitz domains including e.g. cusps and irregular boundaries. Especially we show that, when going to a smaller domain $\tilde{\Omega} \subset \Omega \subset \mathbb{R}^d$, the convergence rate does not deteriorate - i.e. the convergence rates are stable with respect to going to a subset. The impact of this result is explained on the examples of kernels of finite as well as infinite smoothness like the Gaussian kernel. A comparison to approximation in Sobolev spaces is drawn, where the shape of the domain $\Omega$ has an impact on the approximation properties. Numerical experiments illustrate and confirm the experiments.

Related articles: Most relevant | Search more
arXiv:2308.06052 [math.NA] (Published 2023-08-11)
Doubling the rate -- improved error bounds for orthogonal projection in Hilbert spaces
arXiv:1908.08698 [math.NA] (Published 2019-08-23)
The Convergence Rate of MsFEM for Various Boundary Problems
arXiv:2101.01398 [math.NA] (Published 2021-01-05)
On the convergence rate of the Kačanov scheme for shear-thinning fluids