arXiv Analytics

Sign in

arXiv:2209.02305 [stat.ML]AbstractReferencesReviewsResources

Rates of Convergence for Regression with the Graph Poly-Laplacian

Nicolás García Trillos, Ryan Murray, Matthew Thorpe

Published 2022-09-06Version 1

In the (special) smoothing spline problem one considers a variational problem with a quadratic data fidelity penalty and Laplacian regularisation. Higher order regularity can be obtained via replacing the Laplacian regulariser with a poly-Laplacian regulariser. The methodology is readily adapted to graphs and here we consider graph poly-Laplacian regularisation in a fully supervised, non-parametric, noise corrupted, regression problem. In particular, given a dataset $\{x_i\}_{i=1}^n$ and a set of noisy labels $\{y_i\}_{i=1}^n\subset\mathbb{R}$ we let $u_n:\{x_i\}_{i=1}^n\to\mathbb{R}$ be the minimiser of an energy which consists of a data fidelity term and an appropriately scaled graph poly-Laplacian term. When $y_i = g(x_i)+\xi_i$, for iid noise $\xi_i$, and using the geometric random graph, we identify (with high probability) the rate of convergence of $u_n$ to $g$ in the large data limit $n\to\infty$. Furthermore, our rate, up to logarithms, coincides with the known rate of convergence in the usual smoothing spline model.

Related articles: Most relevant | Search more
arXiv:2409.06938 [stat.ML] (Published 2024-09-11)
k-MLE, k-Bregman, k-VARs: Theory, Convergence, Computation
arXiv:2009.00089 [stat.ML] (Published 2020-08-31)
Random Forest (RF) Kernel for Regression, Classification and Survival
arXiv:2409.18804 [stat.ML] (Published 2024-09-27)
Convergence of Diffusion Models Under the Manifold Hypothesis in High-Dimensions