{ "id": "2209.02305", "version": "v1", "published": "2022-09-06T08:59:15.000Z", "updated": "2022-09-06T08:59:15.000Z", "title": "Rates of Convergence for Regression with the Graph Poly-Laplacian", "authors": [ "Nicolás García Trillos", "Ryan Murray", "Matthew Thorpe" ], "categories": [ "stat.ML", "cs.LG", "math.AP" ], "abstract": "In the (special) smoothing spline problem one considers a variational problem with a quadratic data fidelity penalty and Laplacian regularisation. Higher order regularity can be obtained via replacing the Laplacian regulariser with a poly-Laplacian regulariser. The methodology is readily adapted to graphs and here we consider graph poly-Laplacian regularisation in a fully supervised, non-parametric, noise corrupted, regression problem. In particular, given a dataset $\\{x_i\\}_{i=1}^n$ and a set of noisy labels $\\{y_i\\}_{i=1}^n\\subset\\mathbb{R}$ we let $u_n:\\{x_i\\}_{i=1}^n\\to\\mathbb{R}$ be the minimiser of an energy which consists of a data fidelity term and an appropriately scaled graph poly-Laplacian term. When $y_i = g(x_i)+\\xi_i$, for iid noise $\\xi_i$, and using the geometric random graph, we identify (with high probability) the rate of convergence of $u_n$ to $g$ in the large data limit $n\\to\\infty$. Furthermore, our rate, up to logarithms, coincides with the known rate of convergence in the usual smoothing spline model.", "revisions": [ { "version": "v1", "updated": "2022-09-06T08:59:15.000Z" } ], "analyses": { "keywords": [ "convergence", "regression", "quadratic data fidelity penalty", "data fidelity term", "appropriately scaled graph poly-laplacian term" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }