arXiv Analytics

Sign in

arXiv:1503.02352 [math.NA]AbstractReferencesReviewsResources

Infinite-dimensional $\ell^1$ minimization and function approximation from pointwise data

Ben Adcock

Published 2015-03-09Version 1

We consider the problem of approximating a function from finitely-many pointwise samples using $\ell^1$ minimization techniques. In the first part of this paper, we introduce an infinite-dimensional approach to this problem. Three advantages of this approach are as follows. First, it provides interpolatory approximations in the absence of noise. Second, it does not require a priori bounds on the expansion tail in order to be implemented. In particular, the truncation strategy we introduce as part of this framework is completely independent of the function being approximated. Third, it allows one to explain the crucial role weights play in the minimization, namely, that of regularizing the problem and removing so-called aliasing phenomena. In the second part of this paper we present a worst-case error analysis for this approach. We provide a general recipe for analyzing the performance of such techniques for arbitrary deterministic sets of points. Finally, we apply this recipe to show that weighted $\ell^1$ minimization with Jacobi polynomials leads to an optimal, convergent method for approximating smooth, one-dimensional functions from scattered data.

Related articles: Most relevant | Search more
arXiv:2303.05262 [math.NA] (Published 2023-03-09, updated 2023-04-17)
Fredholm integral equations for function approximation and the training of neural networks
arXiv:2311.18333 [math.NA] (Published 2023-11-30)
Spherical Designs for Function Approximation and Beyond
arXiv:1806.05492 [math.NA] (Published 2018-06-14)
Approximate and integrate: Variance reduction in Monte Carlo integration via function approximation