arXiv Analytics

Sign in

arXiv:2403.12187 [stat.ML]AbstractReferencesReviewsResources

Approximation of RKHS Functionals by Neural Networks

Tian-Yi Zhou, Namjoon Suh, Guang Cheng, Xiaoming Huo

Published 2024-03-18Version 1

Motivated by the abundance of functional data such as time series and images, there has been a growing interest in integrating such data into neural networks and learning maps from function spaces to R (i.e., functionals). In this paper, we study the approximation of functionals on reproducing kernel Hilbert spaces (RKHS's) using neural networks. We establish the universality of the approximation of functionals on the RKHS's. Specifically, we derive explicit error bounds for those induced by inverse multiquadric, Gaussian, and Sobolev kernels. Moreover, we apply our findings to functional regression, proving that neural networks can accurately approximate the regression maps in generalized functional linear models. Existing works on functional learning require integration-type basis function expansions with a set of pre-specified basis functions. By leveraging the interpolating orthogonal projections in RKHS's, our proposed network is much simpler in that we use point evaluations to replace basis function expansions.

Related articles: Most relevant | Search more
arXiv:2007.12826 [stat.ML] (Published 2020-07-25)
The Interpolation Phase Transition in Neural Networks: Memorization and Generalization under Lazy Training
arXiv:2205.08609 [stat.ML] (Published 2022-05-17)
Bagged Polynomial Regression and Neural Networks
arXiv:2009.13500 [stat.ML] (Published 2020-09-28)
A priori estimates for classification problems using neural networks