arXiv Analytics

Sign in

arXiv:2212.02223 [stat.ML]AbstractReferencesReviewsResources

Limitations on approximation by deep and shallow neural networks

Guergana Petrova, Przemysław Wojtaszczyk

Published 2022-11-30Version 1

We prove Carl's type inequalities for the error of approximation of compact sets K by deep and shallow neural networks. This in turn gives lower bounds on how well we can approximate the functions in K when requiring the approximants to come from outputs of such networks. Our results are obtained as a byproduct of the study of the recently introduced Lipschitz widths.

Related articles: Most relevant | Search more
arXiv:2106.15002 [stat.ML] (Published 2021-06-28)
Characterization of the Variation Spaces Corresponding to Shallow Neural Networks
arXiv:2106.14997 [stat.ML] (Published 2021-06-28)
Sharp Lower Bounds on the Approximation Rate of Shallow Neural Networks
arXiv:2011.10487 [stat.ML] (Published 2020-11-20)
Normalization effects on shallow neural networks and related asymptotic expansions