arXiv Analytics

Sign in

arXiv:1706.03301 [cs.LG]AbstractReferencesReviewsResources

Neural networks and rational functions

Matus Telgarsky

Published 2017-06-11Version 1

Neural networks and rational functions efficiently approximate each other. In more detail, it is shown here that for any ReLU network, there exists a rational function of degree $O(\text{polylog}(1/\epsilon))$ which is $\epsilon$-close, and similarly for any rational function there exists a ReLU network of size $O(\text{polylog}(1/\epsilon))$ which is $\epsilon$-close. By contrast, polynomials need degree $\Omega(\text{poly}(1/\epsilon))$ to approximate even a single ReLU. When converting a ReLU network to a rational function as above, the hidden constants depend exponentially on the number of layers, which is shown to be tight; in other words, a compositional representation can be beneficial even for rational functions.

Related articles: Most relevant | Search more
arXiv:1602.04485 [cs.LG] (Published 2016-02-14)
Benefits of depth in neural networks
arXiv:1706.02690 [cs.LG] (Published 2017-06-08)
Principled Detection of Out-of-Distribution Examples in Neural Networks
arXiv:1811.12273 [cs.LG] (Published 2018-11-29)
On the Transferability of Representations in Neural Networks Between Datasets and Tasks