arXiv:2009.06132 [cs.LG]AbstractReferencesReviewsResources
Complexity Measures for Neural Networks with General Activation Functions Using Path-based Norms
Published 2020-09-14Version 1
A simple approach is proposed to obtain complexity controls for neural networks with general activation functions. The approach is motivated by approximating the general activation functions with one-dimensional ReLU networks, which reduces the problem to the complexity controls of ReLU networks. Specifically, we consider two-layer networks and deep residual networks, for which path-based norms are derived to control complexities. We also provide preliminary analyses of the function spaces induced by these norms and a priori estimates of the corresponding regularized estimators.
Comments: 47 pages
Related articles: Most relevant | Search more
arXiv:1706.02690 [cs.LG] (Published 2017-06-08)
Principled Detection of Out-of-Distribution Examples in Neural Networks
arXiv:1811.12273 [cs.LG] (Published 2018-11-29)
On the Transferability of Representations in Neural Networks Between Datasets and Tasks
arXiv:1805.07405 [cs.LG] (Published 2018-05-18)
Processing of missing data by neural networks