arXiv Analytics

Sign in

arXiv:1402.1869 [stat.ML]AbstractReferencesReviewsResources

On the Number of Linear Regions of Deep Neural Networks

Guido Montúfar, Razvan Pascanu, Kyunghyun Cho, Yoshua Bengio

Published 2014-02-08, updated 2014-06-07Version 2

We study the complexity of functions computable by deep feedforward neural networks with piecewise linear activations in terms of the symmetries and the number of linear regions that they have. Deep networks are able to sequentially map portions of each layer's input-space to the same output. In this way, deep models compute functions that react equally to complicated patterns of different inputs. The compositional structure of these functions enables them to re-use pieces of computation exponentially often in terms of the network's depth. This paper investigates the complexity of such compositional maps and contributes new theoretical results regarding the advantage of depth for neural networks with piecewise linear activation functions. In particular, our analysis is not specific to a single family of models, and as an example, we employ it for rectifier and maxout networks. We improve complexity bounds from pre-existing work and investigate the behavior of units in higher layers.

Related articles: Most relevant | Search more
arXiv:1611.08083 [stat.ML] (Published 2016-11-24)
Survey of Expressivity in Deep Neural Networks
arXiv:1508.04422 [stat.ML] (Published 2015-08-18)
Scalable Out-of-Sample Extension of Graph Embeddings Using Deep Neural Networks
arXiv:1509.07385 [stat.ML] (Published 2015-09-24)
Provable approximation properties for deep neural networks