arXiv Analytics

Sign in

arXiv:1911.11691 [cs.LG]AbstractReferencesReviewsResources

Emergent Structures and Lifetime Structure Evolution in Artificial Neural Networks

Siavash Golkar

Published 2019-11-26Version 1

Motivated by the flexibility of biological neural networks whose connectivity structure changes significantly during their lifetime, we introduce the Unstructured Recursive Network (URN) and demonstrate that it can exhibit similar flexibility during training via gradient descent. We show empirically that many of the different neural network structures commonly used in practice today (including fully connected, locally connected and residual networks of different depths and widths) can emerge dynamically from the same URN. These different structures can be derived using gradient descent on a single general loss function where the structure of the data and the relative strengths of various regulator terms determine the structure of the emergent network. We show that this loss function and the regulators arise naturally when considering the symmetries of the network as well as the geometric properties of the input data.

Comments: Proceedings of NeurIPS workshop on Real Neurons & Hidden Units. 5 Pages, 6 figures
Categories: cs.LG, cs.NE, q-bio.NC, stat.ML
Related articles: Most relevant | Search more
arXiv:2001.09040 [cs.LG] (Published 2020-01-24)
Estimation for Compositional Data using Measurements from Nonlinear Systems using Artificial Neural Networks
arXiv:2101.09957 [cs.LG] (Published 2021-01-25)
Activation Functions in Artificial Neural Networks: A Systematic Overview
arXiv:2108.01724 [cs.LG] (Published 2021-08-03)
Approximating Attributed Incentive Salience In Large Scale Scenarios. A Representation Learning Approach Based on Artificial Neural Networks