arXiv Analytics

Sign in

arXiv:2206.03927 [cond-mat.dis-nn]AbstractReferencesReviewsResources

Boundary between noise and information applied to filtering neural network weight matrices

Max Staats, Matthias Thamm, Bernd Rosenow

Published 2022-06-08Version 1

Deep neural networks have been successfully applied to a broad range of problems where overparametrization yields weight matrices which are partially random. A comparison of weight matrix singular vectors to the Porter-Thomas distribution suggests that there is a boundary between randomness and learned information in the singular value spectrum. Inspired by this finding, we introduce an algorithm for noise filtering, which both removes small singular values and reduces the magnitude of large singular values to counteract the effect of level repulsion between the noise and the information part of the spectrum. For networks trained in the presence of label noise, we indeed find that the generalization performance improves significantly due to noise filtering.

Related articles: Most relevant | Search more
arXiv:cond-mat/0306018 (Published 2003-06-02, updated 2006-09-20)
Processing of information in synchroneously firing chains in networks of neurons
arXiv:1106.5862 [cond-mat.dis-nn] (Published 2011-06-29)
Comment on "Energy and information in Hodgkin-Huxley neurons"
arXiv:cond-mat/9709219 (Published 1997-09-19, updated 1997-11-09)
Stability of the replica symmetric solution for the information conveyed by by a neural network