arXiv Analytics

Sign in

arXiv:2304.03096 [stat.ML]AbstractReferencesReviewsResources

Spectral Gap Regularization of Neural Networks

Edric Tam, David Dunson

Published 2023-04-06Version 1

We introduce Fiedler regularization, a novel approach for regularizing neural networks that utilizes spectral/graphical information. Existing regularization methods often focus on penalizing weights in a global/uniform manner that ignores the connectivity structure of the neural network. We propose to use the Fiedler value of the neural network's underlying graph as a tool for regularization. We provide theoretical motivation for this approach via spectral graph theory. We demonstrate several useful properties of the Fiedler value that make it useful as a regularization tool. We provide an approximate, variational approach for faster computation during training. We provide an alternative formulation of this framework in the form of a structurally weighted $\text{L}_1$ penalty, thus linking our approach to sparsity induction. We provide uniform generalization error bounds for Fiedler regularization via a Rademacher complexity analysis. We performed experiments on datasets that compare Fiedler regularization with classical regularization methods such as dropout and weight decay. Results demonstrate the efficacy of Fiedler regularization. This is a journal extension of the conference paper by Tam and Dunson (2020).

Comments: This is a journal extension of the ICML conference paper by Tam and Dunson (2020), arXiv:2003.00992
Categories: stat.ML, cs.LG
Related articles: Most relevant | Search more
arXiv:2106.13682 [stat.ML] (Published 2021-06-25)
Prediction of Hereditary Cancers Using Neural Networks
arXiv:2003.00992 [stat.ML] (Published 2020-03-02)
Fiedler Regularization: Learning Neural Networks with Graph Sparsity
arXiv:1505.05424 [stat.ML] (Published 2015-05-20)
Weight Uncertainty in Neural Networks