arXiv:2012.06969 [stat.ML]AbstractReferencesReviewsResources
Predicting Generalization in Deep Learning via Local Measures of Distortion
Abhejit Rajagopal, Vamshi C. Madala, Shivkumar Chandrasekaran, Peder E. Z. Larson
Published 2020-12-13, updated 2020-12-16Version 2
We study generalization in deep learning by appealing to complexity measures originally developed in approximation and information theory. While these concepts are challenged by the high-dimensional and data-defined nature of deep learning, we show that simple vector quantization approaches such as PCA, GMMs, and SVMs capture their spirit when applied layer-wise to deep extracted features giving rise to relatively inexpensive complexity measures that correlate well with generalization performance. We discuss our results in 2020 NeurIPS PGDL challenge.
Comments: Added preprint footnote
Related articles: Most relevant | Search more
arXiv:1804.10988 [stat.ML] (Published 2018-04-29)
SHADE: Information-Based Regularization for Deep Learning
arXiv:1805.05814 [stat.ML] (Published 2018-05-14)
SHADE: Information-Based Regularization for Deep Learning
arXiv:2004.13612 [stat.ML] (Published 2020-04-28)
Denise: Deep Learning based Robust PCA for Positive Semidefinite Matrices