arXiv Analytics

Sign in

arXiv:1708.00631 [cs.LG]AbstractReferencesReviewsResources

On the Importance of Consistency in Training Deep Neural Networks

Chengxi Ye, Yezhou Yang, Cornelia Fermuller, Yiannis Aloimonos

Published 2017-08-02Version 1

We explain that the difficulties of training deep neural networks come from a syndrome of three consistency issues. This paper describes our efforts in their analysis and treatment. The first issue is the training speed inconsistency in different layers. We propose to address it with an intuitive, simple-to-implement, low footprint second-order method. The second issue is the scale inconsistency between the layer inputs and the layer residuals. We explain how second-order information provides favorable convenience in removing this roadblock. The third and most challenging issue is the inconsistency in residual propagation. Based on the fundamental theorem of linear algebra, we provide a mathematical characterization of the famous vanishing gradient problem. Thus, an important design principle for future optimization and neural network design is derived. We conclude this paper with the construction of a novel contractive neural network.

Related articles: Most relevant | Search more
arXiv:1909.09868 [cs.LG] (Published 2019-09-21)
On the Importance of Delexicalization for Fact Verification
arXiv:2102.13651 [cs.LG] (Published 2021-02-26)
On the Importance of Hyperparameter Optimization for Model-based Reinforcement Learning
Baohe Zhang et al.
arXiv:2305.14375 [cs.LG] (Published 2023-05-20)
Learning to Rank the Importance of Nodes in Road Networks Based on Multi-Graph Fusion