arXiv Analytics

Sign in

arXiv:2108.12604 [cs.CV]AbstractReferencesReviewsResources

Threshold: Pruning Tool for Densely Connected Convolutional Networks

Rui-Yang Ju, Ting-Yu Lin, Jen-Shiun Chiang

Published 2021-08-28Version 1

Deep neural networks have made significant progress in the field of computer vision. Recent studies have shown that depth, width and shortcut connections of neural network architectures play a crucial role in their performance. One of the most advanced neural network architectures, DenseNet, has achieved excellent convergence rates through dense connections. However, it still has obvious shortcomings in the usage of amount of memory. In this paper, we introduce a new type of pruning tool, threshold, which refers to the principle of the threshold voltage in MOSFET. This work employs this method to connect blocks of different depths in different ways to reduce the usage of memory. It is denoted as ThresholdNet. We compare ThresholdNet with other different networks for FLOPs and memory usage, and the experiments show that ThresholdNet is 70% less memory than that of the original DenseNet.

Related articles: Most relevant | Search more
arXiv:1605.08153 [cs.CV] (Published 2016-05-26)
DeepMovie: Using Optical Flow and Deep Neural Networks to Stylize Movies
arXiv:1709.03820 [cs.CV] (Published 2017-09-12)
Emotion Recognition in the Wild using Deep Neural Networks and Bayesian Classifiers
arXiv:1703.07715 [cs.CV] (Published 2017-03-22)
Classifying Symmetrical Differences and Temporal Change in Mammography Using Deep Neural Networks