arXiv Analytics

Sign in

arXiv:1906.03986 [cs.LG]AbstractReferencesReviewsResources

Unit Impulse Response as an Explainer of Redundancy in a Deep Convolutional Neural Network

Rachana Sathish, Debdoot Sheet

Published 2019-06-10Version 1

Convolutional neural networks (CNN) are generally designed with a heuristic initialization of network architecture and trained for a certain task. This often leads to overparametrization after learning and induces redundancy in the information flow paths within the network. This robustness and reliability is at the increased cost of redundant computations. Several methods have been proposed which leverage metrics that quantify the redundancy in each layer. However, layer-wise evaluation in these methods disregards the long-range redundancy which exists across depth on account of the distributed nature of the features learned by the model. In this paper, we propose (i) a mechanism to empirically demonstrate the robustness in performance of a CNN on account of redundancy across its depth, (ii) a method to identify the systemic redundancy in response of a CNN across depth using the understanding of unit impulse response, we subsequently demonstrate use of these methods to interpret redundancy in few networks as example. These techniques provide better insights into the internal dynamics of a CNN

Comments: Workshop on Expalainable AI, CVPR 2019
Categories: cs.LG, stat.ML
Related articles: Most relevant | Search more
arXiv:1901.07761 [cs.LG] (Published 2019-01-23)
A deep Convolutional Neural Network for topology optimization with strong generalization ability
arXiv:1811.00170 [cs.LG] (Published 2018-11-01)
PerceptionNet: A Deep Convolutional Neural Network for Late Sensor Fusion
arXiv:1805.10769 [cs.LG] (Published 2018-05-28)
Universality of Deep Convolutional Neural Networks