arXiv Analytics

Sign in

arXiv:1810.13098 [cs.LG]AbstractReferencesReviewsResources

Low-Rank Embedding of Kernels in Convolutional Neural Networks under Random Shuffling

Chao Li, Zhun Sun, Jinshi Yu, Ming Hou, Qibin Zhao

Published 2018-10-31Version 1

Although the convolutional neural networks (CNNs) have become popular for various image processing and computer vision task recently, it remains a challenging problem to reduce the storage cost of the parameters for resource-limited platforms. In the previous studies, tensor decomposition (TD) has achieved promising compression performance by embedding the kernel of a convolutional layer into a low-rank subspace. However the employment of TD is naively on the kernel or its specified variants. Unlike the conventional approaches, this paper shows that the kernel can be embedded into more general or even random low-rank subspaces. We demonstrate this by compressing the convolutional layers via randomly-shuffled tensor decomposition (RsTD) for a standard classification task using CIFAR-10. In addition, we analyze how the spatial similarity of the training data influences the low-rank structure of the kernels. The experimental results show that the CNN can be significantly compressed even if the kernels are randomly shuffled. Furthermore, the RsTD-based method yields more stable classification accuracy than the conventional TD-based methods in a large range of compression ratios.

Related articles: Most relevant | Search more
arXiv:1707.09641 [cs.LG] (Published 2017-07-30)
Visual Explanations for Convolutional Neural Networks via Input Resampling
arXiv:1802.08250 [cs.LG] (Published 2018-02-22)
Overcoming Catastrophic Forgetting in Convolutional Neural Networks by Selective Network Augmentation
arXiv:1606.09375 [cs.LG] (Published 2016-06-30)
Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering