arXiv Analytics

Sign in

arXiv:1901.02182 [stat.ML]AbstractReferencesReviewsResources

Comments on "Deep Neural Networks with Random Gaussian Weights: A Universal Classification Strategy?"

Talha Cihad Gulcu, Alper Gungor

Published 2019-01-08Version 1

In a recently published paper [1], it is shown that deep neural networks (DNNs) with random Gaussian weights preserve the metric structure of the data, with the property that the distance shrinks more when the angle between the two data points is smaller. We agree that the random projection setup considered in [1] preserves distances with a high probability. But as far as we are concerned, the relation between the angle of the data points and the output distances is quite the opposite, i.e., smaller angles result in a weaker distance shrinkage. This leads us to conclude that Theorem 3 and Figure 5 in [1] are not accurate. Hence the usage of random Gaussian weights in DNNs cannot provide an ability of universal classification or treating in-class and out-of-class data separately. Consequently, the behavior of networks consisting of random Gaussian weights only is not useful to explain how DNNs achieve state-of-art results in a large variety of problems.

Comments: a shortened version submitted to IEEE Trans. Signal Proc., as a comment correspondence
Categories: stat.ML, cs.LG
Related articles: Most relevant | Search more
arXiv:1606.05340 [stat.ML] (Published 2016-06-16)
Exponential expressivity in deep neural networks through transient chaos
arXiv:1707.07287 [stat.ML] (Published 2017-07-23)
Learning uncertainty in regression tasks by deep neural networks
arXiv:1702.07790 [stat.ML] (Published 2017-02-24)
Activation Ensembles for Deep Neural Networks