arXiv Analytics

Sign in

arXiv:2102.06064 [cs.LG]AbstractReferencesReviewsResources

Uncertainty Propagation in Convolutional Neural Networks: Technical Report

Christos Tzelepis, Ioannis Patras

Published 2021-02-11Version 1

In this technical report we study the problem of propagation of uncertainty (in terms of variances of given uni-variate normal random variables) through typical building blocks of a Convolutional Neural Network (CNN). These include layers that perform linear operations, such as 2D convolutions, fully-connected, and average pooling layers, as well as layers that act non-linearly on their input, such as the Rectified Linear Unit (ReLU). Finally, we discuss the sigmoid function, for which we give approximations of its first- and second-order moments, as well as the binary cross-entropy loss function, for which we approximate its expected value under normal random inputs.

Comments: A PyTorch implementation is available under the MIT license here: https://github.com/chi0tzp/uacnn
Categories: cs.LG
Related articles: Most relevant | Search more
arXiv:2206.13100 [cs.LG] (Published 2022-06-27)
Zero Stability Well Predicts Performance of Convolutional Neural Networks
arXiv:1912.03789 [cs.LG] (Published 2019-12-08)
Feature Engineering Combined with 1 D Convolutional Neural Network for Improved Mortality Prediction
arXiv:2107.05941 [cs.LG] (Published 2021-07-13)
Multi-Scale Label Relation Learning for Multi-Label Classification Using 1-Dimensional Convolutional Neural Networks