arXiv Analytics

Sign in

arXiv:2002.09779 [cs.LG]AbstractReferencesReviewsResources

Stochasticity in Neural ODEs: An Empirical Study

Viktor Oganesyan, Alexandra Volokhova, Dmitry Vetrov

Published 2020-02-22Version 1

Stochastic regularization of neural networks (e.g. dropout) is a wide-spread technique in deep learning that allows for better generalization. Despite its success, continuous-time models, such as neural ordinary differential equation (ODE), usually rely on a completely deterministic feed-forward operation. This work provides an empirical study of stochastically regularized neural ODE on several image-classification tasks (CIFAR-10, CIFAR-100, TinyImageNet). Building upon the formalism of stochastic differential equations (SDEs), we demonstrate that neural SDE is able to outperform its deterministic counterpart. Further, we show that data augmentation during the training improves the performance of both deterministic and stochastic versions of the same model. However, the improvements obtained by the data augmentation completely eliminate the empirical gains of the stochastic regularization, making the difference in the performance of neural ODE and neural SDE negligible.

Related articles: Most relevant | Search more
arXiv:2206.14483 [cs.LG] (Published 2022-06-29)
Data augmentation for learning predictive models on EEG: a systematic comparison
arXiv:2203.16481 [cs.LG] (Published 2022-03-30)
On Uncertainty, Tempering, and Data Augmentation in Bayesian Classification
arXiv:2207.07875 [cs.LG] (Published 2022-07-16)
On the Importance of Hyperparameters and Data Augmentation for Self-Supervised Learning