arXiv Analytics

Sign in

arXiv:2107.07564 [cs.LG]AbstractReferencesReviewsResources

On the Importance of Regularisation & Auxiliary Information in OOD Detection

John Mitros, Brian Mac Namee

Published 2021-07-15Version 1

Neural networks are often utilised in critical domain applications (e.g.~self-driving cars, financial markets, and aerospace engineering), even though they exhibit overconfident predictions for ambiguous inputs. This deficiency demonstrates a fundamental flaw indicating that neural networks often overfit on spurious correlations. To address this problem in this work we present two novel objectives that improve the ability of a network to detect out-of-distribution samples and therefore avoid overconfident predictions for ambiguous inputs. We empirically demonstrate that our methods outperform the baseline and perform better than the majority of existing approaches, while performing competitively those that they don't outperform. Additionally, we empirically demonstrate the robustness of our approach against common corruptions and demonstrate the importance of regularisation and auxiliary information in out-of-distribution detection.

Related articles: Most relevant | Search more
arXiv:1708.00631 [cs.LG] (Published 2017-08-02)
On the Importance of Consistency in Training Deep Neural Networks
arXiv:1909.09868 [cs.LG] (Published 2019-09-21)
On the Importance of Delexicalization for Fact Verification
arXiv:2012.04550 [cs.LG] (Published 2020-12-08)
In-N-Out: Pre-Training and Self-Training using Auxiliary Information for Out-of-Distribution Robustness