arXiv Analytics

Sign in

arXiv:2308.13792 [cs.LG]AbstractReferencesReviewsResources

Out-of-distribution detection using normalizing flows on the data manifold

Seyedeh Fatemeh Razavi, Mohammad Mahdi Mehmanchi, Reshad Hosseini, Mostafa Tavassolipour

Published 2023-08-26Version 1

A common approach for out-of-distribution detection involves estimating an underlying data distribution, which assigns a lower likelihood value to out-of-distribution data. Normalizing flows are likelihood-based generative models providing a tractable density estimation via dimension-preserving invertible transformations. Conventional normalizing flows are prone to fail in out-of-distribution detection, because of the well-known curse of dimensionality problem of the likelihood-based models. According to the manifold hypothesis, real-world data often lie on a low-dimensional manifold. This study investigates the effect of manifold learning using normalizing flows on out-of-distribution detection. We proceed by estimating the density on a low-dimensional manifold, coupled with measuring the distance from the manifold, as criteria for out-of-distribution detection. However, individually, each of them is insufficient for this task. The extensive experimental results show that manifold learning improves the out-of-distribution detection ability of a class of likelihood-based models known as normalizing flows. This improvement is achieved without modifying the model structure or using auxiliary out-of-distribution data during training.

Related articles: Most relevant | Search more
arXiv:2204.08624 [cs.LG] (Published 2022-04-19)
Topology and geometry of data manifold in deep learning
arXiv:2006.01272 [cs.LG] (Published 2020-06-01)
Shapley-based explainability on the data manifold
arXiv:2210.07100 [cs.LG] (Published 2022-10-13)
Dissipative residual layers for unsupervised implicit parameterization of data manifolds