{ "id": "2308.13792", "version": "v1", "published": "2023-08-26T07:35:16.000Z", "updated": "2023-08-26T07:35:16.000Z", "title": "Out-of-distribution detection using normalizing flows on the data manifold", "authors": [ "Seyedeh Fatemeh Razavi", "Mohammad Mahdi Mehmanchi", "Reshad Hosseini", "Mostafa Tavassolipour" ], "categories": [ "cs.LG", "cs.CV" ], "abstract": "A common approach for out-of-distribution detection involves estimating an underlying data distribution, which assigns a lower likelihood value to out-of-distribution data. Normalizing flows are likelihood-based generative models providing a tractable density estimation via dimension-preserving invertible transformations. Conventional normalizing flows are prone to fail in out-of-distribution detection, because of the well-known curse of dimensionality problem of the likelihood-based models. According to the manifold hypothesis, real-world data often lie on a low-dimensional manifold. This study investigates the effect of manifold learning using normalizing flows on out-of-distribution detection. We proceed by estimating the density on a low-dimensional manifold, coupled with measuring the distance from the manifold, as criteria for out-of-distribution detection. However, individually, each of them is insufficient for this task. The extensive experimental results show that manifold learning improves the out-of-distribution detection ability of a class of likelihood-based models known as normalizing flows. This improvement is achieved without modifying the model structure or using auxiliary out-of-distribution data during training.", "revisions": [ { "version": "v1", "updated": "2023-08-26T07:35:16.000Z" } ], "analyses": { "keywords": [ "data manifold", "low-dimensional manifold", "likelihood-based models", "out-of-distribution detection ability", "auxiliary out-of-distribution data" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }