{ "id": "2210.09909", "version": "v1", "published": "2022-10-18T14:49:44.000Z", "updated": "2022-10-18T14:49:44.000Z", "title": "Uncertainty estimation for out-of-distribution detection in computational histopathology", "authors": [ "Lea Goetz" ], "comment": "8 pages, 3 figures", "categories": [ "cs.CV", "cs.LG" ], "abstract": "In computational histopathology algorithms now outperform humans on a range of tasks, but to date none are employed for automated diagnoses in the clinic. Before algorithms can be involved in such high-stakes decisions they need to \"know when they don't know\", i.e., they need to estimate their predictive uncertainty. This allows them to defer potentially erroneous predictions to a human pathologist, thus increasing their safety. Here, we evaluate the predictive performance and calibration of several uncertainty estimation methods on clinical histopathology data. We show that a distance-aware uncertainty estimation method outperforms commonly used approaches, such as Monte Carlo dropout and deep ensembles. However, we observe a drop in predictive performance and calibration on novel samples across all uncertainty estimation methods tested. We also investigate the use of uncertainty thresholding to reject out-of-distribution samples for selective prediction. We demonstrate the limitations of this approach and suggest areas for future research.", "revisions": [ { "version": "v1", "updated": "2022-10-18T14:49:44.000Z" } ], "analyses": { "keywords": [ "out-of-distribution detection", "distance-aware uncertainty estimation method outperforms", "predictive performance", "computational histopathology algorithms", "monte carlo dropout" ], "note": { "typesetting": "TeX", "pages": 8, "language": "en", "license": "arXiv", "status": "editable" } } }