arXiv Analytics

Sign in

arXiv:2303.16296 [cs.CV]AbstractReferencesReviewsResources

Dice Semimetric Losses: Optimizing the Dice Score with Soft Labels

Zifu Wang, Teodora Popordanoska, Jeroen Bertels, Robin Lemmens, Matthew B. Blaschko

Published 2023-03-28Version 1

The soft Dice loss (SDL) has taken a pivotal role in many automated segmentation pipelines in the medical imaging community. Over the last years, some reasons behind its superior functioning have been uncovered and further optimizations have been explored. However, there is currently no implementation that supports its direct use in settings with soft labels. Hence, a synergy between the use of SDL and research leveraging the use of soft labels, also in the context of model calibration, is still missing. In this work, we introduce Dice semimetric losses (DMLs), which (i) are by design identical to SDL in a standard setting with hard labels, but (ii) can be used in settings with soft labels. Our experiments on the public QUBIQ, LiTS and KiTS benchmarks confirm the potential synergy of DMLs with soft labels (e.g. averaging, label smoothing, and knowledge distillation) over hard labels (e.g. majority voting and random selection). As a result, we obtain superior Dice scores and model calibration, which supports the wider adoption of DMLs in practice. Code is available at \href{https://github.com/zifuwanggg/JDTLosses}{https://github.com/zifuwanggg/JDTLosses}.

Comments: Submitted to MICCAI2023. Code is available at https://github.com/zifuwanggg/JDTLosses
Categories: cs.CV, cs.AI, cs.LG
Related articles: Most relevant | Search more
arXiv:2303.06268 [cs.CV] (Published 2023-03-11, updated 2024-01-13)
Trust your neighbours: Penalty-based constraints for model calibration
arXiv:2207.06224 [cs.CV] (Published 2022-07-13)
Beyond Hard Labels: Investigating data label distributions
arXiv:2302.05666 [cs.CV] (Published 2023-02-11)
Jaccard Metric Losses: Optimizing the Jaccard Index with Soft Labels