arXiv Analytics

Sign in

arXiv:2208.03209 [cs.CV]AbstractReferencesReviewsResources

Bias and Fairness in Computer Vision Applications of the Criminal Justice System

Sophie Noiret, Jennifer Lumetzberger, Martin Kampel

Published 2022-08-05Version 1

Discriminatory practices involving AI-driven police work have been the subject of much controversies in the past few years, with algorithms such as COMPAS, PredPol and ShotSpotter being accused of unfairly impacting minority groups. At the same time, the issues of fairness in machine learning, and in particular in computer vision, have been the subject of a growing number of academic works. In this paper, we examine how these area intersect. We provide information on how these practices have come to exist and the difficulties in alleviating them. We then examine three applications currently in development to understand what risks they pose to fairness and how those risks can be mitigated.

Comments: \c{opyright} 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works
Categories: cs.CV
Related articles: Most relevant | Search more
arXiv:2207.02376 [cs.CV] (Published 2022-07-06)
A Comprehensive Review on Deep Supervision: Theories and Applications
Renjie Li et al.
arXiv:2304.06009 [cs.CV] (Published 2023-04-12)
Literature Review: Computer Vision Applications in Transportation Logistics and Warehousing
arXiv:2301.05993 [cs.CV] (Published 2023-01-15)
Empirical study of the modulus as activation function in computer vision applications