arXiv Analytics

Sign in

arXiv:1504.08219 [cs.CV]AbstractReferencesReviewsResources

Hierarchical Subquery Evaluation for Active Learning on a Graph

Oisin Mac Aodha, Neill D. F. Campbell, Jan Kautz, Gabriel J. Brostow

Published 2015-04-30Version 1

To train good supervised and semi-supervised object classifiers, it is critical that we not waste the time of the human experts who are providing the training labels. Existing active learning strategies can have uneven performance, being efficient on some datasets but wasteful on others, or inconsistent just between runs on the same dataset. We propose perplexity based graph construction and a new hierarchical subquery evaluation algorithm to combat this variability, and to release the potential of Expected Error Reduction. Under some specific circumstances, Expected Error Reduction has been one of the strongest-performing informativeness criteria for active learning. Until now, it has also been prohibitively costly to compute for sizeable datasets. We demonstrate our highly practical algorithm, comparing it to other active learning measures on classification datasets that vary in sparsity, dimensionality, and size. Our algorithm is consistent over multiple runs and achieves high accuracy, while querying the human expert for labels at a frequency that matches their desired time budget.

Related articles: Most relevant | Search more
arXiv:2007.06364 [cs.CV] (Published 2020-07-13)
On uncertainty estimation in active learning for image segmentation
arXiv:2104.07791 [cs.CV] (Published 2021-04-15)
Learning User's confidence for active learning
arXiv:2104.09315 [cs.CV] (Published 2021-04-19)
A Mathematical Analysis of Learning Loss for Active Learning in Regression