arXiv Analytics

Sign in

arXiv:2211.04956 [stat.ML]AbstractReferencesReviewsResources

A Characterization of List Learnability

Moses Charikar, Chirag Pabbaraju

Published 2022-11-07Version 1

A classical result in learning theory shows the equivalence of PAC learnability of binary hypothesis classes and the finiteness of VC dimension. Extending this to the multiclass setting was an open problem, which was settled in a recent breakthrough result characterizing multiclass PAC learnability via the DS dimension introduced earlier by Daniely and Shalev-Shwartz. In this work we consider list PAC learning where the goal is to output a list of $k$ predictions. List learning algorithms have been developed in several settings before and indeed, list learning played an important role in the recent characterization of multiclass learnability. In this work we ask: when is it possible to $k$-list learn a hypothesis class? We completely characterize $k$-list learnability in terms of a generalization of DS dimension that we call the $k$-DS dimension. Generalizing the recent characterization of multiclass learnability, we show that a hypothesis class is $k$-list learnable if and only if the $k$-DS dimension is finite.

Related articles: Most relevant | Search more
arXiv:1710.06910 [stat.ML] (Published 2017-10-18)
Characterization of Gradient Dominance and Regularity Conditions for Neural Networks
arXiv:1503.02596 [stat.ML] (Published 2015-03-09)
A Characterization of Deterministic Sampling Patterns for Low-Rank Matrix Completion
arXiv:2106.15002 [stat.ML] (Published 2021-06-28)
Characterization of the Variation Spaces Corresponding to Shallow Neural Networks