arXiv Analytics

Sign in

arXiv:1911.05806 [cs.LG]AbstractReferencesReviewsResources

Coarse-Refinement Dilemma: On Generalization Bounds for Data Clustering

Yule Vaz, Rodrigo Fernandes de Mello, Carlos Henrique Grossi

Published 2019-11-13Version 1

The Data Clustering (DC) problem is of central importance for the area of Machine Learning (ML), given its usefulness to represent data structural similarities from input spaces. Differently from Supervised Machine Learning (SML), which relies on the theoretical frameworks of the Statistical Learning Theory (SLT) and the Algorithm Stability (AS), DC has scarce literature on general-purpose learning guarantees, affecting conclusive remarks on how those algorithms should be designed as well as on the validity of their results. In this context, this manuscript introduces a new concept, based on multidimensional persistent homology, to analyze the conditions on which a clustering model is capable of generalizing data. As a first step, we propose a more general definition of DC problem by relying on Topological Spaces, instead of metric ones as typically approached in the literature. From that, we show that the DC problem presents an analogous dilemma to the Bias-Variance one, which is here referred to as the Coarse-Refinement (CR) dilemma. CR is intended to clarify the contrast between: (i) highly-refined partitions and the clustering instability (overfitting); and (ii) over-coarse partitions and the lack of representativeness (underfitting); consequently, the CR dilemma suggests the need of a relaxation of Kleinberg's richness axiom. Experimental results were used to illustrate that multidimensional persistent homology support the measurement of divergences among DC models, leading to a consistency criterion.

Comments: 52 pages (in which 5 pages contain references, 1 contains notation, 1 contains dictionary of terms, 2 contain proofs, 5 contain dataset images and 7 contain results)
Categories: cs.LG, stat.ML
Related articles: Most relevant | Search more
arXiv:1811.02067 [cs.LG] (Published 2018-11-05)
Generalization Bounds for Neural Networks: Kernels, Symmetry, and Sample Compression
arXiv:1905.11488 [cs.LG] (Published 2019-05-27)
Generalization Bounds in the Predict-then-Optimize Framework
arXiv:2405.13666 [cs.LG] (Published 2024-05-22)
Generalization Bounds for Dependent Data using Online-to-Batch Conversion