arXiv Analytics

Sign in

arXiv:2304.03720 [cs.LG]AbstractReferencesReviewsResources

Representer Theorems for Metric and Preference Learning: A Geometric Perspective

Peyman Morteza

Published 2023-04-07Version 1

We explore the metric and preference learning problem in Hilbert spaces. We obtain a novel representer theorem for the simultaneous task of metric and preference learning. Our key observation is that the representer theorem can be formulated with respect to the norm induced by the inner product inherent in the problem structure. Additionally, we demonstrate how our framework can be applied to the task of metric learning from triplet comparisons and show that it leads to a simple and self-contained representer theorem for this task. In the case of Reproducing Kernel Hilbert Spaces (RKHS), we demonstrate that the solution to the learning problem can be expressed using kernel terms, akin to classical representer theorems.

Related articles: Most relevant | Search more
arXiv:2501.08679 [cs.LG] (Published 2025-01-15)
Diagonal Over-parameterization in Reproducing Kernel Hilbert Spaces as an Adaptive Feature Model: Generalization and Adaptivity
arXiv:2408.04405 [cs.LG] (Published 2024-08-08)
Probabilistic energy forecasting through quantile regression in reproducing kernel Hilbert spaces
arXiv:1901.09206 [cs.LG] (Published 2019-01-26)
Witnessing Adversarial Training in Reproducing Kernel Hilbert Spaces