arXiv Analytics

Sign in

arXiv:2402.07070 [math.OC]AbstractReferencesReviewsResources

Efficient Algorithms for Sum-of-Minimum Optimization

Lisang Ding, Ziang Chen, Xinshang Wang, Wotao Yin

Published 2024-02-11, updated 2024-06-09Version 2

In this work, we propose a novel optimization model termed "sum-of-minimum" optimization. This model seeks to minimize the sum or average of $N$ objective functions over $k$ parameters, where each objective takes the minimum value of a predefined sub-function with respect to the $k$ parameters. This universal framework encompasses numerous clustering applications in machine learning and related fields. We develop efficient algorithms for solving sum-of-minimum optimization problems, inspired by a randomized initialization algorithm for the classic $k$-means (Arthur & Vassilvitskii, 2007) and Lloyd's algorithm (Lloyd, 1982). We establish a new tight bound for the generalized initialization algorithm and prove a gradient-descent-like convergence rate for generalized Lloyd's algorithm. The efficiency of our algorithms is numerically examined on multiple tasks, including generalized principal component analysis, mixed linear regression, and small-scale neural network training. Our approach compares favorably to previous ones based on simpler-but-less-precise optimization reformulations.

Related articles: Most relevant | Search more
arXiv:1907.01543 [math.OC] (Published 2019-07-02)
Efficient Algorithms for Smooth Minimax Optimization
arXiv:2405.20744 [math.OC] (Published 2024-05-31)
On the sequential convergence of Lloyd's algorithms
arXiv:1309.6976 [math.OC] (Published 2013-09-26)
Efficient Algorithms for Robust and Stable Principal Component Pursuit Problems