arXiv Analytics

Sign in

arXiv:1907.01543 [math.OC]AbstractReferencesReviewsResources

Efficient Algorithms for Smooth Minimax Optimization

Kiran Koshy Thekumparampil, Prateek Jain, Praneeth Netrapalli, Sewoong Oh

Published 2019-07-02Version 1

This paper studies first order methods for solving smooth minimax optimization problems $\min_x \max_y g(x,y)$ where $g(\cdot,\cdot)$ is smooth and $g(x,\cdot)$ is concave for each $x$. In terms of $g(\cdot,y)$, we consider two settings -- strongly convex and nonconvex -- and improve upon the best known rates in both. For strongly-convex $g(\cdot, y),\ \forall y$, we propose a new algorithm combining Mirror-Prox and Nesterov's AGD, and show that it can find global optimum in $\tilde{O}(1/k^2)$ iterations, improving over current state-of-the-art rate of $O(1/k)$. We use this result along with an inexact proximal point method to provide $\tilde{O}(1/k^{1/3})$ rate for finding stationary points in the nonconvex setting where $g(\cdot, y)$ can be nonconvex. This improves over current best-known rate of $O(1/k^{1/5})$. Finally, we instantiate our result for finite nonconvex minimax problems, i.e., $\min_x \max_{1\leq i\leq m} f_i(x)$, with nonconvex $f_i(\cdot)$, to obtain convergence rate of $O(m(\log m)^{3/2}/k^{1/3})$ total gradient evaluations for finding a stationary point.

Related articles: Most relevant | Search more
arXiv:2103.02116 [math.OC] (Published 2021-03-03)
An inexact proximal point method for variational inequality on Hadamard manifolds
arXiv:1511.08301 [math.OC] (Published 2015-11-26)
Enlargement of Monotone Vector Fields and an Inexact Proximal Point Method for Variational Inequalities in Hadamard Manifolds
arXiv:2402.07070 [math.OC] (Published 2024-02-11, updated 2024-06-09)
Efficient Algorithms for Sum-of-Minimum Optimization