{ "id": "1907.01543", "version": "v1", "published": "2019-07-02T17:50:34.000Z", "updated": "2019-07-02T17:50:34.000Z", "title": "Efficient Algorithms for Smooth Minimax Optimization", "authors": [ "Kiran Koshy Thekumparampil", "Prateek Jain", "Praneeth Netrapalli", "Sewoong Oh" ], "categories": [ "math.OC", "cs.LG", "stat.ML" ], "abstract": "This paper studies first order methods for solving smooth minimax optimization problems $\\min_x \\max_y g(x,y)$ where $g(\\cdot,\\cdot)$ is smooth and $g(x,\\cdot)$ is concave for each $x$. In terms of $g(\\cdot,y)$, we consider two settings -- strongly convex and nonconvex -- and improve upon the best known rates in both. For strongly-convex $g(\\cdot, y),\\ \\forall y$, we propose a new algorithm combining Mirror-Prox and Nesterov's AGD, and show that it can find global optimum in $\\tilde{O}(1/k^2)$ iterations, improving over current state-of-the-art rate of $O(1/k)$. We use this result along with an inexact proximal point method to provide $\\tilde{O}(1/k^{1/3})$ rate for finding stationary points in the nonconvex setting where $g(\\cdot, y)$ can be nonconvex. This improves over current best-known rate of $O(1/k^{1/5})$. Finally, we instantiate our result for finite nonconvex minimax problems, i.e., $\\min_x \\max_{1\\leq i\\leq m} f_i(x)$, with nonconvex $f_i(\\cdot)$, to obtain convergence rate of $O(m(\\log m)^{3/2}/k^{1/3})$ total gradient evaluations for finding a stationary point.", "revisions": [ { "version": "v1", "updated": "2019-07-02T17:50:34.000Z" } ], "analyses": { "keywords": [ "smooth minimax optimization problems", "efficient algorithms", "paper studies first order methods", "finite nonconvex minimax problems", "inexact proximal point method" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }