arXiv Analytics

Sign in

arXiv:2006.09097 [math.OC]AbstractReferencesReviewsResources

Accelerated Alternating Minimization and Adaptability to Strong Convexity

Nazarii Tupitsa

Published 2020-06-16Version 1

In the first part of the paper we consider accelerated first order optimization method for convex functions with $L$-Lipschitz-continuous gradient, that is able to automatically adapts to problems which satisfies Polyak-{\L}ojasiewicz condition or which is strongly convex with the value of parameter equal to $\mu$. In these cases method poses linear convergence with factor $\frac{\mu}{L}$, if $\mu$ is unknown. If the problem is strongly convex and $\mu$ is known, than the method possesses linear convergence with the factor $\sqrt{\frac{\mu}{L}}$. If that are not the cases, the method converges with a rate $O(1/k^2)$. The second part contains generalization of the method to the problems, that allows alternating minimization and proofs of the same asymptotic convergence rates. Also it is considered the approach called Adaptive Catalyst, which allows to increase convergence rate up to $O(1/k^2)$ and also it is provided an experimental comparison of the approach with AGM, Alternating AGM, APDAGD and Sinkhorn's algorithm for the dual problem to the discrete entropically regularized optimal transport problem. The result of the work is the attempt to explain the reason why Alternating AGM converge faster than AGM or Adaptive Catalyst despite of the asymptotic theoretical rate $O(1/k^2)$. The hypothesis relies on the fact that Alternating AGM adapts to strong convexity. The hypothesis was tested on quadratic functions, on that Alternating AGM also showed faster convergence.

Related articles: Most relevant | Search more
arXiv:1906.03622 [math.OC] (Published 2019-06-09)
Accelerated Alternating Minimization
arXiv:2312.03583 [math.OC] (Published 2023-12-06)
Strong Convexity of Sets in Riemannian Manifolds
arXiv:2103.10854 [math.OC] (Published 2021-03-19)
Unbalanced Multi-Marginal Optimal Transport