arXiv Analytics

Sign in

arXiv:1710.07367 [math.OC]AbstractReferencesReviewsResources

Convergence Analysis of the Frank-Wolfe Algorithm and Its Generalization in Banach Spaces

Hong-Kun Xu

Published 2017-10-19Version 1

The Frank-Wolfe algorithm, a very first optimization method and also known as the conditional gradient method, was introduced by Frank and Wolfe in 1956. Due to its simple linear subproblems, the Frank-Wolfe algorithm has recently been received much attention for solving large-scale structured optimization problems arising from many applied areas such as signal processing and machine learning. In this paper we will discuss in detail the convergence analysis of the Frank-Wolfe algorithm in Banach spaces. Two ways of the selections of the stepsizes are discussed: the line minimization search method and the open loop rule. In both cases, we prove the convergence of the Frank-Wolfe algorithm in the case where the objective function $f$ has uniformly continuous (on bounded sets) Fr\'echet derivative $f'$. We introduce the notion of the curvature constant of order $\sigma\in (1,2]$ and obtain the rate $O(\frac{1}{k^{\sigma-1}})$ of convergence of the Frank-Wolfe algorithm. In particular, this rate reduces to $O(\frac{1}{k^{\nu}})$ if $f'$ is $\nu$-H\"older continuous for $\nu\in (0,1]$, and to $O(\frac{1}{k})$ if $f'$ is Lipschitz continuous. A generalized Frank-Wolfe algorithm is also introduced to address the problem of minimizing a composite objective function. Convergence of iterates of both Frank-Wolfe and generalized Frank-Wolfe algorithms are investigated.

Related articles: Most relevant | Search more
arXiv:1705.01913 [math.OC] (Published 2017-05-04)
ADMM for monotone operators: convergence analysis and rates
arXiv:1312.0534 [math.OC] (Published 2013-12-02)
The method of cyclic intrepid projections: convergence analysis and numerical experiments
arXiv:1508.03899 [math.OC] (Published 2015-08-17)
Convergence Analysis of Algorithms for DC Programming