arXiv Analytics

Sign in

arXiv:1409.3257 [math.OC]AbstractReferencesReviewsResources

Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization

Yuchen Zhang, Lin Xiao

Published 2014-09-10Version 1

We consider a generic convex optimization problem associated with regularized empirical risk minimization of linear predictors. The problem structure allows us to reformulate it as a convex-concave saddle point problem. We propose a stochastic primal-dual coordinate (SPDC) method, which alternates between maximizing over a randomly chosen dual variable and minimizing over the primal variable. An extrapolation step on the primal variable is performed to obtain accelerated convergence rate. We also develop a mini-batch version of the SPDC method which facilitates parallel computing, and an extension with weighted sampling probabilities on the dual variables, which has a better complexity than uniform sampling on unnormalized data. Both theoretically and empirically, we show that the SPDC method has comparable or better performance than several state-of-the-art optimization methods.

Related articles: Most relevant | Search more
arXiv:1703.00439 [math.OC] (Published 2017-03-01)
Doubly Accelerated Stochastic Variance Reduced Dual Averaging Method for Regularized Empirical Risk Minimization
arXiv:1905.01020 [math.OC] (Published 2019-05-02)
Stochastic Primal-Dual Coordinate Method with Large Step Size for Composite Optimization with Composite Cone-constraints
arXiv:1407.1296 [math.OC] (Published 2014-07-04)
An Accelerated Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization