arXiv Analytics

Sign in

arXiv:1803.07300 [cs.LG]AbstractReferencesReviewsResources

Risk and parameter convergence of logistic regression

Ziwei Ji, Matus Telgarsky

Published 2018-03-20Version 1

The logistic loss is strictly convex and does not attain its infimum; consequently the solutions of logistic regression are in general off at infinity. This work provides a convergence analysis of gradient descent applied to logistic regression under no assumptions on the problem instance. Firstly, the risk is shown to converge at a rate $\mathcal{O}(\ln(t)^2/t)$. Secondly, the parameter convergence is characterized along a unique pair of complementary subspaces defined by the problem instance: one subspace along which strong convexity induces parameters to converge at rate $\mathcal{O}(\ln(t)^2/\sqrt{t})$, and its orthogonal complement along which separability induces parameters to converge in direction at rate $\mathcal{O}(\ln\ln(t) / \ln(t))$.

Related articles: Most relevant | Search more
arXiv:1402.4512 [cs.LG] (Published 2014-02-18, updated 2014-09-04)
Classification with Sparse Overlapping Groups
arXiv:1903.00816 [cs.LG] (Published 2019-03-03)
Stability of decision trees and logistic regression
arXiv:0910.4627 [cs.LG] (Published 2009-10-24)
Self-concordant analysis for logistic regression