arXiv:1803.07300 [cs.LG]AbstractReferencesReviewsResources
Risk and parameter convergence of logistic regression
Published 2018-03-20Version 1
The logistic loss is strictly convex and does not attain its infimum; consequently the solutions of logistic regression are in general off at infinity. This work provides a convergence analysis of gradient descent applied to logistic regression under no assumptions on the problem instance. Firstly, the risk is shown to converge at a rate $\mathcal{O}(\ln(t)^2/t)$. Secondly, the parameter convergence is characterized along a unique pair of complementary subspaces defined by the problem instance: one subspace along which strong convexity induces parameters to converge at rate $\mathcal{O}(\ln(t)^2/\sqrt{t})$, and its orthogonal complement along which separability induces parameters to converge in direction at rate $\mathcal{O}(\ln\ln(t) / \ln(t))$.