{ "id": "2002.06848", "version": "v1", "published": "2020-02-17T09:16:30.000Z", "updated": "2020-02-17T09:16:30.000Z", "title": "SingCubic: Cyclic Incremental Newton-type Gradient Descent with Cubic Regularization for Non-Convex Optimization", "authors": [ "Ziqiang Shi" ], "categories": [ "math.OC" ], "abstract": "In this work, we generalized and unified two recent completely different works of~\\cite{shi2015large} and~\\cite{cartis2012adaptive} respectively into one by proposing the cyclic incremental Newton-type gradient descent with cubic regularization (SingCubic) method for optimizing non-convex functions. Through the iterations of SingCubic, a cubic regularized global quadratic approximation using Hessian information is kept and solved. Preliminary numerical experiments show the encouraging performance of the SingCubic algorithm when compared to basic incremental or stochastic Newton-type implementations. The results and technique can be served as an initiate for the research on the incremental Newton-type gradient descent methods that employ cubic regularization. The methods and principles proposed in this paper can be used to do logistic regression, autoencoder training, independent components analysis, Ising model/Hopfield network training, multilayer perceptron, deep convolutional network training and so on. We will open-source parts of our implementations soon.", "revisions": [ { "version": "v1", "updated": "2020-02-17T09:16:30.000Z" } ], "analyses": { "keywords": [ "cyclic incremental newton-type gradient descent", "cubic regularization", "non-convex optimization", "newton-type gradient descent methods", "regularized global quadratic approximation" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }