{ "id": "2001.07999", "version": "v1", "published": "2020-01-22T13:20:54.000Z", "updated": "2020-01-22T13:20:54.000Z", "title": "Curiosities and counterexamples in smooth convex optimization", "authors": [ "Jerome Bolte", "Edouard Pauwels" ], "categories": [ "math.OC" ], "abstract": "Counterexamples to some old-standing optimization problems in the smooth convex coercive setting are provided. We show that block-coordinate, steepest descent with exact search or Bregman descent methods do not generally converge. Other failures of various desirable features are established: directional convergence of Cauchy's gradient curves, convergence of Newton's flow, finite length of Tikhonov path, convergence of central paths, or smooth Kurdyka-Lojasiewicz inequality. All examples are planar. These examples are based on general smooth convex interpolation results. Given a decreasing sequence of positively curved C k convex compact sets in the plane, we provide a level set interpolation of a C k smooth convex function where k $\\ge$ 2 is arbitrary. If the intersection is reduced to one point our interpolant has positive definite Hessian, otherwise it is positive definite out of the solution set. Furthermore , given a sequence of decreasing polygons we provide an interpolant agreeing with the vertices and whose gradients coincide with prescribed normals.", "revisions": [ { "version": "v1", "updated": "2020-01-22T13:20:54.000Z" } ], "analyses": { "keywords": [ "smooth convex optimization", "counterexamples", "general smooth convex interpolation results", "curiosities", "smooth kurdyka-lojasiewicz inequality" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }