arXiv Analytics

Sign in

arXiv:2011.06064 [math.OC]AbstractReferencesReviewsResources

Non-local Optimization: Imposing Structure on Optimization Problems by Relaxation

Nils Müller, Tobias Glasmachers

Published 2020-11-11Version 1

In stochastic optimization, particularly in evolutionary computation and reinforcement learning, the optimization of a function $f: \Omega \to \mathbb{R}$ is often addressed through optimizing a so-called relaxation $\theta \in \Theta \mapsto \mathbb{E}_\theta(f)$ of $f$, where $\Theta$ resembles the parameters of a family of probability measures on $\Omega$. We investigate the structure of such relaxations by means of measure theory and Fourier analysis, enabling us to shed light on the success of many stochastic optimization methods. The main structural traits we derive, and that allow fast and reliable optimization of relaxations, are the resemblance of optimal values of $f$, Lipschitzness of gradients, and convexity.

Related articles: Most relevant | Search more
arXiv:2105.08317 [math.OC] (Published 2021-05-18, updated 2022-04-19)
An Augmented Lagrangian Method for Optimization Problems with Structured Geometric Constraints
arXiv:2309.00515 [math.OC] (Published 2023-09-01)
Directional Tykhonov well-posedness for optimization problems and variational inequalities
arXiv:2110.04882 [math.OC] (Published 2021-10-10, updated 2022-04-29)
First- and Second-Order Analysis for Optimization Problems with Manifold-Valued Constraints