arXiv Analytics

Sign in

arXiv:2207.14480 [math.OC]AbstractReferencesReviewsResources

Convergence analysis of critical point regularization with non-convex regularizers

Daniel Obmann, Markus Haltmeier

Published 2022-07-29Version 1

In recent years, several methods using regularizers defined by neural networks as penalty terms in variational methods have been developed. One of the key assumptions in the stability and convergence analysis of these methods is the ability of finding global minimizers. However, such an assumption is often not feasible when the regularizer is a black box or non-convex, making the search for global minimizers of the involved Tikhonov functional a challenging task. Instead, standard minimization schemes are applied which typically only guarantee that a critical point is found. To address this issue, in this paper we study stability and convergence properties of critical points of Tikhonov functionals with a possible non-convex regularizer. To this end, we introduce the concept of relative sub-differentiability and study its basic properties. Based on this concept, we develop a convergence analysis assuming relative sub-differentiability of the regularizer. For the case where the noise level tends to zero, we derive a limiting problem representing first-order optimality conditions of a related restricted optimization problem. Finally, we provide numerical simulations that support our theoretical findings and the need for the sort of analysis that we provide in this paper.

Related articles: Most relevant | Search more
arXiv:2101.09081 [math.OC] (Published 2021-01-22)
Convergence Analysis of Projection Method for Variational Inequalities
arXiv:1710.07367 [math.OC] (Published 2017-10-19)
Convergence Analysis of the Frank-Wolfe Algorithm and Its Generalization in Banach Spaces
arXiv:1508.03899 [math.OC] (Published 2015-08-17)
Convergence Analysis of Algorithms for DC Programming