arXiv Analytics

Sign in

arXiv:1908.03006 [math.NA]AbstractReferencesReviewsResources

Sparse $\ell^q$-regularization of inverse problems with deep learning

Markus Haltmeier, Linh Nguyen, Daniel Obmann, Johannes Schwab

Published 2019-08-08Version 1

We propose a sparse reconstruction framework for solving inverse problems. Opposed to existing sparse reconstruction techniques that are based on linear sparsifying transforms, we train an encoder-decoder network $D \circ E$ with $E$ acting as a nonlinear sparsifying transform. We minimize a Tikhonov functional which used a learned regularization term formed by the $\ell^q$-norm of the encoder coefficients and a penalty for the distance to the data manifold. For this augmented sparse $\ell^q$-approach, we present a full convergence analysis, derive convergence rates and describe a training strategy. As a main ingredient for the analysis we establish the coercivity of the augmented regularization term.

Related articles: Most relevant | Search more
arXiv:2210.14764 [math.NA] (Published 2022-10-26)
Towards a machine learning pipeline in reduced order modelling for inverse problems: neural networks for boundary parametrization, dimensionality reduction and solution manifold approximation
arXiv:2006.05311 [math.NA] (Published 2020-06-04)
Deep learning of free boundary and Stefan problems
arXiv:2210.17048 [math.NA] (Published 2022-10-31)
A replica exchange preconditioned Crank-Nicolson Langevin dynamic MCMC method for Bayesian inverse problems