Browsing by Author "Offtermatt, Jonas"
Now showing 1 - 1 of 1
- Results Per Page
- Sort Options
Item Open Access A projection and a variational regularization method for sparse inverse problems(2012) Offtermatt, Jonas; Kaltenbacher, Barbara (Prof. Dr.)The solution of sparse inverse problems has become a highly active topic over the past decade. This thesis aims at providing new methods for the regularization of sparse and possibly ill-posed inverse problems. In this work a projection and a variational regularization method for the solution of sparse inverse problems are presented. The description and analysis of each of these two methods is complemented by an additional related topic. The projection method, developed in Chapter 4, is based on an adaptive regularization method for a distributed parameter in a parabolic PDE, originally introduced by Chavent and coauthors. Here we adapt this approach for general sparse inverse problems. Furthermore a well-definedness result is presented and it is proven that the minimizer achieved by the algorithm solves the original problem in a least squares sense. Additionally, we illustrated the efficiency of the algorithm by two numerical examples from applications in systems biology and data analysis. The sequence of subspaces adaptively chosen by the introduced algorithm leads us to the analysis of regularization by discretization in preimage space. This regularization method is known to convergence only under additional assumptions on the solution. In Chapter 5 regularization by discretization in case of noisy data under a suitable source condition is considered. We present some results of well-definedness, stability and convergence for linear and nonlinear inverse problems in case the regularization subspace is chosen by the discrepancy principle. In Chapter 6 the second main part of this thesis starts. There we present a variational method for sparse inverse problems. Before introducing a new regularization functional, we take a closer look at Bayesian regularization theory. We give a brief introduction and present the connection between deterministic Tikhonov regularization and stochastic Bayesian inversion in case of Gaussian densities, developed by Kaipio and Somersalo. Then we discuss the convergence results from Hofinger and Pikkarainen for the stochastic theory, which are based on this close connection. Also we outline a concept for a general convergence result and prove a generalization result for the existence of a R-minimizing solution. Again we illustrate the gained results with some numerical examples. We use the close connection between stochastic and deterministic regularization to develop a new regularization functional for sparse inverse problems in Chapter 8. There we establish well-definedness, stability and convergence proofs for this functional, based on the results from Hofmann et al. Additionally, we prove convergence rates for the new functional. However, only in a generalized Bregman distance introduced by Grasmair, as the generated regularization term is not convex. The proposed functional is differentiable and thus can be used in gradient based optimization methods, e. g. a Quasi Newton method. We illustrate the efficiency and accuracy of this approach again with some numerical examples. The thesis starts with a general and detailed introduction into inverse problems. First a motivation and introduction to inverse problems is given in Chapter 1. Then a brief overview over recent results in regularization theory is presented in Chapter 2. Finally Chapter 3 closes the introductory part with a motivation and some first notations on sparsity in inverse problems.