University of Cambridge > Talks.cam > Applied and Computational Analysis > Multi-parameter regularisation for solving inverse problems of unmixing problems: theoretical and practical aspects

Multi-parameter regularisation for solving inverse problems of unmixing problems: theoretical and practical aspects

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Carola-Bibiane Schoenlieb.

Motivated by real-life applications in signal processing and image analysis, where the quantity of interest is generated by several sources to be accurately modelled and separated, as well as by recent advances in sparse regularisation theory and optimisation, we present a theoretical and algorithmic framework for optimal support recovery in inverse problems of unmixing type by means of multi-penalty regularisation. While multi-penalty regularisation is not a novel technique [1], we aim at providing precise reconstruction guarantees and methods for adaptive regularisation parameter choice.  

We consider and analyse a regularisation functional composed of a data-fidelity term, where signal and noise are additively mixed, a non-smooth, convex, sparsity promoting term, and a convex penalty term to model the noise. We prove not only that the well-established theory for sparse recovery in the single parameter case can be translated to the multi-penalty settings, but we also demonstrate the enhanced properties of multi-penalty regularisation in terms of support identification compared to sole $\ell^1$-minimisation. We further propose an iterative alternating algorithm based on simple iterative thresholding steps to perform the minimisation of the extended multi-penalty functional, containing  non-smooth and non-convex sparsity promoting term. Extending the notion of Lasso path, we additionally propose an efficient procedure for an adaptive parameter choice in multi-penalty regularisation, focusing on the recovery of the correct support of the solution. The approach essentially enables a fast construction of a tiling over the parameter space in such a way that each tile corresponds to a different sparsity pattern of the solution. Finally, we briefly discuss the applicability of multi-penalty for recovery of low-rank matrices with approximately sparse singular vectors from a limited number of measurements.  

To exemplify the robustness and effectiveness of the multi-penalty framework, we provide an extensive numerical analysis of our method and compare it with state-of-the-art single-penalty algorithms for compressed sensing problems.  

This is joint work with Markus Grasmair [3, 4], Norwegian University of Science and Technology; Timo Klock [4], Simula Research Laboratory; Steffen Peter [2] and Johannes Maly, Technical University of Munich.

References:

1. Y. Meyer, Oscillating patterns in image processing and nonlinear evolution equations: the fifteenth Dean Jacqueline B. Lewis memorial lectures, University Lecture Series, vol. 22, 2001. 2. V. Naumova and S. Peter. Minimization of multi-penalty functionals by alternating iterative thresholding and optimal parameter choices. Inverse Problems, 30:125003, 1–34, 2014. 3. M. Grasmair and V. Naumova. Conditions on optimal support recovery in unmixing problems by means of multi-penalty regularization. Inverse Problems, 32(10):104007, 2016. 4. M. Grasmair, T. Klock, and V. Naumova. Multiple parameter learning with regularization path algorithms, submitted, 2017. 5. M. Fornasier, J. Maly, and V. Naumova. A-T-LAS: A Multi-Penalty Approach to Compressed Sensing of Low-Rank Matrices with Sparse Decompositions, submitted 2018.

This talk is part of the Applied and Computational Analysis series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity