Iteratively Reweighted Least Squares for $\ell_1$-minimization with Global Linear Convergence Rate

Abstract

Iteratively Reweighted Least Squares (IRLS), whose history goes back more than 80 years, represents an important family of algorithms for non-smooth optimization as it is able to optimize these problems by solving a sequence of linear systems.
In 2010, Daubechies, DeVore, Fornasier, and Güntürk proved that IRLS for $\ell_1$-minimization, an optimization program ubiquitous in the field of compressed sensing, globally converges to a sparse solution. While this algorithm has been popular in applications in engineering and statistics, fundamental algorithmic questions have remained unanswered. As a matter of fact, existing convergence guarantees only provide global convergence without any rate, except for the case that the support of the underlying signal has already been identified.
In this paper, we prove that IRLS for $\ell_1$-minimization converges to a sparse solution with a global linear rate. We support our theory by numerical experiments indicating that our linear rate essentially captures the correct dimension dependence.

Publication
In Neural Information Processing Systems (NeurIPS) 2021 (Spotlight, top 3% of submitted papers)
Christian Kümmerle
Christian Kümmerle
Postdoctoral Fellow