Superlinear Convergence

A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples

We propose an iterative algorithm for low-rank matrix completion that can be interpreted as an iteratively reweighted least squares (IRLS) algorithm, a saddle-escaping smoothing Newton method or a variable metric proximal gradient method applied to a …

Understanding and Enhancing Data Recovery Algorithms - From Noise-Blind Sparse Recovery to Reweighted Methods for Low-Rank Matrix Optimization

We prove new results about the robustness of noise-blind decoders for the problem of re- constructing a sparse vector from underdetermined linear measurements. Our results imply provable robustness of equality-constrained l1-minimization for random …

Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery

We propose a new iteratively reweighted least squares (IRLS) algorithm for the recovery of a matrix $X \in \mathbb{C}^{d_1 \times d_2}$ of rank $r \ll \min(d_1,d_2)$ from incomplete linear observations, solv- ing a sequence of low complexity linear …

Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery

This is a first conference version of the paper on Harmonic Mean Iteratively Reweighted Least Squares.