Escaping Saddle Points in Ill-Conditioned Matrix Completion with a Scalable Second Order Method


We propose an iterative algorithm for low-rank matrix completion that can be interpreted as both an iteratively reweighted least squares (IRLS) algorithm and a saddle-escaping smoothing Newton method applied to a non-convex rank surrogate objective. It combines the favorable data efficiency of previous IRLS approaches with an improved scalability by several orders of magnitude. Our method attains a local quadratic convergence rate already for a number of samples that is close to the information theoretical limit. We show in numerical experiments that unlike many state-of-the-art approaches, our approach is able to complete very ill-conditioned matrices with a condition number of up to $10^{10}$ from few samples.

In Workshop on Beyond First Order Methods in ML Systems at the $37^{th}$ International Conference on Machine Learning (ICML 2020)
Christian Kümmerle
Christian Kümmerle
Postdoctoral Fellow