Seminar 09/09/2019 : LOW-RANK OPTIMIZATION VIA ITERATIVELY REWEIGHTED LEAST SQUARES

Speaker: Christian Kümmerle (Technical University of Munich)

Time and place: Thu, 09/09/2019, 14:00-15:00, ELEC seminar room

Absract: We propose and discuss new Iteratively Reweighted Least Squares (IRLS) algorithms for the optimization non-smooth rank surrogates. As IRLS algorithms for similar problems, they minimize quadratic upper bounds of rank surrogates with updated smoothing, but we obtain considerably tighter quadratic bounds by drawing a connection to the second-order derivative structure of spectral functions. This leads an improved theoretical behavior of the IRLS algorithm, through fast convergence rates, and improved practical behavior in terms of data-efficiency.

Within this framework, we propose an IRLS algorithm for the recovery of low-rank matrices that are additionally linearly structured such as being a Hankel or Toeplitz matrix. These matrices are of relevance for the harmonic retrieval problem and for system identification.

For Hankel matrix completion, we establish a local convergence guarantee with quadratic rate of the IRLS strategy, if the provided entries of the matrix are sampled uniformly at random, and if the matrix to be completed satisfies a suitable coherence condition. Our strategy combines computational efficiency, as the dimensionality of its optimization variables scales sub-linearly in the matrix dimensions, with a favorable data efficiency which can be observed in experiments on completion tasks with few samples.

Joint work with Juliane Sigl and Claudio Mayrink Verdun (Technical University of Munich).
This talk is based on the works [1, 2, 3, 4]. Related work: [5, 6, 7].
1. C. Kümmerle and J. Sigl, “Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery,” Journal of Machine Learning Research, vol. 19, no. 47, pp. 1–49, 2018.
2. C. Kümmerle and C. M. Verdun, “Completion of Structured Low-Rank Matrices via Iteratively Reweighted Least Squares,” in 13th International Conference on Sampling Theory and Applications (SampTA), 2019. 3. C. Kümmerle and C. M. Verdun, “On the Completion of Structured Low-Rank Matrices via Iteratively Reweighted Least Squares,” preprint in preparation.
4. C. Kümmerle, Understanding and Enhancing Data Recovery Algorithms: From Noise-Blind Sparse Recovery to Reweighted Methods for Low-Rank Matrix Optimization. PhD thesis, Technical University of Munich, in preparation.
5. I. Daubechies, R. DeVore, M. Fornasier, and C. Güntürk, “Iteratively reweighted least squares minimization for sparse recovery,” Commun. Pure Appl. Math., vol. 63, pp. 1–38, 2010.
6. M. Fornasier, H. Rauhut, and R. Ward, “Low-rank matrix recovery via iteratively reweighted least squares minimization,” SIAM J. Optim., vol. 21, no. 4, pp. 1614–1640, 2011.
7. K. Mohan and M. Fazel, “Iterative reweighted algorithms for matrix rank minimization,” J. Mach. Learn. Res., vol. 13, no. 1, pp. 3441–3473, 2012.

See Seminars

Back to top