Course ID ENS Lyon: Inverse problems and high dimension - MATH5212

Course ID ECL: Parcimonie et grande dimension - S22_ING_S09_MIR3_1

Course Description

Sparsity and convexity are ubiquitous notions in Machine Learning and Statistics. In this course, we study the mathematical foundations of some powerful methods based on convex relaxation: L1-regularisation techniques in Statistics and Signal Processing. The theoretical part of the course will focus on the guarantees of these algorithms under the sparsity assumption. The practical part of this course will present the standard solvers of these learning problems.
Keywords: L1-regularisation; Proximal methods;

Bibliography

  • [Boyd] Convex Optimization, Stephen Boyd and Lieven Vandenberghe, Cambridge University Press
  • [Foucart] A Mathematical Introduction to Compressive Sensing, Simon Foucart and Holger Rauhut, Applied and Numerical Harmonic Analysis (ANHA), Springer
  • [Wainwright] High-Dimensional Statistics: A Non-Asymptotic Viewpoint, Martin J. Wainwright, Cambridge University Press
  • [Bubeck] Convex Optimization: Algorithms and Complexity, Sébastien Bubeck, Foundations and Trends in Machine Learning.
  • [Udell] Generalized Low Rank Models, Madeleine Udell, Corinne Horn, Reza Zadeh, and Stephen Boyd, Foundations and Trends in Machine Learning.

Course Organizers

Projects

The guidelines can be found in this document. The couples ([ref paper], {names}) are:
  • [FW], GARCIA CRISTOBAL Julia, DOUTRES Mathieu, LAMBERT Guerlain, LUTZ Killian;
  • [PCA], OUZZANE Aniss, BRAULT Baptiste, AGBULUT Akin, TAHLIL Abdelilah;
  • [Rep.SR], SCANVIC Jérémy, OZDEMIR Afsin, CHAPPON Edouard, SECHAUD Victor;
  • [Sketch], OUNNOUGHENE Gregoire, BESSONE Guillaume, GALTIER Timothée, TAIEB Ruben;
  • [Rep.IR], TAGHADOUI Said, EZ-ZAHRAOUY Ibrahim, CHABALI Kamal, BEN AMRANE Aymen, LIMOURI Yassir;
  • [Group], LABE Jean-Christophe, CHANET Gaëtan, PETITJEAN Antoine, ESPUNA Victor;
Based on the references:
  • [Gap Safe] E. Ndiaye, O. Fercoq, A. Gramfort, and J. Salmon, “Gap Safe Screening Rules for Sparsity Enforcing Penalties.,” J. Mach. Learn. Res., 2017.
  • [Low-Rank NSP] M. Kabanava, R. Kueng, and H. Rauhut, “Stable low-rank matrix recovery via null space properties,” Information and Inference, vol. 5, no. 4, pp. 405–441, 2016.
  • [FW] Q. Denoyelle, V. Duval, G. Peyré, and E. Soubies, “The sliding Frank–Wolfe algorithm and its application to super-resolution microscopy,” Inverse problems, vol. 36, no. 1, p. 014001, Jan. 2020.
  • [PCA] Bouwmans, T., & Zahzah, E.-H. (2014). Robust PCA via Principal Component Pursuit - A review for a comparative evaluation in video surveillance. Computer Vision and Image Understanding, 122, 22–34. http://doi.org/10.1016/j.cviu.2013.11.009
  • [Rep.SR] J. Yang, J. Wright, T. Huang, and Y. Ma, Image Super-Resolution Via Sparse Representation, IEEE Trans. Image Process., vol. 19, no. 11, pp. 2861 –2873, November 2010. http://doi.org/10.1109/TIP.2010.2050625
  • [Rep.IR] Mairal, J., Elad, M., & Sapiro, G. (2008). Sparse Representation for Color Image Restoration. IEEE Transactions on Image Processing, 17(1), 53–69. http://doi.org/10.1109/tip.2007.911828
  • [Group] Jacob, L., Obozinski, G., & Vert, J. P. (2009, June). Group lasso with overlap and graph lasso. In Proceedings of the 26th annual international conference on machine learning (pp. 433-440). http://members.cbio.mines-paristech.fr/~ljacob/documents/overlasso.pdf
  • [Stat. Low-Rank] Vladimir Koltchinskii, Alexandre B. Tsybakov, Karim Lounici. Nuclear norm penalization and optimal rates for noisy low rank matrix completion. The Annals of Statistics, 39(5), 2302-2329. https://arxiv.org/pdf/1011.6256.pdf
  • [Sketch] Two papers on Sketching:
    Antoine Chatalic, Rémi Gribonval, Nicolas Keriven. Large-Scale High-Dimensional Clustering with Fast Sketching . In ICASSP, 2018.
    Rémi Gribonval, Antoine Chatalic, Nicolas Keriven, Vincent Schellekens, Laurent Jacques, Philip Schniter. Sketching Datasets for Large-Scale Learning (long version). arXiv preprint arXiv:2008.01839.

Class Time and Location

Winter quarter (January - March, 2023).
Lecture: Monday afternoon 2:00-5:15
Ecole Centrale Lyon, Building W1, Room 101 or 102 (second floor)

Contact Info

Yohann: yohann.de-castro@ec-lyon.fr
Rémi: remi.gribonval@inria.fr
Students involvedDate and timeLecturerDescriptionMaterials and references
S22_ING_S09_MIR3_1
-> ECL students only
Jan., 9
(2-5pm) room 101
Julian Optimization toolbox Notes
S22_ING_S09_MIR3_1
-> ECL students only
Jan., 10
(9-12am)
Julian Optimization toolbox [Boyd] Part I;
Notebooks: Real Image;
Conv. Opt.
S22_ING_S09_MIR3_1 and MATH5212
-> Full group
Jan., 23
(2-5pm)
Rémi Lecture 1 [Foucart] Chapter 2
S22_ING_S09_MIR3_1 and MATH5212
-> Full group
Jan., 30
(2-5pm)
Yohann Lecture 2 [Boyd] Part I; [Foucart] Chapter 3;
Convexity (theory)
S22_ING_S09_MIR3_1 and MATH5212
-> Full group
Feb., 6
(2-5pm)
Rémi Lecture 3
None Feb., 13 None BREAK
S22_ING_S09_MIR3_1
-> ECL students only
Feb., 20
(2-5pm)
Yohann Rates in Convex Optimization; Low Rank Models [Bubeck] Chapter 3;
[Udell] Chapter 2;
S22_ING_S09_MIR3_1 and MATH5212
-> Full group
Feb., 27
(2-5pm)
Yohann Lecture 4 [Foucart] Chapters 4 and 6;
Notes
S22_ING_S09_MIR3_1 and MATH5212
-> Full group
Mar., 6
(2-5pm)
Rémi Lecture 5
S22_ING_S09_MIR3_1
-> ECL students only
Mar., 13
(2-5pm)
Yohann Low Rank Models, course and practical Notebooks: mc_solve.py;
mc_util.py;
Low Rank
S22_ING_S09_MIR3_1 and MATH5212
-> Full group
March, 20
(2-5pm)
Examinators: Remi and Yohann Projects Guidelines
S22_ING_S09_MIR3_1 and MATH5212
-> Full group
March, 27
(2-5pm)
Examinator: ECL's agent Exam