Master 2 — Sparsity and High Dimensions

Winter quarter (January–March 2026) · École Centrale Lyon · Monday 2:00–5:15pm

Course Description

Sparsity and convexity are ubiquitous concepts in machine learning and statistics. In this course, we study the mathematical foundations of powerful methods based on convex relaxation, such as L1-regularization techniques in Statistics and Signal Processing. The theoretical part of the course focuses on the performance guarantees of these algorithms under the sparsity assumption. The practical part presents standard solvers for these learning problems.

Bibliography

  • [Foucart] A Mathematical Introduction to Compressive Sensing, Simon Foucart and Holger Rauhut, Applied and Numerical Harmonic Analysis (ANHA), Springer
  • [Bubeck] Convex Optimization: Algorithms and Complexity, Sébastien Bubeck, Foundations and Trends in Machine Learning.
  • [Blondel] The Elements of Differentiable Learning, Mathieu Blondel and Vincent Roulet.
  • [Wainwright] High-Dimensional Statistics: A Non-Asymptotic Viewpoint, Martin J. Wainwright, Cambridge University Press
  • [Boyd] Convex Optimization, Stephen Boyd and Lieven Vandenberghe, Cambridge University Press
  • [d'Aspremont] Acceleration methods, Alexandre d’Aspremont, Damien Scieur and Adrien Taylor.

Projects

Guidelines: projects.pdf.
Please email Rémi and Yohann your group names and 3 ranked reference papers.

Suggested references:

  • [SR-Lasso] C. Poon and G. Peyré, "Super-Resolved Lasso". link
  • [Prox-P&P] S. Hurault, A. Leclaire, and N. Papadakis, "Proximal Denoiser for Convergent Plug-and-Play Optimization with Nonconvex Regularization". link
  • [Gap Safe] E. Ndiaye, O. Fercoq, A. Gramfort, and J. Salmon, “Gap Safe Screening Rules for Sparsity Enforcing Penalties”, J. Mach. Learn. Res., 2017. link
  • [Low-Rank NSP] M. Kabanava, R. Kueng, and H. Rauhut, “Stable low-rank matrix recovery via null space properties,” Information and Inference, vol. 5, no. 4, pp. 405–441, 2016. link
  • [FW] Q. Denoyelle, V. Duval, G. Peyré, and E. Soubies, “The sliding Frank–Wolfe algorithm and its application to super-resolution microscopy,” Inverse problems, vol. 36, no. 1, p. 014001, Jan. 2020. link
  • [Rep.IR] Mairal, J., Elad, M., & Sapiro, G. (2008). Sparse Representation for Color Image Restoration. IEEE Transactions on Image Processing, 17(1), 53–69. doi:10.1109/tip.2007.911828
  • [Group] Jacob, L., Obozinski, G., & Vert, J. P. (2009, June). Group lasso with overlap and graph lasso. In Proceedings of the 26th annual international conference on machine learning (pp. 433-440). pdf
  • [Sketch] Two papers on Sketching:
    Antoine Chatalic, Rémi Gribonval, Nicolas Keriven. "Large-Scale High-Dimensional Clustering with Fast Sketching". In ICASSP, 2018.
    Rémi Gribonval, Antoine Chatalic, Nicolas Keriven, Vincent Schellekens, Laurent Jacques, Philip Schniter. "Sketching Datasets for Large-Scale Learning (long version)". arXiv preprint arXiv:2008.01839.
  • [Gen-Langevin] T. V. Nguyen, G. Jagatap, and C. Hegde, "Provable Compressed Sensing with Generative Priors via Langevin Dynamics", IEEE Transactions on Information Theory, vol. 68, no. 11, pp. 7410–7422, 2022. link
  • [Implicit Bias] C. You, Z. Zhu, Q. Qu, and Y. Ma, "Robust Recovery via Implicit Bias of Discrepant Learning Rates", Advances in Neural Information Processing Systems (NeurIPS), 2020. link
  • [Sparse iOT] F. Andrade, G. Peyré, and C. Poon, "Sparsistency for Inverse Optimal Transport", International Conference on Learning Representations (ICLR), 2024. link

Past exams

Clarifications (ECL only)

Nous proposons deux cours de niveau Master : « Optimisation et Représentation Parcimonieuse » (ORP/Master) et « Parcimonie et Grande Dimension » (PGD/Master), ainsi qu'un cours de 3e année Centrale « Parcimonie et Grande Dimension » (PGD/Option). Ces enseignements sont tous identifiés par le code I_G_S09_FO_MIR3_1 sur le portail de la scolarité. Le planning ci-dessous présente les cours de master. En cas de doute, les informations du portail officiel font foi.

Il y a deux examens :
Exam1 -- TBA pour tout le monde (centraliens, normaliens, Lyon 1, master ou non, option ou non)
Exam2 -- TBA pour les centraliens de l'option (en master ou non)

Dans PGD/Option nous avons donc deux examens pour la note finale = (Exam1+Exam2)/2
Dans ORP/Master, la note de master MeA sera basée sur l’examen Exam2 et un examen/BE à venir (3 élèves à ma connaissance)
Dans PGD/Master, la note du premier examen Exam1 et une note de projets (qui a eu lieu hier) servent pour la note finale = (Exam1+Projets)/2

Les centraliens en Master GRAF auront une note GRAF = (Exam1+Projets)/2

On a donc un seul cours, le cours PGD, aux publics et notes différenciés.

Pour EXAM1, tout le monde est convoqué, les centraliens et les masters MeA, MA, et GRAF.
Pour EXAM2, seuls les centraliens sont convoqués.

Schedule

Students Date & time Lecturer Type Materials / location
All students Jan 26 (2–5pm) Yohann Lecture 1 SALLE 19, W1bis building
All students Feb 02 (2–5pm) Yohann Lecture 2 AMPHI 203, W1 building
All students Feb 09 (2–5pm) Rémi Lecture 3 AMPHI 203, W1 building
All students Feb 23 (2–5pm) Rémi Lecture 4 AMPHI 203, W1 building
All students Mar 02 (2–5pm) Rémi Lecture 5 AMPHI 203, W1 building
All students Mar 16 (2–5pm) Rémi & Yohann Projects Guidelines
SALLE 112, W1bis building
All students TBA ECL Exam 1

Class time and location

Winter quarter (January–March 2026).
Lecture: Monday 2:00–5:15pm.
École Centrale Lyon.