Course ID ENS Lyon: Inverse problems and parcimony - MATH5232

Course ID ECL: Parcimonie et grande dimension - I_G_S09_FO_MIR3_1, M_MAS_MEA_S3_OPT_01

Course Description

Sparsity and convexity are ubiquitous notions in Machine Learning and Statistics. In this course, we study the mathematical foundations of some powerful methods based on convex relaxation: L1-regularisation techniques in Statistics and Signal Processing. The theoretical part of the course will focus on the guarantees of these algorithms under the sparsity assumption. The practical part of this course will present the standard solvers of these learning problems.

Bibliography

  • [Foucart] A Mathematical Introduction to Compressive Sensing, Simon Foucart and Holger Rauhut, Applied and Numerical Harmonic Analysis (ANHA), Springer
  • [Bubeck] Convex Optimization: Algorithms and Complexity, Sébastien Bubeck, Foundations and Trends in Machine Learning.
  • [blondel] The Elements of Differentiable Learning, Mathieu Blondel and Vincent Roulet.
  • [Wainwright] High-Dimensional Statistics: A Non-Asymptotic Viewpoint, Martin J. Wainwright, Cambridge University Press
  • [Boyd] Convex Optimization, Stephen Boyd and Lieven Vandenberghe, Cambridge University Press
  • [d'Aspremont] Acceleration methods, Alexandre d’Aspremont, Damien Scieur and Adrien Taylor.

Course Organizers

Projects

The guidelines can be found in this document. Please send a list of 3 [ref paper] with your {names} by email to Rémi and Yohann.
Based on the references:
  • [SR-Lasso] C. Poon and G. Peyré, "Super-Resolved Lasso" (https://arxiv.org/abs/2311.09928)
  • [Prox-P&P]S. Hurault, A. Leclaire, and N. Papadakis, "Proximal Denoiser for Convergent Plug-and-Play Optimization with Nonconvex Regularization" (https://arxiv.org/abs/2201.13256)
  • [Gap Safe] E. Ndiaye, O. Fercoq, A. Gramfort, and J. Salmon, “Gap Safe Screening Rules for Sparsity Enforcing Penalties”, J. Mach. Learn. Res., 2017.
  • [Low-Rank NSP] M. Kabanava, R. Kueng, and H. Rauhut, “Stable low-rank matrix recovery via null space properties,” Information and Inference, vol. 5, no. 4, pp. 405–441, 2016.
  • [FW] Q. Denoyelle, V. Duval, G. Peyré, and E. Soubies, “The sliding Frank–Wolfe algorithm and its application to super-resolution microscopy,” Inverse problems, vol. 36, no. 1, p. 014001, Jan. 2020.
  • [Rep.IR] Mairal, J., Elad, M., & Sapiro, G. (2008). Sparse Representation for Color Image Restoration. IEEE Transactions on Image Processing, 17(1), 53–69. http://doi.org/10.1109/tip.2007.911828
  • [Group] Jacob, L., Obozinski, G., & Vert, J. P. (2009, June). Group lasso with overlap and graph lasso. In Proceedings of the 26th annual international conference on machine learning (pp. 433-440). http://members.cbio.mines-paristech.fr/~ljacob/documents/overlasso.pdf
  • [Sketch] Two papers on Sketching:
    Antoine Chatalic, Rémi Gribonval, Nicolas Keriven. Large-Scale High-Dimensional Clustering with Fast Sketching . In ICASSP, 2018.
    Rémi Gribonval, Antoine Chatalic, Nicolas Keriven, Vincent Schellekens, Laurent Jacques, Philip Schniter. Sketching Datasets for Large-Scale Learning (long version). arXiv preprint arXiv:2008.01839.

Past exams

2023_exam

Clarifications

Nous avons deux cours de Master « Optimisation et Représentation Parcimonieuse » (ORP/Master) et « Parcimonie et Grande Dimension » (PGD/Master), et un cours 3A Centrale « Parcimonie et Grande Dimension » (PGD/Option)

Il y a deux examens :
Exam1 -- Lundi 24 mars 2024 de 14h à 16h pour tout le monde (centraliens, normaliens, Lyon 1, master ou non, option ou non)
Exam2 -- TBA pour les centraliens de l'option (en master ou non)

Dans PGD/Option nous avons donc deux examens pour la note finale = (Exam1+Exam2)/2
Dans ORP/Master, la note de master MeA sera basée sur l’examen Exam2 et un examen/BE à venir (3 élèves à ma connaissance)
Dans PGD/Master, la note du premier examen Exam1 et une note de projets (qui a eu lieu hier) servent pour la note finale = (Exam1+Projets)/2

Les centraliens en Master GRAF auront une note GRAF = (Exam1+Projets)/2

On a donc un seul cours, le cours PGD, aux publics et notes différentiés.

Pour le lundi 24 mars 2024, tout le monde est convoqué, les centraliens et les masters MeA, MA, et GRAF.
Pour le TBA mars 2024, seuls les centraliens sont convoqués.

Class Time and Location

Winter quarter (January - March, 2025).
Lecture: Monday afternoon 2:00-5:15
Ecole Centrale Lyon

Contact Info

Yohann: yohann.de-castro@ec-lyon.fr
Rémi: remi.gribonval@inria.fr
Students involvedDate and timeLecturerDescriptionMaterials and references

-> Full group
Jan., 20
(2-5pm)
Remi Lecture 1

-> Full group
Jan., 27
(2-5pm)
Rémi Lecture 2 SALLE 19, W1bis building

-> Full group
Feb., 3
(2-5pm)
Yohann Lecture 3 SALLE 19, W1bis building

-> Full group
Feb., 10
(2-5pm)
Rémi Lecture 4 SALLE 19, W1bis building

-> Full group
Feb., 17
(2-5pm)
Yohann Lecture 5 Amphi 201, W1 building
None Feb., 24 None BREAK

None
Mar., 3
(2-5pm)
None BREAK

-> Centrale Only
March, 10
(2-5pm)
Yohann Course on optimization SALLE 103

-> Full group
March, 17
(2-5pm)
Examinators: Remi and Yohann Projects Guidelines
SALLE 19, W1bis building

-> Full group
March, 24
(2-5pm)
Examinator: ECL's agent Exam