Course ID ENS Lyon: Inverse problems and high dimension - MATH5212

Course ID ECL: S9 Advanced Tools for Learning : when Convexity meets Sparsity - S9 MD fo IM3.1

Course Description

Sparsity and convexity are ubiquitous notions in Machine Learning and Statistics. In this course, we study the mathematical foundations of some powerful methods based on convex relaxation: L1-regularisation techniques in Statistics and Signal Processing; Nuclear Norm minimization in Matrix Completion. These approaches turned to be Semi-Definite representable (SDP) and hence tractable in practice. The theoretical part of the course will focus on the guarantees of these algorithms under the sparsity assumption. The practical part of this course will present the standard solvers of these learning problems.
Keywords: L1-regularisation; Matrix Completion; Semi-Definite Programming; Proximal methods;

"Nothing is more practical than a good theory." - V. Vapnik

Bibliography

  • [Giraud] Introduction to High-Dimensional Statistics, Christophe Giraud, Chapman and Hall/CRC
  • [Wainwright] High-Dimensional Statistics: A Non-Asymptotic Viewpoint, Martin J. Wainwright, Cambridge University Press
  • [Foucart] A Mathematical Introduction to Compressive Sensing, Simon Foucart and Holger Rauhut.
  • [Le Gall] Intégration, Probabilités et Processus Aléatoires (PDF, 248 pages)

Course Organizers

Announcements

Due to the sanitary situation, this course is online otherwise specified. Please follow the link below to join the course on Zoom.

Projects

The guidelines can be found in this document. Please send an email to both lecturers with your group (4 names) and your ordered list of three subjects for the 26th of Jannuary.
  • [1] Beck, A., & Teboulle, M. (2009). A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems. SIAM Journal on Imaging Sciences, 2(1), 183–202. http://doi.org/10.1137/080716542
  • [2] E. Ndiaye, O. Fercoq, A. Gramfort, and J. Salmon, “Gap Safe Screening Rules for Sparsity Enforcing Penalties.,” J. Mach. Learn. Res., 2017.
  • [3] M. Kabanava, R. Kueng, and H. Rauhut, “Stable low-rank matrix recovery via null space properties,” Information and Inference, vol. 5, no. 4, pp. 405–441, 2016.
  • [4] Q. Denoyelle, V. Duval, G. Peyré, and E. Soubies, “The sliding Frank–Wolfe algorithm and its application to super-resolution microscopy,” Inverse problems, vol. 36, no. 1, p. 014001, Jan. 2020.
  • [5] Bouwmans, T., & Zahzah, E.-H. (2014). Robust PCA via Principal Component Pursuit - A review for a comparative evaluation in video surveillance. Computer Vision and Image Understanding, 122, 22–34. http://doi.org/10.1016/j.cviu.2013.11.009
  • [6] J. Yang, J. Wright, T. Huang, and Y. Ma, Image Super-Resolution Via Sparse Representation, IEEE Trans. Image Process., vol. 19, no. 11, pp. 2861 –2873, November 2010. http://doi.org/10.1109/TIP.2010.2050625
  • [7] Mairal, J., Elad, M., & Sapiro, G. (2008). Sparse Representation for Color Image Restoration. IEEE Transactions on Image Processing, 17(1), 53–69. http://doi.org/10.1109/tip.2007.911828
  • [8] Patrik O. Hoyer. Non-negative Matrix Factorization with Sparseness Constraints. Journal of machine learning research, 5(Nov):1457--1469, 2004. https://www.jmlr.org/papers/volume5/hoyer04a/hoyer04a.pdf
  • [9] Jacob, L., Obozinski, G., & Vert, J. P. (2009, June). Group lasso with overlap and graph lasso. In Proceedings of the 26th annual international conference on machine learning (pp. 433-440). http://members.cbio.mines-paristech.fr/~ljacob/documents/overlasso.pdf
  • [10] Loh, P. L., & Wainwright, M. J. (2011). High-dimensional regression with noisy and missing data: Provable guarantees with non-convexity. In Advances in Neural Information Processing Systems (pp. 2726-2734). https://papers.nips.cc/paper/2011/hash/ab541d874c7bc19ab77642849e02b89f-Abstract.html
  • [11] Berthet, Q., & Rigollet, P. (2013). Optimal detection of sparse principal components in high dimension. The Annals of Statistics, 41(4), 1780-1815.
  • [12] Vladimir Koltchinskii, Alexandre B. Tsybakov, Karim Lounici. Nuclear norm penalization and optimal rates for noisy low rank matrix completion. The Annals of Statistics, 39(5), 2302-2329. https://arxiv.org/pdf/1011.6256.pdf
  • [13] Two papers on Sketching:
    Antoine Chatalic, Rémi Gribonval, Nicolas Keriven. Large-Scale High-Dimensional Clustering with Fast Sketching . In ICASSP, 2018.
    Rémi Gribonval, Antoine Chatalic, Nicolas Keriven, Vincent Schellekens, Laurent Jacques, Philip Schniter. Sketching Datasets for Large-Scale Learning (long version). arXiv preprint arXiv:2008.01839.

Videos Links

Yohann: Zoom link to Yohann's

Rémi: BBB link to Remi's (be sure to have an account on ENSL)

Class Time and Location

Winter quarter (January - March, 2021).
Lecture: Tuesday afternoon 2:00-5:00 or 6:00
Ecole Centrale Lyon, Building W1, Room 101 (second floor)

Contact Info

Yohann: yohann.de-castro@ec-lyon.fr
Rémi: remi.gribonval@inria.fr
Students involvedDate and timeLecturerDescriptionMaterials and references
S9 MD fo IM3.1
-> ECL students only
Jan., 5
(2-6pm)
Yohann Probabilty toolbox [Giraud; P.17, 22, 24, 219, 221, 223] [Wainwright; Chap. 2] Video 1 Video 2 Notes
S9 MD fo IM3.1
-> ECL students only
Jan., 12
(2-6pm)
Yohann Optimization toolbox [Foucart; App. B] Video 1 Notes
S9 MD fo IM3.1 and MATH5213
-> Full group
Jan., 19
(2-5pm)
Rémi Lecture 1 Video on BBB
S9 MD fo IM3.1 and MATH5213
-> Full group
Jan., 26
(2-5pm)
Rémi Lecture 2
S9 MD fo IM3.1
-> ECL students only
Feb., 2
(2-6pm)
Yohann Statistical Learning toolbox
S9 MD fo IM3.1 and MATH5213
-> Full group
Feb., 9
(2-5pm)
Yohann Lecture 3
None Feb., 16 None BREAK
S9 MD fo IM3.1 and MATH5213
-> Full group
Feb., 23
(2-5pm)
Yohann Lecture 4
S9 MD fo IM3.1 and MATH5213
-> Full group
March, 2
(2-5pm)
Rémi Lecture 5 + 1h of project presentation by each group (training)
MATH5213
-> ENSL students only
March, 9
(2-5pm)
Rémi Lecture 6
S9 MD fo IM3.1 and MATH5213
-> Full group
March, 16
(2-5pm)
Examinators: Rémi and Yohann Project Evaluation
S9 MD fo IM3.1 and MATH5213
-> Full group
March, 23
(2-4pm)
Examinator: Yohann Final Examination