The aim of this course is to introduce the main problems and theoretical aspects of Machine Learning.

The focus will be mainly on supervised classification, with a few extensions on non-supervised learning (clustering) and regression.

Each course will be the opportunity of a focus on a particular technique or tool of general interest (such as deviation inequalities, statistical tests, stochastic optimization, etc.).

The aspects of Machine Learning relative to reinforcement learning will not be addressed in this course: they are at the core of the course CR01: Optimal Decision Making and Online Optimization. Online learning will also be studied during the first Winter School.

- 09.10 1. Introduction, nearest-neighbor classification
- 09.17 Pot du DI at IFE Descartes (meet your tutor)
- 09.24 2. k-nearest neighbors, deviations for averages
- 10.01 3. Kullback-Leibler divergence, PAC learning theory
- 10.08 4. No-free-lunch theorem, uniform convergence, VC dimension
- 10.15 5. Fundamental theorem of statistical learning, proof by uniform convergence theorem for finite VC-dim classes
- 10.22 6. Computational complexity of learning
- 10.29 holidays
- 11.5 7. Linear classifiers, surrogate losses
- 11.12 winter school on Algorithmic aspects of data analysis and machine learning
- 11.19 8. Convex optimization for Machine Learning
- 11.26 winter school on Computer virology
- 12.03 Exam (2 hours)
- 12.10 9. Stochastic gradient descent
- 12.17 10. Regularization, stability and generalization; Support Vector Machines and Kernels
- 01.07 11. Classification Trees, Boosting, Bagging, Random Forests
- 01.14 12 Presentations of the research papers and challenge methodology.
- 01.25 Dimensionality reduction (Master Math.en.Action), numerical experimentation on the MNIST dataset.
- 02.15 Penalized regression: LASSO (slides by Vivian Viallon)

Basic knowledge of probability theory, linear algebra and analysis over the reals

In addition to homework and in-class exercices, students will chose betwenn

- a research article to analyze
- a participation in the Défi IA 2019, you can watch the videos of presentation.
**Warning: the competition stops on 01.13 at 9pm. After that, no submission will be possible. Only the last submission wiil be taken into account in the final ranking.**

In both case, they will prepare a written report and an oral presentation. The final grade will be a function of all these.

- une offre de stage chez Airbus
- AIRBUS lance un nouveau challenge data science, ouvert aux étudiants (et aux chercheur.e.s chevronné.e.s aussi !)
- very accessible and yet rather informative keynote
- Internship propositions at Institut de Mathématiques de Toulouse : Statistical Inference under Differential Privacy, Theoretical challenges in deep learning, Sequential learning theory
- Internship and PhD proposition: Deep learning for texts and knowledge bases access at IRIT and Renault
- Internship: Active and semi-supervised learning using topological data analysis in Grenoble
- Internship: Recommandation / bandits at Moobifun
- Internship: Mathematics of deep learning at Institut de Mathématiques de Toulouse
- Internship: Christoffel functions for Machine Learning at Institut de Mathématiques de Toulouse
- Internship: Explainable AI at Institut de Mathématiques de Toulouse

- Understanding Machine Learning, From Theory to Algorithms,
*by Shai Shalev-Shwartz and Shai Ben-David* - A Probabilistic Theory of Pattern Recognition,
*by Luc Devroye, Laslzlo Gyorfi and Gabor Lugosi* - The Elements of Statistical Learning,
*by Trevor Hastie, Robert Tibshirani and Jerome Friedman* - Introduction to Nonparametric Estimation,
*by Alexander Tsybakov* - Lectures notes on advanced Statistical Learning ,
*by Martin Wainwright*