Course information

Machine Learning is about the construction and study of systems that can automatically learn from data. With the emergence of massive datasets commonly encountered today, the need for powerful machine learning is of acute importance. Examples of successful applications include effective web search, anti-spam software, computer vision, robotics, practical speech recognition, and a deeper understanding of the human genome. In this course, we will give an introduction to this exciting field. We will focus on supervised learning, such as classification and ranking, and unsupervised learning problems, such as clustering and dimension reduction. We will study classical algorithms, and introduce tools to measure their performance, as well as their computational complexity.

Related courses

  • CR01: Using Randomness in Science
  • CR07 Algorithms for Molecular Biology
  • CR08: Combinatorial Scientific Computing


  • homeworks 33%
  • project 67%

Course outline


  • Empirical risk minimization
  • Risk convexification and regularization
  • Bias-variance trade-off, and risk bounds

Supervised learning

  • Ridge regression
  • Logistic regression
  • Perceptron and neural networks
  • Support vector machines
  • Other methods: nearest-neighbors, kernel methods, etc.

Unsupervised learning

  • Principal component analysis
  • Data clustering
  • Other methods: canonical correlation analysis, sparse coding, etc.

Reading material

Machine Learning and Statistics

  • Vapnik, The nature of statistical learning theory. Springer
  • Hastie, Tibshirani, Friedman, The elements of statistical learning. (free online)
  • Devroye, Gyorfi, Lugosi, A probabilistic theory of pattern recognition. Springer
  • Dubashi, Panconesi, Concentration of measure for analysis of randomized algorithms, Cambridge University Press
  • J Shawe-Taylor, N Cristianini. Kernel methods for pattern analysis. 2004.


  • S. Boyd and L. Vandenberghe. Convex Optimization. 2004. (free online)
  • D. Bertsekas. Nonlinear Programming. 2003.


Date Lecturer Topic Scribes
20/09 ZH Introduction + Perceptron.
Sergio Peignier
Semen Marchuk
04/10 ZH Kernel Methods + Stochastic Learning.
Sébastien Maulat
Fasi Massimiliano
18/10 JS Stochastic Learning + Risk bounds.
Bertrand Simon
Antoine Plet
08/11 JM Supervised Learning - Methodology.
Jordan Frecon
Merve Ünlü
22/11 JM Examples of Kernels - Unsupervised Learning.
Rémi De Joannis de Verclos
Karthik Srikanta
06/12 JS Random projections - Nearest Neighbors.

Scribe notes

For each course, a duo of students commit to turn their notes into latex format. A cool package, due to S. Maulat and A. Plet, for making nice notes, can be found here.


During each course, a few exercises are given. They should be returned to the lecturers during the last course on December 6th. (no need to use LateX here).


The project consists of implementing an article, doing some experiments, and writing a small report (less than 10 pages). It is also possible to study a theoretical paper instead of implementing a method. All reports should be written in LateX, and a pdf should be sent to the lecturers before January 5th.
Project Student(s) Coach
Supervised classification of text documents.
Rémi De Joannis de Verclos
Sébastien Maulat.
Machine learning for games.
Massimiliano Fasi. JS
Face recognition.
Semen Marchuk
Bertrand Simon.
Predicting Molecular Activity with Graph Kernels.
Sergio Peignier. JM
Speaker Recognition.
Antoine Plet
Thomas Sibut-Pinote.
Active Learning.
Guillaume Sergent. ZH, JS
Spectral Clustering Theory.
Karthik Srikanta. ZH
Supervised classification of Flickr images.
Merve Ünlü. JS

Jobs / Internships Opportunities

We have lots of intern/PhD opportunities in machine learning and its applications, such as image processing, bioinformatics and computer vision. Please contact the lecturers for more information. Some more detailed topics are listed below:
  • Large-Scale Machine Learning for Statistical Genetics with Julien Mairal and Michael Blum (Grenoble)
  • Machine Learning for web-link prediction, with applications to Knowledge Graphs, with Z. Harchaoui, M. Amini, and A. Juditsky (Grenoble)
  • Statistical aggregation : greedy algorithms vs MCMC, with Joseph Salmon (Paris)