Course information

Statistical learning is about the construction and study of systems that can automatically learn from data. With the emergence of massive datasets commonly encountered today, the need for powerful machine learning is of acute importance. Examples of successful applications include effective web search, anti-spam software, computer vision, robotics, practical speech recognition, and a deeper understanding of the human genome. This course gives an introduction to this exciting field, with a strong focus on kernels methods as a versatile tool to represent data, and recent convolutional and recurrent neural network models for visual recognition and sequence modeling.

Evaluation

  • For MSIAM (3 ETCS): homework (1/2) + project (1/2)
  • For ENSIMAG-MMIS (1.75 ETCS): homework or project

Course outline

Introduction

  • Motivating example applications
  • Empirical risk minimization
  • Bias-variance trade-off, and risk bounds

Supervised learning with linear models

  • Risk convexification and regularization
  • Logistic regression
  • Support vector machines

Kernel Methods

  • Theory of RKHS and kernels
  • Supervised learning with kernels
  • Unsupervised learning with kernels
  • Kernels for structured data
  • Kernels for generative models

Deep learning models

  • Convolutional neural networks
  • Recurrent neural networks

Reading material

Machine Learning and Statistics

  • Vapnik, The nature of statistical learning theory. Springer
  • Hastie, Tibshirani, Friedman, The elements of statistical learning. (free online)
  • Devroye, Gyorfi, Lugosi, A probabilistic theory of pattern recognition. Springer
  • J Shawe-Taylor, N Cristianini. Kernel methods for pattern analysis. 2004.
  • Bishop, Pattern recognition & machine learning. 2006.
  • Slides by Jean-Philippe Vert on kernel methods.

Calendar

Date Lecturer Topic
10/12/15 JV Motivation + (non-)linear Classification + introduction kernels [slides] + Bias-variance tradeoff [slides]
17/12/15 JM Kernels 1
[slides]
7/1/16 JM Kernels 2
14/1/16 JM Kernels 3
21/1/16 JV Fisher kernels [slides] and Convolutional neural networks [slides]
28/1/16 JV Recurrent neural networks [slides]

Homeworks

There will be homework given during the course, to be handed in on January 21, 2016. The homework is now available here. Homework is submitted in printed form, or electronically. Homework has to be done individually.

Projects

The project consists of experimenting with a learning approach of choice to solve a given prediction problem. A small 2-page written report has to be submitted to describe what you did, and results obtained. Code should also be submitted, as well as results. Details can be found here. Projects can be done alone, or in groups of two people. Projects are due on February 14, 2016, on the Kagle website. Reports are to be handed in by email February 18, 2016 at the latest.

Jobs / Internships Opportunities

We have different intern/PhD opportunities in machine learning, image processing, bioinformatics and computer vision. It is best to discuss that matter early with us since the number of places is limited.