### Course information

Statistical learning is about the construction and study of systems that can automatically learn from data. With the emergence of massive datasets commonly encountered today, the need for powerful machine learning is of acute importance. Examples of successful applications include effective web search, anti-spam software, computer vision, robotics, practical speech recognition, and a deeper understanding of the human genome. This course gives an introduction to this exciting field, with a strong focus on kernels methods as a versatile tool to represent data, and recent convolutional and recurrent neural network models for visual recognition and sequence modeling.

#### Evaluation

- homework (1/2) + project (1/2)

### Course outline

#### Introduction

- Motivating example applications
- Linear classification models

#### Deep learning models

- Convolutional neural networks
- Recurrent neural networks

#### Kernel Methods

- Theory of RKHS and kernels
- Supervised learning with kernels
- Unsupervised learning with kernels
- Kernels for structured data
- Kernels for generative models

### Calendar

Classes take place from 8:15 to 11:15 on the following dates, except 17/1/2017Date | Lecturer | Room | Topic |
---|---|---|---|

1/12/15 | JV | H203 | Introduction: linear classification, kernel trick [slides], Fisher kernel [slides] |

8/12/2016 | NO COURSE | ||

15/12/16 | JV | H203 | Neural networks, convolutional and recurrent [slides] |

5/1/17 | JM | H204 | Kernels, RKHS, kernel trick [slides] |

12/1/16 | JM | H204 | Kernels, Supervised learning |

17/1/2017: 2-5 PM |
JV | H203 |
Recurrent and unsupervised deep learning [slides] |

19/1/16 | JM | H204 |

### Homeworks

There is one homework given during the course,**to be handed in on February 6th, 2017.**The homework is now available here. It will count for 50% of the grade. It can be done by groups of 2 students, and should be sent by e-mail (a Pdf file in LateX with the given template) to julien.mairal@m4x.org. A Latex template is available here.

### Projects

The project consists of experimenting with a learning approach of choice to solve a given prediction problem. A small (2-page max) written report has to be submitted to describe what you did, and results obtained. Code should also be submitted, as well as results.**Details can be found here**. Projects can be done alone, or in groups of two people, but you cannot do your homework and the data challenge with the same person.

**Projects are due on February 15, 2017, on the Kaggle website. Reports are to be handed in by email February 17, 2017 at the latest.**

### Reading material

#### Machine Learning and Statistics

- Vapnik, The nature of statistical learning theory. Springer
- Hastie, Tibshirani, Friedman, The elements of statistical learning. (free online)
- Devroye, Gyorfi, Lugosi, A probabilistic theory of pattern recognition. Springer
- J Shawe-Taylor, N Cristianini. Kernel methods for pattern analysis. 2004.
- Bishop, Pattern recognition & machine learning. 2006.