PhD scholarship at Inria - LEAR research group in Grenoble
The goal of this PhD is to design efficient memory representations for visual recognition. Memorizing representations of the visual world becomes increasingly important. We can not afford to recompute the representations over and over again, as such a computation is extremely costly and more importantly requires storing all the data.
This PhD will investigate to which extent deep architectures can be used for efficient memory representations and how to adapt such deep representations over time. This includes determining the appropriate structure of the network [1], updating and fine-tuning the learnt model over time [2] as well as changing the architecture over time. The PhD will also investigate how to design and model the interactions between different networks, in particular to model structural components. Most importantly, this PhD will study and build on recent memory models [3, 4] to design an innovative visual model with novel memory representations.
Your profile:
- Master degree (preferably in Computer Science or Applied Mathematics; Electrical Engineering will also be considered)
- Solid programming skills; the project involves programming in C
- Solid mathematics knowledge (especially linear algebra and statistics)
- Creative and highly motivated
- Fluent in English, both written and spoken
- Prior knowledge in the areas of computer vision, machine learning or data mining
Start date: As soon as possible. No later than October 2015.
Location: Inria Grenoble, France. Grenoble lies in the French Alpes and offers ideal conditions for skiing, hiking, climbing etc.
Contact: Cordelia Schmid (Cordelia.Schmid@inria.fr)
Please send applications via email, including:
- a complete CV
- grades for Bachelors, MSc courses and thesis
- two to three letters of reference
References
- [1] Very deep convolutional networks for large-scale image recognition. K. Simonyan and A. Zisserman. ArXiv 2014.
- [2] Computational baby learning. X. Liang et al. ArXiv 2014.
- [3] Memory networks. J. Weston, S. Chopra and A. Bordes. ArXiv 2014.
- [4] Neural Turing Machine. A. Graves, G. Wayne, I. Danihelka. ArXiv 2014.