Each day is organized around two sessions, resp. from 8:50am to 11:30am and from 5:00pm to 6:30pm. Lunch is at 12:30am. Then the afternoon
(from 2pm to 5pm) is free for scientific discussions, possibly on skis.
Note that there is only one session on Wednesday from 8:50am to 11:30am. The rest of the day is free for scientific discussions or leisure.
Program
Monday, January, 7:
Session 1
- 08:50–09:00 Welcome
- 09:00–9:45 Emmanuel Candes (Stanford University)
Super-resolution via SDP with noiseless and noisy results, [Slides]
- 9:45–10:30 Michael Jordan (University of California Berkeley)
Optimization formulations from Bayesian nonparametrics via small-variance
asymptotics
- 10:30–10:45 Break
- 10:45–11:30 Peter Bulhmann (ETH Zurich)
High-dimensional graphical modeling and causal inference, [Slides]
Session 2
- 17:00–17:30 Philippe Rigollet (Princeton University)
Optimal detection of a sparse principal component, [Slides]
- 17:30–18:00 Laurent El Ghaoui (University of California Berkeley)
Sparse and Robust Optimization and Applications, [Slides]
- 18:00–18:30 Francis Bach (Ecole Normale Superieure)
A stochastic gradient method with an exponential convergence rate for
strongly-convex optimization with finite training sets, [Slides]
Tuesday, January, 8:
Session 1
- 09:00–9:45 Sara van de Geer (ETH Zurich)
The additive model revisited, [Slides]
- 9:45–10:30 Yurii Nesterov (Universite Catholique de Louvain)
TBA
- 10:30–10:45 Break
- 10:45–11:30 Boris Polyak (Institute for Control Science)
New random sampling: billiard walks, [Slides]
Session 2
- 17:00–17:30 Amit Singer (Princeton University)
Some optimization and statistical learning problems in structural biology, [Slides]
- 17:30–18:00 Nati Srebro (TTI Chicago)
Learning and optimization: lower bounds and tight connections, [Slides]
- 18:00–18:30 Caroline Chaux (Universite Aix-Marseille)
Some applications of proximal methods, [Slides]
Wednesday, January, 9:
- 09:00–9:45 Marc Teboulle (Tel-Aviv University)
Conditional gradient algorithms for the sparsity constrained rank-one matrix
approximation problem, [Slides]
- 17:30–18:15 Shai Shalev-Shwartz (The Hebrew University)
The sample-computational tradeoff, [Slides]
- 18:15–19:00 Pablo Parrilo (Massachussets Institute of Technology)
Convex sets, conic matrix factorizations
and conic rank lower bounds, [Slides]
Thursday, January, 10:
Session 1
- 09:00–9:45 Jean-Philippe Vert (Mines ParisTech)
Fast sparse methods for genomic data, [Slides]
- 9:45–10:30 Robert Nowak (University of Wisconsin - Madison)
Statistical Learning and Optimization Based on Comparative Judgements, [Slides]
- 10:30–10:45 Break
- 10:45–11:30 Nourredine El Karoui (University of California Berkeley)
TBA
Session 2
- 17:00–17:30 Elad Hazan (Technion)
Sublinear optimization, [Slides]
- 17:30–18:00 Alexander Rakhlin (University of Pennsylvania)
On learning and estimation
- 18:00–18:30 Gabriel Peyre (Universite Paris-Dauphine)
Robust sparse analysis regularization, [Slides]
Friday, January, 11:
- 09:00–9:30 Sebastien Bubeck (Princeton University)
Discrete stochastic optimization, [Slides]
- 9:30–10:00 Peter Richtarik (University of Edinburgh)
Randomized lock-free methods for minimizing partially separable convex functions, [Slides]
- 10:00–10:30 Break
- 10:30–11:00 Alekh Agarwal (Microsoft Research)
Stochastic optimization and sparse statistical recovery: An optimal algorithm in high
dimensions, [Slides]
- 11:00–11:30 Afonso Bandeira (Princeton University)
Angular Synchronization and its application in Phase Retrieval, [Slides]