In-depth tutorials with practical sessions will take place on January 4th, 5th, and 9th

10 to 12 participants per session: registrations after application acceptance.
The sessions will be 3 hours long.

Below is the list of confirmed sessions as of today (click on the session title to see detailed information).

INTRODUCTION TO DEEP LEARNING WITH KERAS AND TENSORFLOW

Organizers: Olivier GRISEL

Abstract:

This session will introduce the main deep learning concepts with worked examples using Keras. In particular, we will cover the following concepts:

  • feed-forward fully connected network trained with stochastic gradient descent,
  • convolution networks for image classification with transfer learning,
  • embeddings (continuous vectors as a representation for symbolic/discrete variables such as words, tags…),
  • if time allows: Recurrent Neural Networks for NLP.

Requirements:

  • Working of Python programming with NumPy
  • Basics of linear algebra and statistics
  • Environment: Python Jupyter
  • Packages: numpy, matplotlib, tensorflow
  • Follow the instructions here

Teaching material

OPTIMIZATION FOR MACHINE LEARNING "HANDS ON"

OrganizersAlexandre GRAMFORTQuentin BERTRAND

Abstract:

Modern machine learning heavily relies on optimization tools, typically to minimize the so-called loss functions on training sets. The objective of this course is to give an overview of the most commonly employed gradient based algorithms : proximal / accelerated gradient descent, (proximal) coordinate descent, L-BFGS and stochastic gradient descent. As the course is meant to be practical one will see how all these algorithms can be implemented In Python on regression and classification problems. Jupyter notebooks will be used in the programming session.

Requirements:

Python with numpy, scipy and matplotlib

STOCHASTIC OPTIMIZATION

OrganizersAymeric DIEULEVEUTEric MOULINES

Abstract: TBA

Requirements: TBA

MODERN BAYESIAN METHODS: PRINCIPLES AND PRACTICE

OrganizersVinayak RAO

Abstract: TBA

Requirements: TBA

TOPOLOGICAL DATA ANALYSIS

OrganizersVitaliy KURLINKrasen SAMARDZHIEVPawel DLOTKOVincent ROUVREAU

Abstract: TBA

Requirements: TBA

KERNEL METHODS: FROM BASICS TO MODERN APPLICATIONS

OrganizersDougal SUTHERLAND

Abstract: TBA

Requirements: TBA

SAFE LEARNING FOR CONTROL

OrganizersMelanie ZEILINGERLukas HEWLING

Abstract: TBA

Requirements: TBA

DEEP GENERATIVE MODELS FOR REPRESENTATION LEARNING

Organizers: Vincent FORTUIN

Abstract:

Like many areas of machine learning, generative modeling has seen great progress through the introduction of deep learning. These deep generative models can not only produce realistic synthetic data, but can also learn useful representations of the data. These representations may then be used for downstream tasks, such as clustering, missing data imputation, semi-supervised learning, anomaly detection or conditional generation. In this tutorial, we will review different model architectures (such as variational autoencoders and normalizing flows) and implement them in PyTorch. We will then try them out in different application scenarios and discuss their strengths and weaknesses.

Requirements:

Basic knowledge of probabilistic modeling and linear algebra; familiarity with Python, PyTorch, and Jupyter notebooks; a Google account to use Google Colab.