In-depth tutorials with practical sessions will take place on January 4th, 5th, and 9th
10 to 12 participants per session: registrations after application acceptance.
The sessions will be 3 hours long.
Below is the list of confirmed sessions as of today (click on the session title to see detailed information).
INTRODUCTION TO DEEP LEARNING WITH KERAS AND TENSORFLOW
Organizers: Olivier GRISEL
This session will introduce the main deep learning concepts with worked examples using Keras. In particular, we will cover the following concepts:
- feed-forward fully connected network trained with stochastic gradient descent,
- convolution networks for image classification with transfer learning,
- embeddings (continuous vectors as a representation for symbolic/discrete variables such as words, tags…),
- if time allows: Recurrent Neural Networks for NLP.
- Working of Python programming with NumPy
- Basics of linear algebra and statistics
- Environment: Python Jupyter
- Packages: numpy, matplotlib, tensorflow
- Follow the instructions here
OPTIMIZATION FOR MACHINE LEARNING "HANDS ON"
Modern machine learning heavily relies on optimization tools, typically to minimize the so-called loss functions on training sets. The objective of this course is to give an overview of the most commonly employed gradient based algorithms : proximal / accelerated gradient descent, (proximal) coordinate descent, L-BFGS and stochastic gradient descent. As the course is meant to be practical one will see how all these algorithms can be implemented In Python on regression and classification problems. Jupyter notebooks will be used in the programming session.
Python with numpy, scipy and matplotlib
MODERN BAYESIAN METHODS: PRINCIPLES AND PRACTICE
Organizers: Vinayak RAO
TOPOLOGICAL DATA ANALYSIS
KERNEL METHODS: FROM BASICS TO MODERN APPLICATIONS
Organizers: Dougal SUTHERLAND
SAFE LEARNING FOR CONTROL
DEEP GENERATIVE MODELS FOR REPRESENTATION LEARNING
Organizers: Vincent FORTUIN
Like many areas of machine learning, generative modeling has seen great progress through the introduction of deep learning. These deep generative models can not only produce realistic synthetic data, but can also learn useful representations of the data. These representations may then be used for downstream tasks, such as clustering, missing data imputation, semi-supervised learning, anomaly detection or conditional generation. In this tutorial, we will review different model architectures (such as variational autoencoders and normalizing flows) and implement them in PyTorch. We will then try them out in different application scenarios and discuss their strengths and weaknesses.
Basic knowledge of probabilistic modeling and linear algebra; familiarity with Python, PyTorch, and Jupyter notebooks; a Google account to use Google Colab.