Selected Slides and Talks

Tutorial Slides: Deep Generative Models: Foundations, applications and open problems, CCN 2018

keywords: variational inference, generative models, VAE, approximate inference, normalizing-flows, R-NVP, IAF, density estimation, marginalization, GANs, moment-matching, uncertainty estimation, stochastic optimisation, probability divergences, adversarial, causality, autoregressive, moment-matching.

Abstract:

This tutorial will be a review of recent advances in deep generative models. Generative models have a long history and recent methods have combined the generality of probabilistic reasoning with the scalability of deep learning to develop learning algorithms that have been applied to a wide variety of problems giving state-of-the-art results in image generation, text-to-speech synthesis, and image captioning, amongst many others. Advances in deep generative models are at the forefront of deep learning research because of the promise they offer for allowing data-efficient learning, and for model-based reinforcement learning. At the end of this tutorial, audience member will have a full understanding of the latest advances in generative modelling covering three of the active types of models: Markov models, latent variable models and implicit models, and how these models can be scaled to high-dimensional data. The tutorial will expose many questions that remain in this area, and for which there remains a great deal of opportunity for researchers.

Video & Slides: Approximate Inference and Deep Generative Models, CERN 2018

keywords: variational inference, generative models, VAE, approximate inference, normalizing-flows, R-NVP, IAF, density estimation, marginalization.

Abstract:

Advances in deep generative models are at the forefront of deep learning research because of the promise they offer for allowing data-efficient learning, and for model-based reinforcement learning. In this talk I’ll review a few standard methods for approximate inference and introduce modern approximations which allow for efficient large-scale training of a wide variety of generative models. Finally, I’ll demonstrate several important application of these models to density estimation, missing data imputation, data compression and planning.

 

Slides: Tutorial on Deep Generative Models, UAI 2017

Video: Tutorial on Deep Generative Models, UAI 2017

keywords: variational inference, generative models, VAEs, GANs, approximate inference, normalizing-flows, R-NVP, IAF, density estimation.

Abstract:

This tutorial will be a review of recent advances in deep generative models. Generative models have a long history at UAI and recent methods have combined the generality of probabilistic reasoning with the scalability of deep learning to develop learning algorithms that have been applied to a wide variety of problems giving state-of-the-art results in image generation, text-to-speech synthesis, and image captioning, amongst many others. Advances in deep generative models are at the forefront of deep learning research because of the promise they offer for allowing data-efficient learning, and for model-based reinforcement learning. At the end of this tutorial, audience member will have a full understanding of the latest advances in generative modeling covering three of the active types of models: Markov models, latent variable models and implicit models, and how these models can be scaled to high-dimensional data. The tutorial will expose many questions that remain in this area, and for which there remains a great deal of opportunity from members of the UAI community.

 

Video: One-Shot Generalization in Deep Generative Models, ICML2016

keywords: variational inference, generative models, one-shot learning, one-shot density estimation.

Abstract:

Humans have an impressive ability to reason about new concepts and experiences from just a single example. In particular, humans have an ability for one-shot generalization: an ability to encounter a new concept, understand its structure, and then be able to generate compelling alternative variations of the concept. We develop machine learning systems with this important capacity by developing new deep generative models, models that combine the representational power of deep learning with the inferential power of Bayesian reasoning. We develop a class of sequential generative models that are built on the principles of feedback and attention. These two characteristics lead to generative models that are among the state-of-the art in density estimation and image generation. We demonstrate the one-shot generalization ability of our models using three tasks: unconditional sampling, generating new exemplars of a given concept, and generating new exemplars of a family of concepts. In all cases our models are able to generate compelling and diverse samples—having seen new examples just once—providing an important class of general-purpose models for one-shot machine learning.

 

Video: Variational Inference with Normalizing Flows, ICML2015

keywords: variational inference, generative models, normalizing flows, log-det-Jacobian.

Abstract:

The choice of the approximate posterior distribution is one of the core problems in variational inference. Most applications of variational inference employ simple families of posterior approximations in order to allow for efficient inference, focusing on mean-field or other simple structured approximations. This restriction has a significant impact on the quality of inferences made using variational methods. We introduce a new approach for specifying flexible, arbitrarily complex and scalable approximate posterior distributions. Our approximations are distributions constructed through a normalizing flow, whereby a simple initial density is transformed into a more complex one by applying a sequence of invertible transformations until a desired level of complexity is attained. We use this view of normalizing flows to develop categories of finite and infinitesimal flows and provide a unified view of approaches for constructing rich posterior approximations. We demonstrate that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference.

 

Slides: The Helmholtz Machine Revisited, EPFL2012

keywords: variational inference, generative models, temporal models, Helmholtz machine, Boltzmann Machine, wake-sleep, REINFORCE, variance-reduction.

Abstract: In this talk I gave at EPFL in 2012, I have introduced Deep-Latent variable models (DLGMs), Recurrent-DLGM temporal models (later named VRNNs) and the application of the REINFORCE algorithm to variational inference.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.