This course proposes an overview of classical methods to solve sampling-related problems for generative models. Exercises and notebooks in Python are provided to understand the practical challenges in classical settings.
- Target distributions and examples
- Variational Autoencoders
- Score-based diffusion models
- Basics in Markov chains (invariant probability measures, ergodicity and law of large numbers)
- Metropolis-Hastings algorithm
- Pseudo-maginal algorithms and Hamiltonian Monte Carlo
- VAE basics
Training of a simple VAE with detailed building blocks and example from Keras - Score-based diffusion models
Full training of a score-based diffusion model on a toy example - Random walk Metropolis-Hastings algorithm
Introduction to invariant distributions and MH algorithms - MALA and Hamiltonian Monte Carlo
More Advanced MCMC algorithms using Hamiltonian and Langevin dynamics