University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > A tutorial on diffusion models

A tutorial on diffusion models

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact .

Score-based generative models (SGMs) are a powerful class of generative models that exhibit remarkable empirical performance. Although SGMs gained widespread popularity by performing astonishingly well in text-to image generation task (DALL-E, Stable Diffusion), it has been shown quite recently that diffusion-based models are able to reach state-of-the-art quality in many other generative modeling domains in computer vision, chemistry, NLP , and climate modeling. Our talk is designed as a tutorial with no prior knowledge on diffusion models required. We will cover i) basic techniques SGMs rely on (Langevin dynamics and score matching); ii) discrete and then iii) continuous-time diffusion models; iv) relationship between variational and score-based perspectives.

No required reading, but we would suggest reading this beforehand for more engagement :) https://arxiv.org/abs/2011.13456

Please find a non exhaustive list of relevant paper (which we will mention / cover)

• A. Hyvärinen. Estimation of Non-Normalized Statistical Models by Score Matching. 2005

• J. Sohl-Dickstein, E. Weiss, N. Maheswaranathan, and S. Ganguli. Deep unsupervised learning using nonequilibrium thermodynamics. 2015

• P. Vincent. A connection between score matching and denoising autoencoders. 2011

• Y. Song and S. Ermon. Generative modeling by estimating gradients of the data distribution. 2019

• J. Ho, A. Jain, and P. Abbeel. Denoising diffusion probabilistic models. 2020

• V. De Bortoli, J. Thornton, J. Heng, and A. Doucet. Diffusion Schrödinger bridge with applications to score-based generative modeling. 2021

• Y. Song, J. Sohl-Dickstein, D. P. Kingma, A. Kumar, S. Ermon, and B. Poole. 2021. Score-Based Generative Modeling through Stochastic Differential Equations.

• C.-W. Huang, J. H. Lim, and A. C. Courville. A variational perspective on diffusion-based generative models and score matching. 2021

• Y. Song, C. Durkan, I. Murray, and S. Ermon. Maximum likelihood training of score-based diffusion models. 2021

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity