Learning and Regularizing Score-Based Diffusion Models
- đ¤ Speaker: Ricardo Baptista (CALTECH (California Institute of Technology))
- đ Date & Time: Tuesday 16 July 2024, 10:00 - 11:00
- đ Venue: External
Abstract
Diffusion models have emerged as a powerful framework for generative modeling that relies on score matching to learn gradients of the data distribution’s log-density. A key element for the success of diffusion models is that the optimal score function is not identified when solving the denoising score matching problem. In fact, the optimal score in both unconditioned and conditioned settings leads to a diffusion model that returns to the training samples and effectively memorizes the data distribution. In this presentation, we study the dynamical system associated with the optimal score and describe its long-term behavior relative to the training samples. Lastly, we show the effect of three forms of score function regularization on avoiding memorization: restricting the score’s approximation space, early stopping of the training process, and early stopping of the diffusion process during sample generation. Moreover, we establish a connection between early stopping of the diffusion and explicit Tikhonov regularization of the score matching problem. These results are numerically validated using distributions with and without densities including image-based problems.
Series This talk is part of the Isaac Newton Institute Seminar Series series.
Included in Lists
- All CMS events
- bld31
- dh539
- External
- Featured lists
- INI info aggregator
- Isaac Newton Institute Seminar Series
- School of Physical Sciences
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Ricardo Baptista (CALTECH (California Institute of Technology))
Tuesday 16 July 2024, 10:00-11:00