University of Cambridge > Talks.cam > Machine Learning @ CUED > Nonparametric Generative Modeling via Optimal Transport and Diffusions with Provable Guarantees

Nonparametric Generative Modeling via Optimal Transport and Diffusions with Provable Guarantees

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Eric T Nalisnick.

By building up on the recent theory that established the connection between implicit generative modeling and optimal transport, in this talk, I will present a novel parameter-free algorithm for learning the underlying distributions of complicated datasets and sampling from them. The proposed algorithm is based on a functional optimization problem, which aims at finding a measure that is ‘close to the data distribution as much as possible’ and also ‘expressive enough’ for generative modeling purposes. The problem will be formulated as a gradient flow in the space of probability measures. The connections between gradient flows and stochastic differential equations will let us develop a computationally efficient algorithm for solving the optimization problem, where the resulting algorithm will resemble the recent dynamics-based Markov Chain Monte Carlo algorithms. I will then present finite-time error guarantees for the proposed algorithm. I will finally present some experimental results, which support our theory and shows that our algorithm is able to capture the structure of challenging distributions.

If time permits, I will also talk about possible extensions of this approach.

The talk will be based on these two articles: 1) Sliced-Wasserstein Flows 2) Generalized Sliced Wasserstein Distances

This talk is part of the Machine Learning @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity