University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > An introduction to Bayesian nonparametrics: some inference schemes for infinite mixture models

An introduction to Bayesian nonparametrics: some inference schemes for infinite mixture models

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Yingzhen Li.

An introduction to this exciting research area will be given, focusing on the two basic (and famous) processes: Dirichlet and Pitman-Yor (or Chinese restaurant and two parameter Chinese restaurant, respectively, for their corresponding exchangeable random partition representations). These two processes will be used as building blocks of an infinite mixture model. Successively, I will review how their different constructions/representations are useful to construct different algorithms. Then, I will proceed to talk about the two Markov Chain Monte Carlo schemes in detail. Finally, if time permits, I talk about a novel MCMC scheme that exploits the advantages of the two existing MCMC schemes (Lomeli et al, 2015).

References: (no pre-reading required but these are very useful in general)

Yee Whye’s tutorial

Zoubin’s tutorial

Peter Orbanz webpage (very thorough list of references therein)

Lomeli et al (for the last part of the talk)

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity