University of Cambridge > Talks.cam > Machine Learning @ CUED > Bayesian Nonparametrics: Latent Feature and Prediction Models, and Efficient Inference

Bayesian Nonparametrics: Latent Feature and Prediction Models, and Efficient Inference

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Zoubin Ghahramani.

Nonparametric Bayesian approaches offer a flexible modeling paradigm for data without limiting the model-complexity a priori. The flexibility comes from the fact that the model-complexity can grow adaptively with data. The Indian Buffet Process (IBP) is an example of a nonparametric Bayesian model in which a set of observations are assumed to be generated from a small set of latent features, and the number of latent features need not be known a priori. In this talk, I will describe some of my recent work on the IBP based models; in particular, (1) A variant of the IBP which removes the independent latent features assumption, and allows the latent features to be related via a hierarchy, (2) A nonparametric Bayesian multitask learning model which uses a combination of the Dirichlet Process mixture model and the IBP as the prior distribution on the weight vectors of multiple tasks, and (3) An efficient, search-based inference method for finding an approximate MAP estimate of the latent feature assignment matrix in the IBP based models.

This talk is part of the Machine Learning @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2020 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity