University of Cambridge > Talks.cam > Microsoft Research Summer School > To infinity and beyond with nonparametric Bayesian methods

To infinity and beyond with nonparametric Bayesian methods

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Dr Fabien Petitcolas.

Abstract: Probabilistic models in machine learning are widely used in science and industry. Traditionally, these models have been set up assuming a small set of unknowns which need to be learned from data. As the amount of data we learn from grows, more data will just lead to a few extra digits accuracy in our estimates. Nonparametric Bayesian methods are a family of techniques to make better use of data by allowing models to have an infinite number of parameters and letting the data decide how many to actually learn. In this talk I will illustrate how these type of techniques can be used to build a part of speech tagger without knowing anything about parts of speech!

Biography: Jurgen is a PhD candidate in the Machine Learning Group of the Computational and Biological Learning Lab at the Department of Engineering in the University of Cambridge where my advisor is Professor Zoubin Ghahramani. He is supported by a Microsoft Reseach PhD Scholarship and as such co-advised by Ralf Herbrich. Before starting his PhD he was a Master student at the University of Wisconsin in Madison working with Professor Jerry Zhu. He has an undergraduate degree from Leuven. At Cambridge, he is a member of Wolfson College.

This talk is part of the Microsoft Research Summer School series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity