University of Cambridge > Talks.cam > Microsoft Research Cambridge, public talks > An introduction to clustering and the expectation maximisation algorithm Part 2

An introduction to clustering and the expectation maximisation algorithm Part 2

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Microsoft Research Cambridge Talks Admins.

Please note, this event will be recorded. Microsoft will own the copyright of any recording and reserves the right to distribute it as required.

Clustering methods assign ‘similar’ data points to the same cluster, and ‘dissimilar’ data points to different clusters. They find application in a diverse range of application areas including data-driven understanding of disease sub-types, identification of communities in social networks, and email spam filtering. Clustering is therefore one of the central tasks in unsupervised machine learning.

In the second lecture I will describe how learning in this model will be handled through the Expectation Maximisation (EM) algorithm which can be deployed to many latent variable models. We will cover this from the general variational view point connecting to the wider class of variational inference methods. Finally we will look at the behaviour of the mixture of Gaussians EM algorithm and identify strengths and weaknesses of the approach.

This talk is part of the Microsoft Research Cambridge, public talks series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity