University of Cambridge > > Isaac Newton Institute Seminar Series > Nonparametric Bayesian times series models: infinite HMMs and beyond

Nonparametric Bayesian times series models: infinite HMMs and beyond

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Mustapha Amrani.

Statistical Theory and Methods for Complex, High-Dimensional Data

Hidden Markov models (HMMs) are one of the most widely used statistical models for time series. Traditionally, HMMs have a known structure with a fixed number of states and are trained using maximum likelihood techniques. The infinite HMM (iHMM) allows a potentially unbounded number of hidden states, letting the model use as many states as it needs for the data (Beal, Ghahramani and Rasmussen 2002). Teh, Jordan, Beal and Blei (2006) showed that a form of the iHMM could be derived from the Hierarchical Dirichlet Process, and described a Gibbs sampling algorithm based on this for the iHMM. I will talk about recent work we have done on infinite HMMs. In particular: we now have a much more efficient inference algorithm based on dynamic programming, called ‘Beam Sampling’, which should make it possible to apply iHMMs to larger problems. We have also developed a factorial version of the iHMM which makes it possible to have an unbounded number of binary state variables, and can be thought of as a time-series generalization of the Indian buffet process.

Joint work with Jurgen van Gael (Cambridge), Yunus Saatci (Cambridge) and Yee Whye Teh (Gatsby Unit, UCL ).

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2022, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity