University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > Inference in Stochastic Processes

Inference in Stochastic Processes

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Elre Oldewage.

In parametric models, probabilistic inference is most often approached by computing a posterior distribution over model weights. These weights are then marginalised to obtain a distribution over functions and make predictions. If our goal is solely to make good predictions, an appealing alternative is to directly perform inference over the ‘function-space’ or predictive posterior distribution of our models, without considering the posterior distribution over the weights. Using Gaussian Processes (GPs) as motivation, this talk starts by introducing a method for constructing more general stochastic processes based on combining basis functions with random weights. We discuss recent research on performing approximate inference in the function space of neural networks. Finally, we provide a brief introduction to Stochastic Differential Equations (SDEs). We discuss the connection of linear SDEs to GPs and Kalman filtering and smoothing, and present a recent method for performing inference and learning in nonlinear SDEs.

Recommended reading

  1. Rasmussen & Williams, “Gaussian process for Machine Learning”, Chapter 2.2: “Function space view”, pages 13-18
  2. Burt et. al. “Understanding Variational Inference in Function-Space” 2020
  3. Archambeau, Cédric, et al. “Variational inference for diffusion processes.” 2008

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity