University of Cambridge > Talks.cam > Statistics > Learning rates in Bayesian nonparametrics: Gaussian process priors

Learning rates in Bayesian nonparametrics: Gaussian process priors

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Richard Nickl.

This talk has been canceled/deleted

joint with Probability Series

The sample path of a Gaussian process can be used as a prior model for an unknown function that we wish to estimate. For instance, one might model a regression function or log density a priori as the sample path of a Brownian motion or its primitive, or some stationary process. Viewing this prior model as a formal prior distribution in a Bayesian set-up, we obtain a posterior distribution in the usual way, which, given the observations, is a probability distribution on a function space.

We study this posterior distribution under the assumption that the data is generated according to some given true function, and are interested in whether the posterior contracts to the true function if the informativeness in the data increases indefinitely, and at what speed.

For Gaussian process priors this rate of contraction rate can be described in terms of the small ball probability of the Gaussian process and the position of the true parameter relative to its reproducing kernel Hilbert space. Typically the prior has a strong influence on the contraction rate. This dependence can be alleviated by scaling the sample paths. For instance, an infinitely smooth, stationary Gaussian process scaled by an inverse Gamma variable yields a prior distribution on functions such that the posterior distribution adapts to the unknown smoothness of the true parameter, in the sense that contraction takes place at the minimax rate for the true smoothness.

This talk is part of the Statistics series.

Tell a friend about this talk:

This talk is included in these lists:

This talk is not included in any other list

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity