University of Cambridge > Talks.cam > Microsoft Research Machine Learning and Perception Seminars > Probabilistic programs and the computability and complexity of Bayesian reasoning

Probabilistic programs and the computability and complexity of Bayesian reasoning

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Microsoft Research Cambridge Talks Admins.

This event may be recorded and made available internally or externally via http://research.microsoft.com. Microsoft will own the copyright of any recordings made. If you do not wish to have your image/voice recorded please consider this before attending

When is Bayesian reasoning possible? and when is it efficient?

There has been a recent surge of interest in probabilistic programming languages and general-purpose inference engines. By providing languages for specifying large, modular, potentially recursively-defined probabilistic models, these systems make it possible to tackle very complex inductive inference problems, opening up new avenues for AI and applications.

We present recent results elucidating aspects of the theoretical underpinnings of probabilistic programming systems, especially those whose underlying probabilistic languages have, in a technical sense, the same expressive power as probabilistic Turing machines. This algorithmic perspective on probability distributions, combined with new general purpose Monte Carlo inference strategies, presents a host of challenges and opportunities for theoretical computer science.

In particular, we investigate the class of computable probability distributions—-distributions for which there exists a probabilistic program for generating exact samples—-and explore the fundamental limitations on performing Bayesian reasoning with probabilistic program representations. We ask, when does a conditional distribution have such a representation and when can we compute it? In addition to highlighting some positive results demonstrating that Bayesian reasoning is possible when the prior distribution has additional structure such as exchangeability or noise, we prove the nonexistence of algorithms (even inefficient ones) for Bayesian reasoning in the general case. We close with some complexity-theoretic aspects.

This talk is part of the Microsoft Research Machine Learning and Perception Seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2019 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity