University of Cambridge > Talks.cam > AI+Pizza > Pizza & AI April 2019

Pizza & AI April 2019

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Microsoft Research Cambridge Talks Admins.

Please note, this event may be recorded. Microsoft will own the copyright of any recording and reserves the right to distribute it as required.

Speaker 1 – Andrey Malinin

Title – This is the EnDD: Ensemble Distribution Distillation

Abstract – Ensemble of Neural Network (NN) models are known to yield improvements in accuracy as well as robust measures of uncertainty. However, ensembles come at high computational and memory cost, which may be prohibitive for certain application. Previously, the distillation of an ensemble into a single model has been investigated. Such approaches decrease computational cost and allow a single model to achieve accuracy comparable to that of an ensemble. However, information about the diversity of the ensemble, which can yield estimates of epistemic uncertainty, is lost. Recently, a new type of model, called a Prior Network, has been introduced, which allows a single DNN to explicitly model a distribution over output distributions conditioned on the input by parameterizing a Dirichlet distribution. This work proposes an approach called Ensemble Distribution Distillation, which allows distilling an ensemble into a single Prior Network model, retaining both the improved classification performance as well as measures of diversity of the ensemble. The properties of Ensemble Distribution Distillation are investigated on a synthetic spiral dataset.

Speaker 2- Yingzhen Li

Title – Meta-Learning for Stochastic Gradient MCMC

Abstract – Stochastic gradient Markov chain Monte Carlo (SG-MCMC) has become increasingly popular for simulating posterior samples in large-scale Bayesian modeling. However, existing SG-MCMC schemes are not tailored to any specific probabilistic model, even a simple modification of the underlying dynamical system requires significant physical intuition. This paper presents the first meta-learning algorithm that allows automated design for the underlying continuous dynamics of an SG-MCMC sampler. The learned sampler generalizes Hamiltonian dynamics with state-dependent drift and diffusion, enabling fast traversal and efficient exploration of neural network energy landscapes. Experiments validate the proposed approach on both Bayesian fully connected neural network and Bayesian recurrent neural network tasks, showing that the learned sampler out-performs generic, hand-designed SG-MCMC algorithms, and generalizes to different datasets and larger architectures.

This is a joint work with Wenbo Gong and Jose Miguel Hernandez-Lobato from the University of Cambridge. The paper will be presented at ICLR 2019 .

This talk is part of the AI+Pizza series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2019 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity