University of Cambridge > Talks.cam > Machine Learning @ CUED > Deep Learning

Deep Learning

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Zoubin Ghahramani.

I will describe an efficient, unsupervised learning procedure for a simple type of two-layer neural network called a Restricted Boltzmann Machine. I will then show how this algorithm can be used recursively to learn multiple layers of features without requiring any supervision. After this unsupervised “pre-training”, the features in all layers can be fine-tuned to be better at discriminating between classes by using the standard backpropagation procedure from the 1980s. Unsupervised pre-training greatly improves generalization to new data, especially when the number of labelled examples is small. Ten years ago, the pre-training approach initiated a revival of research on deep, feedforward neural networks. I will describe some of the major successes of deep networks for speech recognition, object recognition and machine translation and I will speculate about where this research is headed. The fact that backpropagation learning is now the method of choice for a wide variety of really difficult tasks means that neuroscientists may need to reconsider their well-worn arguments about why it cannot possibly be occurring in cortex. I shall conclude by undermining two of the commonest objections to the idea that cortex is actually backpropagating error derivatives through a hierarchy of cortical areas and I shall show that spike-time dependent plasticity is a signature of backpropagation.

This talk is part of the Machine Learning @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity