Deep Learning in Practice
- đ¤ Speaker: Nick Rogers, Churchill College
- đ Date & Time: Wednesday 20 January 2016, 19:00 - 19:40
- đ Venue: Wolfson Hall, Churchill College
Abstract
This talk will explore some of the problems that exist in the area of deep learning and some of the mathematical techniques that are used to overcome these issues. We will begin by examining how the vanishing gradient problem threatens the very existence of deep networks and then look at how researchers working over the past decade have managed to overcome these issues by using a number of distinct methods to drastically improve the rate of learning and ultimately, the resulting accuracy of these networks. We will investigate how using the cross-entropy cost function improves the learning speed and also show how using an alternative to sigmoid neurons can avoid the problem of neuron saturation. We will also analyse the motivation behind regularisation and show how it can be used to combat the problem of overfitting. These techniques are used by pioneering neural networks such as the winners of the Large Scale Visual Recognition Challenge and can result in networks that achieve human levels of performance in some tasks.
Series This talk is part of the Churchill CompSci Talks series.
Included in Lists
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Wednesday 20 January 2016, 19:00-19:40