Reading group: Underpinning techniques of most widely used DNN architectures
- đ¤ Speaker: Nicolai Baldin (Statslab)
- đ Date & Time: Tuesday 31 October 2017, 14:00 - 15:00
- đ Venue: Centre for Mathematical Sciences, MR2
Abstract
In the first session we will look more closely into common techniques of widely used NN architectures like batch normalisation, dropout and stochastic optimisers. We shall also touch upon regularisation ideas and various activation functions.
It will be roughly based upon the following papers:
1. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift pdf
2. Dropout: A Simple Way to Prevent Neural Networks from Overfitting pdf
3. Adam: A Method for Stochastic Optimization pdf
The first session will be given by the organisers but participants are expected to be familiar with the papers. More information about the reading group can be found at mathsml.com
Series This talk is part of the Mathematics and Machine Learning series.
Included in Lists
- All CMS events
- bld31
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge talks
- Centre for Mathematical Sciences, MR2
- Chris Davis' list
- CMS Events
- DPMMS info aggregator
- Guy Emerson's list
- Hanchen DaDaDash
- Interested Talks
- Mathematics and Machine Learning
- Mathematics and Machine Learning
- ndk22's list
- ob366-ai4er
- rp587
- Trust & Technology Initiative - interesting events
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Tuesday 31 October 2017, 14:00-15:00