LSTM and Recurrent Neural Networks
- đ¤ Speaker: Shixiang Gu; Andrey Malinin
- đ Date & Time: Thursday 20 November 2014, 15:00 - 16:30
- đ Venue: Engineering Department, CBL Room 438
Abstract
Long Short-Term Memory (LSTM) is a type of Recurrent Neural Network (RNN) that has recently succeeded in many sequence learning tasks, such as speech recognition, online handwriting recognition, and statistical machine translation. RNN , a very deep neural network with many tied weights, exhibits vanishing or exploding gradient problems common in deep models trained through backpropagation algorithm (BP). LSTM resolves this problem by introducing internal cell memory and access control gates, enabling learning long temporal range dependencies. In this talk, we introduce basics in RNN and LSTM , and recent successes achieved by LSTM models.
Series This talk is part of the Machine Learning Reading Group @ CUED series.
Included in Lists
- All Talks (aka the CURE list)
- bld31
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- Cambridge talks
- Cambridge University Engineering Department Talks
- Centre for Smart Infrastructure & Construction
- Chris Davis' list
- Computational Continuum Mechanics Group Seminars
- custom
- Engineering Department, CBL Room 438
- Featured lists
- Guy Emerson's list
- Hanchen DaDaDash
- Inference Group Journal Clubs
- Inference Group Summary
- Information Engineering Division seminar list
- Interested Talks
- Machine Learning Reading Group
- Machine Learning Reading Group @ CUED
- Machine Learning Summary
- ML
- ndk22's list
- ob366-ai4er
- Quantum Matter Journal Club
- Required lists for MLG
- rp587
- School of Technology
- Simon Baker's List
- TQS Journal Clubs
- Trust & Technology Initiative - interesting events
- yk373's list
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Thursday 20 November 2014, 15:00-16:30