University of Cambridge > > Machine Learning Reading Group @ CUED > LSTM and Recurrent Neural Networks

LSTM and Recurrent Neural Networks

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact mv310.

Long Short-Term Memory (LSTM) is a type of Recurrent Neural Network (RNN) that has recently succeeded in many sequence learning tasks, such as speech recognition, online handwriting recognition, and statistical machine translation. RNN , a very deep neural network with many tied weights, exhibits vanishing or exploding gradient problems common in deep models trained through backpropagation algorithm (BP). LSTM resolves this problem by introducing internal cell memory and access control gates, enabling learning long temporal range dependencies. In this talk, we introduce basics in RNN and LSTM , and recent successes achieved by LSTM models.

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2022, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity