LSTM Recurrent Neural Network is a special version of the RNN model. It stands for Long Short-Term Memory. The simple RNN has a problem that it cannot remember the context in a long sentence because it quickly loses information. And that is why Simple RNN has an only short-term memory.

LSTM has both long-term and short-term memory. It can store any contextual information for a long time.

LSTM has 2 internal states.
1.) Memory Cell State which acts as a long term memory
2.) Hidden State which acts as a short term memory

The main working components in LSTM are gates. There are 3 types of gates in LSTM:
1.) Forget Gate
2.) Input Gate
3.) Output Gate

In the video, we have understood LSTM Recurrent Neural Network in detail.



Timestamps:
0:00 Intro
1:36 Problem with RNN
5:30 LSTM Overview
7:42 Forget Gate
10:39 Input Gate
13:39 Equations and other details
16:41 Summary of LSTM
18:23 LSTM through different times
19:01 End



Quiz: https://forms.gle/no29DhL1pF1dsFw28



Follow my entire playlist on Recurrent Neural Network (RNN) :

RNN Playlist: https://www.youtube.com/watch?v=lWPkNkShNbo&list=PLuhqtP7jdD8ARBnzj8SZwNFhwWT89fAFr&t=0s



CNN Playlist: https://www.youtube.com/watch?v=E5Z7FQp7AQQ&list=PLuhqtP7jdD8CD6rOWy20INGM44kULvrHu&t=0s

Complete Neural Network: https://www.youtube.com/watch?v=mlk0rddP3L4&list=PLuhqtP7jdD8CftMk831qdE8BlIteSaNzD&t=0s

Complete Logistic Regression Playlist: https://www.youtube.com/watch?v=U1omz0B9FTw&list=PLuhqtP7jdD8Chy7QIo5U0zzKP8-emLdny&t=0s

Complete Linear Regression Playlist: https://www.youtube.com/watch?v=nwD5U2WxTdk&list=PLuhqtP7jdD8AFocJuxC6_Zz0HepAWL9cF&t=0s



If you want to ride on the Lane of Machine Learning, then Subscribe to my channel here: https://www.youtube.com/channel/UCJFAF6IsaMkzHBDdfriY-yQ?sub_confirmation=1