Gated Recurrent Unit (GRU) is a type of Recurrent Neural Network (RNN) which performs better than Simple RNN while dealing with longer input data. Gated Recurrent Unit (GRU) is an advance RNN which can retain some valuable information for a longer period of time without updating it. The memory context of GRU has both long term and short term memory combined. The update gate in GRU helps to retain a valuable context as long as it is needed. The reset gate in GRU helps to forget any context when not needed, allowing more room for new information to come in.
Gated Recurrent Unit (GRU) is used in many real-world applications. The video gives a detailed explanation of Gated Recurrent Unit (GRU) through which you will understand the working of it and equations through which it operates.
Timestamps:
0:00 Intro
0:24 RNN limitations
3:38 LSTM vs GRU
4:42 GRU explained
9:13 Update Gate
10:45 Reset Gate
13:09 Summary
Follow my entire playlist on Recurrent Neural Network (RNN) :
RNN Playlist: https://www.youtube.com/watch?v=lWPkNkShNbo&list=PLuhqtP7jdD8ARBnzj8SZwNFhwWT89fAFr&t=0s
CNN Playlist: https://www.youtube.com/watch?v=E5Z7FQp7AQQ&list=PLuhqtP7jdD8CD6rOWy20INGM44kULvrHu&t=0s
Complete Neural Network: https://www.youtube.com/watch?v=mlk0rddP3L4&list=PLuhqtP7jdD8CftMk831qdE8BlIteSaNzD&t=0s
Complete Logistic Regression Playlist: https://www.youtube.com/watch?v=U1omz0B9FTw&list=PLuhqtP7jdD8Chy7QIo5U0zzKP8-emLdny&t=0s
Complete Linear Regression Playlist: https://www.youtube.com/watch?v=nwD5U2WxTdk&list=PLuhqtP7jdD8AFocJuxC6_Zz0HepAWL9cF&t=0s
If you want to ride on the Lane of Machine Learning, then Subscribe to my channel here: https://www.youtube.com/channel/UCJFAF6IsaMkzHBDdfriY-yQ?sub_confirmation=1