In this video, we will understand Backpropagation in RNN. It is also called Backpropagation through time, as here we are backpropagating through time.

Understanding Backpropagation in RNN helps us to know how Recurrent Neural Networks work. Also, It is important to understand the vanishing gradient problem that occurs in RNN.

We will look at the general equation for computing Backpropagation in RNN, and we will also see how to find gradients with respect to weights in Recurrent Neural Networks.



Timestamps:
0:00 Intro
0:30 Forward Propagation
3:24 gradient w.r.t Wya
5:08 gradient w.r.t Waa
7:27 gradient w.r.t Wax
8:23 End



Follow my entire playlist on Recurrent Neural Network (RNN) :

RNN Playlist: https://www.youtube.com/watch?v=lWPkNkShNbo&list=PLuhqtP7jdD8ARBnzj8SZwNFhwWT89fAFr&t=0s



CNN Playlist: https://www.youtube.com/watch?v=E5Z7FQp7AQQ&list=PLuhqtP7jdD8CD6rOWy20INGM44kULvrHu&t=0s

Complete Neural Network: https://www.youtube.com/watch?v=mlk0rddP3L4&list=PLuhqtP7jdD8CftMk831qdE8BlIteSaNzD&t=0s

Complete Logistic Regression Playlist: https://www.youtube.com/watch?v=U1omz0B9FTw&list=PLuhqtP7jdD8Chy7QIo5U0zzKP8-emLdny&t=0s

Complete Linear Regression Playlist: https://www.youtube.com/watch?v=nwD5U2WxTdk&list=PLuhqtP7jdD8AFocJuxC6_Zz0HepAWL9cF&t=0s



If you want to ride on the Lane of Machine Learning, then Subscribe to my channel here: https://www.youtube.com/channel/UCJFAF6IsaMkzHBDdfriY-yQ?sub_confirmation=1