Adam Optimizer Explained in Detail. Adam Optimizer is a technique that reduces the time taken to train a model in Deep Learning.
The path of learning in mini-batch gradient descent is zig-zag, and not straight. Thus, some time gets wasted in moving in a zig-zag direction. Adam Optimizer increases the horizontal movement and reduced the vertical movement, thus making the zig-zag path straighter, and thus reducing the time taken to train the model.
Adam Optimizer is formed by the combination of two Optimizers in Deep Learning, which are Momentum Optimizer and RMSprop Optimizer.
Thus Adam Optimizer is the most powerful optimizer in Deep Learning.
The concept of Adam Optimizer is difficult to understand. Thus in this video, I have done my best to provide you with a detailed Explanation of the Adam Optimizer.
Momentum Optimizer in Deep Learning: https://youtu.be/Vce8w1sy0e8
RMSprop Optimizer in Deep Learning: https://youtu.be/ajI_HTyaCu8
Improving Neural Network Playlist: https://www.youtube.com/watch?v=SOI39DEHGSk&list=PLuhqtP7jdD8DKUBtucBD0mGS7y0rT9alz&t=0s
Complete Neural Network Playlist: https://www.youtube.com/watch?v=vtx1iwmOx10&t=284s
Complete Logistic Regression Playlist: https://www.youtube.com/watch?v=U1omz0B9FTw&list=PLuhqtP7jdD8Chy7QIo5U0zzKP8-emLdny&t=0s
Complete Linear Regression Playlist: https://www.youtube.com/watch?v=mlk0r...
Timestamp:
0:00 Agenda
1:52 Adam Optimizer Explained
4:35 End
Subscribe to my channel, because I upload a new Machine Learning video every week: https://www.youtube.com/channel/UCJFA...