Mini Batch Gradient Descent is an algorithm that helps to speed up learning while dealing with a large dataset.
Instead of updating the weight parameters after assessing the entire dataset, Mini Batch Gradient Descent updates weight parameters after assessing the small batch of the dataset. Thus we can make much progress before our model sees the entire dataset. Thus the learning can be very fast.
In the video, we also saw Stochastic Gradient Descent, which updates the weight parameter after evaluating every data point or data record. Stochastic Gradient Descent has its own disadvantages as well, which is overcome by the Mini Batch Gradient Descent.
So, in practice, we don't use Stochastic Gradient Descent but we use Mini Batch Gradient descent while dealing with large datasets. And we use simple, Batch Gradient Descent while dealing with small datasets.
Improving Neural Network Playlist: https://www.youtube.com/watch?v=SOI39DEHGSk&list=PLuhqtP7jdD8DKUBtucBD0mGS7y0rT9alz
Complete Neural Network Playlist: https://www.youtube.com/watch?v=vtx1iwmOx10&t=284s
Complete Logistic Regression Playlist: https://www.youtube.com/watch?v=U1omz0B9FTw&list=PLuhqtP7jdD8Chy7QIo5U0zzKP8-emLdny&t=0s
Complete Linear Regression Playlist: https://www.youtube.com/watch?v=mlk0rddP3L4&list=PLuhqtP7jdD8CftMk831qdE8BlIteSaNzD&t=0s
Timestamp:
0:00 Agenda
0:38 When Gradient Descent will Fail
2:24 Mini Batch Gradient Descent Deep Learning
3:31 Drawback of Mini Batch Gradient Descent
5:47 Stochastic Gradient Descent
6:30 When to use Mini Batch Gradient Descent
This is Your Lane to Machine Learning
Subscribe to my channel, because I upload a new Machine Learning video every week: https://www.youtube.com/channel/UCJFAF6IsaMkzHBDdfriY-yQ?sub_confirmation=1