In this video, we will about training word embeddings by writing a python code. So we will write a python code to train word embeddings. To train word embeddings, we need to solve a fake problem. This problem is something that we do not care about. What we care about is the weights that are obtained after training the model. These weights are extracted and they act as word embeddings.
This is part 2/2 for training word embeddings. In part 1 we understood the theory behind training word embeddings. In this part, we will code the same in python.
Complete Code: https://github.com/Coding-Lane/Training-Word-Embeddings---Scratch
Timestamps:
0:00 Intro
2:13 Loading Data
3:25 Removing stop words and tokenizing
5:11 Creating Bigrams
7:37 Creating Vocabulary
9:29 One-hot Encoding
14:41 Model
19:35 Checking results
21:57 Useful Tips
Follow my entire playlist on Recurrent Neural Network (RNN) :
RNN Playlist: https://www.youtube.com/watch?v=lWPkNkShNbo&list=PLuhqtP7jdD8ARBnzj8SZwNFhwWT89fAFr&t=0s
CNN Playlist: https://www.youtube.com/watch?v=E5Z7FQp7AQQ&list=PLuhqtP7jdD8CD6rOWy20INGM44kULvrHu&t=0s
Complete Neural Network: https://www.youtube.com/watch?v=mlk0rddP3L4&list=PLuhqtP7jdD8CftMk831qdE8BlIteSaNzD&t=0s
Complete Logistic Regression Playlist: https://www.youtube.com/watch?v=U1omz0B9FTw&list=PLuhqtP7jdD8Chy7QIo5U0zzKP8-emLdny&t=0s
Complete Linear Regression Playlist: https://www.youtube.com/watch?v=nwD5U2WxTdk&list=PLuhqtP7jdD8AFocJuxC6_Zz0HepAWL9cF&t=0s
If you want to ride on the Lane of Machine Learning, then Subscribe to my channel here: https://www.youtube.com/channel/UCJFAF6IsaMkzHBDdfriY-yQ?sub_confirmation=1