Why are the terms Query, Key, and Value used in self-attention mechanisms? In the Part 4 of our Transformers series, we break down the intuition reasoning behind the names - Query, Key and Value.
By the end, you’ll have a clear understanding of why these terms were chosen by the authors of the "Attention is all you need" paper and how they contribute to the Self-Attention mechanism.
Timestamps:
0:00 Intro
0:43 Query, Key & Value in Computer Science
1:36 Query, Key & Value in Self-Attention
3:30 Outro
Follow my entire playlist on Recurrent Neural Network (RNN) :
Transformers Playlist: https://www.youtube.com/watch?v=lRylkiFdUdk&list=PLuhqtP7jdD8CQTxwVsuiFYGvHtFpNhlR3&index=1&t=0s
RNN Playlist: https://www.youtube.com/watch?v=lWPkNkShNbo&list=PLuhqtP7jdD8ARBnzj8SZwNFhwWT89fAFr&t=0s
CNN Playlist: https://www.youtube.com/watch?v=E5Z7FQp7AQQ&list=PLuhqtP7jdD8CD6rOWy20INGM44kULvrHu&t=0s
Complete Neural Network: https://www.youtube.com/watch?v=mlk0rddP3L4&list=PLuhqtP7jdD8CftMk831qdE8BlIteSaNzD&t=0s
Complete Logistic Regression Playlist: https://www.youtube.com/watch?v=U1omz0B9FTw&list=PLuhqtP7jdD8Chy7QIo5U0zzKP8-emLdny&t=0s
Complete Linear Regression Playlist: https://www.youtube.com/watch?v=nwD5U2WxTdk&list=PLuhqtP7jdD8AFocJuxC6_Zz0HepAWL9cF&t=0s
If you find this helpful, don’t forget to like, subscribe, and share!