Learn about watsonx → https://ibm.biz/BdvxRB
Long Short Term Memory, also known as LSTMs, are a special kind of Recurrent Neural Network, or RNN, architecture capable of learning long-term dependencies as well as a solution to the vanishing gradient problem that can occur when training traditional RNNs.
In this lightboard video, Martin Keen with IBM, breaks down why we need LSTMs to address the problem of long-term dependencies, how the cell state and its various gates help transfer relative information in a sequence chain, and a few key LSTM use cases.
#LSTM #RNN #AI