Traditional artificial neural networks (ANN) have a memory problem : they cannot recall their previous reasoning about events to inform new ones.
If you have time-sequenced data, i.e., longitudinal data, you might want to consider using a Recurrent Neural Network (RNN) that allows information to flow from the past to the present using a feedback loop. Just as we experience the world and our sensors (senses) provide feedback to us constantly to reevaluate an unfolding situation.
Events are a connected series of vectors (or tensors) that have a way of bringing past experience to bear to the next step of the deliberation.
In recent years, RNN’s have enjoyed an “unreasonable success”, to quote Andrej Karpathy of Stanford University. These include speech recognition and categorization, language modeling and translation, image captioning and recognition at multiple points in a sequence of images, etc.
So RNNs add a feedback loop to the Neurons and allow information to flow without having a reset or restart everything they are executed.
Long Short-Term Memory (LSTM) is a variation of RNNs that have been most effective in processing these kinds of longitudinal data. We will further discuss them in a next entry.