Implementation of the paper Multiplicative LSTM for sequence modelling for Keras 2.0+.
Multiplicative LSTMs have been shown to achieve state-of-the-art or close to SotA results for sequence modelling datasets. They also perform better than stacked LSTM models for the Hutter-prize dataset and the raw wikipedia dataset.
From the paper, the change in the equations of the general LSTM are :
The size of m_t selected is same as that of h_t, therefore the size of all mLSTM models should be 1.25 times that of the equivalent LSTM model.
Add the multiplicative_lstm.py
script into your repository, and import the MultiplicativeLSTM layer.
Eg. You can replace Keras LSTM layers with MultiplicativeLSTM layers.
from multiplicative_lstm import MultiplicativeLSTM
While IMDB is not an idea dataset to compare LSTM models since they overfit rapidly, the weights for the two models have been provided to show the comparison.
- LSTM Score : 83.20% (overfits after 3 epochs)
- mLSTM Score : 83.27% (overfits after 7 epochs)