RNN trains the sequential data and gets context information from the dataset. According to this information, classify the time series data and text data very well. But it has vanishing gradient problem for long sequence. For this problem LTSM is used. It contains gates. With forget gates, LSTM forgets less important information.
In this repo, training was started with Sentiment Analysis Dataset from Tensorflow and the model's success was shown with jupyter notebook.