Skip to content

Latest commit

 

History

History
6 lines (4 loc) · 903 Bytes

README.md

File metadata and controls

6 lines (4 loc) · 903 Bytes

GRU RNN

A minimal and elaborately commented implementation of a recurrent neural network with GRUs (Gated Recurrent Units, Cho et al.) applied to predict the next character in a document given a series of preceding characters in a similar way as Andrej Karpathy's minimal ordinary RNN implementation.

To run this implementation an install of Python 3.x and Numpy is required. Running the main.py script will start adapting the model to predict character sequences in input.txt through backpropagation. Once every 100 model updates the model will sample and print a piece of fully predicted text.

The given input.txt contains 40 paragraphs of lorem ipsum. After training, the resulting model produces fun pseudo-Latin.