Skip to content

Multi-layer Recurrent Neural Networks (LSTM, RNN) for character-level language models in Python using Tensorflow

License

Notifications You must be signed in to change notification settings

memo/char-rnn-tensorflow

 
 

Repository files navigation

This is a fork of https://github.com/sherjilozair/char-rnn-tensorflow with modifications to enable the trained models to be used in other environments (e.g. ofxMSATensorFlow). Reasons as to why these changes are nessecary are described here.

After training, run:

sample.py with the --freeze_graph argument to prune, freeze and save the graph as a binary protobuf to be loaded in C++ (removing unnessecary nodes used in training, and replacing variables with consts). It also saves the character-index map as a text file.

sample_frozen.py demonstrates inference with the frozen graph from python. It also works in C++/openFrameworks.


char-rnn-tensorflow

Multi-layer Recurrent Neural Networks (LSTM, RNN) for character-level language models in Python using Tensorflow.

Inspired from Andrej Karpathy's char-rnn.

Requirements

Basic Usage

To train with default parameters on the tinyshakespeare corpus, run python train.py.

To sample from a checkpointed model, python sample.py.

Roadmap

  • Add explanatory comments
  • Expose more command-line arguments
  • Compare accuracy and performance with char-rnn

About

Multi-layer Recurrent Neural Networks (LSTM, RNN) for character-level language models in Python using Tensorflow

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%