Skip to content

Files

Latest commit

2d36c9e · Apr 17, 2017

History

History
This branch is 75 commits behind nfmcclure/tensorflow_cookbook:master.

09_Recurrent_Neural_Networks

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
Jan 28, 2017
Mar 30, 2017
Mar 13, 2017
Apr 1, 2017
Apr 17, 2017
Apr 7, 2017
Mar 22, 2017
Feb 8, 2017

Ch 9: Recurrent Neural Networks

  1. Introduction
  • We introduce Recurrent Neural Networks and how they are able to feed in a sequence and predict either a fixed target (categorical/numerical) or another sequence (sequence to sequence).
  1. Implementing an RNN Model for Spam Prediction
  • We create an RNN model to improve on our spam/ham SMS text predictions.
  1. Implementing an LSTM Model for Text Generation
  • We show how to implement a LSTM (Long Short Term Memory) RNN for Shakespeare language generation. (Word level vocabulary)
  1. Stacking Multiple LSTM Layers
  • We stack multiple LSTM layers to improve on our Shakespeare language generation. (Character level vocabulary)
  1. Creating a Sequence to Sequence Translation Model (Seq2Seq)
  • We show how to use TensorFlow's sequence-to-sequence models to train an English-German translation model.
  1. Training a Siamese Similarity Measure
  • Here, we implement a Siamese RNN to predict the similarity of addresses and use it for record matching. Using RNNs for record matching is very versatile, as we do not have a fixed set of target categories and can use the trained model to predict similarities across new addresses.