Skip to content

auzair92/models

This branch is 8256 commits behind tensorflow/models:master.

Folders and files

NameName
Last commit message
Last commit date
Aug 23, 2017
Aug 3, 2017
Aug 14, 2017
Sep 6, 2017
Aug 8, 2017
Aug 21, 2017
Jul 16, 2017
Jun 15, 2017
May 21, 2017
Jul 21, 2017
Sep 13, 2017
Sep 19, 2017
Sep 6, 2017
Mar 15, 2017
Aug 3, 2017
May 15, 2017
Mar 15, 2017
May 28, 2017
Mar 15, 2017
Apr 25, 2017
Sep 19, 2017
Sep 21, 2017
Jul 5, 2017
Sep 19, 2017
Aug 18, 2017
Jun 20, 2017
Aug 4, 2017
Jul 24, 2017
Aug 21, 2017
Sep 21, 2017
Jun 22, 2017
Aug 19, 2017
Aug 30, 2017
May 11, 2017
Jul 25, 2017
Sep 19, 2017
Jun 30, 2017
Sep 11, 2017
May 12, 2016
Apr 1, 2016
Sep 6, 2017
Jul 7, 2017
Apr 23, 2017
Mar 4, 2016
Aug 13, 2017
Nov 4, 2016
Jun 15, 2017

Repository files navigation

TensorFlow Models

This repository contains machine learning models implemented in TensorFlow. The models are maintained by their respective authors. To propose a model for inclusion, please submit a pull request.

Currently, the models are compatible with TensorFlow 1.0 or later. If you are running TensorFlow 0.12 or earlier, please upgrade your installation.

Models

  • adversarial_crypto: protecting communications with adversarial neural cryptography.
  • adversarial_text: semi-supervised sequence learning with adversarial training.
  • attention_ocr: a model for real-world image text extraction.
  • audioset: Models and supporting code for use with AudioSet.
  • autoencoder: various autoencoders.
  • cognitive_mapping_and_planning: implementation of a spatial memory based mapping and planning architecture for visual navigation.
  • compression: compressing and decompressing images using a pre-trained Residual GRU network.
  • differential_privacy: privacy-preserving student models from multiple teachers.
  • domain_adaptation: domain separation networks.
  • im2txt: image-to-text neural network for image captioning.
  • inception: deep convolutional networks for computer vision.
  • learning_to_remember_rare_events: a large-scale life-long memory module for use in deep learning.
  • lfads: sequential variational autoencoder for analyzing neuroscience data.
  • lm_1b: language modeling on the one billion word benchmark.
  • namignizer: recognize and generate names.
  • neural_gpu: highly parallel neural computer.
  • neural_programmer: neural network augmented with logic and mathematic operations.
  • next_frame_prediction: probabilistic future frame synthesis via cross convolutional networks.
  • object_detection: localizing and identifying multiple objects in a single image.
  • pcl_rl: code for several reinforcement learning algorithms, including Path Consistency Learning.
  • ptn: perspective transformer nets for 3D object reconstruction.
  • qa_kg: module networks for question answering on knowledge graphs.
  • real_nvp: density estimation using real-valued non-volume preserving (real NVP) transformations.
  • rebar: low-variance, unbiased gradient estimates for discrete latent variable models.
  • resnet: deep and wide residual networks.
  • skip_thoughts: recurrent neural network sentence-to-vector encoder.
  • slim: image classification models in TF-Slim.
  • street: identify the name of a street (in France) from an image using a Deep RNN.
  • swivel: the Swivel algorithm for generating word embeddings.
  • syntaxnet: neural models of natural language syntax.
  • textsum: sequence-to-sequence with attention model for text summarization.
  • transformer: spatial transformer network, which allows the spatial manipulation of data within the network.
  • tutorials: models described in the TensorFlow tutorials.
  • video_prediction: predicting future video frames with neural advection.

About

Models built with TensorFlow

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 78.8%
  • C++ 16.0%
  • HTML 1.9%
  • Shell 1.0%
  • Protocol Buffer 0.9%
  • Jupyter Notebook 0.9%
  • Other 0.5%