Skip to content

Commit 3ce5712

Browse files
committed
Links pointing to readme's header
1 parent 060a81f commit 3ce5712

File tree

12 files changed

+225
-225
lines changed

12 files changed

+225
-225
lines changed

01_Introduction/readme.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -2,19 +2,19 @@
22

33
This chapter intends to introduce the main objects and concepts in TensorFlow. We also introduce how to access the data for the rest of the book and provide additional resources for learning about TensorFlow.
44

5-
1. [How TensorFlow Works (General Outline of TF Algorithms)](01_How_TensorFlow_Works)
5+
1. [How TensorFlow Works (General Outline of TF Algorithms)](01_How_TensorFlow_Works#introduction-to-how-tensorflow-graphs-work)
66
* Here we introduce TensorFlow and the general outline of how most TensorFlow algorithms work.
7-
2. [Creating and Using Tensors](02_Creating_and_Using_Tensors)
7+
2. [Creating and Using Tensors](02_Creating_and_Using_Tensors#creating-and-using-tensors)
88
* How to create and initialize tensors in TensorFlow. We also depict how these operations appear in Tensorboard.
9-
3. [Using Variables and Placeholders](03_Using_Variables_and_Placeholders)
9+
3. [Using Variables and Placeholders](03_Using_Variables_and_Placeholders#variables-and-placeholders)
1010
* How to create and use variables and placeholders in TensorFlow. We also depict how these operations appear in Tensorboard.
11-
4. [Working with Matrices](04_Working_with_Matrices)
11+
4. [Working with Matrices](04_Working_with_Matrices#working-with-matrices)
1212
* Understanding how TensorFlow can work with matrices is crucial to understanding how the algorithms work.
13-
5. [Declaring Operations](05_Declaring_Operations)
13+
5. [Declaring Operations](05_Declaring_Operations#declaring-operations)
1414
* How to use various mathematical operations in TensorFlow.
15-
6. [Implementing Activation Functions](06_Implementing_Activation_Functions)
15+
6. [Implementing Activation Functions](06_Implementing_Activation_Functions#activation-functions)
1616
* Activation functions are unique functions that TensorFlow has built in for your use in algorithms.
17-
7. [Working with Data Sources](07_Working_with_Data_Sources)
17+
7. [Working with Data Sources](07_Working_with_Data_Sources#data-source-information)
1818
* Here we show how to access all the various required data sources in the book. There are also links describing the data sources and where they come from.
19-
8. [Additional Resources](08_Additional_Resources)
19+
8. [Additional Resources](08_Additional_Resources#additional-resources)
2020
* Mostly official resources and papers. The papers are TensorFlow papers or Deep Learning resources.

02_TensorFlow_Way/readme.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -2,19 +2,19 @@
22

33
After we have established the basic objects and methods in TensorFlow, we now want to establish the components that make up TensorFlow algorithms. We start by introducing computational graphs, and then move to loss functions and back propagation. We end with creating a simple classifier and then show an example of evaluating regression and classification algorithms.
44

5-
1. [One Operation as a Computational Graph](01_Operations_as_a_Computational_Graph)
5+
1. [One Operation as a Computational Graph](01_Operations_as_a_Computational_Graph#operations-as-a-computational-graph)
66
* We show how to create an operation on a computational graph and how to visualize it using Tensorboard.
7-
2. [Layering Nested Operations](02_Layering_Nested_Operations)
7+
2. [Layering Nested Operations](02_Layering_Nested_Operations#multiple-operations-on-a-computational-graph)
88
* We show how to create multiple operations on a computational graph and how to visualize them using Tensorboard.
9-
3. [Working with Multiple Layers](03_Working_with_Multiple_Layers)
9+
3. [Working with Multiple Layers](03_Working_with_Multiple_Layers#working-with-multiple-layers)
1010
* Here we extend the usage of the computational graph to create multiple layers and show how they appear in Tensorboard.
11-
4. [Implementing Loss Functions](04_Implementing_Loss_Functions)
11+
4. [Implementing Loss Functions](04_Implementing_Loss_Functions#implementing-loss-functions)
1212
* In order to train a model, we must be able to evaluate how well it is doing. This is given by loss functions. We plot various loss functions and talk about the benefits and limitations of some.
13-
5. [Implementing Back Propagation](05_Implementing_Back_Propagation)
13+
5. [Implementing Back Propagation](05_Implementing_Back_Propagation#implementing-back-propagation)
1414
* Here we show how to use loss functions to iterate through data and back propagate errors for regression and classification.
15-
6. [Working with Stochastic and Batch Training](06_Working_with_Batch_and_Stochastic_Training)
15+
6. [Working with Stochastic and Batch Training](06_Working_with_Batch_and_Stochastic_Training#working-with-batch-and-stochastic-training)
1616
* TensorFlow makes it easy to use both batch and stochastic training. We show how to implement both and talk about the benefits and limitations of each.
17-
7. [Combining Everything Together](07_Combining_Everything_Together)
17+
7. [Combining Everything Together](07_Combining_Everything_Together#combining-everything-together)
1818
* We now combine everything together that we have learned and create a simple classifier.
19-
8. [Evaluating Models](08_Evaluating_Models)
19+
8. [Evaluating Models](08_Evaluating_Models#evaluating-models)
2020
* Any model is only as good as it's evaluation. Here we show two examples of (1) evaluating a regression algorithm and (2) a classification algorithm.

03_Linear_Regression/readme.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -2,19 +2,19 @@
22

33
Here we show how to implement various linear regression techniques in TensorFlow. The first two sections show how to do standard matrix linear regression solving in TensorFlow. The remaining six sections depict how to implement various types of regression using computational graphs in TensorFlow.
44

5-
1. [Using the Matrix Inverse Method](01_Using_the_Matrix_Inverse_Method)
5+
1. [Using the Matrix Inverse Method](01_Using_the_Matrix_Inverse_Method#using-the-matrix-inverse-method)
66
* How to solve a 2D regression with a matrix inverse in TensorFlow.
7-
2. [Implementing a Decomposition Method](02_Implementing_a_Decomposition_Method)
7+
2. [Implementing a Decomposition Method](02_Implementing_a_Decomposition_Method#using-the-cholesky-decomposition-method)
88
* Solving a 2D linear regression with Cholesky decomposition.
9-
3. [Learning the TensorFlow Way of Linear Regression](03_TensorFlow_Way_of_Linear_Regression)
9+
3. [Learning the TensorFlow Way of Linear Regression](03_TensorFlow_Way_of_Linear_Regression#learning-the-tensorflow-way-of-regression)
1010
* Linear regression iterating through a computational graph with L2 Loss.
11-
4. [Understanding Loss Functions in Linear Regression](04_Loss_Functions_in_Linear_Regressions)
11+
4. [Understanding Loss Functions in Linear Regression](04_Loss_Functions_in_Linear_Regressions#loss-functions-in-linear-regression)
1212
* L2 vs L1 loss in linear regression. We talk about the benefits and limitations of both.
13-
5. [Implementing Deming Regression (Total Regression)](05_Implementing_Deming_Regression)
13+
5. [Implementing Deming Regression (Total Regression)](05_Implementing_Deming_Regression#implementing-deming-regression)
1414
* Deming (total) regression implemented in TensorFlow by changing the loss function.
15-
6. [Implementing Lasso and Ridge Regression](06_Implementing_Lasso_and_Ridge_Regression)
15+
6. [Implementing Lasso and Ridge Regression](06_Implementing_Lasso_and_Ridge_Regression#implementing-lasso-and-ridge-regression)
1616
* Lasso and Ridge regression are ways of regularizing the coefficients. We implement both of these in TensorFlow via changing the loss functions.
17-
7. [Implementing Elastic Net Regression](07_Implementing_Elasticnet_Regression)
17+
7. [Implementing Elastic Net Regression](07_Implementing_Elasticnet_Regression#implementing-elasticnet-regression)
1818
* Elastic net is a regularization technique that combines the L2 and L1 loss for coefficients. We show how to implement this in TensorFlow.
19-
8. [Implementing Logistic Regression](08_Implementing_Logistic_Regression)
19+
8. [Implementing Logistic Regression](08_Implementing_Logistic_Regression#implementing-logistic-regression)
2020
* We implement logistic regression by the use of an activation function in our computational graph.

04_Support_Vector_Machines/readme.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -2,15 +2,15 @@
22

33
This chapter shows how to implement various SVM methods with TensorFlow. We first create a linear SVM and also show how it can be used for regression. We then introduce kernels (RBF Gaussian kernel) and show how to use it to split up non-linear data. We finish with a multi-dimensional implementation of non-linear SVMs to work with multiple classes.
44

5-
1. [Introduction](01_Introduction)
5+
1. [Introduction](01_Introduction#support-vector-machine-introduction)
66
* We introduce the concept of SVMs and how we will go about implementing them in the TensorFlow framework.
7-
2. [Working with Linear SVMs](02_Working_with_Linear_SVMs)
7+
2. [Working with Linear SVMs](02_Working_with_Linear_SVMs#working-with-linear-svms)
88
* We create a linear SVM to separate I. setosa based on sepal length and pedal width in the Iris data set.
9-
3. [Reduction to Linear Regression](03_Reduction_to_Linear_Regression)
9+
3. [Reduction to Linear Regression](03_Reduction_to_Linear_Regression#svm-reduction-to-linear-regression)
1010
* The heart of SVMs is separating classes with a line. We change tweek the algorithm slightly to perform SVM regression.
11-
4. [Working with Kernels in TensorFlow](04_Working_with_Kernels)
11+
4. [Working with Kernels in TensorFlow](04_Working_with_Kernels#working-with-kernels)
1212
* In order to extend SVMs into non-linear data, we explain and show how to implement different kernels in TensorFlow.
13-
5. [Implementing Non-Linear SVMs](05_Implementing_Nonlinear_SVMs)
13+
5. [Implementing Non-Linear SVMs](05_Implementing_Nonlinear_SVMs#implementing-nonlinear-svms)
1414
* We use the Gaussian kernel (RBF) to separate non-linear classes.
15-
6. [Implementing Multi-class SVMs](06_Implementing_Multiclass_SVMs)
15+
6. [Implementing Multi-class SVMs](06_Implementing_Multiclass_SVMs#implementing-multiclass-svms)
1616
* SVMs are inherently binary predictors. We show how to extend them in a one-vs-all strategy in TensorFlow.

05_Nearest_Neighbor_Methods/readme.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -2,15 +2,15 @@
22

33
Nearest Neighbor methods are a very popular ML algorithm. We show how to implement k-Nearest Neighbors, weighted k-Nearest Neighbors, and k-Nearest Neighbors with mixed distance functions. In this chapter we also show how to use the Levenshtein distance (edit distance) in TensorFlow, and use it to calculate the distance between strings. We end this chapter with showing how to use k-Nearest Neighbors for categorical prediction with the MNIST handwritten digit recognition.
44

5-
1. [Introduction](01_Introduction)
5+
1. [Introduction](01_Introduction#nearest-neighbor-methods-introduction)
66
* We introduce the concepts and methods needed for performing k-Nearest Neighbors in TensorFlow.
7-
2. [Working with Nearest Neighbors](02_Working_with_Nearest_Neighbors)
7+
2. [Working with Nearest Neighbors](02_Working_with_Nearest_Neighbors#working-with-nearest-neighbors)
88
* We create a nearest neighbor algorithm that tries to predict housing worth (regression).
9-
3. [Working with Text Based Distances](03_Working_with_Text_Distances)
9+
3. [Working with Text Based Distances](03_Working_with_Text_Distances#working-with-text-distances)
1010
* In order to use a distance function on text, we show how to use edit distances in TensorFlow.
11-
4. [Computing Mixing Distance Functions](04_Computing_with_Mixed_Distance_Functions)
11+
4. [Computing Mixing Distance Functions](04_Computing_with_Mixed_Distance_Functions#computing-with-mixed-distance-functions)
1212
* Here we implement scaling of the distance function by the standard deviation of the input feature for k-Nearest Neighbors.
13-
5. [Using Address Matching](05_An_Address_Matching_Example)
13+
5. [Using Address Matching](05_An_Address_Matching_Example#an-address-matching-example)
1414
* We use a mixed distance function to match addresses. We use numerical distance for zip codes, and string edit distance for street names. The street names are allowed to have typos.
15-
6. [Using Nearest Neighbors for Image Recognition](06_Nearest_Neighbors_for_Image_Recognition)
15+
6. [Using Nearest Neighbors for Image Recognition](06_Nearest_Neighbors_for_Image_Recognition#nearest-neighbors-for-image-recognition)
1616
* The MNIST digit image collection is a great data set for illustration of how to perform k-Nearest Neighbors for an image classification task.

06_Neural_Networks/readme.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -2,19 +2,19 @@
22

33
Neural Networks are very important in machine learning and growing in popularity due to the major breakthroughs in prior unsolved problems. We must start with introducing 'shallow' neural networks, which are very powerful and can help us improve our prior ML algorithm results. We start by introducing the very basic NN unit, the operational gate. We gradually add more and more to the neural network and end with training a model to play tic-tac-toe.
44

5-
1. [Introduction](01_Introduction)
5+
1. [Introduction](01_Introduction#neural-networks-introduction)
66
* We introduce the concept of neural networks and how TensorFlow is built to easily handle these algorithms.
7-
2. [Implementing Operational Gates](02_Implementing_an_Operational_Gate)
7+
2. [Implementing Operational Gates](02_Implementing_an_Operational_Gate#implementing-an-operational-gate)
88
* We implement an operational gate with one operation. Then we show how to extend this to multiple nested operations.
9-
3. [Working with Gates and Activation Functions](03_Working_with_Activation_Functions)
9+
3. [Working with Gates and Activation Functions](03_Working_with_Activation_Functions#working-with-activation-functions)
1010
* Now we have to introduce activation functions on the gates. We show how different activation functions operate.
11-
4. [Implementing a One Layer Neural Network](04_Single_Hidden_Layer_Network)
11+
4. [Implementing a One Layer Neural Network](04_Single_Hidden_Layer_Network#implementing-a-one-layer-neural-network)
1212
* We have all the pieces to start implementing our first neural network. We do so here with regression on the Iris data set.
13-
5. [Implementing Different Layers](05_Implementing_Different_Layers)
13+
5. [Implementing Different Layers](05_Implementing_Different_Layers#implementing-different-layers)
1414
* This section introduces the convolution layer and the max-pool layer. We show how to chain these together in a 1D and 2D example with fully connected layers as well.
15-
6. [Using Multi-layer Neural Networks](06_Using_Multiple_Layers)
15+
6. [Using Multi-layer Neural Networks](06_Using_Multiple_Layers#using-multiple-layers)
1616
* Here we show how to functionalize different layers and variables for a cleaner multi-layer neural network.
17-
7. [Improving Predictions of Linear Models](07_Improving_Linear_Regression)
17+
7. [Improving Predictions of Linear Models](07_Improving_Linear_Regression#improving-linear-regression)
1818
* We show how we can improve the convergence of our prior logistic regression with a set of hidden layers.
19-
8. [Learning to Play Tic-Tac-Toe](08_Learning_Tic_Tac_Toe)
19+
8. [Learning to Play Tic-Tac-Toe](08_Learning_Tic_Tac_Toe#learning-to-play-tic-tac-toe)
2020
* Given a set of tic-tac-toe boards and corresponding optimal moves, we train a neural network classification model to play. At the end of the script, we can attempt to play against the trained model.
Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,16 @@
11
## Ch 7: Natural Language Processing
22

3-
1. [Introduction](01_Introduction)
3+
1. [Introduction](01_Introduction#natural-language-processing-introduction)
44
* We introduce methods for turning text into numerical vectors. We introduce the TensorFlow 'embedding' feature as well.
5-
2. [Working with Bag-of-Words](02_Working_with_Bag_of_Words)
5+
2. [Working with Bag-of-Words](02_Working_with_Bag_of_Words#working-with-bag-of-words)
66
* Here we use TensorFlow to do a one-hot-encoding of words called bag-of-words. We use this method and logistic regression to predict if a text message is spam or ham.
7-
3. [Implementing TF-IDF](03_Implementing_tf_idf)
7+
3. [Implementing TF-IDF](03_Implementing_tf_idf#implementing-tf-idf)
88
* We implement Text Frequency - Inverse Document Frequency (TFIDF) with a combination of Sci-kit Learn and TensorFlow. We perform logistic regression on TFIDF vectors to improve on our spam/ham text-message predictions.
9-
4. [Working with CBOW](04_Working_With_Skip_Gram_Embeddings)
9+
4. [Working with Skip-Gram](04_Working_With_Skip_Gram_Embeddings#working-with-skip-gram-embeddings)
1010
* Our first implementation of Word2Vec called, "skip-gram" on a movie review database.
11-
5. [Working with Skip-Gram](05_Working_With_CBOW_Embeddings)
11+
5. [Working with CBOW](05_Working_With_CBOW_Embeddings#working-with-cbow-embeddings)
1212
* Next, we implement a form of Word2Vec called, "CBOW" (Continuous Bag of Words) on a movie review database. We also introduce method to saving and loading word embeddings.
13-
6. [Implementing Word2Vec Example](06_Using_Word2Vec_Embeddings)
13+
6. [Implementing Word2Vec Example](06_Using_Word2Vec_Embeddings#using-word2vec-embeddings)
1414
* In this example, we use the prior saved CBOW word embeddings to improve on our TF-IDF logistic regression of movie review sentiment.
15-
7. [Performing Sentiment Analysis with Doc2Vec](07_Sentiment_Analysis_With_Doc2Vec)
15+
7. [Performing Sentiment Analysis with Doc2Vec](07_Sentiment_Analysis_With_Doc2Vec#sentiment-analysis-with-doc2vec)
1616
* Here, we introduce a Doc2Vec method (concatenation of doc and word embeddings) to improve out logistic model of movie review sentiment.
Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,14 @@
11
## Ch 8: Convolutional Neural Networks
22

3-
1. [Introduction](01_Intro_to_CNN)
3+
1. [Introduction](01_Intro_to_CNN#introduction-to-convolutional-neural-networks)
44
* We introduce convolutional neural networks (CNN), and how we can use them in TensorFlow.
5-
2. [Implementing a Simple CNN.](02_Intro_to_CNN_MNIST)
5+
2. [Implementing a Simple CNN.](02_Intro_to_CNN_MNIST#introduction-to-cnn-with-mnist)
66
* Here, we show how to create a CNN architecture that performs well on the MNIST digit recognition task.
7-
3. [Implementing an Advanced CNN.](03_CNN_CIFAR10)
7+
3. [Implementing an Advanced CNN.](03_CNN_CIFAR10#cifar-10-cnn)
88
* In this example, we show how to replicate an architecture for the CIFAR-10 image recognition task.
9-
4. [Retraining an Existing Architecture.](04_Retraining_Current_Architectures)
9+
4. [Retraining an Existing Architecture.](04_Retraining_Current_Architectures#retraining-fine-tuning-current-cnn-architectures)
1010
* We show how to download and setup the CIFAR-10 data for the [TensorFlow retraining/fine-tuning tutorial.](https://github.com/tensorflow/models/tree/master/inception)
11-
5. [Using Stylenet/NeuralStyle.](05_Stylenet_NeuralStyle)
11+
5. [Using Stylenet/NeuralStyle.](05_Stylenet_NeuralStyle#stylenet--neural-style)
1212
* In this recipe, we show a basic implementation of using Stylenet or Neuralstyle.
13-
6. [Implementing Deep Dream.](06_Deepdream)
13+
6. [Implementing Deep Dream.](06_Deepdream#deepdream-in-tensorflow)
1414
* This script shows a line-by-line explanation of TensorFlow's deepdream tutorial. Taken from [Deepdream on TensorFlow](https://github.com/tensorflow/tensorflow/tree/master/tensorflow/examples/tutorials/deepdream). Note that the code here is converted to Python 3.

0 commit comments

Comments
 (0)