Skip to content

Updates to quickstart main page, build model page based on feedback #31

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 85 commits into from
Dec 10, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
85 commits
Select commit Hold shift + click to select a range
ce31579
added basic devcontainer
sethjuarez Nov 2, 2020
a0ce8e1
added more complete devcontainer
sethjuarez Nov 2, 2020
390e283
added auto reload for in browser editing
sethjuarez Nov 2, 2020
6271276
moved pip packages for codespaces to the end
sethjuarez Nov 2, 2020
762ca9d
corrected auto preview
sethjuarez Nov 2, 2020
31cb408
added better ignore criteria for watch
sethjuarez Nov 2, 2020
97300e3
added quickstart and folder for sub-docs
sethjuarez Nov 3, 2020
5aa2c24
added staging build for reviews
sethjuarez Nov 4, 2020
1a023b4
corrected yaml error
sethjuarez Nov 4, 2020
3588209
another yaml correction
sethjuarez Nov 4, 2020
ebf5aa8
revised yaml
sethjuarez Nov 4, 2020
fa26938
changed job name
sethjuarez Nov 4, 2020
4cb5ce2
added tutorial files and work on data and tensors
cassiebreviu Nov 4, 2020
f86760a
fixing syntax issues
cassiebreviu Nov 5, 2020
fccac71
fix syntax
cassiebreviu Nov 5, 2020
d527c8c
renamed file
cassiebreviu Nov 5, 2020
031b9df
syntax fix
cassiebreviu Nov 5, 2020
2bcdc8c
Merge pull request #1 from cassieview/seth-blitz
sethjuarez Nov 5, 2020
bd782b0
Add tensors, autograd tutorials
shwars Nov 5, 2020
40a261c
more work on quickstart
cassiebreviu Nov 5, 2020
1fff4d1
updated optimization tutorial
cassiebreviu Nov 5, 2020
41e866a
added more links to quickstart main page
cassiebreviu Nov 5, 2020
86e5834
Merge pull request #2 from shwars/seth-blitz
sethjuarez Nov 5, 2020
ef51239
Merge branch 'seth-blitz' of https://github.com/cassieview/tutorials …
sethjuarez Nov 5, 2020
6cf32e3
Merge branch 'cassieview-seth-blitz' into seth-blitz
sethjuarez Nov 5, 2020
7d62395
merged content
sethjuarez Nov 5, 2020
ad410b6
removed links at bottom of page causing error
cassiebreviu Nov 5, 2020
9a740b6
Merge branch 'seth-blitz' into seth-blitz
cassiebreviu Nov 5, 2020
3e60a85
Merge pull request #4 from cassieview/seth-blitz
cassiebreviu Nov 5, 2020
c7d4159
fix data formatting
cassiebreviu Nov 6, 2020
d4e4cfd
Merge branch 'seth-blitz' of https://github.com/cassieview/tutorials …
cassiebreviu Nov 6, 2020
506eac2
fix on optimization formatting
cassiebreviu Nov 6, 2020
edffc4a
more updates on sections and formatting
cassiebreviu Nov 6, 2020
45767c2
Merge pull request #5 from cassieview/seth-blitz
cassiebreviu Nov 6, 2020
72f5848
work on autograd
cassiebreviu Nov 6, 2020
486bf5b
more format work
cassiebreviu Nov 6, 2020
675bfe5
Merge pull request #6 from cassieview/seth-blitz
cassiebreviu Nov 6, 2020
2c72883
add images
cassiebreviu Nov 6, 2020
8f249aa
add images to data and optimization
cassiebreviu Nov 6, 2020
54a5bfb
Merge pull request #7 from cassieview/seth-blitz
cassiebreviu Nov 6, 2020
911a07a
moved images fixed links
cassiebreviu Nov 9, 2020
783b83c
Merge pull request #8 from cassieview/seth-blitz
cassiebreviu Nov 9, 2020
c881fe3
more work on quickstart
cassiebreviu Nov 9, 2020
2a1d814
Merge pull request #9 from cassieview/seth-blitz
cassiebreviu Nov 9, 2020
8592289
more updates to make quickstart page
cassiebreviu Nov 9, 2020
f610061
Merge pull request #10 from cassieview/seth-blitz
cassiebreviu Nov 9, 2020
89d5902
fix subsections on main quickstart
cassiebreviu Nov 9, 2020
52285c6
Merge pull request #11 from cassieview/seth-blitz
cassiebreviu Nov 9, 2020
ad5023a
updates to main, data and model
cassiebreviu Nov 9, 2020
67bba76
Merge pull request #12 from cassieview/seth-blitz
cassiebreviu Nov 9, 2020
5175725
format updates
cassiebreviu Nov 9, 2020
dc8bc32
Merge pull request #13 from cassieview/seth-blitz
cassiebreviu Nov 9, 2020
3bd01bb
fix link, fix format, fix stuff
cassiebreviu Nov 9, 2020
e49761a
Merge pull request #14 from cassieview/seth-blitz
cassiebreviu Nov 9, 2020
374f8cf
more formatting
cassiebreviu Nov 9, 2020
9404252
Merge pull request #15 from cassieview/seth-blitz
cassiebreviu Nov 9, 2020
138420a
fix image and format
cassiebreviu Nov 10, 2020
3a34561
Merge pull request #16 from cassieview/seth-blitz
cassiebreviu Nov 10, 2020
b01c6c7
tensor updates
cassiebreviu Nov 10, 2020
8e06cf1
Merge pull request #17 from cassieview/seth-blitz
cassiebreviu Nov 10, 2020
ee60f9f
format fixes
cassiebreviu Nov 10, 2020
99f3613
more formatting
cassiebreviu Nov 10, 2020
398a0ed
Merge pull request #18 from cassieview/seth-blitz
cassiebreviu Nov 10, 2020
43716fb
fix note formatting and optimzation text
cassiebreviu Nov 10, 2020
7e2881d
img rename optmization formatting
cassiebreviu Nov 10, 2020
5485db3
Merge pull request #20 from cassieview/seth-blitz
cassiebreviu Nov 10, 2020
0823df8
updated links, next text, model format
cassiebreviu Nov 10, 2020
316a91b
Merge pull request #21 from cassieview/seth-blitz
cassiebreviu Nov 10, 2020
6319625
fix transforms and autograd link
cassiebreviu Nov 11, 2020
91249f2
Merge pull request #22 from cassieview/seth-blitz
cassiebreviu Nov 11, 2020
d18db14
fixers gonna fix
cassiebreviu Nov 11, 2020
9fb1450
Fixed formatting, move autograd before optimization
shwars Nov 11, 2020
feeb669
Merge pull request #23 from shwars/seth-blitz
cassiebreviu Nov 11, 2020
5bb6be1
Merge remote-tracking branch 'upstream/seth-blitz' into seth-blitz
cassiebreviu Nov 11, 2020
ab22cdc
Merge branch 'seth-blitz' into seth-blitz
cassiebreviu Dec 7, 2020
ed51b90
Merge branch 'seth-blitz' of https://github.com/sethjuarez/tutorials …
cassiebreviu Dec 7, 2020
45a02c6
Added more detail to the intro of the quickstart
cassiebreviu Dec 7, 2020
ac96721
primitive dataset text update
cassiebreviu Dec 8, 2020
0a3c261
fix build model
cassiebreviu Dec 8, 2020
ab9c2ce
updated quickstart to class model
cassiebreviu Dec 9, 2020
202bda0
Merge branch 'seth-blitz' into seth-blitz
cassiebreviu Dec 9, 2020
31efd4b
updates to load class model, build model page work
cassiebreviu Dec 10, 2020
b36cdad
Merge branch 'seth-blitz' of https://github.com/cassieview/tutorials …
cassiebreviu Dec 10, 2020
0c595c0
more work on build model
cassiebreviu Dec 10, 2020
48f0315
Merge branch 'seth-blitz' into seth-blitz
cassiebreviu Dec 10, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
96 changes: 66 additions & 30 deletions beginner_source/quickstart/build_model_tutorial.py
Original file line number Diff line number Diff line change
@@ -1,25 +1,22 @@
"""
Build Model Tutorial
=======================================
"""

###############################################
# The data has been loaded and transformed we can now build the model.
# We will leverage `torch.nn <https://pytorch.org/docs/stable/nn.html>`_
The data has been loaded and transformed we can now build the model.
We will leverage `torch.nn <https://pytorch.org/docs/stable/nn.html>`_ predefined layers that PyTorch has that can simplify our code.

# predefined layers that PyTorch has that can simplify our code.
#
# In the below example, for our FashionMNIT image dataset, we are using a `Sequential`
# container from class `torch.nn. Sequential <https://pytorch.org/docs/stable/generated/torch.nn.Sequential.html>`_
# that allows us to define the model layers inline.
# The neural network modules layers will be added to it in the order they are passed in.
#
# Another way to bulid this model is with a class
# using `nn.Module <https://pytorch.org/docs/stable/generated/torch.nn.Module.html)>`_ This gives us more flexibility, because
# we can construct layers of any complexity, including the ones with shared weights.
#
# Lets break down the steps to build this model below
#
In the below example, for our FashionMNIT image dataset, we are using a `Sequential`
container from class `torch.nn. Sequential <https://pytorch.org/docs/stable/generated/torch.nn.Sequential.html>`_
that allows us to define the model layers inline. In the "Sequential" in-line model building format the ``forward()``
method is created for you and the modules you add are passed in as a list or dictionary in the order that are they are defined.

Another way to bulid this model is with a class
using `nn.Module <https://pytorch.org/docs/stable/generated/torch.nn.Module.html)>`_
A big plus with using a class that inherits ``nn.Module`` is better parameter management across all nested submodules.
This gives us more flexibility, because we can construct layers of any complexity, including the ones with shared weights.

Lets break down the steps to build this model below
"""

##########################################
# Inline nn.Sequential Example:
Expand Down Expand Up @@ -86,6 +83,15 @@ def forward(self, x):
device = 'cuda' if torch.cuda.is_available() else 'cpu'
print('Using {} device'.format(device))

##############################################
# __init__
# -------------------------
#
# The ``init`` function inherits from ``nn.Module`` which is the base class for
# building neural network modules. This function defines the layers in your neural network
# then it initializes the modules to be called in the ``forward`` function.
#

##############################################
# The Model Module Layers
# -------------------------
Expand All @@ -94,30 +100,29 @@ def forward(self, x):
#

##################################################
# `nn.Flatten <https://pytorch.org/docs/stable/generated/torch.nn.Flatten.html>`_ to reduce tensor dimensions to one.
# `nn.Flatten <https://pytorch.org/docs/stable/generated/torch.nn.Flatten.html>`_
# -----------------------------------------------
#
# First we call nn.Flatten to reduce tensor dimensions to one.
#
# From the docs:
# ``torch.nn.Flatten(start_dim: int = 1, end_dim: int = -1)``
# Here is an example using one of the training_data set items:
tensor = training_data[0][0]
print(tensor.size())

# Output: torch.Size([1, 28, 28])

model = nn.Sequential(
nn.Flatten()
)
flattened_tensor = model(tensor)
flattened_tensor.size()

# Output: torch.Size([1, 784])

##############################################
# `nn.Linear <https://pytorch.org/docs/stable/generated/torch.nn.Linear.html>`_ to add a linear layer to the model.
# -------------------------------
#
# Now that we have flattened our tensor dimension we will apply a linear layer transform that will calculate/learn the weights and the bias.
# Now that we have flattened our tensor dimension we will apply a linear layer
# transform that will calculate/learn the weights and the bias.
#

# From the docs:
Expand All @@ -134,17 +139,48 @@ def forward(self, x):
output = model(input)
output.size()


# Output:
# torch.Size([1, 28, 28])
# torch.Size([1, 512])

#################################################
# Activation Functions
# -------------------------
#
# - `nn.ReLU <https://pytorch.org/docs/stable/generated/torch.nn.ReLU.html)>`_ Activation
# - `nn.Softmax <https://pytorch.org/docs/stable/generated/torch.nn.Softmax.html>`_ Activation
# After the first two linear layer we will call the `nn.ReLU <https://pytorch.org/docs/stable/generated/torch.nn.ReLU.html)>`_
# activation function. Then after the third linear layer we call the `nn.Softmax <https://pytorch.org/docs/stable/generated/torch.nn.Softmax.html>`_
# activation to rescale between 0 and 1 and sum to one.
#

model = nn.Sequential(
nn.Flatten(),
nn.Linear(28*28, 512),
nn.ReLU(),
nn.Linear(512, 512),
nn.ReLU(),
nn.Linear(512, len(classes)),
nn.Softmax(dim=1)
).to(device)

print(model)


###################################################
# Forward Function
# --------------------------------
#
# In the class implementation of the neural network we define a ``forward`` function.
# Then call the ``NeuralNetwork``class and assign the device. When training the model we will call ``model``
# and pass the data (x) into the forward function and through each layer of our network.
#
#
def forward(self, x):
x = self.flatten(x)
x = F.relu(self.layer1(x))
x = F.relu(self.layer2(x))
x = self.output(x)
return F.softmax(x, dim=1)
model = NeuralNetwork().to(device)


################################################
# In the next section you will learn about how to train the model and the optimization loop for this example.
#
# Next: Learn more about how the `optimzation loop works with this example <optimization_tutorial.html>`_.
#
Expand Down
11 changes: 1 addition & 10 deletions beginner_source/quickstart/save_load_run_tutorial.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,16 +42,7 @@
# These two steps are illustrated here:

# recreate model
loaded_model = nn.Sequential(
nn.Flatten(),
nn.Linear(28*28, 512),
nn.ReLU(),
nn.Linear(512, 512),
nn.ReLU(),
nn.Linear(512, len(classes)),
nn.Softmax(dim=1)
)

loaded_model = NeuralNetwork()
# hydrate state dictionary
loaded_model.load_state_dict(torch.load('model.pth'))

Expand Down
10 changes: 1 addition & 9 deletions beginner_source/quickstart_tutorial.py
Original file line number Diff line number Diff line change
Expand Up @@ -178,15 +178,7 @@ def test(dataloader, model):
# inference). Check out more details on `saving, loading and running models with Pytorch <quickstart/save_load_run_tutorial.html>`_
#

loaded_model = nn.Sequential(
nn.Flatten(),
nn.Linear(28*28, 512),
nn.ReLU(),
nn.Linear(512, 512),
nn.ReLU(),
nn.Linear(512, len(classes)),
nn.Softmax(dim=1)
)
loaded_model = NeuralNetwork()

loaded_model.load_state_dict(torch.load('model.pth'))
loaded_model.eval()
Expand Down