|
1 | 1 | """ |
2 | | -Build Model Tutorial |
3 | | -============================ |
| 2 | +Build the Neural Netowrk |
| 3 | +=================== |
4 | 4 | """ |
5 | 5 |
|
| 6 | +################################################################# |
| 7 | +# Get Started Building the Model |
| 8 | +# ----------------- |
| 9 | +# |
| 10 | +# The data has been loaded and transformed we can now build the model. |
| 11 | +# We will leverage `torch.nn <https://pytorch.org/docs/stable/nn.html>`_ predefined layers that PyTorch has that can simplify our code. |
| 12 | +# |
| 13 | +# In the below example, for our FashionMNIT image dataset, we are using a `Sequential` |
| 14 | +# container from class `torch.nn. Sequential <https://pytorch.org/docs/stable/generated/torch.nn.Sequential.html>`_ |
| 15 | +# that allows us to define the model layers inline. In the "Sequential" in-line model building format the ``forward()`` |
| 16 | +# method is created for you and the modules you add are passed in as a list or dictionary in the order that are they are defined. |
| 17 | +# |
| 18 | +# Another way to bulid this model is with a class |
| 19 | +# using `nn.Module <https://pytorch.org/docs/stable/generated/torch.nn.Module.html)>`_ |
| 20 | +# A big plus with using a class that inherits ``nn.Module`` is better parameter management across all nested submodules. |
| 21 | +# This gives us more flexibility, because we can construct layers of any complexity, including the ones with shared weights. |
| 22 | +# |
| 23 | +# Lets break down the steps to build this model below |
| 24 | +# |
| 25 | + |
6 | 26 | ########################################## |
7 | 27 | # The data has been loaded and transformed we can now build the model. |
8 | 28 | # We will leverage `torch.nn <https://pytorch.org/docs/stable/nn.html>`_ predefined layers that PyTorch has that can simplify our code. |
@@ -109,6 +129,7 @@ def forward(self, x): |
109 | 129 | # |
110 | 130 | # From the docs: |
111 | 131 | # ``torch.nn.Flatten(start_dim: int = 1, end_dim: int = -1)`` |
| 132 | +# |
112 | 133 | # Here is an example using one of the training_data set items: |
113 | 134 | tensor = training_data[0][0] |
114 | 135 | print(tensor.size()) |
@@ -168,7 +189,7 @@ def forward(self, x): |
168 | 189 | # -------------------------------- |
169 | 190 | # |
170 | 191 | # In the class implementation of the neural network we define a ``forward`` function. |
171 | | -# Then call the ``NeuralNetwork``class and assign the device. When training the model we will call ``model`` |
| 192 | +# Then call the ``NeuralNetwork`` class and assign the device. When training the model we will call ``model`` |
172 | 193 | # and pass the data (x) into the forward function and through each layer of our network. |
173 | 194 | # |
174 | 195 | # |
|
0 commit comments