You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# Hyperparameters are adjustable parameters that let you control the model optimization process. For example, with neural networks, you can configure:
6
+
###############################################
7
+
# The data has been loaded and transformed we can now build the model.
8
+
# We will leverage `torch.nn <https://pytorch.org/docs/stable/nn.html>`_
9
+
# predefined layers that Pytorch has that can simplify our code.
10
+
#
11
+
# In the below example, for our FashionMNIT image dataset, we are using a `Sequential`
12
+
# container from class `torch.nn. Sequential <https://pytorch.org/docs/stable/generated/torch.nn.Sequential.html>`_
13
+
# that allows us to define the model layers inline.
14
+
# The neural network modules layers will be added to it in the order they are passed in.
15
+
#
16
+
# Another way to bulid this model is with a class
17
+
# using `nn.Module <https://pytorch.org/docs/stable/generated/torch.nn.Module.html)>`_ This gives us more flexibility, because
18
+
# we can construct layers of any complexity, including the ones with shared weights.
18
19
#
19
-
# - **Number of Epochs**- the number times iterate over the dataset to update model parameters
20
-
# - **Batch Size** - the number of samples in the dataset to evaluate before you update model parameters
21
-
# - **Cost Function** - the method used to decide how to evaluate the model on a data sample to update the model parameters
22
-
# - **Learning Rate** - how much to update models parameters at each batch/epoch set this to large and you won't update optimally if you set it to small you will learn really slowly
20
+
# Lets break down the steps to build this model below
# 1. The Train Loop - Core loop iterates over all the epochs
44
-
# 2. The Validation Loop - Validate loss after each weight parameter update and can be used to gauge hyper parameter performance and update them for the next batch.
45
-
# 3. The Test Loop - is used to evaluate our models performance after each epoch on traditional metrics to show how much our model is generalizing from the train and validation dataset to the test dataset it's never seen before.
# The standard method for optimization is called Stochastic Gradient Descent, to learn more check out this awesome video by `3blue1brown <https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi>`_. There are many different optimizers and variations of this method in PyTorch such as ADAM and RMSProp that work better for different kinds of models, they are out side the scope of this Blitz, but can check out the full list of optimizers `here <https://pytorch.org/docs/stable/optim.html>`_
Copy file name to clipboardExpand all lines: beginner_source/quickstart_tutorial.py
+4-2Lines changed: 4 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -3,7 +3,7 @@
3
3
===================
4
4
5
5
6
-
The basic machine learning concepts in any framework should include: Working with data, Creating models, Optimizing Parameters, Saving and Loading Models. In this quickstart we will go through an example of an applied machine learning model using the FashionMNIST dataset that demonstrates these core steps using Pytorch.
6
+
The basic machine learning concepts in any framework should include: Working with data, Creating models, Optimizing Parameters, Saving and Loading Models. In this PyTorch Quickstart we will go through these concepts and how to apply them with PyTorch. That dataset we will be using is the FashionMNIST clothing images dataset that demonstrates these core steps applied to create ML Models. You will be introduced to the complete ML workflow using PyTorch with links to learn more at each step. Using this dataset we will be able to predict if the image is one of the following classes: T-shirt/top, Trouser, Pullover, Dress, Coat, Sandal, Shirt, Sneaker, Bag, or Ankle boot. Lets get started!
# PyTorch has two basic data primitives: ``DataSet`` and ``DataLoader``.
15
-
# These ``DataSet`` objects include a ``transforms`` mechanism to
15
+
# The `torchvision.datasets` ``DataSet`` object includes a ``transforms`` mechanism to
16
16
# modify data in-place. Below is an example of how to load that data from the Pytorch open datasets and transform the data to a normalized tensor.
17
+
18
+
# This example is using the `torchvision.datasets` which is a subclass from the primitive `torch.utils.data.Dataset`. Note that the primitive dataset doesnt have the built in transforms param like the built in dataset in `torchvision.datasets.`
17
19
#
18
20
# To see more examples and details of how to work with Tensors, Datasets, DataLoaders and Transforms in Pytoch with this example checkout these resources:
0 commit comments