Skip to content

Commit

Permalink
Merge pull request #1068 from pritesh2000/gram-1/04
Browse files Browse the repository at this point in the history
04_pytorch_custom_datasets.ipynb
  • Loading branch information
mrdbourke authored Sep 6, 2024
2 parents 91eb2e0 + 545a460 commit 38e6e3e
Showing 1 changed file with 12 additions and 12 deletions.
24 changes: 12 additions & 12 deletions 04_pytorch_custom_datasets.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -166,7 +166,7 @@
"source": [
"## 1. Get data\n",
"\n",
"First thing's first we need some data.\n",
"First things first we need some data.\n",
"\n",
"And like any good cooking show, some data has already been prepared for us.\n",
"\n",
Expand Down Expand Up @@ -270,7 +270,7 @@
"\n",
"In our case, we have images of pizza, steak and sushi in standard image classification format.\n",
"\n",
"Image classification format contains separate classes of images in seperate directories titled with a particular class name.\n",
"Image classification format contains separate classes of images in separate directories titled with a particular class name.\n",
"\n",
"For example, all images of `pizza` are contained in the `pizza/` directory.\n",
"\n",
Expand Down Expand Up @@ -973,7 +973,7 @@
"\n",
"We'll do so using [`torch.utils.data.DataLoader`](https://pytorch.org/docs/stable/data.html#torch.utils.data.DataLoader).\n",
"\n",
"Turning our `Dataset`'s into `DataLoader`'s makes them iterable so a model can go through learn the relationships between samples and targets (features and labels).\n",
"Turning our `Dataset`'s into `DataLoader`'s makes them iterable so a model can go through and learn the relationships between samples and targets (features and labels).\n",
"\n",
"To keep things simple, we'll use a `batch_size=1` and `num_workers=1`.\n",
"\n",
Expand Down Expand Up @@ -1759,7 +1759,7 @@
"source": [
"They sure do!\n",
"\n",
"Let's now take a lot at some other forms of data transforms."
"Let's now take a look at some other forms of data transforms."
]
},
{
Expand All @@ -1778,7 +1778,7 @@
"\n",
"Or cropping it or randomly erasing a portion or randomly rotating them.\n",
"\n",
"Doing this kinds of transforms is often referred to as **data augmentation**.\n",
"Doing these kinds of transforms is often referred to as **data augmentation**.\n",
"\n",
"**Data augmentation** is the process of altering your data in such a way that you *artificially* increase the diversity of your training set.\n",
"\n",
Expand Down Expand Up @@ -2090,7 +2090,7 @@
" self.classifier = nn.Sequential(\n",
" nn.Flatten(),\n",
" # Where did this in_features shape come from? \n",
" # It's because each layer of our network compresses and changes the shape of our inputs data.\n",
" # It's because each layer of our network compresses and changes the shape of our input data.\n",
" nn.Linear(in_features=hidden_units*16*16,\n",
" out_features=output_shape)\n",
" )\n",
Expand Down Expand Up @@ -2361,7 +2361,7 @@
" # 5. Optimizer step\n",
" optimizer.step()\n",
"\n",
" # Calculate and accumulate accuracy metric across all batches\n",
" # Calculate and accumulate accuracy metrics across all batches\n",
" y_pred_class = torch.argmax(torch.softmax(y_pred, dim=1), dim=1)\n",
" train_acc += (y_pred_class == y).sum().item()/len(y_pred)\n",
"\n",
Expand Down Expand Up @@ -2522,7 +2522,7 @@
"\n",
"To keep our experiments quick, we'll train our model for **5 epochs** (though you could increase this if you want).\n",
"\n",
"As for an **optimizer** and **loss function**, we'll use `torch.nn.CrossEntropyLoss()` (since we're working with multi-class classification data) and `torch.optim.Adam()` with a learning rate of `1e-3` respecitvely.\n",
"As for an **optimizer** and **loss function**, we'll use `torch.nn.CrossEntropyLoss()` (since we're working with multi-class classification data) and `torch.optim.Adam()` with a learning rate of `1e-3` respectively.\n",
"\n",
"To see how long things take, we'll import Python's [`timeit.default_timer()`](https://docs.python.org/3/library/timeit.html#timeit.default_timer) method to calculate the training time."
]
Expand Down Expand Up @@ -2772,7 +2772,7 @@
"source": [
"### 8.1 How to deal with overfitting\n",
"\n",
"Since the main problem with overfitting is that you're model is fitting the training data *too well*, you'll want to use techniques to \"reign it in\".\n",
"Since the main problem with overfitting is that your model is fitting the training data *too well*, you'll want to use techniques to \"reign it in\".\n",
"\n",
"A common technique of preventing overfitting is known as [**regularization**](https://ml-cheatsheet.readthedocs.io/en/latest/regularization.html).\n",
"\n",
Expand Down Expand Up @@ -2830,7 +2830,7 @@
"\n",
"And preventing overfitting and underfitting is possibly the most active area of machine learning research.\n",
"\n",
"Since everone wants their models to fit better (less underfitting) but not so good they don't generalize well and perform in the real world (less overfitting).\n",
"Since everyone wants their models to fit better (less underfitting) but not so good they don't generalize well and perform in the real world (less overfitting).\n",
"\n",
"There's a fine line between overfitting and underfitting.\n",
"\n",
Expand Down Expand Up @@ -3180,7 +3180,7 @@
"\n",
"Even though our models our performing quite poorly, we can still write code to compare them.\n",
"\n",
"Let's first turn our model results in pandas DataFrames."
"Let's first turn our model results into pandas DataFrames."
]
},
{
Expand Down Expand Up @@ -3358,7 +3358,7 @@
"source": [
"## 11. Make a prediction on a custom image\n",
"\n",
"If you've trained a model on a certain dataset, chances are you'd like to make a prediction on on your own custom data.\n",
"If you've trained a model on a certain dataset, chances are you'd like to make a prediction on your own custom data.\n",
"\n",
"In our case, since we've trained a model on pizza, steak and sushi images, how could we use our model to make a prediction on one of our own images?\n",
"\n",
Expand Down

0 comments on commit 38e6e3e

Please sign in to comment.