Skip to content
/ MNIST Public

Learning implementation of Deep Learning models using MNIST Dataset

Notifications You must be signed in to change notification settings

ABD-01/MNIST

Repository files navigation

MNIST Digit Classifier

ezgif com-video-to-gif

Table of Content

Implemented various models on The MNIST Database using different approaches to learn new stuff.

Implemented a convolutional network that learns to generate encodings of passed images such as to minimize the triplet loss function given by :

ℒ(A,P,N) = max( || f(A)-f(P) ||2) - || f(A)-f(N) ||2 + 𝜶, 0)

where A is an anchor input, P is a positive input of the same class as A, N is a negative input of a different class from A, 𝜶 is a margin between positive and negative pairs, and f is an embedding.

The Network was used to implement One Shot Learning which is a technique of learning representations from a single sample. Images of classes 3 to 9 weren't used while training the model, i.e they were passed to the model for the first time while testing it.

Training

Parameter Value
TrainSet 100 images each of 0, 1 and 2 classes
TestSet 60,000 images of all ten classes
Loss Triplet Loss
Learning Rate 0.001
Batch Size 10
Epochs 5
Optimizer Adam

Results

Class Accuracy Correct Total
0 97.99% 5804 5923
1 98.60% 6648 6742
2 97.85% 5830 5958
3 95.85% 5877 6131
4 99.79% 5830 5842
5 97.28% 5274 5421
6 99.83% 5908 5918
7 89.20% 5589 6265
8 98.73% 5777 5851
9 98.31% 5849 5949

Plot

Cost vs No. of Iterations


Trained a Convolutional Neural Network with two layers. Used mini-batches

Training

Parameter Value
TrainSet 60,000
TestSet 10,000
Loss Cross Entropy
Learning Rate 0.002
Batch Size 100
Epochs 50

Summary

Result Value
Train Accuracy 99.40%
Train Correct 59641
Test Accuracy 98.59%
Test Correct 9859

Plot

Cost vs No. of Iterations Acc. vs No. of Iterations


Trained a Mulit-Layer Neural Net in NumPy. The model has 4 layers with 512, 128, 32, 10 neurons respectively.

Training

Parameter Value
TrainSet 60,000
TestSet 10,000
Loss Cross Entropy
Learning Rate 0.11
Batch Size -
Epochs 1000

Summary

Result Value
Train Accuracy 98.27%
Train Correct 57291
Test Accuracy 98.26%
Test Correct 9505

Plot

Cost vs No. of Iterations


A single layer Neural Net implemented using NumPy library.

Training

Parameter Value
TrainSet 60,000
TestSet 10,000
Loss Cross Entropy
Learning Rate 0.009
Batch Size -
Epochs 2000

Summary

Result Value
Train Loss 0.50
Train Accuracy 93.98%
Train Correct 52507
Test Loss 0.80
Test Accuracy 94.18%
Test Correct 8836

Plot

Cost vs No. of Iterations


Others Branches