This project implements a basic neural network using only NumPy and Pandas. It demonstrates core concepts like forward propagation, backpropagation, and gradient descent for training.
- Fully connected network layers implemented in NumPy
- Tanh and softmax activation functions
- Forward and backpropagation passes
- Gradient descent optimization to update weights
- Trained and tested on MNIST dataset
The code is contained in a Jupyter notebook for easy execution and readability. To use:
- Download the notebook file and mnist dataset
- Ensure NumPy and Pandas are installed
- Run the cells in the notebook to train the network on the MNIST data
- The notebook walks through each step with explanations and visualizations. Easily tweak the network configuration and training parameters to experiment with different architectures.
The trained model achieves 90.75% accuracy on the test set.
Links View and run the Jupyter notebook on Kaggle: https://www.kaggle.com/code/nilaygaitonde/mnist-from-scratch?scriptVersionId=157118372