Implementation of a neural network from scratch with batch normalization, ReLu as activation function, softmax as activation for the output layer and cross entropy as loss function. Applied to cifar-10 dataset.
-
Notifications
You must be signed in to change notification settings - Fork 0
jakobGTO/neural-network
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published