Implementation of a Neural Network from scratch using Cairo 1.0 for MNIST predictions.
The NN has a simple two-layer architecture:
- Input layer 𝑎[0] will have 784 units corresponding to the 784 pixels in each 28x28 input image.
- A hidden layer 𝑎[1] will have 10 units with ReLU activation.
- Output layer 𝑎[2] will have 10 units corresponding to the ten digit classes with softmax activation.
Functionalities implemented in Cairo 1.0:
- Vector implementation with operations: sum, max, min, argmax.
- Matrix implementation with operations: get, dot, add, len.
- Tensor implementation.
- 8-bit weight quantization based in ONNX quantization.
- ReLU activation.
- Forward propagation of NN.
- Predict method for NN.
- Pseudo-softmax activation optimized for quantized values.
- Weight loading into Cairo NN from trained Tensorflow NN.
- MNIST inferences using Cairo NN.
Built with auditless/cairo-template
Currently supports building and testing contracts.
Build the contracts.
$ make build
Run the tests in src/test
:
$ make test
Format the Cairo source code (using Scarb):
$ make fmt