Skip to content

Basic scalar-valued auto gradient engine and neural net library

Notifications You must be signed in to change notification settings

ryansereno/autograd

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Auto Gradient

Logo


Library for storing partial derivatives and calculating loss gradients.
Based on Andrej Karpathy's micrograd

Neural network 'learning' involves incrementally adjusting the model's parameters, based on the gradient of the loss function.

These adjustments are made in the direction that minimizes the loss function, a process called gradient descent.

To derive these gradients, all operations and their inputs during the forward pass must be tracked so that their derivatives can be calculated during the backward pass. This process, known as backpropagation, uses the chain rule to efficiently compute these gradients.

All basic math operators are supported (add, mult, exponentiate, etc.)
value.backward() will calculate the gradients and save to each value object in the lineage

from autograd import Value

#using basic math operators:
a = Value(-4.0)
b = Value(2.0)
c = a + b
d = a * b + b**3
c += c + 1
c += 1 + c + (-a)
d += d * 2 + (b + a).relu()
d += 3 * d + (b - a).relu()
e = c - d
f = e**2
g = f / 2.0
g += 10.0 / f
print(f'{g.data:.4f}') # prints 24.7041, the outcome of this forward pass
g.backward() # calculate the gradient of g with respect to each parameter in the network
print(f'{a.grad:.4f}') # prints 138.8338, the numerical value of dg/da
print(f'{b.grad:.4f}') # prints 645.5773, the numerical value of dg/db

#using in a neural net:
from nn import MLP

network = MLP(3, [4,4,1]) # initialize a multilayer perceptron
x = [2.0, 3.0, -1.0] # create some input data
y = network(x) # run a forward pass
y.backward() # calculate the gradients

Usage

Clone the repo

git clone https://github.com/ryansereno/autograd

Import

 from autograd import Value

(back to top)

About

Basic scalar-valued auto gradient engine and neural net library

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages