Skip to content

mlx port of Karpathy's micrograd - a tiny scalar-valued autograd engine with a small PyTorch-like neural network library on top.

Notifications You must be signed in to change notification settings

Jaykef/mlx_micrograd

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

mlx-micrograd

An mlx port of Karpathy's micrograd - a tiny scalar-valued autograd engine with a small PyTorch-like neural network library on top.

banner

Installation

pip install mlx_micrograd

Example usage

Example showing a number of possible supported operations:

from mlx_micrograd.engine import Value

a = Value(-4.0)
b = Value(2.0)
c = a + b
d = a * b + b**3
c += c + 1
c += 1 + c + (-a)
d += d * 2 + (b + a).relu()
d += 3 * d + (b - a).relu()
e = c - d
f = e**2
g = f / 2.0
g += 10.0 / f
print(f'{g.data}') # prints array(24.7041, dtype=float32), the outcome of this forward pass
g.backward()
print(f'{a.grad}') # prints array(138.834, dtype=float32), i.e. the numerical value of dg/da
print(f'{b.grad}') # prints array(645.577, dtype=float32), i.e. the numerical value of dg/db

Training a neural net

demo.ipynb provides a full demo of training an 2-layer neural network (MLP) binary classifier.

Running tests

To run the unit tests cd into tests folder - which has test_engine.py that does sanity checks with pytorch's autograd and tests.py which run tests on some supported ops.

python test_engine.py
python tests.py

License

MIT

About

mlx port of Karpathy's micrograd - a tiny scalar-valued autograd engine with a small PyTorch-like neural network library on top.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published