Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
jxbz authored Aug 20, 2024
1 parent 00cc0a4 commit e274a35
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

Modula is a deep learning framework designed for graceful scaling. Neural networks written in Modula automatically transfer learning rate across scale.

We are slowly writing [the Modula docs](https://jeremybernste.in/modula/). Check them out for an accessible introduction to scaling theory and the Modula API. Also, here are some [slides](https://docs.google.com/presentation/d/12pykmY3KT1vP_25zFYISvVMA4KQbaVdeo6cRF8SMGXM/edit?usp=sharing) for a talk that Jeremy gave, that provide a more visual introduction to Modula. And here is [Modulax](https://github.com/GallagherCommaJack/modulax/) by Jack Gallagher. And here is a barebones implementation of [Modula in NumPy](https://colab.research.google.com/drive/1lKS15RJilGsstYP5JDQKSn3Z7TUUYIDQ?usp=sharing).
We are slowly writing [the Modula docs](https://jeremybernste.in/modula/). Check them out for an accessible introduction to scaling theory and the Modula API. Also, here are some [slides](https://docs.google.com/presentation/d/1mCp6weIty9BzFFmx7LUGk2MPmNi-m-yKjigQ9wnycng/edit?usp=sharing) for a talk that Jeremy gave, that provide a more visual introduction to Modula. And here is [Modulax](https://github.com/GallagherCommaJack/modulax/) by Jack Gallagher. And here is a barebones implementation of [Modula in NumPy](https://colab.research.google.com/drive/1lKS15RJilGsstYP5JDQKSn3Z7TUUYIDQ?usp=sharing).

Modula is an experimental framework based on our research paper: [Scalable Optimization in the Modular Norm](https://arxiv.org/abs/2405.14813).

Expand Down

0 comments on commit e274a35

Please sign in to comment.