Skip to content

anhinga/2020-notes

Repository files navigation

2020 design and research notes

This is a continuation of

https://github.com/anhinga/2019-design-notes

2019 design notes contain the initial design for the next generation of dataflow matrix machines, a couple of preprint-like research notes, a white paper, and an interdisciplinary collaborative research agenda.

The 2020-notes repository contains:

  • a number of preprint-like research notes and drafts written in 2020
  • a couple of slide decks
  • an updated version of the interdisciplinary collaborative research agenda
  • a small collection of resources on AI-generating algorithms: AI-GAs-resources
  • a map of DMM-related programming examples and techniques: programming-overview
  • a space for research notes related to Transformers and other attention-based models and their interplay with DMMs: attention-based-models

This repository also contains an initial write-up on Julia Flux and Zygote written in February. It seems that Julia Flux/Zygote framework is a better fit for DMMs than more traditional machine learning frameworks, so I am trying to focus on Julia ecosystem at the moment.

Python actually does have a sufficiently flexible machine learning framework these days as well: JAX (with its pytree protocol for handling tree-like containers), and there is an ecosystem built around JAX by DeepMind: https://deepmind.com/blog/article/using-jax-to-accelerate-our-research.

About

2020 design and research notes

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages