A lightweight Python package that provides fast automatic differentiation (AD) in forward and reverse mode for scalar and array computations.
AD is an efficient algorithm for computing exact derivatives of numeric programs. It is a standard tool in numerous fields, including optimization, machine learning, and scientific computing.
Note
This repository focuses on providing Python bindings and does not include the C++ backend, which is part of a separate, standalone C++ library. The C++ version offers additional features not available in these bindings. For more information on the C++ version, please visit the AutoDiff repository.
- Automatic differentiation:
- Jacobian matrix with forward- and reverse-mode AD
- Jacobian-vector products, e.g., gradients and directional derivatives
- Support for scalar, 1D and 2D array, and linear algebra computations
- Fast and efficient implementation:
- Backend written in C++ (using this repository)
- Leverages the performance of the Eigen linear algebra library
- Memory efficient (see Variables vs. expressions)
- Simple and intuitive API
- Regular control flow: function calls, loops, branches
- Eager evaluation: what you evaluate is what you differentiate
- Lazy re-evaluations: offering you precise control over what to evaluate
- Math-inspired syntax
For more details, see the documentation.
If you are on Linux, you can download the latest wheel file from the releases page and install it using pip.
python -m pip install autodiff-0.1.0-cp311-cp311-linux_x86_64.whl
The wheel contains the extension modules as well as Python stub files for autocompletion and documentation in your IDE.
Below are two simple examples of how to use the autodiff
package to compute the gradient of a function with respect to its inputs.
The package provides two sub-modules: array
and scalar
.
The array
module is more general and can be used to compute gradients of functions involving both scalars and arrays (1D and 2D).
Caution
It is not possible to mix the array
and scalar
modules in the same program, as they use incompatible internal representations for variables.
# Example: gradient computation with NumPy arrays
import numpy as np
from autodiff.array import Function, var, d
# Create two 1D array variables
x = var(np.array([1, 2, 3]))
y = var(np.array([4, 5, 6]))
# Assign their (element-wise) product to a new variable
z = var(x * y)
# Variables are evaluated eagerly
print("z =", z()) # z = [ 4. 10. 18.]
# Create the function f : (x, y) ↦ z = x * y
f = Function(z) # short for: Function(sources=(x, y), target=z)
# Set the (element-wise) derivative of z with respect to itself
z.set_derivative(np.ones((1, 3)))
# Compute the gradient of f at (x, y) using reverse-mode AD
f.pull_gradient()
# Get the components of the (element-wise) gradient
print("∇_x f =", d(x)) # ∇_x f = [[4. 5. 6.]]
print("∇_y f =", d(y)) # ∇_y f = [[1. 2. 3.]]
For functions mapping only scalars to scalars, the scalar
module is more efficient and convenient.
No further imports are required.
# Example: gradient computation with scalars
from autodiff.scalar import Function, var, d
# Create two scalar variables
x = var(1.5)
y = var(-2.0)
# Assign their product to a new variable
z = var(x * y)
# Variables are evaluated eagerly
print("z =", z()) # z = -3.0
# Create the function f : (x, y) ↦ z = x * y
f = Function(z) # short for: Function(sources=(x, y), target=z)
# Compute the gradient of f at (x, y) using reverse-mode AD
f.pull_gradient_at(z)
# Get the components of the gradient
print("∂f/∂x =", d(x)) # ∂f/∂x = -2.0
print("∂f/∂y =", d(y)) # ∂f/∂y = 1.5
- Variables and expressions - writing programs with
autodiff
- Functions - (deferred) evaluation and differentiation
- The
autodiff.scalar
module - working with scalars only - The
autodiff.array
module - working with scalars and NumPy arrays - Applications - common use cases and examples