MOLPIPx is a JAX-based library that provides an implementation of PIP models compatible with,
- FLAX: Neural network library.
- GPU friendly.
- Fully differentiable.
This library translates the MSA files, specifically the _file_.MONO
and _file_.POLY
files to the corresponding JAX version, _file_mono.py
and _file_poly.py
.
The MSA files must be generated before, for more information please see https://github.com/szquchen/MSA-2.0
MSA References:
- Xie, Z.; Bowman, J.M. Permutationally Invariant Polynomial Basis for Molecular Energy Surface Fitting via Monomial Symmetrization. J. Chem. Theory Comput. 2010, 6, 26-34.
Install MOLPIPx using:
git clone https://github.com/ChemAI-Lab/molpipx.git
cd molpipx
pip install .
MOLPIPx package includes msa_file_generator
, which translates monomial and polynomial files from MSA to JAX and Rust for molecules.
Check out an example on generating msa files
from molpipx import msa_file_generator
head_files = 'MOL_<info>_<deg>'
path = '<path_to_the_files>'
label = '<file_label>'
msa_file_generator(head_files, path, label)
The structure of the library is kept simple, as each molecular system could need individual elements.
MOLPIPx incorporated PIPs with three main regression models, i.e., linear regression, neural networks and Gaussian processes. This library leverages two main automatic differentiation engines, JAX for The Python version and Enzyme-AD for the Rust version improve the simulation of a wide range of chemical systems.
The Rust version makes use of std::autodiff, an experimental feature of Rust which is currently in the process of upstreaming. While upstreaming is in progress, you will need to build our custom fork of Rust which already includes autodiff. Instruction for how to do so are available here. Once upstreaming completed, you will be able to use any nightly Rust version. This tracking issue shows the progress in upstreaming the remaining autodiff pieces.
Check out our tutorials to get started with MOLPIPx. These tutorials define inputs for different regression approaches, train machine learning models with or without forces, and make predictions.
- Linear regression with permutationally invariant polynomials (Linear PIP)
- Anisotropic linear regression with permutationally invariant polynomials (Anisotropic Linear PIP)
- Permutationally Invariant Polynomial Neural Networks (PIP-NN)
- Permutationally Invariant Polynomial Gaussian Process (PIP-GP)
@misc{molpipx2024,
title={MOLPIPx: an end-to-end differentiable package for permutationally invariant polynomials in Python and Rust},
author={Manuel S. Drehwald and Asma Jamali and Rodrigo A. Vargas-Hernández},
year={2024},
eprint={2411.17011},
archivePrefix={arXiv},
primaryClass={physics.chem-ph},
url={https://arxiv.org/abs/2411.17011},
}