Skip to content

Python interface to PartMC

License

Notifications You must be signed in to change notification settings

slayoo/PyPartMC

 
 

Repository files navigation

logo

PyPartMC

PyPartMC is a Python interface to PartMC, a particle-resolved Monte-Carlo code for atmospheric aerosol simulation. PyPartMC is implemented in C++ and it also constitutes a C++ API to the PartMC Fortran internals. The Python API can facilitate using PartMC from other environments - see, e.g., Julia and Matlab examples below.

For an outline of the project, rationale, architecture, and features, refer to: D'Aquino et al., 2023 (arXiv) (please cite if PyPartMC is used in your research). For a list of talks and other relevant resources, please see project Wiki.

US Funding License: GPL v3 Copyright Github Actions Build Status API docs DOI PyPI version

TL;DR (try in a Jupyter notebook)

Python 3 Linux OK macOS OK Windows OK Jupyter

! pip install PyPartMC
import PyPartMC

Jupyter notebooks with examples

Note: clicking the badges below redirects to cloud-computing platforms. The mybinder.org links allow anonymous execution, Google Colab requires logging in with a Google account, ARM JupyerHub requires logging in with an ARM account (and directing Jupyter to a particular notebook within the examples folder).

  • Urban plume scenario demo (as in PartMC):
    View notebook Open In Colab Binder ARM JupyterHub
  • Dry-Wet Particle Size Equilibration with PartMC and PySDM:
    View notebook Open In Colab Binder ARM JupyterHub Voila
  • Simulation output processing example (loading from netCDF files using PyPartMC):
    View notebook Open In Colab Binder ARM JupyterHub
  • Optical properties calculation using external Python package (PyMieScatt):
    View notebook Open In Colab Binder ARM JupyterHub

Features

  • works on Linux, macOS and Windows (compatibility assured with CI builds)
  • hassle-free installation using pip (prior PartMC installation not needed)
  • works out of the box on mybinder.org, Google Colab and alike
  • ships with a set of examples maintained in a form of Jupyter notebooks
  • Pythonic API (but retaining PartMC jargon) incl. Python GC deallocation of Fortran objects
  • specification of parameters using native Python datatypes (lists, dicts) in place of PartMC spec files
  • code snippets in README depicting how to use PyPartMC from Julia and Matlab (also executed on CI)
  • auto-generated API docs on the web
  • support for [de]serialization of selected wrapped structures using JSON
  • based on unmodified PartMC code
  • does not use or require shell or any pre-installed libraries
  • aiming at 100% unit test coverage

Usage examples

The listings below depict how the identical task of randomly sampling particles from an aerosol size distribution in PartMC can be done in different programming languages.

For a Fortran equivalent of the Python, Julia and Matlab programs below, see the readme_fortran folder.

Python

import numpy as np

import PyPartMC as ppmc
from PyPartMC import si

aero_data = ppmc.AeroData((
    #      [density, ions in solution, molecular weight, kappa]
    {"OC": [1000 *si.kg/si.m**3, 0, 1e-3 *si.kg/si.mol, 0.001]},
    {"BC": [1800 *si.kg/si.m**3, 0, 1e-3 *si.kg/si.mol, 0]},
))

aero_dist = ppmc.AeroDist(
    aero_data,
    [{
        "cooking": {
            "mass_frac": [{"OC": [1]}],
            "diam_type": "geometric",
            "mode_type": "log_normal",
            "num_conc": 3200 / si.cm**3,
            "geom_mean_diam": 8.64 * si.nm,
            "log10_geom_std_dev": 0.28,
        }
    },
    {
        "diesel": {
            "mass_frac": [{"OC": [0.3]}, {"BC": [0.7]}],
            "diam_type": "geometric",
            "mode_type": "log_normal",
            "num_conc": 2900 / si.cm**3,
            "geom_mean_diam": 50 * si.nm,
            "log10_geom_std_dev": 0.24,
        }
    }],
)

n_part = 100
aero_state = ppmc.AeroState(aero_data, n_part, "nummass_source")
aero_state.dist_sample(aero_dist)
print(np.dot(aero_state.masses(), aero_state.num_concs), "# kg/m3")

Julia (using PyCall.jl)

using Pkg
Pkg.add("PyCall")

using PyCall
ppmc = pyimport("PyPartMC")
si = ppmc["si"]

aero_data = ppmc.AeroData((
  #       (density, ions in solution, molecular weight, kappa)
  Dict("OC"=>(1000 * si.kg/si.m^3, 0, 1e-3 * si.kg/si.mol, 0.001)),
  Dict("BC"=>(1800 * si.kg/si.m^3, 0, 1e-3 * si.kg/si.mol, 0))
))

aero_dist = ppmc.AeroDist(aero_data, (
  Dict( 
    "cooking" => Dict(
      "mass_frac" => (Dict("OC" => (1,)),),
      "diam_type" => "geometric",
      "mode_type" => "log_normal",
      "num_conc" => 3200 / si.cm^3,
      "geom_mean_diam" => 8.64 * si.nm,
      "log10_geom_std_dev" => .28,
    )
  ),
  Dict( 
    "diesel" => Dict(
      "mass_frac" => (Dict("OC" => (.3,)), Dict("BC" => (.7,))),
      "diam_type" => "geometric",
      "mode_type" => "log_normal",
      "num_conc" => 2900 / si.cm^3,
      "geom_mean_diam" => 50 * si.nm,
      "log10_geom_std_dev" => .24,
    )
  )
))

n_part = 100
aero_state = ppmc.AeroState(aero_data, n_part, "nummass_source")
aero_state.dist_sample(aero_dist)
print(aero_state.masses()'aero_state.num_concs, "# kg/m3")

notes (see the PyPartMC Matlab CI workflow for an example on how to achieve it on Ubuntu 20):

  • Matlab ships with convenience copies of C, C++ and Fortran runtime libraries which are dlopened() by default; one way to make PyPartMC OK with it is to [pip-]install by compiling from source using the very same version of GCC that Matlab borrowed these libraries from (e.g., GCC 9 for Matlab R2022a, etc);
  • Matlab needs to use the same Python interpretter/venv as the pip invocation used to install PyPartMC;
  • a single-line pybind11_builtins.py file with just pybind11_type=type inside needs to be placed within Matlab's PYTHONPATH to sort out a Matlab-pybind11 incompatibility.
ppmc = py.importlib.import_module('PyPartMC');
si = py.importlib.import_module('PyPartMC').si;

aero_data = ppmc.AeroData(py.tuple({ ...
  py.dict(pyargs("OC", py.tuple({1000 * si.kg/si.m^3, 0, 1e-3 * si.kg/si.mol, 0.001}))), ...
  py.dict(pyargs("BC", py.tuple({1800 * si.kg/si.m^3, 0, 1e-3 * si.kg/si.mol, 0}))) ...
}));

aero_dist = ppmc.AeroDist(aero_data, py.tuple({ ...
  py.dict(pyargs( ...
    "cooking", py.dict(pyargs( ...
      "mass_frac", py.tuple({py.dict(pyargs("OC", py.tuple({1})))}), ...
      "diam_type", "geometric", ...
      "mode_type", "log_normal", ...
      "num_conc", 3200 / si.cm^3, ...
      "geom_mean_diam", 8.64 * si.nm, ...
      "log10_geom_std_dev", .28 ...
    )) ...
  )), ...
  py.dict(pyargs( ... 
    "diesel", py.dict(pyargs( ...
      "mass_frac", py.tuple({ ...
        py.dict(pyargs("OC", py.tuple({.3}))), ...
        py.dict(pyargs("BC", py.tuple({.7}))), ...
      }), ...
      "diam_type", "geometric", ...
      "mode_type", "log_normal", ...
      "num_conc", 2900 / si.cm^3, ...
      "geom_mean_diam", 50 * si.nm, ...
      "log10_geom_std_dev", .24 ...
    )) ...
  )) ...
}));

n_part = 100;
aero_state = ppmc.AeroState(aero_data, n_part, "nummass_source");
aero_state.dist_sample(aero_dist);
masses = cell(aero_state.masses());
num_concs = cell(aero_state.num_concs);
fprintf('%g # kg/m3\n', dot([masses{:}], [num_concs{:}]))

usage in other projects

PyPartMC is used within the test workflow of the PySDM project.

Implementation outline

  • PyPartMC is written in C++, Fortran and uses pybind11 and CMake.
  • JSON support is handled with nlohmann::json and pybind11_json
  • PartMC and selected parts of SUNDIALS are statically linked (and compiled in during pip install or python -m build)
  • C (SUNDIALS, netCDF), C++ (pybind11, ...) and Fortran (PartMC, CAMP, netCDF-fortran) dependencies are linked through git submodules
  • MOSAIC dependency is optionally linked through setting the environmental variable MOSAIC_HOME
  • a drop-in replacement of the PartMC spec file routines is used for i/o from/to JSON

Implementation architecture

flowchart TD
    subgraph J ["Julia"]
        julia_user_code["Julia user code"] --> PyCall.jl
    end
    subgraph M ["Matlab"]
        matlab_user_code["Matlab user code"] --> matlab_python["Matlab built-in\nPython interface"]
    end
    subgraph P ["Python"]
        python_user_code -.-> NumPy
        python_user_code["Python user code"] ---> PyPartMC["pubind11-generated\nPyPartMC module"]
        matlab_python --> PyPartMC
        PyCall.jl --> PyPartMC
    end
    subgraph Cpp ["C++"]
        cpp_user_code["C++ user code"] ----> ppmc_cpp
        PyPartMC --> ppmc_cpp["PyPartMC-C++"]
        ppmc_cpp --> pybind11_json
        pybind11_json ---> nlohmann::JSON
        fake_spec_file_cpp --> nlohmann::JSON
    end
    subgraph C ["C"]
        fake_spec_file_c --> fake_spec_file_cpp["SpecFile-C++"]
        ppmc_cpp --> ppmc_c["PyPartMC-C"]
        netCDF-C
        SUNDIALS
        camp_c["CAMP C code"]
    end
    subgraph Fortran ["Fortran"]
        PartMC -....-> MOSAIC
        ppmc_c --> ppmc_f["PyPartMC-F"]
        ppmc_f ---> PartMC
        PartMC --> netCDF-F
        netCDF-F --> netCDF-C
        PartMC --> SUNDIALS
        PartMC ---> camp_f
        camp_f["CAMP"] --> camp_c
        PartMC ----> fake_spec_file_f[SpecFile-F]
        fake_spec_file_f --> fake_spec_file_c["SpecFile-C"]
    end

    style PartMC fill:#7ae7ff,stroke-width:2px,color:#2B2B2B
Loading

FAQ

  • Q: Why pip install PyPartMC triggers compilation on my brand new Apple machine, while it quickly downloads and installs binary packages when executed on older Macs, Windows or Linux?
    A: We are not yet providing binary wheels on PyPI for Apple-silicon (arm64) machines. Cross-compilation with gfortran is only supported with experimental unofficial builds and is tricky, while Github Actions ARM64 virtual machines are costly.

  • Q: Why some of the constructors expect data to be passed as lists of single-entry dictionaries instead of multi-element dictionaries?
    A: This is intentional and related with PartMC relying on the order of elements within spec-file input; while Python dictionaries preserve ordering (insertion order), JSON format does not, and we intend to make these data structures safe to be [de]serialized using JSON.

  • Q: How to check the version of PartMC that PyPartMC was compiled against?
    A: Version numbers of compile-time dependencies of PyPartMC, including PartMC, can be accessed as follows:

import PyPartMC
PyPartMC.__versions_of_build_time_dependencies__['PartMC']

Troubleshooting

Common installation issues

error: [Errno 2] No such file or directory: 'cmake'

Try rerunning after installing CMake, e.g., using apt-get install cmake (Ubuntu/Debian), brew install cmake (homebrew on macOS) or using MSYS2 on Windows.

No CMAKE_Fortran_COMPILER could be found.

Try installing a Fortran compiler (e.g., brew reinstall gcc with Homebrew on macOS or using MSYS2 on Windows).

Could not find NC_M4 using the following names: m4, m4.exe

Try installing m4 (e.g., using MSYS2 on Windows).

Notes for developers

How to debug

git clone --recursive git+https://github.com/open-atmos/PyPartMC.git
cd PyPartMC
DEBUG=1 VERBOSE=1 pip --verbose install -e .
gdb python 
(gdb) run -m pytest -s -vv -We -p no:unraisableexception tests

Pre-commit hooks

PyPartMC codebase benefits from Pylint, Black and isort code analysis (which are all part of the CI workflows where we also use pre-commit hooks. The pre-commit hooks can be run locally, and then the resultant changes need to be staged before committing. To set up the hooks locally, install pre-commit via pip install pre-commit and set up the git hooks via pre-commit install (this needs to be done every time you clone the project). To run all pre-commit hooks, run pre-commit run --all-files. The .pre-commit-config.yaml file can be modified in case new hooks are to be added or existing ones need to be altered.

Credits

PyPartMC:

authors: PyPartMC developers
funding: US Department of Energy Atmospheric System Research programme
copyright: University of Illinois at Urbana-Champaign
licence: GPL v3

PartMC:

authors: Nicole Riemer, Matthew West, Jeff Curtis et al.
licence: GPL v2 or later

About

Python interface to PartMC

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 35.2%
  • Python 29.7%
  • Fortran 29.5%
  • CMake 5.6%