The Software for Testing Accuracy, Reliability and Scalability of Hierarchical (STARS-H) computations is a parallel library that provides a high performance matrix market of rank structured matrix operators. STARS-H supports various matrix kernels that are proxies for many scientific applications, and optionally compresses them by exploiting their data sparsity. This translates into a lower arithmetic complexity and memory footprint. STARS-H intends to provide a standard software environment for assessing accuracy and performance of 𝓗-matrix libraries on a given hardware architecture. STARS-H currently supports the tile low-rank (TLR) data format for approximation on shared and distributed-memory systems, possibly equipped with GPUs, using MPI, OpenMP and task-based programming models.
The vision of STARS-H is to design, implement and provide a community code for hierarchical matrix generator with support of various data formats for approximation, including, but limited to, TLR, HSS, HODLR, H and H^2. STARS-H aspires to be for the low-rank approximation community what UF Sparse Matrix Collection is for the sparse linear algebra community, by generating hierarchical matrices coming from a variety of synthetic and real-world applications. Furthermore, extracting the performance of the underlying hardware resources (i.e., x86 and GPUs) is in the DNA of STARS-H, since the approximation phase can be time-consuming on large-scale scientific applications.
This project is WIP, with current features limited to:
The only supported data format is Tile Low-Rank (TLR):
- TLR Approximation
- Multiplication of TLR matrix by dense matrix
Programming models (backends):
- OpenMP
- MPI
- Task-based using StarPU (with and without MPI)
Applications in matrix-free form:
- Cauchy matrix
- Electrostatics (1/r)
- Electrodynamics (sin(kr)/r and cos(kr)/r)
- Random synthetic TLR matrix
- Spatial statistics (exponential, square exponential and matern kernels)
- Mesh deformation using radial basis functions, i.e., Gaussian, exponential, inverse quadratic, inverse multi-quadratic, CPTS, and Wendland kernels.
- Acoustic scattering
Low-rank approximation techniques (low-rank engines):
- Ordinary SVD,
- Rank-revealing QR,
- Randomized SVD.
Additional:
- CG method for symmetric positive-definite (SPD) systems.
- Add support for more matrix kernels and applications
- Extend support to hardware accelerators (i.e, GPUs)
- Provide full StarPU support (GPUs and distributed-memory systems)
- Port to other dynamic runtime systems
- Implement additional low-rank routines like ACA.
- Implement additional formats: HODLR/H/HSS/H^2
Installation requires at least CMake of version 3.2.3. To build STARS-H, follow these instructions:
-
Get STARS-H from git repository
git clone git@github.com:ecrc/stars-h
or
git clone https://github.com/ecrc/stars-h
-
Go into STARS-H folder
cd stars-h
-
Get submodules
git submodule update --init
-
Create build directory and go there
mkdir build && cd build
-
Use CMake to get all the dependencies
cmake .. -DCMAKE_INSTALL_PREFIX=/path/to/install/
-
Build STARS-H
make -j
-
Run tests (optional)
make test
-
Build local documentation (optional)
make docs
-
Install STARS-H
make install
-
Add line
export PKG_CONFIG_PATH=/path/to/install/lib/pkgconfig:$PKG_CONFIG_PATH
to your .bashrc file.
Now you can use pkg-config executable to collect compiler and linker flags for STARS-H.
The directory examples
contains two subfolders: problem
and approximation
.
The sources in problem
show how to generate problems (e.g., spatial statistics,
minimal or dense) and how to create STARSH_problem instance, required for every
step of STARS-H. The examples in approximation
are based on problem generations
and have additional steps on approximation of corresponding matrices.
Important notice: the approximation phase does not require the entire dense matrix to be stored, since matrix elements are computed on the fly.
Please see Data.md for information about dataset.