-
Notifications
You must be signed in to change notification settings - Fork 27
Lumi
Esteban Ferrer edited this page Apr 10, 2024
·
3 revisions
The modules necessary to compile and run HORSES3D are:
module load PrgEnv-gnu swap gcc gcc/11.2.0 load LUMI/22.12 load ParMETIS/4.0.3-cpeGNU-22.12 load gnuplot/5.4.6-cpeGNU-22.12
export METIS_HOME=$EBROOTPARMETIS
git apply mpi.patch
Then, to compile:
make COMPILER=ifort COMM=PARALLEL WITH_METIS=YES WITH_HDF5=YES MODE=HPC
Special flags:
GNU_RELEASE_FLAGS= -cpp -fallow-argument-mismatch -std=legacy -ffree-line-length-0 -O3 -ftree-vectorize \
-ftree-vectorizer-verbose=0 -fbackslash -D_has_Quad -march=znver3 -funroll-loops #-ffast-math
An example slurm script is included below:
#!/bin/bash -l
#SBATCH --partition=standard # Partition (queue) name
#SBATCH --nodes=1 # 2 - Total number of nodes
#SBATCH --ntasks-per-node=1 # 2 - 2 MPI ranks per node
#SBATCH --cpus-per-task=32 # 16 - 16 threads per task
#SBATCH --time=60 # Run time (minutes)
#SBATCH --account=project_XXXXXXXXX # Project for billing
#SBATCH --mem=0 # Memory
module purge
module load PrgEnv-gnu
module swap gcc gcc/11.2.0
module load LUMI/22.12
module load ParMETIS/4.0.3-cpeGNU-22.12
module load gnuplot/5.4.6-cpeGNU-22.12
export METIS_HOME=$EBROOTPARMETIS
export SRUN_CPUS_PER_TASK=32
export OMP_NUM_THREADS=${SRUN_CPUS_PER_TASK}
export OMP_PROC_BIND=close
export OMP_PLACES=cores
export MPICH_CPUMASK_DISPLAY=1
srun ./horses3d.ns naca63a.control
Deploying on: