Skip to content

Commit

Permalink
Merge pull request #1487 from pyiron/docs_env_update
Browse files Browse the repository at this point in the history
Docs: Add explanation about pyiron environment variables
  • Loading branch information
jan-janssen authored Jun 20, 2024
2 parents 5003ad0 + 8c95c04 commit 64c5158
Showing 1 changed file with 10 additions and 0 deletions.
10 changes: 10 additions & 0 deletions docs/tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -184,6 +184,16 @@ to the queuing system by configuring the same `job.server` property. Still it is
string `executable_str` variable to use `mpiexec` or other means of parallelization to execute the external executable
in parallel.

If the `executable_str` supports multiple cores, multiple threads or GPUs acceleration, then these can be accessed via
the environment variables `PYIRON_CORES` for CPU cores, `PYIRON_THREADS` for threads and `PYIRON_GPUS` for number of
GPUs. So you can wrap MPI parallel executables using `mpirun -n ${PYIRON_CORES} executable` and then set the number of
cores on the `job` object using `job.server.cores=10`. Alternatively you can create a shell script like `executable.sh`:
```bash
#!/bin/bash
mpirun -n ${PYIRON_CORES} executable
```
This shell script can then be set as `executable_str`.

### Jupyter Notebook
The third category of job objects the `pyiron_base` workflow manager supports are the `ScriptJob` which are used to
submit Jupyter notebooks to the queuing system. While it is recommended to use the `wrap_python_function()` to wrap the
Expand Down

0 comments on commit 64c5158

Please sign in to comment.