This repository hosts submission scripts and framework for hyperparameter optimization of the models defined in the main library. It is a successor to the original hyperparameter optimization repository that became obsolete after the main library was refactored to use pytorch lightning.
- Configure your model with only yaml files
- Submission files for SLURM
- Upload your results to the Weights & Biases platform
scripts
: This is the main content: All the config files and submission scripts to start training are herebin
: Helper scripts that might be worth adding to your$PATH
(i.e., that apply to all experiments/subprojects)src/hpo2
: Python package with additional helper files that don't have a place in the main library (currently empty)
First, follow the instructions from the main library to set up the mamba environment and install the main library. Then run
pip3 install --editable '.[dev,testing]'
for this library.
For the helper scripts, install xonsh:
pip3 install xonsh
xonsh link_bin.xsh
- wandb-osh: package to trigger wandb syncs on compute nodes without internet