Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Update Checkpoint Loader docs with an example (#6871)
Related to #6687, this adds some documentation how to resume a network. I did not find this documented anywhere in MONAI and it's rather important imo. Please shortly check doc the syntax, I don't have any experience with it. The code itself is from my own repository and working. What is potentially still missing here is some information on when to use the Checkpoint Loader. As far I understood the docs, this shall mostly be used to resume interrupted training runs. But what is 1) with pure inference runs where the state of the trainer does not matter, only the evaluator 2) Resuming the training at epoch 200 but with a learning rate reset (e.g. DeepEdit train without clicks first for 200 epochs, then 200 epochs with clicks on top). 1 works well in my experience, 2 as well if you modify the state_dict to exclude e.g. the learning rate scheduler. ### Description A few sentences describing the changes proposed in this pull request. ### Types of changes <!--- Put an `x` in all the boxes that apply, and remove the not applicable items --> - [x] Non-breaking change (fix or new feature that would not break existing functionality). - [ ] Breaking change (fix or new feature that would cause existing functionality to change). - [ ] New tests added to cover the changes. - [ ] Integration tests passed locally by running `./runtests.sh -f -u --net --coverage`. - [ ] Quick tests passed locally by running `./runtests.sh --quick --unittests --disttests`. - [ ] In-line docstrings updated. - [ ] Documentation updated, tested `make html` command in the `docs/` folder. --------- Signed-off-by: Matthias Hadlich <matthiashadlich@posteo.de>
- Loading branch information