Allow definition of Loss and Optimizer in config file. #132
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This is an initial implementation that allows defining PyTorch Loss (Criterion) and Optimizers from within the config file.
Two new tables are introduced,
[criterion]
and[optimizer]
and we currently use the relatively familiar libpath approach to avoid having to create a registry of pytorch criterion and optimizer functions which would need to be loaded at run time (kind of heavy and slow)The keys in the tables need to share the same names as the parameters that would be passed to the loss or optimizer functions. (we just use
**config
to pass the params in).Within the
__init__
function of either the built in models or in an external class the user will still need to call:(I would like to fix that in a followup PR - Issue #134)
Additionally, if the user has a special
def _optimizer()
ordef _criterion()
function that is defined in the model class, we will not over write that.One final followup would be to allow a Learning Rate scheduler (described here) to be defined in the config as well. That won't happen in this PR though.