-
Notifications
You must be signed in to change notification settings - Fork 88
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TrainingStep's hyperparameters does not work on v2.2.0 #152
Comments
@ohbuchim - thanks for reporting the issue, it does seem to be a regression due to merging the hyperparameters set in the estimator along with the ones established in the constructor for in it would resolve to something like the following, but it would also get rid of any hyperparameters that were established in the estimator by doing so.
does the workaround actually work though? I think it just discards the constructor parameter altogether. To support placeholders, I'm thinking perhaps we should skip merging estimator established hyperparameters if a placeholder is provided. Note that this will result in us dropping any hyperparameters that were established in the estimator (as it was with what do you think? |
Hi, thanks for your comments. I expected estimator's 'hyperparameters' and ExecutionInput's 'hyperparameters' would be merged. It works for me that estimator's 'hyperparameters' is set by default, and only the parameters set by ExecutionInput are updated. I mean, if the estimator sets hyperparameters={'A': a, 'B': b, 'C': c} and ExecutionInput sets hyperparameters={'B': bb, 'D': dd}, the final hyperparameters should be {'A': a, 'B': bb, 'C': c, 'D': dd}. |
Some clarification about the expected behaviour for this use case is being discussed in #159 (comment) |
Fixed the breaking change: If a Placeholder is provided as input to hyperparameters in TrainingStep constructor, the estimator hyperparameters are overwritten The fix will be included in the next release Created a new feature request (#163 ) to facilitate the merging of placeholder hyperparmeters with the estimator parameters |
Just to clarify, this is possible today. The The "HyperParameters.$": "$$.Execution.Input['TrainingParameters']"
To merge the two you need to do the following: training_step = TrainingStep(...,
hyperparmeters={
'B': execution_input['TrainingParameters']['B'],
'D': execution_input['TrainingParameters']['D']
}
) |
I cannot set TrainingStep's 'hyperparameters' parameter using ExecutionInput on v2.2.0. v2.1.0 works well.
When I used v2.1.0 this code worked, and I was able to set hyperparameters when I executed workflow.
When I used v2.2.0 I needed to add get_schema_as_dict() to ExecutionInput. After that, 'hyperparameters' was not updated by ExecutionInput. It seems that the estimator's 'hyperparameters' setting only is set (ExecutionInput is ignored).
The text was updated successfully, but these errors were encountered: