Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TrainingStep's hyperparameters does not work on v2.2.0 #152

Closed
ohbuchim opened this issue Jul 13, 2021 · 5 comments
Closed

TrainingStep's hyperparameters does not work on v2.2.0 #152

ohbuchim opened this issue Jul 13, 2021 · 5 comments
Assignees

Comments

@ohbuchim
Copy link

I cannot set TrainingStep's 'hyperparameters' parameter using ExecutionInput on v2.2.0. v2.1.0 works well.

When I used v2.1.0 this code worked, and I was able to set hyperparameters when I executed workflow.

training_step = steps.TrainingStep(
    "SageMaker Training Step",
    estimator=sklearn,
    data={"train": sagemaker.TrainingInput(execution_input["PreprocessingOutputDataTrain"], content_type="text/csv")},
    job_name=execution_input["TrainingJobName"],
    hyperparameters=execution_input["TrainingParameters"],
    wait_for_completion=True,
)

When I used v2.2.0 I needed to add get_schema_as_dict() to ExecutionInput. After that, 'hyperparameters' was not updated by ExecutionInput. It seems that the estimator's 'hyperparameters' setting only is set (ExecutionInput is ignored).

training_step = steps.TrainingStep(
    "SageMaker Training Step",
    estimator=sklearn,
    data={"train": sagemaker.TrainingInput(execution_input["PreprocessingOutputDataTrain"], content_type="text/csv")},
    job_name=execution_input["TrainingJobName"],
    hyperparameters=execution_input["TrainingParameters"].get_schema_as_dict(),
    wait_for_completion=True,
)
@shivlaks
Copy link
Contributor

@ohbuchim - thanks for reporting the issue, it does seem to be a regression due to merging the hyperparameters set in the estimator along with the ones established in the constructor for TrainingStep

in v2.1.0

it would resolve to something like the following, but it would also get rid of any hyperparameters that were established in the estimator by doing so.

'HyperParameters.$': "$$.Execution.Input['TrainingParameters']"

does the workaround actually work though? I think it just discards the constructor parameter altogether.

To support placeholders, I'm thinking perhaps we should skip merging estimator established hyperparameters if a placeholder is provided. Note that this will result in us dropping any hyperparameters that were established in the estimator (as it was with v2.1.0.

what do you think?

@ohbuchim
Copy link
Author

Hi, thanks for your comments. I expected estimator's 'hyperparameters' and ExecutionInput's 'hyperparameters' would be merged.

It works for me that estimator's 'hyperparameters' is set by default, and only the parameters set by ExecutionInput are updated. I mean, if the estimator sets hyperparameters={'A': a, 'B': b, 'C': c} and ExecutionInput sets hyperparameters={'B': bb, 'D': dd}, the final hyperparameters should be {'A': a, 'B': bb, 'C': c, 'D': dd}.

@wong-a
Copy link
Contributor

wong-a commented Sep 9, 2021

Some clarification about the expected behaviour for this use case is being discussed in #159 (comment)

@ca-nguyen
Copy link
Contributor

Fixed the breaking change: If a Placeholder is provided as input to hyperparameters in TrainingStep constructor, the estimator hyperparameters are overwritten

The fix will be included in the next release

Created a new feature request (#163 ) to facilitate the merging of placeholder hyperparmeters with the estimator parameters

@wong-a
Copy link
Contributor

wong-a commented Sep 10, 2021

facilitate the merging of placeholder hyperparmeters with the estimator parameters

Just to clarify, this is possible today. The hyperparameters argument in TrainingStep's constructor just needs to be a dict. TrainingStep does a merge of the hyperparameters dict and estimator's hyperparameters. The dict's values can be Placeholders to reference dynamic values too.

https://github.com/aws/aws-step-functions-data-science-sdk-python/blob/main/src/stepfunctions/steps/sagemaker.py#L72-L74

The hyperparameters arg will only overwrite the estimator's values if the keys conflict. If the top level value is a placeholder, hyperparameters=execution_input["TrainingParameters"] from @ohbuchim's example, the constructor arg will win, producing the following ASL:

"HyperParameters.$": "$$.Execution.Input['TrainingParameters']"

if the estimator sets hyperparameters={'A': a, 'B': b, 'C': c} and ExecutionInput sets hyperparameters={'B': bb, 'D': dd}

To merge the two you need to do the following:

training_step = TrainingStep(...,
  hyperparmeters={ 
    'B': execution_input['TrainingParameters']['B'],
    'D': execution_input['TrainingParameters']['D']
  }
)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants