Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add eval metrics types to get_experiment_df #1648

Merged
merged 6 commits into from
Sep 9, 2022

Conversation

sararob
Copy link
Contributor

@sararob sararob commented Sep 7, 2022

Add new Model Evaluation metric schema types to get_experiment_df. With this addition, calling get_experiment_df will return metrics produced by Model Evaluation pipelines.

@sararob sararob requested a review from a team as a code owner September 7, 2022 14:09
@product-auto-label product-auto-label bot added size: s Pull request size is small. api: vertex-ai Issues related to the googleapis/python-aiplatform API. labels Sep 7, 2022
Copy link
Contributor

@SinaChavoshi SinaChavoshi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should change the type name in schema classes artifact_schema.py to use the same constant as definitions in const file. However we can cover that in a separate cl.

@rosiezou rosiezou added do not merge Indicates a pull request not ready for merge, due to either quality or timing. and removed do not merge Indicates a pull request not ready for merge, due to either quality or timing. labels Sep 8, 2022
@sararob sararob merged commit 944b03f into googleapis:main Sep 9, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: vertex-ai Issues related to the googleapis/python-aiplatform API. size: s Pull request size is small.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants