Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: support dataset update #1416

Merged
merged 4 commits into from
Jun 9, 2022
Merged

feat: support dataset update #1416

merged 4 commits into from
Jun 9, 2022

Conversation

jaycee-li
Copy link
Contributor

@jaycee-li jaycee-li commented Jun 8, 2022

Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly:

  • Make sure to open an issue as a bug/issue before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea
  • Ensure the tests and linter pass
  • Code coverage does not decrease (if any source code was changed)
  • Appropriate docs were updated (if necessary)

Fixes #1178 🦕

@jaycee-li jaycee-li requested a review from sasha-gitg June 8, 2022 21:10
@product-auto-label product-auto-label bot added size: m Pull request size is medium. api: vertex-ai Issues related to the googleapis/python-aiplatform API. labels Jun 8, 2022
google/cloud/aiplatform/datasets/dataset.py Show resolved Hide resolved
google/cloud/aiplatform/datasets/dataset.py Outdated Show resolved Hide resolved
tests/system/aiplatform/test_dataset.py Show resolved Hide resolved
@sasha-gitg sasha-gitg added do not merge Indicates a pull request not ready for merge, due to either quality or timing. and removed do not merge Indicates a pull request not ready for merge, due to either quality or timing. labels Jun 9, 2022
assert dataset.gca_resource.description == _TEST_DATASET_DESCRIPTION

finally:
if dataset is not None:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This line is superfluous. dataset will only exist on assignment after ImageDataset.create is successful.

The test should start with dataset=None

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the ImageDataset.create is successful, then we want to delete the dataset finally. But if create() fails, dataset is None and we shouldn't call dataset.delete(). So I think it's necessary to add this line.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If create fails then dataset is not defined and if dataset is not None will throw a NameError. dataset isn't created until after the right side of the statement, dataset = aiplatform.ImageDataset.create() is successful. You can validate this behavior in Python:

def f():
    raise RuntimeError('')
    
try:
    dataset = f()
except:
    if dataset is None:
        pass

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it!

But I see other system tests like test_create_and_import_image_dataset and test_create_tabular_dataset have the same pattern that checks if dataset is not None. Is there any reason for that?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, those should be eventually fixed as well.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it. Just removed this line. Thank you!

@jaycee-li jaycee-li merged commit e3eb82f into googleapis:main Jun 9, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: vertex-ai Issues related to the googleapis/python-aiplatform API. size: m Pull request size is medium.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support dataset update
2 participants