Skip to content

feat: Use validated local storage for data uploads #1612

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 12 commits into from
Apr 14, 2025

Conversation

TrevorBergeron
Copy link
Contributor

Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly:

  • Make sure to open an issue as a bug/issue before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea
  • Ensure the tests and linter pass
  • Code coverage does not decrease (if any source code was changed)
  • Appropriate docs were updated (if necessary)

Fixes #<issue_number_goes_here> 🦕

@product-auto-label product-auto-label bot added size: m Pull request size is medium. api: bigquery Issues related to the googleapis/python-bigquery-dataframes API. labels Apr 10, 2025
@product-auto-label product-auto-label bot added size: l Pull request size is large. and removed size: m Pull request size is medium. labels Apr 11, 2025
@TrevorBergeron TrevorBergeron marked this pull request as ready for review April 12, 2025 01:12
@TrevorBergeron TrevorBergeron requested review from a team as code owners April 12, 2025 01:12
@TrevorBergeron TrevorBergeron requested a review from ZehaoXU April 12, 2025 01:12
@TrevorBergeron TrevorBergeron changed the title feat: Use validated local storage for load jobs feat: Use validated local storage for data uploads Apr 12, 2025
@TrevorBergeron TrevorBergeron requested review from tswast and sycai and removed request for ZehaoXU April 12, 2025 01:26
pandas_dataframe_copy.columns = pandas.Index(new_col_ids)
pandas_dataframe_copy[ordering_col] = np.arange(pandas_dataframe_copy.shape[0])

timedelta_cols = utils.replace_timedeltas_with_micros(pandas_dataframe_copy)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you may wan to remove this function from utils.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

makes sense, yeah, done


# JSON support incomplete
for item in data.schema.items:
utils.validate_dtype_can_load(item.column, item.dtype)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

an we move valid_dtype_can_load into loader.py? It was put in utils.py for referencing by two places.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure, done

@@ -763,3 +740,39 @@ def _transform_read_gbq_configuration(configuration: Optional[dict]) -> dict:
configuration["jobTimeoutMs"] = timeout_ms

return configuration


def _search_for_nested_json_type(arrow_type: pa.DataType) -> bool:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

naming nit: _has_json_arrow_type(...) ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

we're using a workaround: storing JSON as strings and then parsing them into JSON
objects.
TODO(b/395912450): Remove workaround solution once b/374784249 got resolved.
"""
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should probably add some content in the Python doc to indicate that an error will be raised if the validation fails.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added docstring raises

@TrevorBergeron TrevorBergeron enabled auto-merge (squash) April 14, 2025 22:17
@TrevorBergeron TrevorBergeron merged commit aee4159 into main Apr 14, 2025
24 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: bigquery Issues related to the googleapis/python-bigquery-dataframes API. size: l Pull request size is large.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants