-
Notifications
You must be signed in to change notification settings - Fork 309
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support inserting data into BigQuery directly from a Polars DataFrame #1979
Comments
I have some thoughts on this: I think a separate package for BigQuery + Polars (similar to pandas-gbq) would be most appropriate here, especially given the desire to avoid pyarrow as a dependency. |
Thanks for the response @tswast It would be great if the BigQuery client accepted Polars DataFrames in addition to pandas DataFrames in I can also appreciate a separate package would provide a consolidated reading and writing interface, which I am definitely in support of. Do you have in mind that this package would be developed / owned by Google? If yes, would be it be possible / in-scope to support reading from BigQuery into Polars without a pyarrow dependency? At the moment it seems that As a heads up, I think I will also request that Polars provide some support for the BigQuery client, and implement the approaches mentioned in their user guide. EDIT: here is that request - pola-rs/polars#18547 |
Yes, that's my thought, though if the community were to create one before I make it through all the red tape needed to make such a thing happen, I'd gladly contribute there, instead. ;-)
That's my thought with regards to a separate package. No pyarrow necessary if the focus is just polars. The test suite for google-cloud-bigquery is complicated enough as it is without adding a test environment where polars is installed but pyarrow is not. |
I know you asked for writes, but I figured I'd try the read path today, and I was able to get BigQuery table -> polars DataFrame without pyarrow. Checkout this gist: https://gist.github.com/tswast/99b017b20386e324f5c7d2bd49f21b5f#file-bigquery-to-polars-no-pyarrow-ipynb Obviously, it's single threaded, missing a lot of boilerplate, and doesn't support query inputs, but as a proof of concept, I was happy to see it's possible. |
I've confirmed it is possible to write to BigQuery from polars without pyarrow in this gist: https://gist.github.com/tswast/4e2fb2cca1c1fecf8fb697e94102358f I've mailed pola-rs/polars#20292 to update the polars docs to set this option. |
Thank you for your Polars PR as well. Is there any movement on creating a separate Polars / BQ package within Google? |
Is your feature request related to a problem? Please describe.
The Polars DataFrame library has been gaining a lot of traction and many are writing new pipelines in Polars and/or moving from pandas to Polars. It would be great to add native support between the BigQuery client library and Polars.
Describe the solution you'd like
This request is to allow inserting data directly from a Polars DataFrame into a BigQuery table.
An additional bonus would be not requiring PyArrow to be installed.
I would be open to expanding
client.load_table_from_dataframe
to also accept Polars DataFrames, or new dedicated method(s) being created.Describe alternatives you've considered
Convert to pandas at the end of the pipeline and use
client.load_table_from_dataframe
to insert the data. Not ideal to require an additional dependency just to insert data. Furthermore, I don't believe that pandas supports complex types available in both BigQuery and Polars, such as structs and arrays.Write the DataFrame to a bytes stream as a parquet file and insert the data with
client.load_table_from_file
. This intent of this code is a lot less obvious, and it would be much nicer to have more native support. Note that this is also the suggested approach in the Polars user guide (rightfully so IMO as it does not require any additional dependencies).Do not support Polars directly, but instead support inserting data from a PyArrow table. This is not currently feasible, but would be an alternative feature request. This is not preferable as the option above already allows inserting data without a PyArrow dependency. From looking at the docs (haven't checked the source), this potentially looks to have some overlap with what
client.load_table_from_dataframe
already does.The text was updated successfully, but these errors were encountered: