-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support configuring types_mapper
in read_gbq
#45
Comments
@bnaul You can already provide a custom mapper in |
Ha well this looks great, but unfortunately that's |
:) :) :) scratch that, I'm contributing to Yes, I think dask-bigquery/dask_bigquery/core.py Lines 45 to 48 in 876ddb8
|
types_mapper
in read_gbq
googleapis/python-bigquery#1529 and googleapis/python-bigquery#1547 have recently added arguments for overriding the default type conversions performed by
record_batch.to_pandas()
; this allows, for example, loading string data directly into dtypestring[pyarrow]
(which can be quite a bit more efficient) without doing any expensive conversions after the fact.I think we could basically just copy the implementation from the above PRs, same kwarg names and everything. Anyone see any potential issues @jrbourbeau @j-bennet @ncclementi?
The text was updated successfully, but these errors were encountered: