You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
raiseValueError("You must provide schema if data is Iterable")
else:
raiseTypeError(
f"{type(data).__name__} is not a valid input. Only PyArrow RecordBatchReader, RecordBatch, Iterable[RecordBatch], Table, Dataset or Pandas DataFrame are valid inputs for source."
)
With the Arrow PyCapsule interface, you can support any Arrow-based Python input, regardless of the library implementation.
Use Case
I'm working on a Python library for Arrow-backed geospatial operations, and support for the PyCapsule interface would allow interfacing with my or any other Python Arrow library without needing hard-coded support.
Related Issue(s)
The text was updated successfully, but these errors were encountered:
I put up a short initial PR for the write side to get things moving. I think we may be able to wrap some additional classes to export through the pycapsule interface as well (deferring to to_pyarrow under the hood)
Description
Looking at the code below, the
write_deltalake
function is hard-coded to support only specific pyarrow and pandas objects.delta-rs/python/deltalake/writer.py
Lines 283 to 306 in 6f81b80
With the Arrow PyCapsule interface, you can support any Arrow-based Python input, regardless of the library implementation.
Use Case
I'm working on a Python library for Arrow-backed geospatial operations, and support for the PyCapsule interface would allow interfacing with my or any other Python Arrow library without needing hard-coded support.
Related Issue(s)
The text was updated successfully, but these errors were encountered: