You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When exporting a tensor array (a kind of extension array) as a record batch, PyArrow segfaults. This does not happen if the batch is exported as a stream.
To Reproduce
The following test will fail in arrow-pyarrow-integration-testing/tests/test_sql.py:
deftest_tensor_array():
tensor_type=pa.fixed_shape_tensor(pa.float32(), [2, 3])
inner=pa.array([float(x) forxinrange(1, 7)] + [None] *12, pa.float32())
storage=pa.FixedSizeListArray.from_arrays(inner, 6)
f32_array=pa.ExtensionArray.from_storage(tensor_type, storage)
# Round-tripping as an array gives back storage type, because arrow-rs has# no notion of extension types.b=rust.round_trip_array(f32_array)
assertb==f32_array.storagebatch=pa.record_batch([f32_array], ["tensor"])
b=rust.round_trip_record_batch(batch)
assertb==batchdelb
Expected behavior
We should round trip the array type successfully.
Additional context
The record batch exporting is done by exporting each individual array, but this separates the extension arrays from their metadata. I suspect PyArrow segfaults because it is receiving a plain array and then later told it is an extension in the final schema.
The text was updated successfully, but these errors were encountered:
Describe the bug
When exporting a tensor array (a kind of extension array) as a record batch, PyArrow segfaults. This does not happen if the batch is exported as a stream.
To Reproduce
The following test will fail in
arrow-pyarrow-integration-testing/tests/test_sql.py
:Expected behavior
We should round trip the array type successfully.
Additional context
The record batch exporting is done by exporting each individual array, but this separates the extension arrays from their metadata. I suspect PyArrow segfaults because it is receiving a plain array and then later told it is an extension in the final schema.
The text was updated successfully, but these errors were encountered: