You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem or challenge? Please describe what you are trying to do.
Add support to the arrow CSV writer for writing lists.
Currently an sqllogictest of the format:
query T
SELECT array_agg(c13) FROM (SELECT * FROM aggregate_test_100 ORDER BY c13 LIMIT 2) test
----
[0VVIHzxWtNOFLtnhjHEKjXaJOSLJfm0keZ5G8BffGwgF2RwQD59TFzMStxCB]
will fail with thread 'main' panicked at 'called Result::unwrap() on an Err value: CsvError("CSV Writer does not support List(Field { name: \"item\", data_type: Utf8, nullable: true, dict_id: 0, dict_is_ordered: false, metadata: {} }) data type")', datafusion/core/tests/sqllogictests/src/main.rs:134:33
This would be an upstream issue in arrow-rs if we wish to add support for this, and I would be happy to review a PR adding support for it.
That being said, I'm not sure how to encode nested data in CSV? I had understood the format to be strictly tabular? The docs for arrow python would appear to suggest only primitive types are supported although I haven't tested this?
Is your feature request related to a problem or challenge? Please describe what you are trying to do.
Add support to the arrow CSV writer for writing lists.
Currently an sqllogictest of the format:
will fail with
thread 'main' panicked at 'called Result::unwrap() on an Err value: CsvError("CSV Writer does not support List(Field { name: \"item\", data_type: Utf8, nullable: true, dict_id: 0, dict_is_ordered: false, metadata: {} }) data type")', datafusion/core/tests/sqllogictests/src/main.rs:134:33
Source of failure in arrow-csv: https://github.com/apache/arrow-rs/blob/master/arrow-csv/src/writer.rs#L228
Not sure if this makes sense to implement upstream (in arrow-csv) or as part of datafusions test harness
The text was updated successfully, but these errors were encountered: