You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Over the course of working sets of this data, you'll have the bulk data Vec, the names column in a Vec<&String>, the address column in a Vec<Option>. This puts extra memory pressure on the system, at the minimum we have to allocate a Vec the same size as the bulk data even if we are using references.
What I'm proposing is to use an IntoIter style. This will maintain backward compat as a slice automatically implements IntoIter. Where ColumnWriterImpl#write_batch goes from "values: &[T::T]"to values "values: IntoIter<Item=T::T>". Then you can do things like
Comment from Xavier Lange(xrl) @ 2019-04-09T19:22:29.534+0000:
[~csun] [~sadikovi] what do you think of this potentially breaking change? I need to confirm the backwards compatibility but I think it might still be a useful change.
Comment from Ivan Sadikov(sadikovi) @ 2019-04-09T19:33:52.271+0000:
[~xrl] Yes, sure. I will be happy to review if you open a PR with changes. We can create a new method "write_batch_iter", which implements new API and make "write_batch" to call the new method, since as you pointed out slice implements IntoIter.
jorgecarleitao
changed the title
[Parquet] Use IntoIter trait for write_batch/write_mini_batch
Use IntoIter trait for write_batch/write_mini_batch
Apr 29, 2021
Note: migrated from original JIRA: https://issues.apache.org/jira/browse/ARROW-5153
Writing data to a parquet file requires a lot of copying and intermediate Vec creation. Take a record struct like:
{{struct MyData {}}{{ name: String,}}{{ address: Option}}{{}}}
Over the course of working sets of this data, you'll have the bulk data Vec, the names column in a Vec<&String>, the address column in a Vec<Option>. This puts extra memory pressure on the system, at the minimum we have to allocate a Vec the same size as the bulk data even if we are using references.
What I'm proposing is to use an IntoIter style. This will maintain backward compat as a slice automatically implements IntoIter. Where ColumnWriterImpl#write_batch goes from "values: &[T::T]"to values "values: IntoIter<Item=T::T>". Then you can do things like
{{ write_batch(bulk.iter().map(|x| x.name), None, None)}}{{ write_batch(bulk.iter().map(|x| x.address), Some(bulk.iter().map(|x| x.is_some())), None)}}
and you can see there's no need for an intermediate Vec, so no short-term allocations to write out the data.
I am writing data with many columns and I think this would really help to speed things up.
The text was updated successfully, but these errors were encountered: