You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a huge dump file and I'm only interested in analysing the entries and exporting just a few tables to the file system.
I actually want to convert to parquet, but it could just save the table data as a gzip csv file.
The problem is that the package always tries to export every table and blogs to a temporary file without compression.
That is taking a lot of time and space and I don't need it.
An option to not cache anything would be great.
And a method to export the contents of an entry with parameters for the filename and if it should use gzip would be perfect.
Thank you!
The text was updated successfully, but these errors were encountered:
Hi.
This is a great package, congratulations!
I have a huge dump file and I'm only interested in analysing the entries and exporting just a few tables to the file system.
I actually want to convert to parquet, but it could just save the table data as a gzip csv file.
The problem is that the package always tries to export every table and blogs to a temporary file without compression.
That is taking a lot of time and space and I don't need it.
An option to not cache anything would be great.
And a method to export the contents of an entry with parameters for the filename and if it should use gzip would be perfect.
Thank you!
The text was updated successfully, but these errors were encountered: