Replies: 3 comments 6 replies
-
+1 for this - I think this is a vital feature missing in lrzip. I'd love to have the ability to compress a directory (with multiple files and sub-directories in it) whilst using the maximum RAM window. This would also ensure that the hierarchy of the directory would be maintained |
Beta Was this translation helpful? Give feedback.
-
So is the only difference between using the tar -I and piping into LRzip the calculated memory usage? The output file will be the same? Can you manually set the window size larger to compensate or will that cause issues? What are the command options for extracting the resulting file? I was testing extracting some large files using the -I option in Tar but ran into some issues with can not extract in memory, using temp file. Naturally that was really slow. |
Beta Was this translation helpful? Give feedback.
-
Cool, that wiki page only shows the -I version of using tar, I tried that and it works but says it could not fit the whole file into ram and then started using a temp file instead of extracting directly to the destination? Since the piping version is better would I simply use the same piping command but with the decompress option for tar? Would this solve the temp file issue? Or would I need to decompress with lrzip first and pipe that into tar? Still very new to piping. |
Beta Was this translation helpful? Give feedback.
-
A long missed feature of
lrzip
is the ability to zip more than one file at a time. Whiletar --use-compress-program=lrzip
can work, the compression window gets reduced as a result.lrzip
accepts wildcards for compression or decompression, each file is compressed individually. A better way would be to construct a list of files selected - whether with wildcards or multiple file entries on the command line - and add that list of files to an lrz file chaining them one by one, chaining them just like streams blocks are chained.Beta Was this translation helpful? Give feedback.
All reactions