Third Party Compression compatibility #800
Replies: 2 comments 26 replies
-
The output produces This indicates that the huff0 stream produces more output than it should.
Huff0 Stream 0 decodes to at least 1746 bytes. |
Beta Was this translation helpful? Give feedback.
-
Do you have a sample of the huff0 case? I don't think it is possible. My encoder will never generate that case since it will reject this as incompressible after doing the histogram. In fact I cannot encode this, since the table is incompressible as FSE and cannot be stored as raw values. The first check is This means I will reject trying to compress if no histogram entry is above The second check is if the FSE is incompressible and Finally, I reject if actual output + table doesn't give a reasonable improvement after compression. This is tweakable, but for zstd it is fixed at This is to avoid the decompression penalty when improvement is marginal. |
Beta Was this translation helpful? Give feedback.
-
Hi all,
I am working on ZSTD Compression/Decompression algorithms. I have used Facebook Decompressor as well as their Educational decoder to validate RTL compression algorithm. Found few issues ,those are stated below.
#1 My RTL compressed data is not getting decompressed with latest Facebook ZSTD Decompressor, where as their
Educational decoder is able to do it.
Note: Production is able to decode it but the data is wrong.(something is going wrong when it is a 4 stream encoded,
sequences are found okay)
#2 Educational decoder fails for few blocks of webster and samba due to a corner case:
The case is all the 256 literals are having same weight(let's say 1)
So the Huffman codes would be like 0x1000
It is not able to find the end since there is no overflow on the stream.
Files :
1st_case.txt.zip
Beta Was this translation helpful? Give feedback.
All reactions