Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

do we need a different strategy for compressible payloads? #196

Closed
brayniac opened this issue Apr 11, 2024 · 1 comment
Closed

do we need a different strategy for compressible payloads? #196

brayniac opened this issue Apr 11, 2024 · 1 comment
Labels
question Further information is requested

Comments

@brayniac
Copy link
Contributor

Currently, we take a naive approach of a payload (either a value for cache or a message for pubsub) having a fixed number of random bytes to achieve a target compression ratio. These bytes are grouped at the head of the payload (ignoring the additional header we pack into pubsub messages to track timing and integrity).

We should do some analysis to determine if the entropy needs to be spread throughout the payload, either by shuffling the bytes, or using a reduced set of symbols, or ...

It would be interesting to know if this even effects the compression/decompression overheads in an appreciable way for expected value/message sizes.

@brayniac
Copy link
Contributor Author

brayniac commented Nov 7, 2024

This is not needed after #301 which shuffles the bytes around.

@brayniac brayniac closed this as completed Nov 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

1 participant