fix: Use proper length for number of total cells in area encoder #5
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
For a simple area with 3 nodes, I ran the encoder and got a lot of zeros at the end using the latest code on the master branch. (I can send this file to you in a gist if you want).
Looking at the code, it seems like there is a calculation for the size of an area which I don't entirely understand. It seems like it should allocate for the # of cells encoded as varints? I'm not exactly sure what the old code was calculating -- if we need it, probably best we add a bit of documentation or comment about the 🧙 magic 🌟 numbers 👍 😄
EDIT: it isn't a big deal, I think if the length is too long.. then the output could just be trimmed once it's determined that the cell length is actually smaller than predicted.
Before:
After: