You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, every time a file within the archive is opened, a new copy of the decompressed stream is created and then read until the beginning of the file in question is reached. This means there is a performance hit that gets worse as you descend into the archive. To read the 10th file, you have to read and discard the first 9 files in the archive, to read the 100th file, you have to read past 99 files in the archive, etc.
A performance improvement would be to keep the decompressed stream reader around for any future files that are further along in the archive, (you can't go backwards).
The text was updated successfully, but these errors were encountered:
Currently, every time a file within the archive is opened, a new copy of the decompressed stream is created and then read until the beginning of the file in question is reached. This means there is a performance hit that gets worse as you descend into the archive. To read the 10th file, you have to read and discard the first 9 files in the archive, to read the 100th file, you have to read past 99 files in the archive, etc.
A performance improvement would be to keep the decompressed stream reader around for any future files that are further along in the archive, (you can't go backwards).
The text was updated successfully, but these errors were encountered: