Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bpo-43613: Faster implementation of gzip.compress and gzip.decompress #27941

Merged
merged 22 commits into from
Sep 2, 2021

Conversation

rhpvorderman
Copy link
Contributor

@rhpvorderman rhpvorderman commented Aug 25, 2021

This also includes the changes from #25011 for bpo-43612. These make more sense in the context of these changes.

Currently gzip.compress and gzip.decompress are implemented with GzipFile. This is a lot of overhead when a simple in memory compression is needed. As shown in the benchmarks below, the overhead is considerable for datasizes below 4096 bytes (which are probably very common targets for in memory compression and decompression).

This PR changes the implementations to compress and decompress in memory.

I compiled python before and after this change with --enable-optimizations to ensure a fair comparison.
I used this script to benchmark:

import gzip
import pathlib
import statistics
import sys
import timeit

DATA=pathlib.Path(sys.argv[1]).read_bytes()

SIZES = [0, 128, 512, 1024, 4096, 8192, 16384]

def benchmark(bench_string, number=1000, repetitions=10):
    for size in SIZES:
        data = DATA[:size]
        compressed_data = gzip.compress(data)
        timeit_kwargs=dict(globals=dict(**locals(), **globals()),
                           number=number)
        results = [timeit.timeit(bench_string, **timeit_kwargs) for _ in range(repetitions)]
        average = statistics.mean(results)
        print(f"Data size {size}: {round(average * (1_000_000 / number),2)} microseconds average")

if __name__ == "__main__":
    print("gzip compression")
    benchmark("gzip.compress(compressed_data)")
    print()
    print("gzip decompression")
    benchmark("gzip.decompress(compressed_data)")

Before:

gzip compression
Data size 0: 7.92 microseconds average
Data size 128: 12.1 microseconds average
Data size 512: 18.45 microseconds average
Data size 1024: 22.41 microseconds average
Data size 4096: 32.51 microseconds average
Data size 8192: 41.03 microseconds average
Data size 16384: 57.99 microseconds average

gzip decompression
Data size 0: 8.99 microseconds average
Data size 128: 10.26 microseconds average
Data size 512: 12.62 microseconds average
Data size 1024: 13.55 microseconds average
Data size 4096: 21.12 microseconds average
Data size 8192: 30.59 microseconds average
Data size 16384: 61.24 microseconds average

After:

gzip compression
Data size 0: 3.68 microseconds average
Data size 128: 7.64 microseconds average
Data size 512: 14.06 microseconds average
Data size 1024: 17.42 microseconds average
Data size 4096: 27.25 microseconds average
Data size 8192: 37.09 microseconds average
Data size 16384: 53.48 microseconds average

gzip decompression
Data size 0: 1.98 microseconds average
Data size 128: 3.74 microseconds average
Data size 512: 5.36 microseconds average
Data size 1024: 6.72 microseconds average
Data size 4096: 14.1 microseconds average
Data size 8192: 23.57 microseconds average
Data size 16384: 52.72 microseconds average

https://bugs.python.org/issue43613

rhpvorderman and others added 3 commits August 30, 2021 14:23
Co-authored-by: Łukasz Langa <lukasz@langa.pl>
Co-authored-by: Łukasz Langa <lukasz@langa.pl>
@rhpvorderman
Copy link
Contributor Author

Thanks for your review and suggestions @ambv! I updated the code.

@ambv ambv merged commit ea23e78 into python:main Sep 2, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants