Improve performance for Python 3.x #751
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This issue attempts to improve performance under Python 3.x while either maintaining or slightly improving 2.x performance.
Some of the reasons Python 3.x is slow than 2.x:
These redis commands are represented by the connection.Token object. To speed things up, I've implemented a cache of Token objects and modified the Token object to store its encoded value. This greatly reduces the number of times redis commands need to be encoded. It also reduces the number of times Token objects are created, although that has a smaller impact.
I also wrote a basic benchmark (basic_operations.py) modeled on redis-benchmark (although much more limited). It tests the performance of some of the basic commands.
Testing on my machine has shown a 10-20 % improvement for Python 3.5. It should make the biggest improvement to workloads where there are small and frequent requests to redis.