-
Notifications
You must be signed in to change notification settings - Fork 551
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory growth in 0.12.1 #239
Comments
Thank you for a thorough bug report. I'll try to find time to investigate this soon. |
1st1
added a commit
that referenced
this issue
Mar 20, 2019
The crux of the problem is that TimerHandle did not clean up a strong reference from the event loop to `self`. This typically isn't a problem unless there's another strong reference to the loop from the callback or from its arguments (such as a Future). A few new unit tests should ensure this kind of bugs won't happen again. Fixes: #239.
Thanks to your excellent analysis it was fairly trivial to find the actual bug. I've created a PR with a fix. Once it's green I'll work on releasing 0.12.2 asap. Thank you. |
1st1
added a commit
that referenced
this issue
Mar 20, 2019
The crux of the problem is that TimerHandle did not clean up a strong reference from the event loop to `self`. This typically isn't a problem unless there's another strong reference to the loop from the callback or from its arguments (such as a Future). A few new unit tests should ensure this kind of bugs won't happen again. Fixes: #239.
1st1
added a commit
that referenced
this issue
Mar 20, 2019
The crux of the problem is that TimerHandle did not clean up a strong reference from the event loop to `self`. This typically isn't a problem unless there's another strong reference to the loop from the callback or from its arguments (such as a Future). A few new unit tests should ensure this kind of bugs won't happen again. Fixes: #239.
Fixed in uvloop 0.12.2. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
PYTHONASYNCIODEBUG
in env?: yesLong story short
Our services show a small increase in memory usage over time and at some point get OOM killed.
The service is deployed with gunicorn and uvloop using worker [GunicornUVLoopWebWorker|https://github.com/aio-libs/aiohttp/blob/master/aiohttp/worker.py#L211]
An investigation into the memory usage with tracemalloc shows memory increase in
aiohttp/worker.py:124
. Usingobjgraph
to show the most common types every 60 seconds we see an instance increase of one for Context, Future, TimerHandle and UVTimer around every second.We run another test with GunicornWebWorker using asyncio loop and could not reproduce an increased memory usage.
Reading the code this feels like an issue with
uvloop
call_later
orcreate_future
(big guess).Expected behaviour
Memory usage stays equal regardless of loop used
Actual behaviour
uvloop shows a steady increase in memory usage over asyncio loop.
Steps to reproduce
Create file
uvleak.py
withpip install aiohttp uvloop objgraph gunicorn export PYTHONTRACEMALLOC=10 gunicorn uvleak:create_app --max-requests 1000 --bind 0.0.0.0:8080 --workers 1 --worker-class aiohttp.worker.GunicornUVLoopWebWorker
Every 60 seconds objgraph will dump a file ending in
.objgraph
tosnapshots/{pid}/
that shows the memory growth. In my testing the contents areAdditional tracemalloc snapshots can be found in the directory too.
Your environment
Ubuntu Linux 18.10
python 3.7.1
aiohttp 3.5.4
uvloop 0.12.1
gunicorn 19.9.0
The text was updated successfully, but these errors were encountered: