-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory leak when doing get() and scope closes #4833
Comments
I'm having this exact same issue. |
Any update on this? Would love to see this issue addressed! |
experiencing the same issue |
On Python 3.7.8 it's working fine, but with Python 3.8+ versions getting memory leak problem. I hope someone solve this problem |
Just did a check on my machine on 2a9d5e9 (master):
$ python --version
Python 3.8.5
$ uname -a
Linux 5.4.0-52-generic #57-Ubuntu SMP Thu Oct 15 10:57:00 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
$ lsb_release -a
Distributor ID: Ubuntu
Description: Ubuntu 20.04.1 LTS
Release: 20.04
Codename: focal |
Same PC on aiohttp 3.7.2
$ python --version
Python 3.8.5
$ python -m pip show aiohttp
Name: aiohttp
Version: 3.7.2
Summary: Async http client/server framework (asyncio)
Home-page: https://github.com/aio-libs/aiohttp
Author: Nikolay Kim
Author-email: [email protected]
License: Apache 2
Location: /tmp/mem_test/venv/lib/python3.8/site-packages
Requires: multidict, yarl, attrs, typing-extensions, chardet, async-timeout
Required-by:
$ python -m pip show multidict
Name: multidict
Version: 5.0.0
Summary: multidict implementation
Home-page: https://github.com/aio-libs/multidict
Author: Andrew Svetlov
Author-email: [email protected]
License: Apache 2
Location: /tmp/mem_test/venv/lib/python3.8/site-packages
Requires:
Required-by: yarl, aiohttp
$ python -m pip show yarl
Name: yarl
Version: 1.6.2
Summary: Yet another URL library
Home-page: https://github.com/aio-libs/yarl/
Author: Andrew Svetlov
Author-email: [email protected]
License: Apache 2
Location: /tmp/mem_test/venv/lib/python3.8/site-packages
Requires: idna, multidict
Required-by: aiohttp |
aiohttp doesn't use weak references to break cycles between moving parts. Does it help? |
@asvetlov in my environment it is not needed. In the I will check later this leak in CentOS docker. |
OK. BTW, for checking on different Python versions I use https://github.com/pyenv/pyenv
Works like a charm. P.S. |
For CentOS I've got the leak :-(
Dockerfile: FROM centos
RUN yum install -y python3
RUN yum groupinstall -y 'Development Tools'
RUN yum install -y python3-devel
RUN pip3 install memory_profiler
RUN yum install -y wget
RUN wget https://www.hq.nasa.gov/alsj/a17/A17_FlightPlan.pdf
RUN pip3 install aiohttp==3.7.2
ADD 4833_test.py / $ python3 --version
Python 3.6.8 |
Oooh. Thanks. |
Having a memory leak in one of my aiohttp apps, I've been following this closely. If memory_profiler is to be trusted, this still happens with aiohttp 3.7.x and cpython 3.9.4. I've tried the following with no change to the apparent leak reported by memory_profiler:
There are dependencies between a |
I experience the same problem with aiohttp 3.8.1 and python 3.7.1 |
Here's some bells and whistles to make spotting the issue easier: |
🐞 Describe the bug
When downloading content from a website, memory usage increases roughly proportionally to the size of the content downloaded, with some overhead for the library and other objects in memory.
When the scope where that content is downloaded is closed and inaccessible, I expect the garbage collector to de-allocate the objects which were downloaded, but memory usage appears to remain near the same level as peak memory usage right after downloads finish.
💡 To Reproduce
💡 Expected behavior
Once the downloaded objects are no longer in scope, they should be de-allocated by the garbage collector and memory usage should fall back to a similar level as when the script initialized.
📋 Logs/tracebacks
I have no specific logs/tracebacks (no exception occurred), but I have profiled the relevant functions in two scripts with memory_profiler to demonstrate memory usage.
The first script is created with aiohttp, demonstrating the bug:
aiohttp_profile.txt
The bug is observed here: On line 38, I expect memory usage to reach its peak, and the next line should see memory usage drop to init levels. However, on line 40, memory usage is still at the same level, even though the scope where the memory was allocated is gone and inaccessible.
The second is created using requests in a nearly identical way, but demonstrates the behavior I expect in aiohttp:
requests_profile.txt
Here the memory usage shown by memory_profiler is considerably different, because the memory usage drops as the function quicktest() on line 33 ends, so the scope of main() never sees the same level of memory usage.
📋 Your version of the Python
📋 Your version of the aiohttp/yarl/multidict distributions
📋 Additional context
The relevant scripts and my Pipfile* are here:
scripts.zip
And my Pipfile:
I have not embedded my Pipfile.lock due to its length, but it is in the
scripts.zip
above and demonstrates the same versions for my dependencies as I showed before.All of my dependencies are listed above.
I created and tested the scripts above on my personal machine running Archlinux, but originally observed the bug that led me to create these scripts on a machine running CentOS.
Linux HOSTNAME 3.10.0-862.9.1.el7.x86_64 #1 SMP Mon Jul 16 16:29:36 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux
The text was updated successfully, but these errors were encountered: