You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Disclaimer: I'm new to python and asyncio so this may just be my own mis-use.
I've written some code to integrate with the auto-discovery feature of AWS ElastiCache. Part of this is connecting to a memcached cluster address every 60 seconds (it is important to re-connect each time so we resolve the DNS and ensure we get to a healthy cluster member). Everything is working find but it seems this process of frequently connecting / disconnecting is leaking dict's.
Here is a minimal reproducer using pympler to demonstrate the leak:
It looks like these dict's will hang around forever until loop.close() is called. I'm confused by this. I think I don't want to ever close the loop that I borrowed from tornado via tornado.ioloop.IOLoop.current().asyncio_loop. Is there any other way to properly close / cleanup these connections without closing the loop?
The text was updated successfully, but these errors were encountered:
Disclaimer: I'm new to python and asyncio so this may just be my own mis-use.
I've written some code to integrate with the auto-discovery feature of AWS ElastiCache. Part of this is connecting to a memcached cluster address every 60 seconds (it is important to re-connect each time so we resolve the DNS and ensure we get to a healthy cluster member). Everything is working find but it seems this process of frequently connecting / disconnecting is leaking dict's.
Here is a minimal reproducer using pympler to demonstrate the leak:
It looks like these dict's will hang around forever until
loop.close()
is called. I'm confused by this. I think I don't want to ever close the loop that I borrowed from tornado viatornado.ioloop.IOLoop.current().asyncio_loop
. Is there any other way to properly close / cleanup these connections without closing the loop?The text was updated successfully, but these errors were encountered: