Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: Telemetry dependency collection massively consumes CPU at 1 minute mark impacting our production services #12436

Open
kravnaut opened this issue Feb 20, 2025 · 3 comments
Labels

Comments

@kravnaut
Copy link

Tracer Version(s)

2.19.3

Python Version(s)

3.12.8

Pip Version(s)

24.3.1

Bug Report

Recently (or at least in this version we just upgraded to) added feature gated by DD_TELEMETRY_DEPENDENCY_COLLECTION_ENABLED seems to be

We have profiled the application and upon learning it was this feature we saw in your code it was gated using env var DD_TELEMETRY_DEPENDENCY_COLLECTION_ENABLED which we have since engaged to disable it. This resolved the problem for us.

Reproduction Code

CPU spikes to 100% and stays there for 5+ seconds impacting ongoing request handling in the container.

Error Logs

No response

Libraries in Use

No response

Operating System

No response

@kravnaut kravnaut added the bug label Feb 20, 2025
@kravnaut
Copy link
Author

kravnaut commented Feb 20, 2025

Here is austin profiler snapshot of the process running this code in ddtrace's thread with -s setting on to only measure non-idle frames

The heaviest path:

get_module_distribution_versions (/usr/local/lib/python3.12/site-packages/ddtrace/internal/packages.py:77)
> update_imported_dependencies (/usr/local/lib/python3.12/site-packages/ddtrace/internal/telemetry/data.py:76)
> TelemetryWriter._app_dependencies_loaded_event (/usr/local/lib/python3.12/site-packages/ddtrace/internal/telemetry/writer.py:419)
> TelemetryWriter.periodic (/usr/local/lib/python3.12/site-packages/ddtrace/internal/telemetry/writer.py:627)

Image

@P403n1x87
Copy link
Contributor

@kravnaut thanks for reporting this. We have merged a fix for this recently (#12327) and it should become available with the next patch releases.

@joshfermin
Copy link

joshfermin commented Feb 21, 2025

@P403n1x87 was this part of the 2.21.2 release? we are still seeing significant CPU performance issues as well on that version.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants