-
Notifications
You must be signed in to change notification settings - Fork 2k
Description
Bug Description
Hi LiveKit team, Langfuse maintainer here. Many Langfuse users are logging OpenTelemetry spans generated by the livekit-agents library to Langfuse.
Currently, the input (lk.chat_ctx and lk.function_tools) of the LLM call is not part of the llm_request OpenTelemetry span exposed by the livekit-agents library. Example:
(This information is currently only part of the llm_node span. Example: )
Below is the current raw llm_request OpenTelemetry span that does not include lk.chat_ctx and lk.function_tools.
{
"name": "llm_request",
"context": {
"trace_id": "34f1e50ec02e200bcc858e3c62c49ef7",
"span_id": "d80cd37e1dac89e4",
"trace_state": "[]"
},
"kind": "SpanKind.INTERNAL",
"parent_id": "d3de5900e354aff5",
"start_time": "2025-11-28T09:31:43.184986Z",
"end_time": "2025-11-28T09:31:43.863360Z",
"status": {
"status_code": "UNSET"
},
"attributes": {
"gen_ai.request.model": "openai/gpt-4o-mini",
"lk.llm_metrics":
"{\"type\":\"llm_metrics\",\"label\":\"livekit.agents.inference.llm.LLM\",\"request_id\":\"chatcmpl-Cgp7HQ5t1FPEBIduluS9hFmiiz7O3\",\"timestamp\":1764322303.861996,\"duration\":0.6770540000288747,\…
"gen_ai.usage.input_tokens": 72,
"gen_ai.usage.output_tokens": 10,
"langfuse.observation.completion_start_time": "\"2025-11-28T09:31:43.772340+00:00\""
},
"events": [
{
"name": "gen_ai.choice",
"timestamp": "2025-11-28T09:31:43.862140Z",
"attributes": {
"role": "assistant",
"content": "Hi! How can I assist you today?"
}
}
],
"links": [],
"resource": {
"attributes": {
"telemetry.sdk.language": "python",
"telemetry.sdk.name": "opentelemetry",
"telemetry.sdk.version": "1.34.1",
"service.name": "unknown_service"
},
"schema_url": ""
},
"instrumentationScope": {
"name": "livekit-agents",
"version": "",
"schema_url": "",
"attributes": null
}
}
Expected Behavior
I would expect the llm_request OpenTelemetry span to also include the system message and all other chat messages (lk.chat_ctx and lk.function_tools) that the model receives as input.
Reproduction Steps
1. Execute Langfuse & Livekit example application: https://github.com/livekit/agents/blob/main/examples/voice_agents/langfuse_trace.py
2. Check the generated span in Langfuse and see that the `llm_request` span does not include any input information of the model call.
Example trace in Langfuse: https://cloud.langfuse.com/project/cloramnkj0002jz088vzn1ja4/traces/8f3ebfb558c02a91a8ced187e81c5fcf?timestamp=2025-11-27T10%3A16%3A07.597Z&observation=0a84b5af424e870bOperating System
MacOS
Models Used
openai/gpt-4o-mini
Package Versions
livekit 1.0.19
livekit-agents 1.3.5
livekit-api 1.0.7
langfuse 3.10.1
opentelemetry-api 1.34.1
opentelemetry-exporter-otlp 1.34.1
opentelemetry-sdk 1.34.1Session/Room/Call IDs
No response
Proposed Solution
Additional Context
Please let me know if there is any other information I can provide.
Screenshots and Recordings
No response