-
Notifications
You must be signed in to change notification settings - Fork 2k
Open
Labels
bugSomething isn't workingSomething isn't workingfeature:tracingquestionQuestion about using the SDKQuestion about using the SDKstale
Description
Describe the bug
My agent uses 4.1 models, served on Azure Foundry. The default openAI Tracing is somewhat broken. The traces are created and timed correctly, as are their spans. However, any spans that corresponds to an LLM call is broken: selecting them shows no content :

Debug information
- Agents SDK version: 0.1.0 (latest)
- Python version: reproduced across 3.10 to 3.13
- OpenAI: 1.90.0
Repro steps
from agents import OpenAIResponsesModel
#the endpoint looks like this "https://XXX.openai.azure.com/"
client = AsyncOpenAI(
api_key=settings.main_key,
base_url=str(settings.endpoint+"/openai/v1/"),
default_query={
"api-version": "preview"
}
)
model = OpenAIResponsesModel(
model=settings.model_name,
openai_client=client
os.environ["OPENAI_API_KEY"] = "sk-..."
with trace(workflow_name="Testrun - " + idparameter):
result = await Runner.run(agent, conversation_items)
During the run, each agent LLM call leads to a
INFO:httpx:HTTP Request: POST https://api.openai.com/v1/traces/ingest "HTTP/1.1 204 No Content"
I assume this means that the trace upload calls are slightly off when using Azure Foundry served OpenAI responses models
thunder-007
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workingfeature:tracingquestionQuestion about using the SDKQuestion about using the SDKstale