-
Notifications
You must be signed in to change notification settings - Fork 2k
Open
Labels
enhancementNew feature or requestNew feature or request
Description
Feature Type
Would make my life easier
Feature Description
Feature Request
Problem
Some Azure OpenAI deployments only expose the Responses API endpoint (/openai/responses) instead of the traditional Chat Completions API (/chat/completions). This is
particularly true for newer models like gpt-5.1-codex-mini.
Currently, LiveKit agents only supports the Chat Completions API via LLM.with_azure(), which means these Responses API-only deployments cannot be used.
Proposed Solution
Add a new ResponsesLLM class that uses the OpenAI Responses API instead of Chat Completions. This would:
- Support Azure OpenAI deployments that only expose
/openai/responses - Maintain compatibility with existing
ChatContextand function tool patterns - Handle the different request/response formats internally
Use Case
Azure OpenAI users deploying newer models (e.g., gpt-5.1-codex-mini) that are only available via the Responses API endpoint.
References
Workarounds / Alternatives
No response
Additional Context
No response
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request