Skip to content

Support for OpenAI Responses API (Azure deployments) #4121

@IanSteno

Description

@IanSteno

Feature Type

Would make my life easier

Feature Description

Feature Request

Problem

Some Azure OpenAI deployments only expose the Responses API endpoint (/openai/responses) instead of the traditional Chat Completions API (/chat/completions). This is
particularly true for newer models like gpt-5.1-codex-mini.

Currently, LiveKit agents only supports the Chat Completions API via LLM.with_azure(), which means these Responses API-only deployments cannot be used.

Proposed Solution

Add a new ResponsesLLM class that uses the OpenAI Responses API instead of Chat Completions. This would:

  • Support Azure OpenAI deployments that only expose /openai/responses
  • Maintain compatibility with existing ChatContext and function tool patterns
  • Handle the different request/response formats internally

Use Case

Azure OpenAI users deploying newer models (e.g., gpt-5.1-codex-mini) that are only available via the Responses API endpoint.

References

Workarounds / Alternatives

No response

Additional Context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions