Skip to content

[Feature Request] Support for Streaming Responses in ChatAgent #2494

Open
@fengju0213

Description

@fengju0213

Required prerequisites

Motivation

The current model module can use OpenAI's SDK to pass the stream parameter, but in the chat agent's step method, it actually waits for the model to completely finish responding before returning all the output at once. We want to achieve true streaming output in the chat agent.

Solution

No response

Alternatives

No response

Additional context

No response

Metadata

Metadata

Assignees

Labels

P0Task with high level priorityenhancementNew feature or request

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions