Skip to content

Commit

Permalink
Fix/readme (#51)
Browse files Browse the repository at this point in the history
* fix: adjust react agent & host multi agent imple, adjust readme for graph example
  • Loading branch information
shentongmartin authored Jan 26, 2025
1 parent 291e855 commit 8f485a1
Show file tree
Hide file tree
Showing 10 changed files with 201 additions and 193 deletions.
Binary file modified .github/static/img/eino/react.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file removed .github/static/img/eino/simple_graph.png
Binary file not shown.
Binary file removed .github/static/img/eino/simple_workflow.png
Binary file not shown.
Binary file added .github/static/img/eino/tool_call_graph.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
32 changes: 21 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,19 +61,29 @@ chain, _ := NewChain[map[string]any, *Message]().
chain.Invoke(ctx, map[string]any{"query": "what's your name?"})
```

Now let's create a graph that uses a ChatModel to generate tool calls, then uses a ToolsNode to execute those tools, then feed the tool response back to ChatModel.
Now let's create a graph that uses a ChatModel to generate answer or tool calls, then uses a ToolsNode to execute those tools if needed.

![](.github/static/img/eino/simple_graph.png)
![](.github/static/img/eino/tool_call_graph.png)

```Go
graph := NewGraph[[]*Message, *Message]()
graph.AddChatModelNode("node_model", model)
graph.AddToolsNode("node_tools", toolsNode)
graph.AddEdge(START, "node_model")
graph.AddEdge("node_tools", "node_model")
graph.AddBranch("node_model", branch)
runnable, _ := graph.Compile(ctx)
runnable.Stream(ctx, []*Message{UserMessage("help me plan my weekend")})
graph := NewGraph[map[string]any, *schema.Message]()

_ = graph.AddChatTemplateNode("node_template", chatTpl)
_ = graph.AddChatModelNode("node_model", chatModel)
_ = graph.AddToolsNode("node_tools", toolsNode)
_ = graph.AddLambdaNode("node_converter", takeOne)

_ = graph.AddEdge(START, "node_template")
_ = graph.AddEdge("node_template", "node_model")
_ = graph.AddBranch("node_model", branch)
_ = graph.AddEdge("node_tools", "node_converter")
_ = graph.AddEdge("node_converter", END)

compiledGraph, err := graph.Compile(ctx)
if err != nil {
return err
}
out, err := r.Invoke(ctx, map[string]any{"query":"Beijing's weather this weekend"})
```

Now let's create a 'ReAct' agent: A ChatModel binds to Tools. It receives input Messages and decides independently whether to call the Tool or output the final result. The execution result of the Tool will again become the input Message for the ChatModel and serve as the context for the next round of independent judgment.
Expand Down Expand Up @@ -174,7 +184,7 @@ The Eino framework consists of several parts:

- [EinoExt](https://github.com/cloudwego/eino-ext): Component implementations, callback handlers implementations, component usage examples, and various tools such as evaluators, prompt optimizers.

- [Eino Devops](https://github.com/cloudwego/eino-ext/devops): visualized developing, visualized debugging
- [Eino Devops](https://github.com/cloudwego/eino-ext/tree/main/devops): visualized developing, visualized debugging
etc.

- [EinoExamples](https://github.com/cloudwego/eino-examples) is the repo containing example applications and best practices for Eino.
Expand Down
32 changes: 21 additions & 11 deletions README.zh_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,19 +62,29 @@ chain, _ := NewChain[map[string]any, *Message]().
chain.Invoke(ctx, map[string]any{"query": "what's your name?"})
```

现在,我们来创建一个 Graph,先用一个 ChatModel 生成 Tool 调用指令,接着用一个 ToolsNode 执行这些Tool,然后将 Tool 的响应反馈给 ChatModel
现在,我们来创建一个 Graph,先用一个 ChatModel 生成回复或者 Tool 调用指令,如生成了 Tool 调用指令,就用一个 ToolsNode 执行这些 Tool

![](.github/static/img/eino/simple_graph.png)
![](.github/static/img/eino/tool_call_graph.png)

```Go
graph := NewGraph[[]*Message, *Message]()
graph.AddChatModelNode("node_model", model)
graph.AddToolsNode("node_tools", toolsNode)
graph.AddEdge(START, "node_model")
graph.AddEdge("node_tools", "node_model")
graph.AddBranch("node_model", branch)
runnable, _ := graph.Compile(ctx)
runnable.Stream(ctx, []*Message{UserMessage("help me plan my weekend")})
graph := NewGraph[map[string]any, *schema.Message]()

_ = graph.AddChatTemplateNode("node_template", chatTpl)
_ = graph.AddChatModelNode("node_model", chatModel)
_ = graph.AddToolsNode("node_tools", toolsNode)
_ = graph.AddLambdaNode("node_converter", takeOne)

_ = graph.AddEdge(START, "node_template")
_ = graph.AddEdge("node_template", "node_model")
_ = graph.AddBranch("node_model", branch)
_ = graph.AddEdge("node_tools", "node_converter")
_ = graph.AddEdge("node_converter", END)

compiledGraph, err := graph.Compile(ctx)
if err != nil {
return err
}
out, err := r.Invoke(ctx, map[string]any{"query":"Beijing's weather this weekend"})
```

现在,咱们来创建一个 “ReAct” 智能体:一个 ChatModel 绑定了一些 Tool。它接收输入的消息,自主判断是调用 Tool 还是输出最终结果。Tool 的执行结果会再次成为聊天模型的输入消息,并作为下一轮自主判断的上下文。
Expand Down Expand Up @@ -172,7 +182,7 @@ compiledGraph.Invoke(ctx, input, WithCallbacks(handler).DesignateNode("node_1"))
Eino 框架由几个部分组成:
- Eino(本代码仓库):包含类型定义、流处理机制、组件抽象、编排功能、切面机制等。
- [EinoExt](https://github.com/cloudwego/eino-ext):组件实现、回调处理程序实现、组件使用示例,以及各种工具,如评估器、提示优化器等。
- [Eino Devops](https://github.com/cloudwego/eino-ext/devops):可视化开发、可视化调试等。
- [Eino Devops](https://github.com/cloudwego/eino-ext/tree/main/devops):可视化开发、可视化调试等。
- [EinoExamples](https://github.com/cloudwego/eino-examples):是包含示例应用程序和最佳实践的代码仓库。

## 详细文档
Expand Down
42 changes: 31 additions & 11 deletions flow/agent/multiagent/host/compose.go
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ import (
"context"
"fmt"

"github.com/cloudwego/eino/components/model"
"github.com/cloudwego/eino/compose"
"github.com/cloudwego/eino/schema"
)
Expand All @@ -39,10 +40,28 @@ type state struct {
// the default StreamToolCallChecker may not work properly since it only checks the first chunk for tool calls.
// In such cases, you need to implement a custom StreamToolCallChecker that can properly detect tool calls.
func NewMultiAgent(ctx context.Context, config *MultiAgentConfig) (*MultiAgent, error) {
if err := config.validateAndFillDefault(); err != nil {
if err := config.validate(); err != nil {
return nil, err
}

var (
hostPrompt = config.Host.SystemPrompt
name = config.Name
toolCallChecker = config.StreamToolCallChecker
)

if len(hostPrompt) == 0 {
hostPrompt = defaultHostPrompt
}

if len(name) == 0 {
name = "host multi agent"
}

if toolCallChecker == nil {
toolCallChecker = firstChunkStreamToolCallChecker
}

g := compose.NewGraph[[]*schema.Message, *schema.Message](
compose.WithGenLocalState(func(context.Context) *state { return &state{} }))

Expand All @@ -69,7 +88,7 @@ func NewMultiAgent(ctx context.Context, config *MultiAgentConfig) (*MultiAgent,
agentMap[specialist.Name] = true
}

if err := addHostAgent(config, agentTools, g); err != nil {
if err := addHostAgent(config.Host.ChatModel, hostPrompt, agentTools, g); err != nil {
return nil, err
}

Expand All @@ -78,15 +97,15 @@ func NewMultiAgent(ctx context.Context, config *MultiAgentConfig) (*MultiAgent,
return nil, err
}

if err := addDirectAnswerBranch(convertorName, g, config); err != nil {
if err := addDirectAnswerBranch(convertorName, g, toolCallChecker); err != nil {
return nil, err
}

if err := addSpecialistsBranch(convertorName, agentMap, g); err != nil {
return nil, err
}

r, err := g.Compile(ctx, compose.WithNodeTriggerMode(compose.AnyPredecessor), compose.WithGraphName(config.Name))
r, err := g.Compile(ctx, compose.WithNodeTriggerMode(compose.AnyPredecessor), compose.WithGraphName(name))
if err != nil {
return nil, err
}
Expand Down Expand Up @@ -127,32 +146,33 @@ func addSpecialistAgent(specialist *Specialist, g *compose.Graph[[]*schema.Messa
return g.AddEdge(specialist.Name, compose.END)
}

func addHostAgent(config *MultiAgentConfig, agentTools []*schema.ToolInfo, g *compose.Graph[[]*schema.Message, *schema.Message]) error {
if err := config.Host.ChatModel.BindTools(agentTools); err != nil {
func addHostAgent(model model.ChatModel, prompt string, agentTools []*schema.ToolInfo, g *compose.Graph[[]*schema.Message, *schema.Message]) error {
if err := model.BindTools(agentTools); err != nil {
return err
}

preHandler := func(_ context.Context, input []*schema.Message, state *state) ([]*schema.Message, error) {
state.msgs = input
if len(config.Host.SystemPrompt) == 0 {
if len(prompt) == 0 {
return input, nil
}
return append([]*schema.Message{{
Role: schema.System,
Content: config.Host.SystemPrompt,
Content: prompt,
}}, input...), nil
}
if err := g.AddChatModelNode(hostName, config.Host.ChatModel, compose.WithStatePreHandler(preHandler), compose.WithNodeName(hostName)); err != nil {
if err := g.AddChatModelNode(hostName, model, compose.WithStatePreHandler(preHandler), compose.WithNodeName(hostName)); err != nil {
return err
}

return g.AddEdge(compose.START, hostName)
}

func addDirectAnswerBranch(convertorName string, g *compose.Graph[[]*schema.Message, *schema.Message], config *MultiAgentConfig) error {
func addDirectAnswerBranch(convertorName string, g *compose.Graph[[]*schema.Message, *schema.Message],
toolCallChecker func(ctx context.Context, modelOutput *schema.StreamReader[*schema.Message]) (bool, error)) error {
// handles the case where the host agent returns a direct answer, instead of handling off to any specialist
branch := compose.NewStreamGraphBranch(func(ctx context.Context, sr *schema.StreamReader[*schema.Message]) (endNode string, err error) {
isToolCall, err := config.StreamToolCallChecker(ctx, sr)
isToolCall, err := toolCallChecker(ctx, sr)
if err != nil {
return "", err
}
Expand Down
14 changes: 1 addition & 13 deletions flow/agent/multiagent/host/types.go
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ type MultiAgentConfig struct {
StreamToolCallChecker func(ctx context.Context, modelOutput *schema.StreamReader[*schema.Message]) (bool, error)
}

func (conf *MultiAgentConfig) validateAndFillDefault() error {
func (conf *MultiAgentConfig) validate() error {
if conf == nil {
return errors.New("host multi agent config is nil")
}
Expand All @@ -92,10 +92,6 @@ func (conf *MultiAgentConfig) validateAndFillDefault() error {
return errors.New("host multi agent specialists are empty")
}

if len(conf.Host.SystemPrompt) == 0 {
conf.Host.SystemPrompt = defaultHostPrompt
}

for _, s := range conf.Specialists {
if s.ChatModel == nil && s.Invokable == nil && s.Streamable == nil {
return fmt.Errorf("specialist %s has no chat model or Invokable or Streamable", s.Name)
Expand All @@ -106,14 +102,6 @@ func (conf *MultiAgentConfig) validateAndFillDefault() error {
}
}

if len(conf.Name) == 0 {
conf.Name = "host multi agent"
}

if conf.StreamToolCallChecker == nil {
conf.StreamToolCallChecker = firstChunkStreamToolCallChecker
}

return nil
}

Expand Down
Loading

0 comments on commit 8f485a1

Please sign in to comment.