Skip to content

Configured LLM model in graph-memory is ignored #26

@Ruiww

Description

@Ruiww
项目 详情
Bug config.llm 配置被忽略
现象 配置 alibaba/qwen3.5-flash,实际使用 moonshot/kimi-k2.5
根因 readProviderModel 函数只读全局 api.config,不读插件 api.pluginConfig
位置 index.ts 第 85-103 行、第 173 行

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions