Skip to content

1.Add Local Models #531

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

1.Add Local Models #531

wants to merge 1 commit into from

Conversation

Annisun2016
Copy link

@Annisun2016 Annisun2016 commented Apr 15, 2025

Summary by mrge

Added support for local model integration and fixed message parsing issues. These changes allow users to connect to locally hosted models and improve message handling by removing "think" tags.

New Features

  • Added local model provider option with configuration in environment variables.
  • Added local model to the provider list with appropriate model options.

Bug Fixes

  • Fixed message parsing by removing "think" tags from AI responses using regex.
  • Updated model parameter naming from "model_name" to "model" for consistency.

2.Fix Message think errors

2.Fix Message think errors
@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.


liyan seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account.
You have signed the CLA already but the status is still pending? Let us recheck it.

Copy link

@mrge-io mrge-io bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mrge found 4 issues across 3 files. View them in mrge.io

@@ -223,6 +224,7 @@ async def get_next_action(self, input_messages: list[BaseMessage]) -> AgentOutpu
ai_content = ai_message.content

try:
ai_content = re.sub(r"<think>.*?</think>", "", ai_content, flags=re.DOTALL).strip()
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing comment explaining the purpose of removing tags from the AI response

@@ -180,7 +181,22 @@ def get_llm_model(provider: str, **kwargs):
return ChatOpenAI(
api_key=api_key,
base_url=base_url,
model_name=kwargs.get("model_name", "Qwen/QwQ-32B"),
model=kwargs.get("model_name", "Qwen/QwQ-32B"),
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Parameter name changed from model_name to model for the siliconflow provider, indicating a bug fix. This suggests there may be other inconsistencies in other providers.

return ChatOpenAI(
api_key=api_key,
base_url=base_url,
model=kwargs.get("model_name", ""),
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Empty string as default model name for local provider could cause API errors if the model name is required

model=kwargs.get("model_name", "Qwen/QwQ-32B"),
temperature=kwargs.get("temperature", 0.0),
)
elif provider == "local":
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Local provider implementation duplicates API key and base_url extraction logic

@Annisun2016
Copy link
Author

Add Local Models

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants